Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Bug IT Technology

Time's Up: 2^30 Seconds Since 1970 675

An anonymous reader writes: "In Software glitch brings Y2K deja vu, CNET points out a small wave of Y2K-like bugs may soon hit, though it gets the explanation wrong. It will soon be about 2^30 (1 billion, not 2 billion) seconds since 1970 (do the arithmetic). Systems that use only 29 bits of a word for unsigned/positive integers, or store time as seconds since 1970 in this format, may roll back to 1970. (Many systems that do not need full 32 bit integers may reserve some bits for other uses, such as boolean flags, or for type information to distinguish integers from booleans and pointers.)"
This discussion has been archived. No new comments can be posted.

Time's Up: 2^30 Seconds Since 1970

Comments Filter:
  • Some systems... (Score:4, Interesting)

    by NightSpots ( 682462 ) on Sunday December 21, 2003 @08:00PM (#7782367) Homepage
    And which systems are those?

    Any of the common architectures use 29 bits instead of 31?
  • Wrong writeup. (Score:5, Interesting)

    by crapulent ( 598941 ) on Sunday December 21, 2003 @08:14PM (#7782470)
    Could you be any MORE confusing? 2^30 is not 1 billion. It's 1,073,741,824. And the date as of right now is:

    $ date +%s
    1072051722


    So, yes, there is an issue with the date overflowing a 30 bit space. I'd hardly say it's relevant, any software that made such a braindead choice (why 30 and not 32 bits?) deserves to break. But it has nothing to do with a billion or anything else related to base 10. It hit 1 billion a long time ago, and it was covered then. [slashdot.org]
  • by teridon ( 139550 ) on Sunday December 21, 2003 @08:14PM (#7782472) Homepage
    that time should be stored as self-describing format, such as:
    header containing:
    2-bits (E) for # of bits for Epoch
    1-bit for whether the time is a floating point format
    if not floating, then:
    2-bits (N) for # of bits for the time
    2-bits (n) for # of bits for the resolution (1/2^n) (e.g. n=8 would mean 1/256 second resolution)
    if floating, then follow some IEEE standard representation.

  • by cbiffle ( 211614 ) on Sunday December 21, 2003 @08:15PM (#7782475)
    Chances are pretty good that you interact with 31-bit machines every day -- namely, older (pre-64-bit) IBM mainframes. Even the new zSeries machines frequently run apps in 31-bit mode for compatibility with older systems.

    Using a couple of bits in an integer for data type is usually (in my experience) called 'tagged data.' I use it in Smalltalk VMs as an optimization -- the "objects" representing Integers are really just 31-bit integers with an extra zero-bit stuck on the LSB. (Object pointers have an LSB of 1, so you mask that to zero before using them and keep everything 16-bit aligned.)

    Essentially what you wind up with there is a tradeoff: you can perform simple arithmetic and logic on the Integer "references" without actually having to allocate an object to hold an Integer, but you lose a bit of dynamic range. In my experience, it's an acceptable tradeoff, and it lets you have all the advantages of a true OO system without the performance penalty of having to use an object for, say, every loop variable.

    So there's an example of why you do that. The aforementioned Smalltalk systems wouldn't be vulnerable to this date issue, however, as their integers will automatically convert themselves to arbitrary-precision numeric types as needed.
  • by Rosco P. Coltrane ( 209368 ) on Sunday December 21, 2003 @08:17PM (#7782491)
    I'm bracing for the 2034 Y2K (or is it Y2KATF) bug, the one that'll overflow the Unix time() function.

    You think I'm trying to be funny ? well let's see : people were worried that systems built in the 80s and before would display the 99 Cobol date bug, and/or the 2-digit date bug in 2000. 1980 and before is 20+ years ago, and there weren't that many computers/microcontroller around during those 20 years compared to what's to come, and operating systems weren't very unified. Today in 2004, we have kajillions of Unix machines around : how much do you bet a lot of these will still be running 30 years from now ?

    This said, I'm not bracing quite yet to tell the truth ...
  • ObCalculation (Score:5, Interesting)

    by LaCosaNostradamus ( 630659 ) <[moc.liam] [ta] [sumadartsoNasoCaL]> on Sunday December 21, 2003 @08:20PM (#7782505) Journal
    2^30 = 1073741824s ~= 34y 9d 97m

    1970JAN01 0000hr + (34y 9d 97m) ~= 2004JAN10 0137hr

    January 10th should be an interesting day for somebody.
  • by fireman sam ( 662213 ) on Sunday December 21, 2003 @08:22PM (#7782514) Homepage Journal
    I've seen some comments about hey, another Y2K waste of time... blah blah blah. But think of it this way:

    1 - What if all the money that was spent to "fix" the Y2K bug actually fixed the bug.

    2 - Most people say that all the money spent "fixing" the Y2K bug was a waste because nothing happened.

    3 - How many people have insurance of some sort, and have never needed it (I am). Yet every year, you renew your policies.

    There are two things we can do about these "time" bombs. The first is to do nothing and hope that all is well. Or we could audit the code that may fail. A bit like paying insurance.

    [ PS: it is SCO's code, so they should pay ]
  • Re:subject (Score:3, Interesting)

    by DarkOx ( 621550 ) on Sunday December 21, 2003 @08:23PM (#7782521) Journal
    People did think this would happen eventually, iff thoses systems were still in operation. Nobody thought they would be still in operation. So it was thought safe to save on the memory. Remember that lots of these big old mainframes that sometimes have hundreds of terminals have less then 16megs of memory. I think it was not till 1960 that a computer was even build with that much ram and and it was common into the late 70's to have much less on big iron. Disk/tape capacitys were just as limited. Memory was EXPENSIVE and LIMITED that is why it was done the way it was.
  • by belarm314 ( 663118 ) on Sunday December 21, 2003 @08:33PM (#7782593)
    Fortunately, it would be rather trivial to convert those to 32-bit structs in the next
    perl -e 'print ((2**31) - time)/(3600*24*365),"\n"'

    34.102266457382
    years.
  • Fixed In Time (Score:2, Interesting)

    by Ashcrow ( 469400 ) on Sunday December 21, 2003 @08:39PM (#7782633) Homepage
    It seems like we will probably all be using 64bit computing by 2038 which will fix the problem itself. By then we won't have to worry for at least a few thousand years (I think).
  • by pHDNgell ( 410691 ) on Sunday December 21, 2003 @08:45PM (#7782659)
    I could of course be wrong but I'm pretty sure there aren't 31-bit architectures.

    Right, but that doesn't necessarily dictate how the 32 bits of 32-bit architectures should work. I ran into a problem in a programming system that used 30-bit numbers when I tried to represent time as an int. The programming system wanted two bits of the int for itself (or maybe it was just one and the sign, can't remember).

  • by E-Lad ( 1262 ) on Sunday December 21, 2003 @08:58PM (#7782730)
    [daleg@lithium]~>perl 2038.pl
    Tue Jan 19 03:14:01 2038
    Tue Jan 19 03:14:02 2038
    Tue Jan 19 03:14:03 2038
    Tue Jan 19 03:14:04 2038
    Tue Jan 19 03:14:05 2038
    Tue Jan 19 03:14:06 2038
    Tue Jan 19 03:14:07 2038
    Tue Jan 19 03:14:07 2038
    Tue Jan 19 03:14:07 2038
    Tue Jan 19 03:14:07 2038
    [daleg@lithium]~>uname -rs
    SunOS 5.8

    Interesting, it stays at the limit rather than rolling over.
  • Party Like Its 2037 (Score:5, Interesting)

    by yintercept ( 517362 ) on Sunday December 21, 2003 @09:10PM (#7782788) Homepage Journal
    2038 will be a big mess.

    For the first programming job I had (at an insurance agency) they were using 9/9/99 as infinity. So, if your benefits mysteriously stopped a few years ago...hey, it wasn't my fault!

    The most interesting time related bug I came across was with a RDBMS called Advanced Revelation. The program counted days from 1/1/1970. In May 1997 the sequence counter went from 4 to 5 digits. It was interesting, the database was stable, but there were quite a few reports and add ons that were designed to expect a 4 digit number.

    BTW, I built a 3/3/3333 into a program that I wrote for a company.
  • Re:subject (Score:4, Interesting)

    by Anonymous Coward on Sunday December 21, 2003 @09:24PM (#7782843)
    Yes, you are the first person ever to understand this.

    Try explaining to a manager, in 1978, why he should spend twice as much on a system so that it wouldn't fail sixty years hence.
  • Time stops for OS X (Score:3, Interesting)

    by shking ( 125052 ) <babulicm@cuu g . a b . ca> on Sunday December 21, 2003 @09:26PM (#7782854) Homepage
    Don't know if the problem is OS X 10.2.8 or Perl 5.6, but time just stops at Jan 19 03:14:07 2038 on my mac

    http://maul.deepsky.com/~merovech/2038.html [deepsky.com]

    mikebabu% perl 2038.pl
    Tue Jan 19 03:14:01 2038
    Tue Jan 19 03:14:02 2038
    Tue Jan 19 03:14:03 2038
    Tue Jan 19 03:14:04 2038
    Tue Jan 19 03:14:05 2038
    Tue Jan 19 03:14:06 2038
    Tue Jan 19 03:14:07 2038
    Tue Jan 19 03:14:07 2038
    Tue Jan 19 03:14:07 2038
    Tue Jan 19 03:14:07 2038

  • Re:Did the math. (Score:5, Interesting)

    by happyduckworks ( 683158 ) on Sunday December 21, 2003 @09:31PM (#7782886)
    > Okay -- I did the math, and 2^29 seconds since January 1st 1970 would have been up on January 4th, 1987. I remember that day - the Common Lisp system I was using (on a Sun) all of a sudden stopped recognizing when files were out of date and needed recompiling. Yup, they used a couple bits for a tag and then interpreted the rest as signed...
  • by Walabio ( 660956 ) on Sunday December 21, 2003 @09:33PM (#7782895) Homepage

    If we use Plank-Time and 256bit integers, we can handle 1.981384141637854Year*E+26. We should handle time as 256bit integer based on placktime and convert to local human time-standards as needed. We should support for a second 256bit imaginary integer and conversion to two floating point-math-units (one real and one imaginary) because some calculations in Physics involving time occur on the complex plain. I propose that zero-time be zero Julian Date.

  • by kst ( 168867 ) on Sunday December 21, 2003 @09:48PM (#7782954)
    The point is that time() still returns a 32-bit value in both C and C++. That's the problem we're trying to solve here.

    No, time() returns a time_t value. The C standard defines time_t as an arithmetic type capable of representing times; there's nothing in the standard that says it has to be 32 bits, or that it counts seconds from 1970-01-01, or even that it counts seconds. (I think POSIX imposes some more specific requirements, but it still allows a 64-bit type.)

    I've worked on several systems that already use a 64-bit type for time_t. I suspect that all systems will do so well before 2038.

    If the problem hasn't taken care of itself in 20 or 30 years, then we can start worrying about it.

    (Switching to an unsigned 32-bit type would buy us another 68 years, but it would make it impossible to represent dates before 1970.)
  • by the eric conspiracy ( 20178 ) on Sunday December 21, 2003 @09:50PM (#7782964)
    What are the chances that we'll still be using the Gregorian calendar in Y10K?

    Don't know about the Gregorian calendar, but there are other calendars like the Chinese and Hebrew calendars that are already in year 4000+. These probably will be around for Y10K relative to their starting dates.

  • by Jamie Zawinski ( 775 ) <jwz@jwz.org> on Sunday December 21, 2003 @10:06PM (#7783027) Homepage

    One of the fun tricks you can do is use the bottom 2 bits for tagged data type, and then reserve two of those for immediate integers: one for even ints, and one for odd. That way, you get 3 tag types, with 30 bits of pointer, but you still get 31 bit integers instead instead of 30 bit. So the tags might look like:

    • 00: odd int

    • 01: even int
      10: pointer to object header
      11: pointer to array/string header

    Then you convert raw data to an int with >>1, and to a pointer with &~3 (you only need 30 bits of pointer if all objects are word-aligned in a 32 bit address space.)

    Lucid Common Lisp used this kind of system, and Lucid Emacs/XEmacs do something similar.

  • by optikSmoke ( 264261 ) on Sunday December 21, 2003 @10:51PM (#7783207)

    Im just more worried about the 0.0481298833079654997463216641 seconds after 2004...

    Not to nitpick, but that would be 0.0481298833079654997463216641 years. And besides, your calculation is a little off because you appear to have used 365 days per year :).

    How 'bout this:

    2^30seconds / 60seconds/minute / 60minutes/hour / 24hours/day / 365.242199days/year = 34.025551925361years

    34.025551925361years + 1970 = 2004.0255519254

    0.0255519254years * 365.242199days/year = 9.3326414074074days

    0.3326414074074days * 24hours/day = 7.9833937777782hours

    0.9833937777782hours * 60minutes/hour = 59.003626666694

    0.003626666694minutes * 60seconds/minute = 0.21760000161308seconds

    OR, on January 9, 2004 at 07:59:00.21760000161308, the world will come to an end.

    Approximately.

  • Re:Umm, none? (Score:4, Interesting)

    by tjb ( 226873 ) on Sunday December 21, 2003 @11:06PM (#7783269)
    Yes you can. You can define a binary system with however many bits you like. It may not look pretty in a hexadecimal representation, but there's no reason you can't do it.

    In fact, I work on a processor in which one engine has 20 bit words, another has 18 bit words, and another does multiplies out to 29 bits (15 bit x 14 bit).

    Tim
  • by Just Another Perl Ha ( 7483 ) on Sunday December 21, 2003 @11:33PM (#7783405) Journal
    Bzzzt... wrong answer. Here's how it works:
    1 UK fluid ounce -> 28.41 ml
    1 US fluid ounce -> 29.57 ml

    1 UK pint -> 20 UK oz -> 568.3 ml
    1 US pint -> 16 US oz -> 472.3 ml
    From the above, I'd hasten to guess that you're either not American, not British or not a connoisseur of fine pints after work.
  • by Walabio ( 660956 ) on Monday December 22, 2003 @12:51AM (#7783747) Homepage

    This is a response to the first two replies

    John Hasler:

    Zero-time should be the instant of the Big Bang. By the time 256 bit cpus are standard we should know that accurately.

    I thought about zero-time being the big bang. I figured that since the big is not known very accurately, it would not be a good idea. On second thought, if we revise the time and the conversion-tables (something which is necessary anyway as the value of the Planck-Time-Unit (5.4*E-44)) with revision of the time of the big bang, it will work. If we use the time of the big bang, we will have to include the version of the revision along with the time stamp or else the text file from one year ago will show as from one billion years ago while the spreadsheet from 2000 AD, will not exist for two billion years.

    Anonymous Coward:

    (BTW, too bad I don't have some mod points to spend -- Planck units are the perfect rebuttal to those that claim the metric system is the best around)

    Natural units (Planck-Units) are best. After them, Metric Units are second best. One must remember that the metric system predates the discovery of Planck-Constants. The metric system has definite advantages over other previous systems:

    • All units are interrelated
    • The names of the prefixes are powers of ten
    • It is an international standard

    Natural units are too small for practical everyday use. Metric units are in the convenient range for everyday use. Still, it is important to remember that metric units are arbitrary.

    Among my favorite arbitrary units is the byte, system for measuring memory:

    2^10

    This way, the amount of memory a computer can handle is always a round number (because the algorithms used for memory allocation, use binary mathematics). As an example, my filesystem can support files of upto 16 exobytes in size.

  • TAI (Score:3, Interesting)

    by Detritus ( 11846 ) on Monday December 22, 2003 @01:30AM (#7783893) Homepage
    UTC is a mess. I'd rather see all computer clocks use TAI [iers.org] (International Atomic Time) internally.
  • Re:subject (Score:2, Interesting)

    by quantum bit ( 225091 ) on Monday December 22, 2003 @01:41AM (#7783926) Journal
    Recompiling everything is no big deal to Gentoo and *BSD people.

    That's pretty much right. Just out of curiosity I tried recompiling everything on my FreeBSD system after changing the definition of time_t from int32_t to int64_t.

    Surprisingly, everything works. I guess this is probably because it's already sometimes compiled for 64-bit architectures anyway.
  • by quantum bit ( 225091 ) on Monday December 22, 2003 @01:59AM (#7783994) Journal
    Bah, you amateurs... the only proper way to represent a date/time is as an unsigned long long, signifying microseconds-since-epoch.

    What do you do when you need to represent a date before the epoch?
  • by Baki ( 72515 ) on Monday December 22, 2003 @02:39AM (#7784206)
    At the moment we are using 31-dec-9999 as infinity (the max date Oracle can represent). I think chances are pretty slim that todays systems are still running in 9999.
  • Re:Some systems... (Score:5, Interesting)

    by Piquan ( 49943 ) on Monday December 22, 2003 @02:40AM (#7784210)

    So maybe a Lisp Machine might have this problem? Of course, Lispers will tell you that they'd always have the sense to use a bignum :)

    The Symbolics Lispms had wider words than PCs today. They used 36-bit words on the 3600s, with 4 bits of tag and 32 bits of data for numbers (or 8 bits of tag and 28 bits of data for pointers). They used 40-bit words on the Ivory, with 8 bits of tag and 32 bits of data for all types. So either way, the number is a 32-bit value. (This is why Lispms traditionally spec RAM in megawords, not megabytes.)

    That aside, like I mentioned in my other post, they said that all the date code is bignum-friendly anyway.

  • by Dahan ( 130247 ) <khym@azeotrope.org> on Monday December 22, 2003 @06:05AM (#7784762)
    I'm just sayin', if you're going to try and be ultra accurate, then don't half-ass it.

    Taking into account leap seconds:
    env TZ=/usr/share/zoneinfo/right/GMT date -r 1073741823
    Sat Jan 10 13:36:41 GMT 2004

    Ignoring leap seconds:
    env TZ=/usr/share/zoneinfo/posix/GMT date -r 1073741823
    Sat Jan 10 13:37:03 GMT 2004

    (BSD date's -r option is useful sometimes)

Say "twenty-three-skiddoo" to logout.

Working...