Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Bug IT Technology

Time's Up: 2^30 Seconds Since 1970 675

An anonymous reader writes: "In Software glitch brings Y2K deja vu, CNET points out a small wave of Y2K-like bugs may soon hit, though it gets the explanation wrong. It will soon be about 2^30 (1 billion, not 2 billion) seconds since 1970 (do the arithmetic). Systems that use only 29 bits of a word for unsigned/positive integers, or store time as seconds since 1970 in this format, may roll back to 1970. (Many systems that do not need full 32 bit integers may reserve some bits for other uses, such as boolean flags, or for type information to distinguish integers from booleans and pointers.)"
This discussion has been archived. No new comments can be posted.

Time's Up: 2^30 Seconds Since 1970

Comments Filter:
  • I could of course be wrong but I'm pretty sure there aren't 31-bit architectures. At least, these architectures are exceedingly rare if they do indeed exist.

    What I believe this article is referring to is that some software may have been coded to use a bit in integers to store extra info. This seems like a pretty bad idea though as it would have all sorts of interesting effects on overflow and such. It would seem like it would only be useful to a very very very tiny portion of software since the overhead in using this method as a general purpose solution would be terribly difficult.

    Sounds like it's just the story of yet another software bug...
  • Re:Y2K (Score:2, Insightful)

    by Anonymous Coward on Sunday December 21, 2003 @08:17PM (#7782493)
    If people hadn't worked to prepare for Y2K, it wouldn't have gone smoothly. There was a problem but it was largely fixed before it happened.
  • Re:Some systems... (Score:5, Insightful)

    by Anonymous Coward on Sunday December 21, 2003 @08:27PM (#7782545)
    Well, they wouldn't just have the sense to use a bignum - they'd have the sense not to override the default behaviour of the damn language, which would be to go to bignum if necessary. It would take effort to write a declaration to actually deliberately override the behaviour, and would be A Seriously Stupid Thing To Do. Doesn't mean that somebody, somewhere wouldn't do it, of course, but it wouldn't be the "common case" that there would be a problem waiting to happen, like in C.

  • by Anonymous Coward on Sunday December 21, 2003 @08:29PM (#7782569)
    Seriously, could we please get started fixing this 2038 bug now? I don't know if it's practical to change time_t to "long long"; if not, could we at least officially define the successor to time_t?

    I know that the emergence of 64-bit chips will alleviate this somewhat, but it wouldn't surprise me if at least embedded systems are still running 32-bits in 2038.

    I know that "long long" is common, but it's not part of the official C++ standard yet. Shouldn't we be putting this in the standard now? It's not too much to require language libraries to have 64-bit integer support (if necessary). This doesn't have to be painful.

    I'll feel a lot better the day that I know what I'm officially supposed to use instead of time_t -- or if I can be given a guarantee that time_t will be upgraded to 64 bits within the next few years.

  • Re:ObCalculation (Score:3, Insightful)

    by Rick Richardson ( 87058 ) on Sunday December 21, 2003 @08:29PM (#7782570) Homepage
    $ date -d "1/1/70 `dc -e '2 30 ^p'` secs"
    Sat Jan 10 13:37:04 CST 2004
  • by Mr. Slippery ( 47854 ) <.tms. .at. .infamous.net.> on Sunday December 21, 2003 @08:36PM (#7782614) Homepage
    Be smart, and play it safe. Use a 5, or better yet, 10 digit year. What's a few bytes?
    I wrote the following in the RISKS forum a few years ago [ncl.ac.uk]:
    So maybe I'm an April Fool, but it seems to me that the Y10K issue is worth a little serious thought.

    There are areas of human endeavor in which 8000 years is not an extreme time span. At present, we deal with these long time spans only in modeling things like geological and cosmological events. But it is not unreasonable that within the next century, we may begin to build very high technology systems with mission durations of thousands of years - for example, a system to contain radioactive wastes, or a probe to another star system.

    Y2K issues have raised our consciousness about timer overflows, but it's quite possible that this may fade in succeeding generations. There's no reason not to start setting standards now.

    Perhaps all time counters should be bignums?

  • by Kenja ( 541830 ) on Sunday December 21, 2003 @09:18PM (#7782822)
    This is true of IT work in general. If you do your job, nothing happens and people think you're wsting time.
  • Embedded systems (Score:4, Insightful)

    by tepples ( 727027 ) <tepples.gmail@com> on Sunday December 21, 2003 @09:26PM (#7782860) Homepage Journal

    Perhaps all time counters should be bignums?

    Bad idea. A "system to contain radioactive wastes" will usually be an embedded system with a fixed memory. Fixed-memory machines need fixed-size data structures, and a 64-bit count of seconds should hold even over lifetime-of-the-Universe or lifetime-of-copyright time scales.

  • by be-fan ( 61476 ) on Sunday December 21, 2003 @09:27PM (#7782864)
    Argh! God I hate XML culture! How is this in any way better than a 64-bit integer???
  • Actually... (Score:3, Insightful)

    by Kjella ( 173770 ) on Sunday December 21, 2003 @11:01PM (#7783239) Homepage
    Its interesting, how no one considered this would happen eventually and just started to use 64 bit ints to store this from the long run.

    Well, considering how it was the 70s, the change of the millennium seemed ages away, 32 bit seemed a *lot* of memory back in those days. Think a small database with 64 vs 32 bit timestamps. My dad used to work in IBM way back, and I remember he's told me about a printing company which turned down getting another mb of mainframe memory from them, because it'd cost them $150,000. That's 50 cents extra per date.

    If I remember my history right, it was one of the Apple developers that in the design process cut two digits from the year, in order to save memory. Since they were trying to make a consumer product (or at least consumer-priced product), even more work went into minimizing cost. This was also in the 70s. The IBM PC just copied that system.

    Today, it seems utterly stupid to not double it. Just understand that today the machines have literally thousands of times more RAM than they used to. It was a real scarce resource, where every bit counted. Nobody was thinking of year 2038, they were thinking of how this would improve performance right here and now. Also at the time, computers were mostly big mainframes, with pretty much specialists and geeks. So they probably thought it'd be easy to fix when the issue arised.

    That they were mostly unable to foresee the explosion of a) the personal computer, b) the Internet and c) embedded electronics before the end of the millenium, is hardly anything you can blame them for. It wasn't really until near the very end of the millennium that people really understood the magnitude of the task. That's why the y2k-problem became such a big thing, if it had been some mainframes in a computer lab somewhere, noone would have lifted an eyebrow. It was that "the computers are everywhere, in all offices, they'll all go to hell, and society will collapse" which made it big. At which point it was a little late to fix the underlying format, since so much code relied on it being of that type.

    So if there was a time, it'd have to be the middle time... after memory got "cheap", before too much code started to rely on it. It's hard to say if such a period even existed. And even if it did, I'm sure 99,9% of the people would have wondered why the fsck you need to fix that issue NOW. The old system works, and will work for many years yet (like, long past the life time of this PC. We can fix it in the next one). Don't knock it if it works, right? So I find it quite natural it turned out the way it did, no matter how smart people were.

    Kjella
  • by Smelly Jeffrey ( 583520 ) on Monday December 22, 2003 @01:45AM (#7783947) Homepage
    Y2K began on January 1, 19101

    Let:
    Y2K == "year two-thousand"
    19100 == 2000
    19101 == 2001.

    You mean to tell me that the "year two-thousand" began on January 1, 2001?
  • Re:subject (Score:3, Insightful)

    Aha, you see, it compiled. But have you tested every function and routine to make sure no programmer assumed a x bit integer? Or based calculations on the rollover shortcut? Or required a 32 bit integer for certain time calculations? Mere compilation is not enough to ensure accurate and correct operation of all programs.

    Personally, I would have more concerns about the stability of the recompiled system than the 32 bit time based one at this point.

Say "twenty-three-skiddoo" to logout.

Working...