Time's Up: 2^30 Seconds Since 1970 675
An anonymous reader writes: "In Software glitch brings Y2K deja vu, CNET points out a small wave of Y2K-like bugs may soon hit, though it gets the explanation wrong. It will soon be about 2^30 (1 billion, not 2 billion) seconds since 1970 (do the arithmetic). Systems that use only 29 bits of a word for unsigned/positive integers, or store time as seconds since 1970 in this format, may roll back to 1970. (Many systems that do not need full 32 bit integers may reserve some bits for other uses, such as boolean flags, or for type information to distinguish integers from booleans and pointers.)"
I don't think there are 31-bit architectures (Score:5, Insightful)
What I believe this article is referring to is that some software may have been coded to use a bit in integers to store extra info. This seems like a pretty bad idea though as it would have all sorts of interesting effects on overflow and such. It would seem like it would only be useful to a very very very tiny portion of software since the overhead in using this method as a general purpose solution would be terribly difficult.
Sounds like it's just the story of yet another software bug...
Re:Y2K (Score:2, Insightful)
Re:Some systems... (Score:5, Insightful)
Seriously, why can't we fix this damn thing now? (Score:5, Insightful)
I know that the emergence of 64-bit chips will alleviate this somewhat, but it wouldn't surprise me if at least embedded systems are still running 32-bits in 2038.
I know that "long long" is common, but it's not part of the official C++ standard yet. Shouldn't we be putting this in the standard now? It's not too much to require language libraries to have 64-bit integer support (if necessary). This doesn't have to be painful.
I'll feel a lot better the day that I know what I'm officially supposed to use instead of time_t -- or if I can be given a guarantee that time_t will be upgraded to 64 bits within the next few years.
Re:ObCalculation (Score:3, Insightful)
Sat Jan 10 13:37:04 CST 2004
Re:Prepare for the Y10K Bug! (Score:5, Insightful)
Re:A note about the "funnies" (Score:4, Insightful)
Embedded systems (Score:4, Insightful)
Perhaps all time counters should be bignums?
Bad idea. A "system to contain radioactive wastes" will usually be an embedded system with a fixed memory. Fixed-memory machines need fixed-size data structures, and a 64-bit count of seconds should hold even over lifetime-of-the-Universe or lifetime-of-copyright time scales.
Re:does anybody else think... (Score:4, Insightful)
Actually... (Score:3, Insightful)
Well, considering how it was the 70s, the change of the millennium seemed ages away, 32 bit seemed a *lot* of memory back in those days. Think a small database with 64 vs 32 bit timestamps. My dad used to work in IBM way back, and I remember he's told me about a printing company which turned down getting another mb of mainframe memory from them, because it'd cost them $150,000. That's 50 cents extra per date.
If I remember my history right, it was one of the Apple developers that in the design process cut two digits from the year, in order to save memory. Since they were trying to make a consumer product (or at least consumer-priced product), even more work went into minimizing cost. This was also in the 70s. The IBM PC just copied that system.
Today, it seems utterly stupid to not double it. Just understand that today the machines have literally thousands of times more RAM than they used to. It was a real scarce resource, where every bit counted. Nobody was thinking of year 2038, they were thinking of how this would improve performance right here and now. Also at the time, computers were mostly big mainframes, with pretty much specialists and geeks. So they probably thought it'd be easy to fix when the issue arised.
That they were mostly unable to foresee the explosion of a) the personal computer, b) the Internet and c) embedded electronics before the end of the millenium, is hardly anything you can blame them for. It wasn't really until near the very end of the millennium that people really understood the magnitude of the task. That's why the y2k-problem became such a big thing, if it had been some mainframes in a computer lab somewhere, noone would have lifted an eyebrow. It was that "the computers are everywhere, in all offices, they'll all go to hell, and society will collapse" which made it big. At which point it was a little late to fix the underlying format, since so much code relied on it being of that type.
So if there was a time, it'd have to be the middle time... after memory got "cheap", before too much code started to rely on it. It's hard to say if such a period even existed. And even if it did, I'm sure 99,9% of the people would have wondered why the fsck you need to fix that issue NOW. The old system works, and will work for many years yet (like, long past the life time of this PC. We can fix it in the next one). Don't knock it if it works, right? So I find it quite natural it turned out the way it did, no matter how smart people were.
Kjella
Re:You, sir, are incorrect (Score:2, Insightful)
Let:
Y2K == "year two-thousand"
19100 == 2000
19101 == 2001.
You mean to tell me that the "year two-thousand" began on January 1, 2001?
Re:subject (Score:3, Insightful)
Personally, I would have more concerns about the stability of the recompiled system than the 32 bit time based one at this point.