Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Bug Programming Unix

You've Got 25 Years Until UNIX Time Overflows 492

CowboyRobot writes "In 25 years, an odd thing will happen to some of the no doubt very large number of computing devices in our world: an old, well-known and well-understood bug will cause their calculation of time to fail. The problem springs from the use of a 32-bit signed integer to store a time value, as a number of seconds since 00:00:00 UTC on Thursday, 1 January 1970, a practice begun in early UNIX systems with the standard C library data structure time_t. On January 19, 2038, at 03:14:08 UTC that integer will overflow. It's not difficult to come up with cases where the problem could be real today. Imagine a mortgage amortization program projecting payments out into the future for a 30-year mortgage. Or imagine those phony programs politicians use to project government expenditures, or demographic software, and so on. It's too early for panic, but those of us in the early parts of their careers will be the ones who have to deal with the problem."
This discussion has been archived. No new comments can be posted.

You've Got 25 Years Until UNIX Time Overflows

Comments Filter:
  • those phony programs politicians use to project government expenditures

    The programs are real, even if the math may be phony.

  • Re:Not NetBSD (Score:4, Insightful)

    by Beezlebub33 ( 1220368 ) on Tuesday January 22, 2013 @11:07AM (#42656811)
    64-bit is taking over already everywhere. In 10-15 years, you will have a hard time finding a 32-bit computer. (Actually, you might have a hard time finding a 'computer' at some point but that's a whole other issue). With the change to 64 bits, more and more OS's will operate on 64-bit time, and this will not be an issue.
  • by bill_mcgonigle ( 4333 ) * on Tuesday January 22, 2013 @11:09AM (#42656855) Homepage Journal

    Was this a USENET post from '94? Mortgage systems using 32-bit time_t (if there ever were any) failed 5 years ago for 30-year mortgages. We did not hear an earth-shattering kaboom.

  • by JcMorin ( 930466 ) on Tuesday January 22, 2013 @11:12AM (#42656879)
    Having a 64-bit will not solve this at all. The problem lie in the software, if a 32-bit time structure as been used, no matter on what computer it's running it will be emulated as a 32-bit application with only 32-bit of memory for time allocation. It will bust, I can see a whole bunch of old program still running because "it work" and company, especially big companies, do not re-write the software in 64 bit.
  • by kenny603 ( 1754842 ) on Tuesday January 22, 2013 @11:15AM (#42656923)
    In terms of personal computers this isn't going to mean much probably by the time it happens. The problem is that there are many, many embedded devices out in the field that are still running with 32-bit time_t vars. Example from a previous job of mine: we made building automation controllers. Those are all devices that are still out in the field and they are all still running old code. Assuming the devices don't die before the time rollover or are not replaced with newer pieces of hardware then it might become a problem.
  • by h4rr4r ( 612664 ) on Tuesday January 22, 2013 @11:17AM (#42656963)

    64bit time will solve it for anything that uses time from the system directly. Meaning all those perl and bash scripts that run so much stuff no one thinks about, from billing to transferring data between companies to scheduling your next vacation are going to be fine.

    What will not be fine is crusty old compiled code that was 32bit. Hopefully they are just a recompile away from being fixed. Sadly that is likely not to be the case as generally corporations do not have that kind of foresight.

  • by locofungus ( 179280 ) on Tuesday January 22, 2013 @11:33AM (#42657215)

    Compiling away isn't always that simple.

    You'd be amazed how many people code depending on the fact that sizeof(long) == sizeof(int) == sizeof(void*) == sizeof(time_t) == 4 even when they don't need to and structures are often mapped directly onto binary data, either from disk or network.

    I don't actually imagine that 2038 will be much of a problem - most of the issues that will be triggered by the above assumptions will occur between now and then and will be fixed as they occur.

    Then 2038 will loom and there will be a big drive to fix everything (else), the magic time will occur and there will be little more than a whimper. Then everyone will complain about the hype about a non-existent problem.

    I am quite looking forward to having the option of some lucrative consulting income in my early retirement should I decide I need it. :-)

    Tim.

  • by Anonymous Coward on Tuesday January 22, 2013 @11:37AM (#42657267)

    History shows us that it's not biased in the slightest to assume that politicians will lie, cheat, and steal their way to riches. Giving them the benefit of the doubt is like Charlie Brown giving that field goal one more shot because maybe, just maybe, Lucy won't pull the ball this time.

  • Re:Not NetBSD (Score:5, Insightful)

    by dingen ( 958134 ) on Tuesday January 22, 2013 @11:57AM (#42657561)

    The problem is not so much in computers still being 32 bits in 25 years time, but 32 bit computers right now doing calculations involving dates 25 years in the future.

  • Re:Not NetBSD (Score:1, Insightful)

    by eek_the_kat ( 249620 ) on Tuesday January 22, 2013 @12:17PM (#42657747)
    Except that it wont happen for 25 years... FEAR, UNCERTAINTY, CHANGE... RUN!!!!! THE SKY IS FALLING!! (in 25 years)... With the acceleration of development that has been occurring over even the last 10 years, I hardly doubt there will be much to worry about 25 years from now.
  • Re:Not NetBSD (Score:4, Insightful)

    by Z00L00K ( 682162 ) on Tuesday January 22, 2013 @12:21PM (#42657779) Homepage Journal

    If we even are alive by then.

  • Re:Not NetBSD (Score:2, Insightful)

    by davydagger ( 2566757 ) on Tuesday January 22, 2013 @12:28PM (#42657837)
    "That's not how it works, you dumb shit"
    actually, that is how you works you dumb shit.
    https://en.wikipedia.org/wiki/Unix_time#Representing_the_number

    read, and get edumicated before you post!

    " time_t has been widened to 64 bits. In the negative direction, this goes back more than twenty times the age of the universe and in the positive direction for approximately 293 billion years."

    So, given that all modern hardware is 64 bit, just about all modern OSs have the option of the same. So you run 64bit OS on 64 bit hardware, and problem solved. 64bit has been the standard for the last 5 years. Linux has had 64bit support since 2005, and for the past 3, 64 bit has been the recommended install.

    TODAY 32 bit UNIX systems are legacy. both in hardware and software. There are 64 bit drop in solutions for just about everything. The way TIME works in UNIX, a simple recompile against 64 bit libraries with a 64 bit system clock will fix the program.

    I cannot see a computer system made today(guarunteed to be 64 bit), being in use in 25 years.

    1. Cars will be 5 years past QQ plates, rebuiling all new custom aftermarket electronics from scratch will be an option for collectors. No one else is going to care.

    2. Airplanes generally retire after around 10 years. There is no reason to expect 25 year old airplanes sitll flying, without of course many many many major overhauls, to include electronics.

    3. IBM only supports mainframes for around ~10 years. Oh mainframes were 64 bit before anything else. UNAFFECTED.
    http://www-03.ibm.com/support/techdocs/atsmastr.nsf/WebIndex/TD105503

    4. the useful life of a desktop computer is around 5-7 years. it might get a secondary life used for "projects" that don't require modern CPU power, giving it a secondary life of another 7-10 years, as legacy, doing basic tasks.

    This is any machine you'd likely do accounting, business, or any sort of complicated financial transactions that sits on a desk, or in a server room. Please note, mainframes and most mini computers went 64 bit in the 1990s. x86 and ARM are the last to do so.

    After 15-20 years, they have passed the threshold of "legacy" into "obsolete", and "unsupported". 20 years ago, was 1993
    https://en.wikipedia.org/wiki/P6_%28microarchitecture%29

    before the P6.

    Now lets look even further 25 years ago: 1988
    http://www.computerhope.com/jargon/num/80386.htm

    the 386. Just recently retired from mainline linux kernel support.
  • by Anonymous Coward on Tuesday January 22, 2013 @12:31PM (#42657865)

    those phony programs politicians use to project government expenditures

    The programs are real, even if the math may be phony.

    The math is real, even if the inputs are phony.

  • by stoborrobots ( 577882 ) on Tuesday January 22, 2013 @12:37PM (#42657923)

    If you have a 64 bit PC and OS, it should be little more than a recompile.

    A number of comments have claimed this: recompile with 64-bit time_t and the problem goes away. Unfortunately, for many apps it's not quite as simple as that.

    Certainly, for those apps which only deal with time information internally, and in a transient fashion, this will be sufficient to eliminate the problem.

    However any program which persists UNIX timestamps in files, or sends UNIX timestamps as part of a networking protocol, or basically anything which sends the data structure outside the application is still going to require work on how to handle the migration.

    What happens if your app is required to talk on the network? If there are few enough machines involved, then sure, you can upgrade all of them at once, but if it's a large or distributed operation, there needs to be a transition plan. How will older clients and newer clients interoperate?

    If data is saved, how will the recompiled application interpret old files? Does it need a way to distinguish them? Can old data be automatically converted? Are there cases where old data may be compromised? (e.g. those 30 year mortgages probably have the wrong end-date stored...) How will we handle those cases?

    What about situations where time information is used to prime other calculations - is it okay to affect those other calculations? Serial numbers, PRNGs, UUID generation, etc - the situation has to be assessed. Many cases might be fine, but you can't make a blanket statement for all cases, so each calculation must be vetted.

    In the end, the result is that there is work to do. Much of it will be easy to fix, but the assessment still needs to be done.

  • by i_ate_god ( 899684 ) on Tuesday January 22, 2013 @12:44PM (#42658027)

    You would think that if all politicians cared about was their own greed, they'd be far better off than they are now no?

  • by Anonymous Coward on Tuesday January 22, 2013 @01:01PM (#42658231)

    $0.5M to be air-tight sure that a simple 20 lines of Pascal code doesn't crash one (or more) of the hundreds of A-10s in active service, each one worth $12+M, not to mention keeping those pilots safe.

    Yes. Definitely our government dollars at work.

  • by interval1066 ( 668936 ) on Tuesday January 22, 2013 @01:05PM (#42658279) Journal

    you don't need a 64-bit system to deal with 64-bit numbers - it's just dealing with them is a lot slower as the compiler emits library calls to perform the arithmetic...

    A lot slower? Nah. Negligble, at best. It'll be fine.

  • by jnork ( 1307843 ) on Tuesday January 22, 2013 @01:31PM (#42658581)

    If you're doing so many 64-bit calculations that it's seriously impacting the performance of your 32-bit machine, then either you don't care or you need to upgrade to a 64-bit machine.

    I'm sure you can come up with a scenario where that's not possible, but I see no reason to solve that problem until and unless it comes up. Meantime, I do all my work on 8-bit processors and laugh at your 64-bit woes (somewhat hysterically).

  • by dkleinsc ( 563838 ) on Tuesday January 22, 2013 @01:47PM (#42658793) Homepage

    History shows us that it's not biased in the slightest to assume that politicians will lie, cheat, and steal their way to riches.

    What is biased is to assume that the non-elected civil servants who actually do the math will not attempt to do their jobs to the best of their ability. For example, the staff at the Congressional Budget Office who actually do the math do not directly report to any politician. Similarly, the civil service laws exist specifically to prevent someone from, say, being fired by NOAA for coming up with the "wrong" climate data, or fired from the SSA for coming up with the "wrong" Social Security budget projections.

    So you're right to assume that politicians lie, cheat, and steal. But that doesn't mean that a GS-11 actuary working in the bowels of a government agency is lying in his reports.

  • by InterGuru ( 50986 ) <(jhd) (at) (interguru.com)> on Tuesday January 22, 2013 @03:04PM (#42659749)

    We denigrate politicians because they lie, but candidates who tell the truth do not get elected.

Any circuit design must contain at least one part which is obsolete, two parts which are unobtainable, and three parts which are still under development.

Working...