Forgot your password?
typodupeerror
Programming Businesses IT Technology

COBOL Turning 50, Still Important 314

Posted by Soulskill
from the legacy-code's-legacy dept.
Death Metal writes with this excerpt from a story about COBOL's influence as it approaches 50 years in existence: "According to David Stephenson, the UK manager for the software provider Micro Focus, 'some 70% to 80% of UK plc business transactions are still based on COBOL.' ... Mike Gilpin, from the market research company Forrester, says that the company's most recent related survey found that 32% of enterprises say they still use COBOL for development or maintenance. ... A lot of this maintenance and development takes place on IBM products. The company's software group director of product delivery and strategy, Charles Chu, says that he doesn't think 'legacy' is pejorative. 'Business constantly evolves,' he adds, 'but there are 250bn lines of COBOL code working well worldwide. Why would companies replace systems that are working well?'"
This discussion has been archived. No new comments can be posted.

COBOL Turning 50, Still Important

Comments Filter:
  • by Samschnooks (1415697) on Saturday April 11, 2009 @11:26AM (#27542181)

    Why would companies replace systems that are working well?

    Because the director of IT or whatever his title is will want to be able to put on his resume that HE moved a company from a "Legacy" and "Outdated" system to a modern "web based solution" that enables "greater productivity" among the workforce saving "millions of dollars". Now, he can put that on his resume and go for the CTO, CIO, or whatever jobs.

    I've seen it and it works.

    • by Jurily (900488) <jurily.gmail@com> on Saturday April 11, 2009 @11:32AM (#27542221)

      Because the director of IT or whatever his title is will want to be able to put on his resume that HE moved a company from a "Legacy" and "Outdated" system to a modern "web based solution" that enables "greater productivity" among the workforce saving "millions of dollars". Now, he can put that on his resume and go for the CTO, CIO, or whatever jobs.

      If and only if they're able to pull it off. It's also a nice way to end your carreer if you fail.

      • by trash eighty (457611) on Saturday April 11, 2009 @11:54AM (#27542369) Homepage

        no if you fail you just get promoted out of harms way

      • by plopez (54068) on Saturday April 11, 2009 @11:58AM (#27542391) Journal

        If and only if they're able to pull it off. It's also a nice way to end your carreer if you fail.

        no, you collect a bonus and bail out before it crashes in flames leaving someone else holding the bag. See also the bank failures for examples of this. See a pattern? I hate MBAs.

        • by nurb432 (527695) on Saturday April 11, 2009 @01:47PM (#27543185) Homepage Journal

          The last CFO i worked for did that. Was hired to 'rescue' the company but instead he ruined the company, fired most of us that stuck with the company even during the dark times. then he left, with a bonus. ( i hear he destroyed the next one too )

          1 year later the company filed bankruptcy.

          In reality it was the best thing for me personally, it got me out of the slowly dying industry i had been in for 20 years just before the collapse. But the way it went down was just wrong.

        • by blind biker (1066130) on Saturday April 11, 2009 @02:08PM (#27543361) Journal

          This is not necessarily tied to MBAs as much as it's tied to corporate psychopaths - the ones most likely to succeed in the modern chaotic corporate world (especially in larger publicly-traded companies). They will lie, manipulate and use everything and everybody only to further their own interest. If the company, their workers or even the economy of the whole USA will suffer, they won't care one bit - no conscience.

      • If and only if they're able to pull it off. It's also a nice way to end your carreer if you fail.

        Just change company before failure becomes apparent. Then you prepared it all so well and successor just it all up.

    • by fuzzyfuzzyfungus (1223518) on Saturday April 11, 2009 @11:34AM (#27542239) Journal
      I'm guessing that this story involves enough java to float a battleship; but not quite enough to keep the interface responsive...
    • Re: (Score:2, Interesting)

      Well, yes, that could be part of it. The other part of it is that in smaller shops you have a talent pool to pick from, and getting cobol programmers isn't easy, especially ones who are also proficient with some of the modern programming languages and web development. I'm not saying it can't be done, but its harder to find those folks. Plus, cobol programmers are usually older, have much experience in the industry, and consequently command a higher salary. Sticking with the "more modern" technologies in my
      • COBOL not quite dead (Score:3, Interesting)

        by BrokenHalo (565198)
        Quick disclaimer: IAACP (or rather I was in an earlier life, when I had to be).

        COBOL's main drawback was never any real technical issue. It is simply that it can be very tedious to do. Having said that, once you understand it, it is also a very easy language to code in. But there's certainly no reason to throw away the code just because it isn't trendy any more.

        Having said that, I do remember working on a site in the UK back in the '80s where the COBOL source to some important routines had long since disa
    • by kilodelta (843627)
      So very true. IT Directors always try to put their stamp on things. I should know, I was an I.T. Director once.
    • Re: (Score:3, Interesting)

      by jlp2097 (223651)

      Your answer, while maybe true in some rare cases, just shows the ignorance of some IT people to business decisions. Maintaining such an old piece of technology might actually result in higher costs than developing a new system based on current technology. Why?

      1. People. Probably the most important reason. Cobol is not exactly a widely taught language. People knowledgable in Java, C#, PHP and other languages are younger, much easier to find and thus cheaper. People with real solid Cobol experience are actual

    • Sure, greater productivity is one benefit, but the language is completely irrelevant for that.

      It's about how flexible the system will be when you have to change it. And you will -- that's the whole point of software, that it is soft, and changeable.

      Old Cobol apps generally are not flexible [msdn.com]. (stolen from this comment [slashdot.org]). It's worth mentioning that a decent object-oriented system would've gone a long way towards eliminating this problem -- any idiot can stuff a date into a Date class, which then encapsulates al

      • It's about how flexible the system will be when you have to change it. And you will -- that's the whole point of software, that it is soft, and changeable."

        Not entirely. Software allowed one to perform functions that were not practical to do in hardware.

      • by ralphdaugherty (225648) <ralph@ee.net> on Saturday April 11, 2009 @03:40PM (#27544021) Homepage

        * I apologize for the profanity, but any program that can't change a fucking constant is a broken program. Or did they copy/paste 6.55 all over the place?

              Your comment is idiotic. Do yu think there was only one change to the minimum wage in all the decades that California payroll system was in production?

              There was a long thread on that system, and the real reasons that that California bureaucracy didn't want to modify the payroll system at that time was covered in the thread.

              I also provided a solution in the thread that didn't require modifying the system. The California thing wasn't about technology, it was about politics.

              There, I've wasted some more of my life on this idiotic issue.

          rd

    • You have to understand where many of the cobol programs are running and what their function is. They're not little shell scripts people wrote for fun. They usually power the core of large businesses and depending on the business could be responsible for millions or billions of dollars in trasactions a day.

      Screwing with that, when everything is working fine is not a good idea. When Cobol programmers become even more rare it will become a more desirable job for programmers. I think places like ITT tech and De

      • Think of it this way. If Apple came out with a new iHeart would you have surgery to replace your perfectly functioning heart with an artificial one just so you can play mp3s?

        Apple's marketing department thought about that long and hard. Then they shrugged and acknowledged that the risk was too great. If they killed off their nosiest fanboys in a mishap, where would they be then? There's not much of a future selling iTunes off bottlecaps from Pepsi bottles.

  • by Anonymous Coward on Saturday April 11, 2009 @11:27AM (#27542189)

    Why would companies replace systems that are working well?

    So I can have a fucking job?

    • Re: (Score:3, Funny)

      by Extremus (1043274)
      I am willing to offer you a good one, but I couldn't find your address anywhere.
    • Re:Why replace it? (Score:5, Informative)

      by artor3 (1344997) on Saturday April 11, 2009 @11:43AM (#27542297)

      Learn COBOL, and you always will. My dad is a COBOL programmer for the NY state government. According to him, around 95% of their COBOL programmers are within 10 years of retirement. The youngin's (as he calls them) are in their mid to late forties.

      If you know COBOL, you are absolutely guaranteed a job there.

  • Cool (Score:5, Interesting)

    by bigsexyjoe (581721) on Saturday April 11, 2009 @11:37AM (#27542249)
    Does that mean my Java skill set is likely to keep me employed for the next 30 to 40 years?
    • by Ilgaz (86384)

      Well Java stays for sure. You will just be blamed for not using whatever fashion is. That is exactly happening to COBOL developers which their code still runs unmodifed on some million dollar state of art Z/OS mainframe for 30-35 years. I know a very big bank's very high level admin, the stories he told me were amazing.

      People should look below at their shiny, fashion development tools running OS X and they will see UNIX, with exactly same principles as it was invented in 1970s. Thing calls its shells "tty"

      • by WillKemp (1338605)

        Thing calls its shells "tty", if one looks up where that comes from, he will be horrified.

        Anyone reading slashdot who doesn't already know where "tty" comes from should hand their geek card in at the door on the way out.

        Some of us have even used them!

    • Not exactly. You have to wait 30 years until the Java herd is thinned-out.

  • by Just Some Guy (3352) <kirk+slashdot@strauser.com> on Saturday April 11, 2009 @11:40AM (#27542273) Homepage Journal

    Before I start, I know all too well that you can write good or terrible code in any language. Still, most COBOL code I've seen is written to run well on hardware that is no longer relevant. A recent experience with an ex-coworker illustrated this pretty well for me:

    Said fellow, call him "Joe", had about 30 years of COBOL experience. We're a Python shop but hired him based on his general coding abilities. The problem was that he wrote COBOL in every language he used, and the results were disastrous. He was used to optimizing for tiny RAM machines or tight resource allocations and did things like querying the database with a rather complex join for each record out of quite a few million. I stepped in to look at his code because it took about 4 hours to run and was slamming the database most of the time. I re-wrote part of it with a bit of caching and got the run-time down to 8 seconds. (Choose to believe me or not, but I'd testify to those numbers in court.) I gave it back to him, he made some modifications, and tried it again - 3 hours this time. I asked him what on Earth he'd done to re-break the program, and he'd pretty much stripped out my caching. Why? Because it used almost half a gig of RAM! on his desktop and he thought that was abhorrent.

    Never mind that it was going to be run on a server with 8GB of RAM, and that I'd much rather use .5GB for 8 seconds than 1MB for 3 hours of intense activity.

    So Joe isn't every COBOL programmer, but you and I both know that he's a lot of them. But back to the direct point, how much of that 250GLOC was written with the assumption that it'd be running on 512KB machines or with glacial hard drives or where making the executable as tiny as possible was an extreme priority? Doing things like storing cache data in hash tables would've been obscenely expensive back in the day, so those old algorithms were designed to be hyper-efficient and dog slow. Whether you think that constitutes "working well" is up to you.

    • by thethibs (882667) on Saturday April 11, 2009 @11:53AM (#27542363) Homepage

      Nice story, but it doesn't say anything about COBOL.

      I have a similar story about 30 programmers who spent two years writing java code and delivering nothing useful because the requirement called for two different architectures: one best served with a batch system, the other best served with a real-time system. What they need is COBOL and C, but what they know is java and struts. It's been another four years since I ran screaming from the building and they still haven't delivered anything useful.

      Inept programmers will screw things up in any language.

      • by Just Some Guy (3352) <kirk+slashdot@strauser.com> on Saturday April 11, 2009 @12:15PM (#27542491) Homepage Journal

        Nice story, but it doesn't say anything about COBOL.

        I think it says a lot about The COBOL Way, or at least the way things were done when those ancient codebases were being written.

      • by chthon (580889)

        The problem with Java is that it is too much hyped and sold to business types as the silver bullet. As Fred Brooks said more than 30 years ago : THERE IS NO SILVER BULLET (yelling intentional)! Is Java better than Cobol ? Probably not, as Java is also an imperative language. Do not fool yourself : object-oriented languages of the type SIMULA (and Java is not better than Simula, just the wheel reinvented) is nothing more than an imperative language with a small layer for dispatching on type, be it in the com

        • by k.a.f. (168896)

          Is Java better than Cobol ? Probably not, as Java is also an imperative language.

          Granted, there is hype, and programming is intrinsically difficult. That said, if you believe that language designers have learnt no lessons whatsoever within 40 years, you are delusional.

          • by chthon (580889)

            That is true, but I do not think they implemented those lessons in Java, even when people like Richard P. Gabriel and Guy L. Steele helped developing and implementing the language.

    • by TheRaven64 (641858) on Saturday April 11, 2009 @12:14PM (#27542487) Journal

      You see that kind of thing from lots of programmers who only know one language well. This is why a good programmer always keeps up with modern architectures. I've seen C programmers who put things in globals rather than passing them on the stack, because that was faster before caching (now it breaks locality of reference, and moves something that was in a register or on the stack to an indirect reference where it needs extra loads and causes extra cache churn).

      I've seen programmers who grew up with Pascal carefully turning multiplies into sequences of adds and shifts. Great, except that something like the Athlon can do two multiplies in parallel, but only one shift at a time (because most code contains a lot more multiplies than shifts), stalling the pipeline.

      Another common issue is aggressively testing for potential special cases for optimising, ignoring the fact that branches are very expensive on most modern CPUs and the cost of the checks is now often greater than the saving from shortcutting the special cases.

      Java programmers are not immune to this, and often optimise based on old versions of the JVM. One favourite is to add finally everywhere, making the code very rigid, believing this makes it faster. In a modern JVM, finally is completely ignored; the VM already knows if a class is not subclassed and will do the same optimisations whether it is declared finally or not.

      There's a reason why the rules for optimisation are:

      1. Don't.
      2. Don't yet (experts only).

      If you write good algorithms, your compiler will usually produce reasonable code. If this isn't fast enough, then make sure you really understand how your VM and target CPU work, before you try optimising. The experts only part isn't a joke.

      • by david.given (6740) <dg AT cowlark DOT com> on Saturday April 11, 2009 @01:11PM (#27542901) Homepage Journal

        One favourite is to add finally everywhere, making the code very rigid, believing this makes it faster.

        I think you mean final here, no? finally does something else.

        1. Don't.
        2. Don't yet (experts only).

        Very true.

        I'd also add the additional rule: You don't know it's slow until you've benchmarked it. All too often I have seen, and I should add that I've perpetrated this myself, people spend ages painstakingly optimising parts of a system that felt like they were cause speed problems, when actually they weren't.

        I once spent about two or three months designing and implementing a clever caching system for properties in a UI framework. It was subtle and complex, and I was very proud of it, and it was a total waste of time. We eventually threw it all away and stored the properties in an AVL tree. Yes, this involved looking up the properties in the inner loops of all the UI component redraw methods, but compared to the amount of work of touching every pixel on the screen, this was trivial. And it increased flexibility, reduced complexity and code size, and vastly improved maintainability.

        • by TheRaven64 (641858) on Saturday April 11, 2009 @01:30PM (#27543055) Journal

          I think you mean final here, no? finally does something else.

          Yes, sorry, final.

          And it increased flexibility, reduced complexity and code size, and vastly improved maintainability.

          Reducing code size is important for three reasons:

          1. It means that a single human can understand more of the program at a time.
          2. It means that there are fewer places to look for bugs.
          3. It reduces instruction cache usage.

          This last part is something I should probably have mentioned in the last post. Instruction cache churn is one of the biggest performance killers on modern CPUs. It is particularly instructive to compare C++ templates and Objective-C. In C++, your compiler will generate a new set of functions for every template instantiation (well, not quite, but close enough). In Objective-C, it will only emit one copy of a method and then use run-time lookups to determine how things work. The Objective-C solution to the problem is slower. You can generate lots of microbenchmarks that prove that it's slower. Except, somehow, the C++ program ends up being slower overall, because the Objective-C program can keep most of the code in cache, while the C++ program is constantly thrashing the instruction cache. If you run a couple of programs concurrently, each one gets even less cache usage, so you end up spending even more time waiting for main memory.

          This is almost a corollary to your comment: it isn't slow until you test it in place. On a modern computer there are so many potential sources of performance issues that you can't always take a microbenchmark and generalise it. There are lots of cases where one option is slower when you do it once but faster when you do it a few million times, or faster when you do other seemingly-independent things. Microoptimisations are no longer a good way of ensuring that an entire program runs quickly.

        • by Unoti (731964) on Saturday April 11, 2009 @06:48PM (#27545095) Journal

          You don't know it's slow until you've benchmarked it.

          This is similar to a mantra of continuous improvement: "In God we trust. All others must bring numbers." The point of that saying is that if you want to make a change to the process, you must support your assertions with real metrics.

      • One favourite is to add finally everywhere, making the code very rigid, believing this makes it faster. In a modern JVM, finally is completely ignored; the VM already knows if a class is not subclassed and will do the same optimisations whether it is declared finally or not.

        Pretty sure you mean "final" - as far as I know, "finally" is only used in exception handling. Optimizations aside, "final" is still useful for making a statement about the intended terminal status of a class in the hierarchy.

      • by Pig Hogger (10379)

        I've seen programmers who grew up with Pascal carefully turning multiplies into sequences of adds and shifts. Great, except that something like the Athlon can do two multiplies in parallel, but only one shift at a time (because most code contains a lot more multiplies than shifts), stalling the pipeline.

        This is silly. With the optimizing compilers nowadays, there is no more need to do such silly antics.

        And even back then, it was already silly because " premature optimization is the root of all evil [c2.com] ".

      • by McSnarf (676600) * on Saturday April 11, 2009 @02:15PM (#27543413)
        Hmmm...

        This revived some slightly old memories.
        I remember a talk by the local FORTRAN compiler guru in the mid-70s.

        After talking about some intricacies of the IBM FORTRAN H compiler, he gave some examples of the compiler's abilities. Summarizing it with: Don't try too hard to optimize. Leave it to the compiler. It usually knows what it is doing.

        And that sums it up rather nicely.
        I'd rather work on code written by someone else who concentrated on writing readable code than on code written by someone trying to be clever.

        (Note to people born after around 1980: Yes, we too believed it would be cool to write cool, complex code, with the odd assembler routine thrown in for good measure (which, btw. didn't really save much time), badly documented and demonstrating our superiour coding abilities. Looking back, we were idiots who should have been fired. Don't repeat our mistakes, OK?)

      • by mcrbids (148650) on Saturday April 11, 2009 @04:43PM (#27544409) Journal

        There's a reason why the rules for optimisation are:

              1. Don't.
              2. Don't yet (experts only).

        If you write good algorithms, your compiler will usually produce reasonable code. If this isn't fast enough, then make sure you really understand how your VM and target CPU work, before you try optimising. The experts only part isn't a joke.

        Except that there's a clear and definite time to optimise - when performance is in the crapper!

        Just 2 days ago, I heard complaints about a well-written (but very old!) report that was still taking as long as 5 minutes to run when accessing a large data set. Taking a look, I found that it was using an old template system for output using regex, and for a complex report, the regex was killing the server. So I replaced it with a different template engine using simple string replacement, and reduced > 5 minutes reporting time to about 15 seconds. Further looking there found a simple 3-element index in the database cut the time down to 2 seconds.

        Now the report comes up INSTANTLY in most cases.

        Optimising is necessary - but only AFTER the fact!

      • by linuxrocks123 (905424) on Saturday April 11, 2009 @06:49PM (#27545103) Homepage Journal

        > I've seen programmers who grew up with Pascal carefully turning multiplies into sequences of adds and shifts. Great, except that something like the Athlon can do two multiplies in parallel, but only one shift at a time (because most code contains a lot more multiplies than shifts), stalling the pipeline.

        That strength reduction of multiplication to bit shifting might not be valuable anymore for modern CPUs surprised me, so I looked up the instruction latency and throughput for the Core 2. To see for yourself, download http://download.intel.com/design/processor/manuals/248966.pdf [intel.com] and look at Appendix C.

        The timings show that integer multiply, IMUL, is still quite a bit slower from the bit shift instructions (SHL/SHR). So, turning multiplies into adds and shifts is still a Good Thing.

        However, these days, a compiler can do this type of thing for you much of the time, so doing the transformation by hand is usually not necessary. Also, since optimizing the source code in this way does take time and decreases source readability, the only cases where people should really still be doing manual strength reduction are in performance-critical code where the compiler, for whatever reason, is missing the optimization opportunity.

        ---linuxrocks123

    • by mwvdlee (775178)

      You might want to ask what Joe did inbetween that 1 year of COBOL and you guy hiring him 29 years later.

      I've been a IBM mainframe programmer (COBOL amongst other languages) and will tell you that I've never had to worry about memory much. Sometimes a job would require more than the standard 4GB allocated to it, but than you'd just set it to use 8GB, 16GB or whatever. Typical "irrelevant" mainframes (not just IBM's) measure total RAM in TB nowadays, not GB.

      Modern mainframe programmers are used to aggresively

      • Re: (Score:3, Insightful)

        by Just Some Guy (3352)

        I'm sure you're right for new code written today. I am equally sure that programmers 30 years ago did not have that luxury. Seriously, a 4GB standard allocation? An IBM 4341 from 1979 had up to 16MB of RAM [wikipedia.org]. The article quotes people saying that code 30-50 years old still works well, but I would posit that code that old was probably written with assumptions that are no longer sensible today, and at the very least should be re-evaluated instead of being allowed to blindly accrete new layers.

  • by Anonymous Coward

    COBOL's still around, will perl last as long?

    For instance there is now OO COBOL but the only people that use it are COBOL programmers who are stuck, perhaps because of their company's dictates, perhaps by choice, with COBOL. In the same way perl may be heading towards irrelevance wrt "mainstream" language. I've written commercial perl in the past, it was a pain then and it's still a pain now. The thing is that now there are alternative languages in the same space (python, ruby etc., php for web side) that d

    • I agree on you assessment of COBOL, but lay of perl. Its cool like Jazz. It doesn't go out of style. COBOL is more like the US Mail service. Yeah, it works and is dependable and can't be replaced for everything, but generally time has passed it by. There are better ways to send information than mail. So maybe you don't get rich and famous by playing Jazz or coding perl, but there is a certain timelessness poetry to it.
    • Re: (Score:3, Insightful)

      by coryking (104614) *

      You know, I guess it depends. I wouldn't port an application from Perl to something else. But I'm not sure I'd base a new project on Perl either.

      There are some things about perl that could be fixed that might change my mind:

      1) Dump Perldoc and liberally rip from javadoc [sun.com] and XML comments [microsoft.com]. I know both of these got their start from perldoc, but perldoc needs to catch up.
      2) Make sure the IDE actually uses said docs. Once your IDE's intellesense sucks up your comments and uses them while you are typing in a

    • by Pig Hogger (10379)

      For instance there is now OO COBOL but the only people that use it are COBOL programmers who are stuck, perhaps because of their company's dictates, perhaps by choice, with COBOL

      Ha! After years of working in Pascal, I was assigned to work on the IBM mainframe, in COBOL. So I would label my procedure as:

      PROC-000
      PROC-001 yadda
      yadda
      PROC-002 yadda
      ...
      PROC-014 yadda
      PROC-999

      and call it as PERFORM PROC-000 through PROC-099.

      The old COBOL heads could not figure why I was doing that...

  • DOH! (Score:3, Funny)

    by Daswolfen (1277224) on Saturday April 11, 2009 @11:44AM (#27542305)

    That's what I get get for learning FORTRAN in college rather than COBOL...

    at least my mad HTML skills..

    oh wait... all websites are in FLASH or PHP now...

    DOH!

    Ok.. im going back to watch a movie on my Betamax or HD-DVD player....

    • Re:DOH! (Score:4, Interesting)

      by Brandybuck (704397) on Saturday April 11, 2009 @12:53PM (#27542777) Homepage Journal

      Believe it or not, there is a LOT of Fortran out there. I've run across quite a bit of it in the past few years. A lot of scientific, engineering and other number crunching apps were written in Fortran, and there's no reason to rewrite them just because they're thirty years old. The apps might have brand new GUI and visualization front ends, but deep in the heart there is some Fortran code encapsulating the domain specific math.

  • by conner_bw (120497) * on Saturday April 11, 2009 @11:45AM (#27542315) Homepage Journal

    YES WE COBOL!

  • Adequate Languages (Score:4, Insightful)

    by j. andrew rogers (774820) on Saturday April 11, 2009 @11:54AM (#27542375)

    COBOL is a perfect example of an "adequate" language, like most programming languages that are in common use. Adequate languages linger forever if there is a tool chain to support them because there is little in the way of economic rationale for replacing them.

    The reason adequate languages proliferate over the long term is that only inadequate languages get replaced, and "ideal" languages become hopelessly politicized by purists and ideologues, leaving the adequate, practical, boring languages as the sound business choice. It is a real-world example of perfect being the enemy of good enough, but for economic reasons good enough always wins.

    • Re: (Score:2, Flamebait)

      by isdnip (49656)

      Good point. COBOL is adequate.

      It's just not k3w1. Script kiddi3z don't use it. Hax04 d00dz don't like it. Boring grownups with real work to do use it, and they do real work with it. How uninteresting!

      What COBOL won't do well or at all -- write Linux device drivers, write 3D game engines, or overflow buffers and take control of somebody else's system in order to send spam. What it will do -- process lots of data, paying attention to every penny.

      C, on the other hand, is a miserable language for almost e

      • C isn't a great language for low-level programming either. One of the first languages I learned was PL/M. I forgot most of it, but recently revisited it. In comparison to PL/M, C looks like a toy language. PL/M has, for example, much better support for NUMA architectures (this in a language that Intel was pushing for the 8086). Variants of ALGOL were popular for low-level code for a little while, and supported things like concurrency / reentrant code a lot better than C does.
    • It's more than adequate because it natively handles money data in transparently correct ways. Other languages have to be bent to fit. Integer and double precision data simply do not meet the need, though these days with 64bit integers it is easier to warp integers to be usable.

      In addition it let one record right-size fields on disk and tape (in a day when disks were small and very very expensive).

      I'm glad I no longer write code in COBOL, but for many years it was the only choice possible for business.

    • Cluestick: That hopelessly politicized language is NOT ideal, despite what the purists and ideologues say.

  • COBOL, not so bad (Score:4, Insightful)

    by Ancient_Hacker (751168) on Saturday April 11, 2009 @12:07PM (#27542437)

    As just one data point, a certain place replaced a COBOL app that ran with millisecond response time just fine on a 2 megabyte 1 MIPS mainframe, replaced it with a spankin' fresh Java app that ran about 2000 times slower on an 8 gig 16-CPU, 800MHz very expensive water-cooled mainframe.

    Now it could have been due to having a bunch of neophyte Java weenies doing the coding, but I'm just sayin', when there's three orders of speed and 3.5 orders of RAM, there may be something significant in the implementation language.

    • Yeah, its the coders, not the language. Java is one of the easiest languages to screw up the implementation while still having it "work". Java is flexible enough for bad coders to make a noose and hang themselves.
    • Now you see why IBM embraces Java.

  • by cryfreedomlove (929828) on Saturday April 11, 2009 @12:10PM (#27542451)
    When I was working on my CS undergrad degree, more than one professor told me that I was really limiting my job prospects because I declined to take the elective COBOL classes. I knew enough about COBOL then to know that I could not drag myself out of bed in the morning to make a living with it.

    The ranks of COBOL programmers out there are living drone like lives without passion or joy. That's not for me. I code for the love if it.
    • ...living drone like lives without passion or joy.

      Wow. Absolute judgment on the lives of others based on the programming language they use?

    • Those ranks are collecting a nice, secure paycheck each and every month and will do so for the next few decades.

      Coding for the love of it is marvellous, a joy without peer. But without a corresponding paycheck it is just vanity. Mental masturbation, if you will.

      I'd rather be a well payed drone than a masturbator, wouldn't you?
    • by chthon (580889)

      Bah, you do not know what you are talking about. I wrote not only applications in COBOL, but even tools to help automate my compilations, and once even a remote login shell. For every language the Turing principle counts, and if your COBOL gives access to your operating system calls, then the sky is the limit.

    • Not a heroic life (Score:5, Interesting)

      by QuestorTapes (663783) on Saturday April 11, 2009 @02:17PM (#27543423)

      > I declined to take the elective COBOL classes.
      > I knew enough about COBOL then to know that I
      > could not drag myself out of bed in the morning
      > to make a living with it.

      COBOL also encourages "heroic" programming. At a shop I worked at, they were very disparaging of the new systems using relational databases, and proud of the mainframe COBOL stuff, because it never went down.

      If you don't count the 4 times in one year a hardware failure caused critical business systems to go offline for 2 days to 3 weeks at a time while teams of COBOL coders used tools to manually delete and rebuild tables and indexes when various failures caused -extensive data corruption-.

      And the corrupt data caused by operator errors in nightly batch processing.

      And the recoding to fix a major financials bug that went undetected for 10 years until we compared the numbers on the external PC system.

      You see, that was "normal operations." by contrast, when a network failure occurred and the relational systems we built went offline for 45 minutes, and data recovery was "re-run the job", -that- was a disaster caused by the "sloppy PC programmers and their tinkertoy PC systems."

      COBOL's great stuff...if you like being paged at all hours to manually fix data.

  • Design (Score:2, Interesting)

    Why would companies replace systems that are working well?

    The question is not, 'Does the code work?', the question is, 'Does the code follow good design principles?' If the answer is 'No, the code is not well-designed', then maintenance can be very costly. I like the snippet from this [msdn.com] page:

    "I was once told by an instructor in a design patterns class that the Y2K problem wasn't caused by using 2 digits to represent 4. Instead, it was caused by doing so all over the place. While the reality of the situat
  • by 1c3mAn (532820) on Saturday April 11, 2009 @12:18PM (#27542507)

    Why is no one updating Cobol code? Because the skill to interact with other systems is disappearing.

    As a Mainframe Utilities Programmer I hear it from customers all the time. "We can't touch that system because the guy who wrote it retired." System here just represent the code, but also the server backend stuff like database design.

    I have heard stories of an IT department being 10 man team. In the 80s that team had everyone dedicated in maintaining the mainframes. Now, they still have 10 people but only 1 person is there to work on the Mainframe.

    So now you have code from the 70s that no one understands, running a mission critical application, and you think the IT manager is going to touch it? He is praying it doesnt break on his watch or he might get a call from the COO. Even if it breaks, it is better to patch it then rewrite it because the database behind it is so vital to all the rest of the application that it cant be changed either.

    The issue mainly is that no one is teaching old skills anymore. Skills that are still required, but really arent 'sexy' for young college students to learn. Even the name "Mainframe" has grandfather connotation to it while if people actually looked at the IBM Z Servers, one would see how high tech these systems actually are.

    • by jimicus (737525)

      Even the name "Mainframe" has grandfather connotation to it while if people actually looked at the IBM Z Servers, one would see how high tech these systems actually are.

      If you want to be picky about it, it would be more accurate to say that everything else is low tech. Many of the latest big things on a boring bog-standard x86 server debuted on mainframes.

  • 1. Cobol programs where written so bad and confusing that probably only the original programmer (he was the original maintainer) was able to maintain it.

    2. The company was grabbed by the balls by the provider, who - along with the maintainer -, kept on saying "we can't assure that if you migrate this all will work fine".

    and, since it was a "critical" application, we never migrated it. Managers never minded that in a demo I showed them the response with PostgreSQL was about 8 times faster.

    that is
  • Why replace? (Score:3, Insightful)

    by Jane Q. Public (1010737) on Saturday April 11, 2009 @12:29PM (#27542589)
    Hmmm. I suppose it could be that between the times that COBOL was developed (Grace Murray Hopper, FTW), and today...

    There is more to a language than just being Turing-complete. There is syntax and geneeral usability, for example.

    I know that there are still jobs out there for COBOL programmers. And it makes me sad.
  • books? (Score:2, Insightful)

    Most COBOL books and tutorials are unavailable, out-of-print, or just plain gone.
     
    What resources still exist for someone who wants to learn COBOL?
     
    http://www.opencobol.org can easily be installed on Fedora Linux (for example) with a simple "yum install open-cobol", but what comes next?

  • Does this add up? (Score:5, Insightful)

    by Allicorn (175921) on Saturday April 11, 2009 @12:53PM (#27542787) Homepage

    How does this add up?

    1. Around a third of UK companies say they have even at least one COBOL program somewhere in their enterprise.

    2. Around three quarters of all UK electronic business is coded in COBOL.

    I'm aware that there are allegedly pockets of COBOL here and there with some fairly significant nuclei of usage within certain business sectors but seriously... 80% of all electronic transactions?

    Monster.co.uk search for single keyword in title, 11th Apr 2009:

    Java: 173 hits
    C++: 142 hits
    PHP: 95 hits
    Perl: 39 hits
    COBOL: 1 hit

    This doesn't seem to suggest a great demand for COBOL coders at present which - one would think - suggests little use of COBOL.

    I've heard this "the world secretly runs on COBOL" story countless times over my career, but seldom seen more than a few lines of COBOL actually deployed in businesses I've worked with. Is the whole thing just a weird industry myth?

    • Re:Does this add up? (Score:5, Interesting)

      by PPH (736903) on Saturday April 11, 2009 @01:23PM (#27542991)

      You are confusing the amount of development done with Cobol and the number of transactions handled by Cobol systems.

      In my experience, the reason that Cobol still hangs on is that management continues to pursue the philosophy of "patch it just one more time" rather than "port it to something more easily maintained". As time goes by and continual patching gets more difficult, expensive and riskier, the amount of development goes down. But the systems are still up and running.

      The ultimate in unmaintained systems was one at an outfit I used to work for where they lost the source code. Its gone, never to be modified again. It was lost before the end of the last century and was suspected not to be Y2K compliant. The solution was to write and run a routine on the applicable database to bump all of the date fields back (20 years, I think). Then, another small routine was written to post process the output add the time back. Management is grinning from ear to ear because, unlike all the other apps that cost millions to maintain, this one costs them nothing. As long as those responsible retire before 2020, that is.

    • by jimicus (737525)

      If it's true at all, my guess is that it's in niche applications which haven't changed much in 30 years but - and here is the big but - process such huge volumes of data that they constitute 80% easily.

      In other words, it exists at the common points which most transactions are going to go through one way or another. Banks, for example.

    • by perlchild (582235)

      It's not a weird industry myth, but anyone who hires a COBOL Coder can't call it a Cobol job if they wanna migrate OFF Cobol either. It's a catch-22.

  • by slashdot_commentator (444053) on Saturday April 11, 2009 @12:54PM (#27542791) Journal

    Its not just the mythical "mission critical" aspect that keeps businesses dependent on COBOL. MANY of those programs required either financial analysts to "vet" the COBOL program, or lawyers to "vet" the COBOL program complied with laws (privacy, methods of determinations, etc.).

    Not just are you putting in the cost to refactor the program from scratch, not just are you risking a bug costing your company hundreds of millions to billions of dollars, but you also have to take in the costs of expensive NON-programmers to "bless" the new program.

    Then also realize that the new whizbang technologies like SQL and java will RUN LIKE A DOG compared to the COBOL program. That's because mainframes are optimized data I/O machines. They're not great for making intense calculations or retrieving indexed relationships, but they are a BEAST when it comes to pulling out gigabytes of user data, and then making simple calculations and updates to each. It also sounds like top notch COBOL programmers programmed to the machine for efficiency. That's not really done anymore by generic programmers.

    New shops don't have to consider COBOL. But any large company (and gov't) could potentially take a huge hit on their finances (in legal issues) if refactor project has a bug. You can roll the dice, or you can just pay another couple million/year and hope nothing ever forces you to consider replacing your legacy COBOL programs that no one knows how they work, or how to change them.

  • by ErichTheRed (39327) on Saturday April 11, 2009 @01:09PM (#27542887)

    I've had an interesting run through IT environments so far. Each one of my employers has successfully used what would be called a "legacy system" to do core business transactions. I'm not old by any means, but I definitely see no reason to get rid of systems that are performing well.

    The qualification for that statement, of course, is "performing well." The side effect to using older systems is all the bolt-ons you have to use to get newer functionality working. My current employer uses a midrange system from the early 80s to run the core of the operation, and it has tons of extra software and hardware riding on top of it to do modern things like access without terminal emulators and message queuing. The time to consider a replacement is when these systems become too unwieldy or brittle to manage. For example, if a transaction-processing system needs a daily FTP feed from some source, and it doesn't get it, will that blow up the whole system? If the answer is yes, it's time to fix the problem or replace the underlying system if it's bad enough.

    I'm very skeptical of anyone who comes in and says, barely looking at the existing system, that it needs to be ripped and replaced. A lot of it stems from the fact that IT hasn't matured all the way yet. People still come into the field knowing little more than the Java they were taught in school, and don't have the big-picture attitude you need to really understand IT. You may think I'm an old fart, but I'm really not. I've learned the basic rule of IT -- you're not there to play with computers and have fun. You're there to make sure the business you support is able to do their work, preferably without calling you in every night to fix something.

  • Why would you want to replace 2 billion lines of working COBOL code?

    Easy...COBOL, while still in use and working well now, is not a language which is still growing...it is shrinking. Nobody would choose COBOL for a new project. The only jobs left for COBOL programming are maintenance. That means that there are no "exciting" COBOL jobs, and that only coders who learned COBOL, not engineers who are good at design or interested in building/maintaining good design.

    "So it's old and all you can get is people wi

  • I am a physicist but i think about switching the job to softrware development. COBOL is one of the languages i consider to learn for a living (i wrote productive applications in pascal,basic,c,assembler,perl,python.lisp,tcl,java,javascript,tcl,matlab,shell languages). Any comments how to start from this starting point?

  • When you want to be sure your financial calculations are accurate and not subject to floating point shenanigans

  • by Orion Blastar (457579) <orionblastar@@@gmail...com> on Saturday April 11, 2009 @03:23PM (#27543887) Homepage Journal

    My doctors discover that I have cancer and I will die, unless they put me into suspended animation and freeze me and the cancer so it does not spread until they can find a cure. But I have to sign an agreement that in order to pay for the cure and suspended animation I have to agree to work a job for the company that sponsors the cure and freezing.

    So I sign, and wake up in 9999 AD and they cure my cancer.

    I go to human resources for the company that paid for my freezing and cure and the HR director says:

    "So, Orion Blastar, I see you are a computer programmer and I also see that you know COBOL. We have a Y10K problem and need to make all of our programs work with five digit years. You'll be spending the rest of your life debugging COBOL programs to solve the Y10K problem."

    COBOL is not a dead language, it is an undead language like a Zombie or Vampire it is very hard to kill, as well as hard to work with.

  • by Maxo-Texas (864189) on Saturday April 11, 2009 @03:27PM (#27543917)

    At my company, all the time, the salesperson says "this new system will solve all your problems-will take 12 months to implement max, and be very cheap to maintain!!!"

    And the executives bite on it every time.

    And lately, everything is going to "end of life". Excuse me? Cobol, RPG, and Java don't seem to go to end of life. I'm sure there are others.

    It's really hard when you finish a 2 year project (should have been 1 in salesman land), roll it out and debug production issues over the next year, and then 2 years later, it is "end of life".

    I guess it's good to keep me employed but it seems kinda dumb.

    "Wierd.. slasdot requires a wait... "It's been THREE minutes" --- at work it varies -- sometimes at 16 seconds it lets you posts while other times it takes the full minute.

No skis take rocks like rental skis!

Working...