Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Businesses IT Technology

COBOL Turning 50, Still Important 314

Death Metal writes with this excerpt from a story about COBOL's influence as it approaches 50 years in existence: "According to David Stephenson, the UK manager for the software provider Micro Focus, 'some 70% to 80% of UK plc business transactions are still based on COBOL.' ... Mike Gilpin, from the market research company Forrester, says that the company's most recent related survey found that 32% of enterprises say they still use COBOL for development or maintenance. ... A lot of this maintenance and development takes place on IBM products. The company's software group director of product delivery and strategy, Charles Chu, says that he doesn't think 'legacy' is pejorative. 'Business constantly evolves,' he adds, 'but there are 250bn lines of COBOL code working well worldwide. Why would companies replace systems that are working well?'"
This discussion has been archived. No new comments can be posted.

COBOL Turning 50, Still Important

Comments Filter:
  • by Samschnooks ( 1415697 ) on Saturday April 11, 2009 @11:26AM (#27542181)

    Why would companies replace systems that are working well?

    Because the director of IT or whatever his title is will want to be able to put on his resume that HE moved a company from a "Legacy" and "Outdated" system to a modern "web based solution" that enables "greater productivity" among the workforce saving "millions of dollars". Now, he can put that on his resume and go for the CTO, CIO, or whatever jobs.

    I've seen it and it works.

  • by Anonymous Coward on Saturday April 11, 2009 @11:43AM (#27542293)

    COBOL's still around, will perl last as long?

    For instance there is now OO COBOL but the only people that use it are COBOL programmers who are stuck, perhaps because of their company's dictates, perhaps by choice, with COBOL. In the same way perl may be heading towards irrelevance wrt "mainstream" language. I've written commercial perl in the past, it was a pain then and it's still a pain now. The thing is that now there are alternative languages in the same space (python, ruby etc., php for web side) that do the "perl thing" better than perl.

    Perl was great, it introduced many people to programming, just like COBOL did. But now it's time to move on. To move on to languages that learnt from perl, that improved on it, that don't have to drag around a syntax and culture that values neat tricks and trying to guess what the programmer really meant over providing the needed building blocks and letting you build code that does what you say, not what it thinks it heard you say. Or even, dare I say it, to move on to languages outside the perl family for some programming and choose the right tool for the job for a change.

    I'd prefer to think of this as provocative rather than a flame, there is a difference you know.

  • by thethibs ( 882667 ) on Saturday April 11, 2009 @11:53AM (#27542363) Homepage

    Nice story, but it doesn't say anything about COBOL.

    I have a similar story about 30 programmers who spent two years writing java code and delivering nothing useful because the requirement called for two different architectures: one best served with a batch system, the other best served with a real-time system. What they need is COBOL and C, but what they know is java and struts. It's been another four years since I ran screaming from the building and they still haven't delivered anything useful.

    Inept programmers will screw things up in any language.

  • by trash eighty ( 457611 ) on Saturday April 11, 2009 @11:54AM (#27542369) Homepage

    no if you fail you just get promoted out of harms way

  • Adequate Languages (Score:4, Insightful)

    by j. andrew rogers ( 774820 ) on Saturday April 11, 2009 @11:54AM (#27542375)

    COBOL is a perfect example of an "adequate" language, like most programming languages that are in common use. Adequate languages linger forever if there is a tool chain to support them because there is little in the way of economic rationale for replacing them.

    The reason adequate languages proliferate over the long term is that only inadequate languages get replaced, and "ideal" languages become hopelessly politicized by purists and ideologues, leaving the adequate, practical, boring languages as the sound business choice. It is a real-world example of perfect being the enemy of good enough, but for economic reasons good enough always wins.

  • by plopez ( 54068 ) on Saturday April 11, 2009 @11:58AM (#27542391) Journal

    If and only if they're able to pull it off. It's also a nice way to end your carreer if you fail.

    no, you collect a bonus and bail out before it crashes in flames leaving someone else holding the bag. See also the bank failures for examples of this. See a pattern? I hate MBAs.

  • COBOL, not so bad (Score:4, Insightful)

    by Ancient_Hacker ( 751168 ) on Saturday April 11, 2009 @12:07PM (#27542437)

    As just one data point, a certain place replaced a COBOL app that ran with millisecond response time just fine on a 2 megabyte 1 MIPS mainframe, replaced it with a spankin' fresh Java app that ran about 2000 times slower on an 8 gig 16-CPU, 800MHz very expensive water-cooled mainframe.

    Now it could have been due to having a bunch of neophyte Java weenies doing the coding, but I'm just sayin', when there's three orders of speed and 3.5 orders of RAM, there may be something significant in the implementation language.

  • by TheRaven64 ( 641858 ) on Saturday April 11, 2009 @12:14PM (#27542487) Journal

    You see that kind of thing from lots of programmers who only know one language well. This is why a good programmer always keeps up with modern architectures. I've seen C programmers who put things in globals rather than passing them on the stack, because that was faster before caching (now it breaks locality of reference, and moves something that was in a register or on the stack to an indirect reference where it needs extra loads and causes extra cache churn).

    I've seen programmers who grew up with Pascal carefully turning multiplies into sequences of adds and shifts. Great, except that something like the Athlon can do two multiplies in parallel, but only one shift at a time (because most code contains a lot more multiplies than shifts), stalling the pipeline.

    Another common issue is aggressively testing for potential special cases for optimising, ignoring the fact that branches are very expensive on most modern CPUs and the cost of the checks is now often greater than the saving from shortcutting the special cases.

    Java programmers are not immune to this, and often optimise based on old versions of the JVM. One favourite is to add finally everywhere, making the code very rigid, believing this makes it faster. In a modern JVM, finally is completely ignored; the VM already knows if a class is not subclassed and will do the same optimisations whether it is declared finally or not.

    There's a reason why the rules for optimisation are:

    1. Don't.
    2. Don't yet (experts only).

    If you write good algorithms, your compiler will usually produce reasonable code. If this isn't fast enough, then make sure you really understand how your VM and target CPU work, before you try optimising. The experts only part isn't a joke.

  • by 1c3mAn ( 532820 ) on Saturday April 11, 2009 @12:18PM (#27542507)

    Why is no one updating Cobol code? Because the skill to interact with other systems is disappearing.

    As a Mainframe Utilities Programmer I hear it from customers all the time. "We can't touch that system because the guy who wrote it retired." System here just represent the code, but also the server backend stuff like database design.

    I have heard stories of an IT department being 10 man team. In the 80s that team had everyone dedicated in maintaining the mainframes. Now, they still have 10 people but only 1 person is there to work on the Mainframe.

    So now you have code from the 70s that no one understands, running a mission critical application, and you think the IT manager is going to touch it? He is praying it doesnt break on his watch or he might get a call from the COO. Even if it breaks, it is better to patch it then rewrite it because the database behind it is so vital to all the rest of the application that it cant be changed either.

    The issue mainly is that no one is teaching old skills anymore. Skills that are still required, but really arent 'sexy' for young college students to learn. Even the name "Mainframe" has grandfather connotation to it while if people actually looked at the IBM Z Servers, one would see how high tech these systems actually are.

  • by berend botje ( 1401731 ) on Saturday April 11, 2009 @12:19PM (#27542515)
    When you have working code in COBOL, really battle-hardened proven-beyond-doubt COBOL code, would you really trust a mechanical translation into another language?

    I wouldn't, no way! And there is no way to completely test the new code either, as the specs never existed or at least are missing and/or outdated.

    I'd rather keep the working COBOL code. Even if that means I have to deal with grumpy old geezers to maintain said code.
  • Why replace? (Score:3, Insightful)

    by Jane Q. Public ( 1010737 ) on Saturday April 11, 2009 @12:29PM (#27542589)
    Hmmm. I suppose it could be that between the times that COBOL was developed (Grace Murray Hopper, FTW), and today...

    There is more to a language than just being Turing-complete. There is syntax and geneeral usability, for example.

    I know that there are still jobs out there for COBOL programmers. And it makes me sad.
  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Saturday April 11, 2009 @12:49PM (#27542747) Journal

    Sure, greater productivity is one benefit, but the language is completely irrelevant for that.

    It's about how flexible the system will be when you have to change it. And you will -- that's the whole point of software, that it is soft, and changeable.

    Old Cobol apps generally are not flexible [msdn.com]. (stolen from this comment [slashdot.org]). It's worth mentioning that a decent object-oriented system would've gone a long way towards eliminating this problem -- any idiot can stuff a date into a Date class, which then encapsulates all the date-handling code.

    Maybe some of it is very well designed. Drupal [drupal.org] proves that you can write good, elegant code in any language, even if you are fighting the language and reinventing the wheel every step of the way. But the converse [slashdot.org] is also true -- you can write bad COBOL in any language.

    My point here is that when changing minimum wage is even a tech [sacbee.com] story [slashdot.org] at all, that program is really fucking broken*. It's very likely too broken to be patched. Really, we've learned things in the past 50 years, and not all of them are buzzwords or ways to waste five times the RAM.

    Not all of them have anything to do with programming languages, either, but if you're building a new system, and you have a choice of languages, why would you choose COBOL?

    I agree in spirit. But what people have to remember is, if it ain't broke, don't fix it. So, if it's broke, fix it!

    * I apologize for the profanity, but any program that can't change a fucking constant is a broken program. Or did they copy/paste 6.55 all over the place?

  • books? (Score:2, Insightful)

    by innocent_white_lamb ( 151825 ) on Saturday April 11, 2009 @12:49PM (#27542753)

    Most COBOL books and tutorials are unavailable, out-of-print, or just plain gone.
     
    What resources still exist for someone who wants to learn COBOL?
     
    http://www.opencobol.org can easily be installed on Fedora Linux (for example) with a simple "yum install open-cobol", but what comes next?

  • Does this add up? (Score:5, Insightful)

    by Allicorn ( 175921 ) on Saturday April 11, 2009 @12:53PM (#27542787) Homepage

    How does this add up?

    1. Around a third of UK companies say they have even at least one COBOL program somewhere in their enterprise.

    2. Around three quarters of all UK electronic business is coded in COBOL.

    I'm aware that there are allegedly pockets of COBOL here and there with some fairly significant nuclei of usage within certain business sectors but seriously... 80% of all electronic transactions?

    Monster.co.uk search for single keyword in title, 11th Apr 2009:

    Java: 173 hits
    C++: 142 hits
    PHP: 95 hits
    Perl: 39 hits
    COBOL: 1 hit

    This doesn't seem to suggest a great demand for COBOL coders at present which - one would think - suggests little use of COBOL.

    I've heard this "the world secretly runs on COBOL" story countless times over my career, but seldom seen more than a few lines of COBOL actually deployed in businesses I've worked with. Is the whole thing just a weird industry myth?

  • by slashdot_commentator ( 444053 ) on Saturday April 11, 2009 @12:54PM (#27542791) Journal

    Its not just the mythical "mission critical" aspect that keeps businesses dependent on COBOL. MANY of those programs required either financial analysts to "vet" the COBOL program, or lawyers to "vet" the COBOL program complied with laws (privacy, methods of determinations, etc.).

    Not just are you putting in the cost to refactor the program from scratch, not just are you risking a bug costing your company hundreds of millions to billions of dollars, but you also have to take in the costs of expensive NON-programmers to "bless" the new program.

    Then also realize that the new whizbang technologies like SQL and java will RUN LIKE A DOG compared to the COBOL program. That's because mainframes are optimized data I/O machines. They're not great for making intense calculations or retrieving indexed relationships, but they are a BEAST when it comes to pulling out gigabytes of user data, and then making simple calculations and updates to each. It also sounds like top notch COBOL programmers programmed to the machine for efficiency. That's not really done anymore by generic programmers.

    New shops don't have to consider COBOL. But any large company (and gov't) could potentially take a huge hit on their finances (in legal issues) if refactor project has a bug. You can roll the dice, or you can just pay another couple million/year and hope nothing ever forces you to consider replacing your legacy COBOL programs that no one knows how they work, or how to change them.

  • by ErichTheRed ( 39327 ) on Saturday April 11, 2009 @01:09PM (#27542887)

    I've had an interesting run through IT environments so far. Each one of my employers has successfully used what would be called a "legacy system" to do core business transactions. I'm not old by any means, but I definitely see no reason to get rid of systems that are performing well.

    The qualification for that statement, of course, is "performing well." The side effect to using older systems is all the bolt-ons you have to use to get newer functionality working. My current employer uses a midrange system from the early 80s to run the core of the operation, and it has tons of extra software and hardware riding on top of it to do modern things like access without terminal emulators and message queuing. The time to consider a replacement is when these systems become too unwieldy or brittle to manage. For example, if a transaction-processing system needs a daily FTP feed from some source, and it doesn't get it, will that blow up the whole system? If the answer is yes, it's time to fix the problem or replace the underlying system if it's bad enough.

    I'm very skeptical of anyone who comes in and says, barely looking at the existing system, that it needs to be ripped and replaced. A lot of it stems from the fact that IT hasn't matured all the way yet. People still come into the field knowing little more than the Java they were taught in school, and don't have the big-picture attitude you need to really understand IT. You may think I'm an old fart, but I'm really not. I've learned the basic rule of IT -- you're not there to play with computers and have fun. You're there to make sure the business you support is able to do their work, preferably without calling you in every night to fix something.

  • by david.given ( 6740 ) <dg@cowlark.com> on Saturday April 11, 2009 @01:11PM (#27542901) Homepage Journal

    One favourite is to add finally everywhere, making the code very rigid, believing this makes it faster.

    I think you mean final here, no? finally does something else.

    1. Don't.
    2. Don't yet (experts only).

    Very true.

    I'd also add the additional rule: You don't know it's slow until you've benchmarked it. All too often I have seen, and I should add that I've perpetrated this myself, people spend ages painstakingly optimising parts of a system that felt like they were cause speed problems, when actually they weren't.

    I once spent about two or three months designing and implementing a clever caching system for properties in a UI framework. It was subtle and complex, and I was very proud of it, and it was a total waste of time. We eventually threw it all away and stored the properties in an AVL tree. Yes, this involved looking up the properties in the inner loops of all the UI component redraw methods, but compared to the amount of work of touching every pixel on the screen, this was trivial. And it increased flexibility, reduced complexity and code size, and vastly improved maintainability.

  • Inhouse web apps? (Score:4, Insightful)

    by ClosedSource ( 238333 ) on Saturday April 11, 2009 @01:17PM (#27542935)

    In a lot of environments in-house web apps would only serve the purpose of being trendy. I suspect a company who is smart enough to keep their working code around would probably resist the temptation of unnecessary "webizing" their internal apps.

  • by 1c3mAn ( 532820 ) on Saturday April 11, 2009 @01:25PM (#27543013)

    Why do people always think it is about speed. It isnt. Mainframes are not super computers. They dont really need to be able to have very high processing power, what makes mainframes great is their storage movement capabilities. 2 billion transactions a day is a LOT of data to be thrown around, and a mainframe can easily handle that.

    Beowulf Cluster? Not so much.

    Reliability and redundancy is unmatched in a mainframe. There is a reason why financial institutions run Mainframes. Because that data really cant be lost.

    Mainframe was called Dead in the early 90s and companies tried to switch to open systems and FAILED miserably. 20 years later and most Fortune 1000 companies still use a Mainframe, because nothing comes close to what they do.

  • by coryking ( 104614 ) * on Saturday April 11, 2009 @01:28PM (#27543027) Homepage Journal

    You know, I guess it depends. I wouldn't port an application from Perl to something else. But I'm not sure I'd base a new project on Perl either.

    There are some things about perl that could be fixed that might change my mind:

    1) Dump Perldoc and liberally rip from javadoc [sun.com] and XML comments [microsoft.com]. I know both of these got their start from perldoc, but perldoc needs to catch up.
    2) Make sure the IDE actually uses said docs. Once your IDE's intellesense sucks up your comments and uses them while you are typing in a method, you are rewarded for documenting your stuff. Nothing like positive feedback to encourage good habits.
    3) Finish EPIC [epic-ide.org]. Perl needs IDE support. Syntax coloring and auto-indentation does not make for an IDE.
    4) Get rid of this my $blah_param = shift; crap and start making function declarations that work like everybody else: sub myDopeFunction($blah_param){}. This coupled with perldoc's suckage lead to hard to maintain code
    5) Give a couple million to the Template::Toolkit guys. They rock.
    6) Mystery option.

  • by Pig Hogger ( 10379 ) <pig.hogger@g[ ]l.com ['mai' in gap]> on Saturday April 11, 2009 @02:04PM (#27543335) Journal

    If by "high tech" you mean "pay $250,000 for the same speed you can get on a commodity desktop in 6 months"... then sure.

    People realized how stupid it was to waste money on mainframes when commodity hardware is moving so quickly.

    "Commodity" desktops will never be able to process 2500 simultaneous transactions in the same database. Even in Beowulf clusters. This is why there is still some big iron around.

  • by McSnarf ( 676600 ) * on Saturday April 11, 2009 @02:15PM (#27543413)
    Hmmm...

    This revived some slightly old memories.
    I remember a talk by the local FORTRAN compiler guru in the mid-70s.

    After talking about some intricacies of the IBM FORTRAN H compiler, he gave some examples of the compiler's abilities. Summarizing it with: Don't try too hard to optimize. Leave it to the compiler. It usually knows what it is doing.

    And that sums it up rather nicely.
    I'd rather work on code written by someone else who concentrated on writing readable code than on code written by someone trying to be clever.

    (Note to people born after around 1980: Yes, we too believed it would be cool to write cool, complex code, with the odd assembler routine thrown in for good measure (which, btw. didn't really save much time), badly documented and demonstrating our superiour coding abilities. Looking back, we were idiots who should have been fired. Don't repeat our mistakes, OK?)

  • by Just Some Guy ( 3352 ) <kirk+slashdot@strauser.com> on Saturday April 11, 2009 @02:28PM (#27543489) Homepage Journal

    I'm sure you're right for new code written today. I am equally sure that programmers 30 years ago did not have that luxury. Seriously, a 4GB standard allocation? An IBM 4341 from 1979 had up to 16MB of RAM [wikipedia.org]. The article quotes people saying that code 30-50 years old still works well, but I would posit that code that old was probably written with assumptions that are no longer sensible today, and at the very least should be re-evaluated instead of being allowed to blindly accrete new layers.

  • by Maxo-Texas ( 864189 ) on Saturday April 11, 2009 @03:27PM (#27543917)

    At my company, all the time, the salesperson says "this new system will solve all your problems-will take 12 months to implement max, and be very cheap to maintain!!!"

    And the executives bite on it every time.

    And lately, everything is going to "end of life". Excuse me? Cobol, RPG, and Java don't seem to go to end of life. I'm sure there are others.

    It's really hard when you finish a 2 year project (should have been 1 in salesman land), roll it out and debug production issues over the next year, and then 2 years later, it is "end of life".

    I guess it's good to keep me employed but it seems kinda dumb.

    "Wierd.. slasdot requires a wait... "It's been THREE minutes" --- at work it varies -- sometimes at 16 seconds it lets you posts while other times it takes the full minute.

  • by mapkinase ( 958129 ) on Saturday April 11, 2009 @04:25PM (#27544307) Homepage Journal

    He might also have a COBOL-unrelated disease called "optimisis", when a programmer tries to write optimize code right from the beginning 100%.

    It is a valuable practice if you are using conventionally accepted optimization for common things, but it leads to disastrously unscalable code heavily based on logic trees.

    I work with a biologist who gives me decision trees and modifies them every day. If I would try to optimize the logic in those trees by rearranging logical expressions and making shortcuts, I would be able to apprehend zilch in my code in a week.

  • by HardWoodWorker ( 1032490 ) on Saturday April 11, 2009 @04:39PM (#27544391)
    If you think writing code for Java 5+ is "fashion," I don't think you're a very experienced programmer.

    Generics and java.util.Concurrent are mandatory for scalable systems nowadays. Java 5 added a lot of features to make code more reliable, not mindless fluff.

    If you're not using the latest features of the language where appropriate, you're doing your employer a huge disservice.

    Sorry, buddy, but you need to learn your technology. Just because people will employ you to write on outdated systems, doesn't mean you're doing your company a favor by refusing to learn the latest technologies and writing your code to run on JDK 1.4.
  • by mcrbids ( 148650 ) on Saturday April 11, 2009 @04:43PM (#27544409) Journal

    There's a reason why the rules for optimisation are:

          1. Don't.
          2. Don't yet (experts only).

    If you write good algorithms, your compiler will usually produce reasonable code. If this isn't fast enough, then make sure you really understand how your VM and target CPU work, before you try optimising. The experts only part isn't a joke.

    Except that there's a clear and definite time to optimise - when performance is in the crapper!

    Just 2 days ago, I heard complaints about a well-written (but very old!) report that was still taking as long as 5 minutes to run when accessing a large data set. Taking a look, I found that it was using an old template system for output using regex, and for a complex report, the regex was killing the server. So I replaced it with a different template engine using simple string replacement, and reduced > 5 minutes reporting time to about 15 seconds. Further looking there found a simple 3-element index in the database cut the time down to 2 seconds.

    Now the report comes up INSTANTLY in most cases.

    Optimising is necessary - but only AFTER the fact!

  • by windsurfer619 ( 958212 ) on Saturday April 11, 2009 @07:43PM (#27545387)

    I hear Java is like violence. If it doesn't work, use more of it.

    --
    -stolen from someone's sig

  • Re:Why replace it? (Score:1, Insightful)

    by Anonymous Coward on Saturday April 11, 2009 @09:21PM (#27545741)

    Sure, but then I'd be a COBOL programmer in New York.

    I could be a garbageman and have good job security, too. I'd also deal with less garbage and not have to move to NY.

Intel CPUs are not defective, they just act that way. -- Henry Spencer

Working...