Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?

COBOL Celebrates 50 Years 277

oranghutan writes "The language used to power most of the world's ATMs, COBOL, is turning 50. It also runs about 75 per cent of the world's business applications, so COBOL should be celebrated for making it to half a century. In cricketing terms, that's a good knock. The author says: 'COBOL's fate was decided during a meeting of the Short Range Committee, the organization responsible for submitting the first version of the language in 1959. The meeting was convened after a meeting at the Pentagon first laid down the guidelines for the language. Half a century later, Micro Focus published research which showed people still use COBOL at least 10 times throughout the course of an average working day in Australia. Only 18 per cent of those surveyed, however, had ever actually heard of COBOL.'"
This discussion has been archived. No new comments can be posted.

COBOL Celebrates 50 Years

Comments Filter:
  • by Anonymous Coward on Monday September 21, 2009 @09:10AM (#29490119)
    Oh God, you're one of those. Look junior, contrary to popular opinion, the majority of computers in the world does not run Windows. PCs are a minority.
  • by Old97 ( 1341297 ) on Monday September 21, 2009 @09:15AM (#29490171)

    The wording is misleading. Perhaps it's more accurate to say that 75% of business computing by value depends on COBOL. I've worked at a number of places in the financial services industry and have a lot of friends who do as well. All of our core business functions are still in COBOL. A lot of the data is still in VSAM, IMS and Model 204 legacy stores. A lot of what is in DB2, an RDBMS, is VSAM files converted directly to tables instead of truly relational databases.

    The fun stuff (Java, .NET, Web) runs the outward facing services and peripheral functions, but claims processing, credit card reconciliation, billing, accounting, etc. is still in COBOL. The computer industry press spends a lot of time admiring the new chrome and fins and that new built-in radio with FM, but business is still powered by the COBOL drive train running on mainframes.

    Even the clued in managers want to get off of it and onto more flexible systems and more productive languages, but it's too scary (risky) because they are afraid to break something. No one knows what the business rules are because they are embedded hither and yon in COBOL programs.

  • by gardyloo ( 512791 ) on Monday September 21, 2009 @09:16AM (#29490179)

    Come on, I've teed it up for you, now knock it out of the park!

    Maybe we can make a touchdown from that half-court shot, as you so nicely handicapped the goalie.

  • Not So Bad (Score:5, Insightful)

    by Ancient_Hacker ( 751168 ) on Monday September 21, 2009 @09:29AM (#29490301)

    COBOL did a lot of things right, things that a lot of modern languages ignored.

    Little things like:

    * Having a manufacturer and machine and OS-independent standard.
    * Quasi human-readable code.

    that said, it's just as easy for numbskulls to write bad COBOL as to write bad C++ or bad Ruby.

  • by PhunkySchtuff ( 208108 ) <kai@automatica.c[ ]au ['om.' in gap]> on Monday September 21, 2009 @09:34AM (#29490343) Homepage

    Can we get rid of it? Surely COBOL has developed faults over time, just like a train that's been running since 1850 would have.

    Or, just maybe, it's proven itself to be stable, reliable, well-understood, suited to the purpose for which it's used and relatively bug free?

    Nah, of course not. It's old and busted. Bring on the new hotness.

  • by drinkypoo ( 153816 ) <martin.espinoza@gmail.com> on Monday September 21, 2009 @09:35AM (#29490361) Homepage Journal

    I think what we're arguing over here is the application of the English language. As the sentence is written, it is probably incorrect. Due to logarithmic growth, it is virtually impossible that the numbers come out right. If one said that 70% of business transactions were facilitated through COBOL at least in part then it might be true, because of all the legacy code still doing its job out there at banks and other financial institutions.

    Mainframes are breathing their last gasp; they will soon exist only in cases where you need very fast access to all of very large data sets. And honestly, clustering filesystems and databases are solving that problem too. Clusters will rule nearly every aspect of large computing because they are the only thing more reliable than a mainframe.

  • by MBGMorden ( 803437 ) on Monday September 21, 2009 @09:37AM (#29490391)

    I have to agree. We recently switched parts of our tax billing software from an old COBOL system running on an AS400 to Windows. There were some legitimate concerns involved - creating a graphical sketch wasn't possible on a text-mode system, and tax laws change very frequently, and the old system was just becoming difficult to maintain.

    So, we switched to a Windows app with a SQL Server backend. FWIW the database backend has been rock-solid, but the actual client? It's junk. That old clunky COBOL system might have been awkward to use and a bit long in the tooth, but it NEVER crashed, and its mistakes were minimal to say the least. This new Windows system crashes constantly (including crashing if you work too fast - yeah I literally have to do a "one one-thousand" count when switching between properties or the client will lock up), and it goofs up the data frequently enough that in another 5 years I think our data will be reduced to an unreliable mess.

    Truthfully though, it's not the fault of Windows, or whatever language the newer apps are written in (Visual Basic in the case of our new pile o' junk). You can certainly write good stuff in new languages on new systems. I think it's a two-fold problem. One, the complexities of a GUI makes codes many times more intricate, making the job more difficult (and more error prone), but also, programmers today look at problems differently. They program for "features" first so they can give a good sales presentation. In the old days it seems like a reduced feature set was fine so long as your code was done right. That's not the case anymore, which is a shames, because on most of our newer systems we use MAYBE 20% of of the features included, and I'd gladly trade the other 80% for stability and accuracy.

  • by sgbett ( 739519 ) <slashdot@remailer.org> on Monday September 21, 2009 @09:44AM (#29490499) Homepage

    I don't think its the languages that are getting worse...

  • Longevity (Score:5, Insightful)

    by wandazulu ( 265281 ) on Monday September 21, 2009 @09:48AM (#29490563)

    I worked at a company that had a Cobol-based program that went live back in 1969. A team of programmers had kept it going ever since. Shortly after I started (mid 1995), I was in a meeting when one of the Cobol programmers mentioned that so-and-so had died over the weekend. Everybody started talking about her, what a great person she was, etc. After the meeting, I asked who she was, and was told that she was the last surviving member of the original team that wrote and deployed the application. When the system was finally shut down back in 2003 or so (I had long since left, but still had some contacts there to tell me what was going on), I really felt weird about hearing it; here was this thing that had outlived its creators (and some of the later maintainers), and now it was gone too.

    Isn't it strange how computer software is both unbelievably ephemeral, yet also incredibly long-lived. I've worked on both sides and I'm not sure which is more fulfilling; it apparently took several years to write the aforementioned Cobol program, but it outlived its creators. I wonder what a programmer on something like, say, Madden, would feel, knowing that this thing they're working so hard on will be totally supplanted by the next version, next year.

    Strange business, this computing machinery. Strange indeed.

  • Re:Not So Bad (Score:3, Insightful)

    by ChienAndalu ( 1293930 ) on Monday September 21, 2009 @10:07AM (#29490813)

    * Quasi human-readable code.

    Human readability doesn't count. You have to understand it too. Cobol uses English words instead of a more concise syntax with special characters, and is therefore more difficult to understand. Mathematical equations and chemical formulas have their special syntaxes, and computer programs should have them too.

    that said, it's just as easy for numbskulls to write bad COBOL as to write bad C++ or bad Ruby.

    Obvious. But can you show me *good* COBOL?

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Monday September 21, 2009 @10:10AM (#29490863) Journal

    they will soon exist only in cases where you need very fast access to all of very large data sets.

    Which is quite often.

    And honestly, clustering filesystems and databases are solving that problem too.

    Except that clustering filesystems almost always have to compromise on one of the ACID properties. For example, Amazon's Dynamo and CouchDB are highly available, redundant, and fast, but allow conflicts, assuming the application will correct for them. Ok, but that fails for a banking application -- if I were to withdraw my entire balance from two different nodes simultaneously, I'd have a massive overdraft, but I'd also have the money.

    You could imagine trying to shard it instead, but what happens when you transfer money between two shards? You still need a transaction, only now it needs to be synchronized between two nodes. What do you do? Do you lock both nodes at once? Now you've got a possibility of deadlocks.

    Clusters will rule nearly every aspect of large computing because they are the only thing more reliable than a mainframe.

    Reliability can be defined in several ways. Clusters are more available than a mainframe -- if your mainframe goes down, you're down. But clusters are less consistent than a mainframe, unless you're willing to take such massive hits from synchronization that the performance advantage is gone.

    For the vast majority of applications, some inconsistency is acceptable. Take Amazon's example -- if you tell one node to add item A to your cart, and another node to add item B, producing two conflicting versions of your cart, the cart application should be smart enough to merge them. The only synchronization needed is checkout, and here, all you'd need to do is refer to a specific version of that record in the form that's submitted.

    But for applications which can't tolerate that inconsistency, unless there's some clustering method I'm unaware of, you're still going to want something like a mainframe.

  • by MikeBabcock ( 65886 ) <mtb-slashdot@mikebabcock.ca> on Monday September 21, 2009 @10:13AM (#29490907) Homepage Journal

    What are you, a college student? You honestly believe anywhere near 75% of the world's business applications runs on Windows?

    Microsoft only wishes it had the big iron servers that do the fun stuff like banking and credit card processing.

  • by SanityInAnarchy ( 655584 ) <ninja@slaphack.com> on Monday September 21, 2009 @10:15AM (#29490927) Journal

    One, the complexities of a GUI makes codes many times more intricate, making the job more difficult (and more error prone),

    I doubt this is much of an issue. GUIs can certainly be abstracted to the point where it's not an issue.

    programmers today look at problems differently.

    Well, some programmers. (Hire me!)

    I do agree, but recent programmers certainly don't have a monopoly on WTFs. I think you've got something of the success effect here -- that is, your old COBOL system was necessarily reliable, because if it wasn't, it wouldn't have lasted this long. So the old COBOL apps that are still in production are likely at least somewhat reliable.

    But reliable and maintainable are different things. I'd argue rewriting them just to make them more maintainable -- carefully, of course, so they're reliable, but you also want to be able to open them up twenty years from now and make a minor change without pulling your hair out.

  • by MasterOfMagic ( 151058 ) on Monday September 21, 2009 @10:31AM (#29491167) Journal

    Stable, reliable, well-understood, and bug-free are true of many more recent languages.

    <sarcasm>I didn't know that more modern languages had a 50 year history of reliability, scalability, and security to process transactions 24/7. Live and learn I guess...</sarcasm>

    Further, the cost of developing, debugging, and testing the replacement in any language (including redeveloping the system from the ground up in COBOL) is quite expensive, no matter what language you choose. Likely more expensive than the big iron and software environments necessary to run the old code that has worked reliably for the last 20 to 40 years.

  • by bostei2008 ( 1441027 ) on Monday September 21, 2009 @10:33AM (#29491195)

    and still very much alive.

    You cannot kill it (quite literally, mainframes have a MTBF of what, 40 years? How is your windows box doing?).

    You can sneer at it, disregard it, ridicule it. But it is still there after decades of getting bad rep and no fresh blood. That is actually pretty impressive.

  • by Bakkster ( 1529253 ) <Bakkster,man&gmail,com> on Monday September 21, 2009 @10:43AM (#29491329)

    Stable, reliable, well-understood, and bug-free are true of many more recent languages.

    Yup, JAVA never crashes, C# is easily understood, C++ is free of bloat, and interpreted languages run faster. /s

    I dispute that it's the best suited to the purpose for which it's used. Show me a construct in COBOL that wouldn't be much easier in something modern -- even Java, if we have to.

    COBOL isn't used because it's easier to write than your JAVA or other new language. It's used because it was designed with business transactions in mind and is reliable. If you have to give up reliability or predictability to gain readability or 'modern-ness' (as has often been my experience with JAVA), it's not a good fit for businesses who can hire additional programmers to produce reliable code.

    Regardless, if COBOL works well for the application already, then some modern language would have to one hell of a lot better to rewrite these applications for the incremental improvement to be worth the cost and risk involved with a complete rewrite.

  • Re:Longevity (Score:3, Insightful)

    by CharlyFoxtrot ( 1607527 ) on Monday September 21, 2009 @11:24AM (#29491859)

    Well, ideally, they'd be able to get excited about doing it differently, maybe doing it better.

    Le mieux est l'ennemi du bien. (The better is the enemy of the good.) - Voltaire

    Might be anecdotal but I've seen a few perfectly good system thrown out in the search for an ephemeral "better" that then failed to materialize.

  • by Anonymous Coward on Monday September 21, 2009 @11:27AM (#29491893)
    Don't bother he'll never understand or believe that one. If he can't play Bejeweled or some first person shooter on it he'll never believe it's a computer.
  • by orzetto ( 545509 ) on Monday September 21, 2009 @11:58AM (#29492317)

    Visual Basic

    I think I see your problem here...

    One, the complexities of a GUI makes codes many times more intricate [...]

    Here's the rest of your problem. A GUI must never bump into a difficult or mission-critical algorithm. That's supposed to be its own library, which is accessed by the GUI through a clean and solid software interface. This is a major architectural fault: a Big Ball of Mud [laputan.org], and some languages encourage that more than others.

    My suggestion would be: get a language with a lower density of script kiddies, sufficiently popular and object-oriented (Python, C++, Java, ...), get some good programmers with proven track record, and rewrite the client. Specify that you want all functions and variables documented, and test suites; if they say "that will cost you more", show them the door. If they say "we do it anyway", that's a good sign.

  • tags (Score:3, Insightful)

    by smoker2 ( 750216 ) on Monday September 21, 2009 @12:15PM (#29492545) Homepage Journal
    Which prick tagged this !kobol ? Does it SAY kobol anywhere in the title, summary or article ? Or is so that you can easily search for kobol later and not find this story (in which case you could have saved your typing) ? FFS. Next time there is a story about google I'm going to tag it !poodle.
  • Re:Not So Bad (Score:3, Insightful)

    by orzetto ( 545509 ) on Monday September 21, 2009 @12:34PM (#29492787)

    A good example for a language that has certain things in place to prevent bad coding, is Haskell.

    I have been studying some Haskell in the past few months, and while I am in awe at Haskell's type system, and how you can write a language without for and while, and lazy evaluation, and functional purity, and partial application, and so on, there are two things that made me give up:

    1. Functional purity makes I/O a major mess. Monads are complex, unintuitive and unwieldy. I think I spent over one month only trying to warp my mind around that. It does not help that Haskellers keep repeating that "monads are really simple", there is a reason why they are the most asked-about topic in newsgroups.
    2. The worst thing is the Haskell community's coding standards. Single-letter variables are common, and I actually read some delirious rant about this being necessary "because it's so abstract you cannot name it". If it's so abstract you cannot name it, you abstracted too much, or you don't understand what you are doing. There seems to be a proliferation of operators, since Haskell foolishly allows to define new ones, even completely useless ones like $. Coding function with undocumented one-liners seems to be considered a virtue.

    Haskell has many good ideas, but it will never be a successful language because it's just too damn difficult, and no one in the Haskell community seems to care about it. I gave up when I thought that, even if I learned it, it would be utterly useless because few other people will bother to learn it. It also weighed in that there are so few software projects based on Haskell.

    Haskell is a great language to calculate factorials, but very little else.

  • by thethibs ( 882667 ) on Monday September 21, 2009 @01:47PM (#29493783) Homepage

    That assumes that programmers make well-informed, rational decisions at all times. Chuckle.

  • by Bakkster ( 1529253 ) <Bakkster,man&gmail,com> on Monday September 21, 2009 @05:09PM (#29496605)

    I've seen the JVM (that's one that is capitalized!) crash multiple times on more complex programs. One was a digital logic architect program written by a professor and several graduate students. After several years of development, it still crashed frequently on large projects, due to the JVM running out of memory. Crashes were common enough that we had to convince the professor to add an auto-save to make it less un-usable. Not saying it's due to Java specifically, but it did seem to be linked to the virtual machine.

    Regardless, I have yet to see a compelling argument to rewrite COBOL code in Java or any other 'modern' language. If the COBOL works, and will continue to in the near future, what is the benefit of a lengthy rewrite? Does the cost of design, development, verification, testing, and deployment provide at least as much benefit (in dollars/time) to the company? Is it worth the risk to replace a system that has worked well for decades with one that could introduce a serious compunding error that doesn't manifest itself until years down the line?

    Perhaps some modern language might be a best fit for a newly written program, but I'm still not convinced. While I can't say for certain that COBOL will compile to be any faster than the equivalent Java, my experience with high-level languages tells me that they tend to create bloated executables. This isn't a bad thing for a general-purpose language with expansive libraries that can do everything, but for these kinds of transactions a special-purpose language should be able to outperform.

    tl;dr You may have a case for new programs, but it makes no sense to rewrite working COBOL.

  • by Anonymous Coward on Monday September 21, 2009 @05:49PM (#29497089)

    One of the rare experiences in my life is to encounter a technology platform that just does its job, only that and nothing more with only reasonable downtime for explainable problems. I have a robot lawnmower that works like that -- just keeps the grass cut to the length I set. As long as I clean it off every so often and lube it on occasion it just keeps going. Pity my robot vacuums are not so well designed.

    My experiences with COBOL fall into the same catagory. While I managed to miss working with the stuff, I did take a Cobol course in school and except for the first program (missing period in the environment division) the rest all compiled and ran correctly the first time. Some years later I built a pre-compiler for my employer that modified Cobol programs to permit use of relational expressions with non-relational file architectures. While what I was doing was experimental, the Cobol side of it was stable and predictable -- a far cry from many other languages I have used over the years.

    It is a pity that we are in love with the new -- COBOL just does its job and if I had to support business programming again I would prefer a language that the boss could actually read over something new that spread the logic of the program over a vast pool of disconnected pieces. Functionality is where it is at -- especially if there is money involved.

  • Re:Not So Bad (Score:4, Insightful)

    by orzetto ( 545509 ) on Monday September 21, 2009 @06:57PM (#29497761)

    You can think of monads in general as a way to formally define types of computation that may have a context in which they operate.

    See, you are part of the problem. Is that supposed to be an explanation? If you cannot explain it in plain English, you do not really understand it yourself (I think I am quoting Feynman). Your explanation is also wrong (as most "explanations" about monads): Maybe has no context, and neither do lists. The general definition of monad has nothing to do with context, only with chaining.

    Try reading Real World Haskell? The text is available online.

    Read, until chapter 15 where I gave up around the Reader monad. I read the Thompson before. See below for horror example.

    OK. Let's take the simple example of map:

    f becomes function, e first, es rest.

    However map is pretty easy to grasp. For the promised RWH horror code, here is one from chapter 14 [realworldhaskell.org]:

    bindSt :: (SimpleState s a) -> (a -> SimpleState s b) -> SimpleState s b
    bindSt m k = \s -> let (a, s') = m s
    in (k a) s'

    There are multiple issues: the SimpleState is not a state (problem common to the State monad), but a state processor, a function. I really would like to inflict pain on whomever decided the name. The authors use m presumably for a monad, then (the gods knows for which reason) k for a function. They also use a single apostrophe (the smallest character they could find, arguably) to distinguish the new state from the old. Funny thing, they actually try to follow up with a more "readable" version, in which they make a sorry attempt at readability, which fails hopelessly (step seems a noun, but is actually meant as a verb).

    [...] helps keep the code using the combinators reasonably short [...]

    Argh, no, you must not keep the damn code short! The alpha and omega is keeping it readable. Ideally good code should read almost as plain English. Operators are not English, and should be used sparingly (I once thought it was silly to limit the operators in C++... now I see the wisdom). Surely you can be too verbose, but at a minimum the code must be self-explanatory. Well that's the professional coding world, where Haskell does not belong, and never will.

    Funny thing, I discovered that "terse" has a different meaning in English from what I expected. In my language (from which the word comes) it means clean, transparent, so I extrapolated "readable". Guess what, it took people praising Haskell for being "terse" for me to figure out that something was amiss.

No extensible language will be universal. -- T. Cheatham