Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Stats Programming

Are We Overestimating the Number of COBOL Transactions Each Day? (archive.org) 90

An anonymous Slashdot reader warns of a possible miscalculation: 20 years ago today, cobolreport.com published an article, according to which there are 30 billion Customer Information Control System/COBOL transactions per day. This number has since been cited countless times... [T]his number is still to be found in the marketing of most COBOL service providers, compiler vendors (IBM, Micro-Focus and others) and countless articles about how relevant COBOL supposedly still was. The article originally reported 30 billion "CICS transactions", but within 2 years it had already been turned into "COBOL transactions"...

The "30 billion" likely originates from a DataPro survey in 1997, in which they still reported 20 billion transactions per day. Only 421 companies participated in that survey. They actually scaled the results from such a small survey up to the IT-market of the entire world!

That same survey is also the source of many other numbers that are still to be found in the marketing of COBOL compiler vendors and articles:

- There are 200 billion lines of COBOL Code

- That's 60-80% of all the source codes in the world [sic]

- 5 billion lines of COBOL code are newly written each year

- There are 2 million COBOL developers in the world

- COBOL processes 95% of all "in person transactions", "ATM swipes" or similar

DataPro was bought by Gartner Inc. in 1997. Since then, all the numbers are reported to come "from Gartner". Only very early sources quote DataPro as their source.

Some of these numbers are obvious nonsense. The explanation for this is that DataPro had only surveyed mainframe owners. So it only says that 60-80% of all the source codes on mainframes are written in COBOL (which is plausible at least for 1997). And only 95% of all credit companies that have mainframes use their mainframes for processing credit card transactions. Considering the low participation, we are probably talking about 19 of 20 credit companies here.

This discussion has been archived. No new comments can be posted.

Are We Overestimating the Number of COBOL Transactions Each Day?

Comments Filter:
  • by UnknowingFool ( 672806 ) on Sunday January 31, 2021 @11:39AM (#61012172)
    I guess the point is maybe those numbers 20 years ago were not correct but who is relying on those numbers to represent today? Certainly systems have been updated and retired since then and the number needs revising. Also the banking world has changed with fewer cash ATM withdrawals, etc.
    • by Dr_Barnowl ( 709838 ) on Sunday January 31, 2021 @11:43AM (#61012180)

      Don't forget non corporate COBOL users.

      e.g. The UK Department for Work and Pensions.

      - Pays out £70,000 of transactions per SECOND
      - Despite frenetic activity on their new systems and having an in-house staff of over 700 developers, the "over 50% COBOL" statistic still stands

    • by Mr. Barky ( 152560 ) on Sunday January 31, 2021 @12:02PM (#61012240)

      Maybe systems have been updated, but so often the systems doing the processing are so critical to the organization that updating them just means modifying a few (carefully reviewed) lines of code. Perhaps if someone were creating an entirely new bank, it might be COBOL-free, but that's not all that common (especially weighted by volume of transactions).

      My guess is that there are more transactions via COBOL today than 25 years ago for the simple reason that there are far more electronic transactions now - more is done via credit cards and less is done with cash. Probably everything that needs to go through a bank probably goes through COBOL code at some point.

      • by Rockoon ( 1252108 ) on Sunday January 31, 2021 @01:02PM (#61012382)
        The problem with migrating from these old mature technologies is the fact that they are mature. That credit transaction code has a thousand conditional statements in it dealing with everything from regional financial laws, to the time-zone peculiarities of eastern nowhere where people west of a river go by one time zone and the people east of it another, regardless of anything else any database anywhere days on the matter.

        Mature code is ugly but it also transforms the situation into implementation defined requirements. Any faithful port of the logic will be just as ugly. The only reason to change languages is for access to subjectively better hardware unless that language is purposefully built for the task. One such purposefully built language is called COBOL.
        • by Mr. Barky ( 152560 ) on Sunday January 31, 2021 @01:10PM (#61012404)

          I completely agree. Here's an article that I think is pretty good on the subject (talking about how netscape screwed up in choosing to rewrite their code and why they were wrong to do so).

          https://www.joelonsoftware.com/2000/04/06/things-you-should-never-do-part-i/

          • by Anonymous Coward

            It's nonsense though, a lot of Spolsky's writings are. Sometimes rewrites are the best thing to do, the cost of maintenance can easily become higher than the cost of rewrites in systems with high levels of technical debt, and writing code isn't the slow thing in building software, gathering requirements, understanding the problems you're trying to solve, and coming up with the solutions.

            Systems for which those things are well understood and well answered and where the original system was developed organical

        • by LostMyBeaver ( 1226054 ) on Monday February 01, 2021 @01:00AM (#61014218)
          I never understood why this is an issue.

          *** sorry for the book below... I started writing and... well it got out of control ***

          Back in the 90's I worked for a bank clearing house and was coding some stuff on PC and there was an NCR mainframe which was actually so far out of support that they were literally collecting spare NCR mainframes from dumpsters to have spare parts. There was simply no point in rewriting the banking code since the mainframe from the 1970's was processing banking transactions for ... everything for about 1/3 of all bank customers in the state of Florida at the time. We had an entire building dedicated to paper check processing with machines that reached 5 meter high ceilings lining all the walls. We were ingesting (what I assume was) millions of paper checks a week.

          The mainframe guys didn't see their jobs threatened because unlike nearly every PC technology, on the mainframe, everything is an object and everything is a transaction.

          To compare to a modern approach... an IBM mainframe is :
          - CICS - Function as a Service, similar to Amazon Lambda functions... almost identical in fact.
          - DB2 - Distributed Object storage database with NoSQL support as well as (I think) and ACID compliant SQL query engine as well as tables support
          - JCL - Basically the AWS command line tools for uploading and scheduling Lambda events and transactions
          - RPL3 - A batch language for generating form reports ... it's basically a precursor to Crystal Reports... which is mostly dead since paper is too
          - TSO/e or ISPF - something like HTML for presenting the user interface to smart terminals... IBM 3270 terminals were sorta like a web browser in the sense that they support things pages and form controls and had things like the approximate equivalent of Get and Post.
          - COBOL - One of very many programming languages which can be called by the CICS engine

          I suppose I could go on, but overall, these systems are basically self-hosted AWS alternatives which have evolved ... slowly and hardened since 1969.

          I should point out that COBOL, which not the most exciting language is insanely simple. As soon as you understand that you generally don't write programs in Cobol, rather your write functions. You schedule the function to run using JCL (job control language) when an event occurs (a form posted data for example) and then you use object storage and nosql style methods to query databases and read or store information for example.

          What people generally don't realize about the mainframe is that it completely and totally lacks a single point of failure. You can have a few nodes or a few million nodes. Storage is distributed naturally across any format from high performance SSD through real to real tape... even S3 if you really want it.

          - IBM mainframes are and always have be a cloud.
          - Load balancers on the way into a mainframe distribute transactions across any available node (just like K8S ingress)
          - Unlike ingress, if a transaction doesn't complete, you can schedule a retry that will try another node
          - Functions are stored as source or binary objects in the common object database
          - Within operator provided constraints, all functions are elastic and location independent.
          - When a transaction comes in, the function associated with the transaction (often referenced with something like a URL) is loaded onto a node.
          - The CPU in the node isn't important. The binary format for IBM mainframes is a virtual machine language and has to be just in time compiled on the node where it will run if a binary wasn't already cached from earlier transactions.
          - The system depended on the terminals to provide user interface functionality. Since the 60's, I'm had been using declarative UIs and a language like JSON for triggering transactions. The difference being that terminals didn't need responsive UI or single page apps. Instead, when a new page layout was ne
        • I've seen efforts to rewrite "dusty old legacy code" at $very_big_mainframe_corp, in Java. After I don't know how much time and money spent they gave up and let the COBOL continue to do its job. It's clunky, it's boring, and it just works.
  • by Rosco P. Coltrane ( 209368 ) on Sunday January 31, 2021 @11:44AM (#61012186)

    if Netcraft confirms it.

    • by BAReFO0t ( 6240524 ) on Sunday January 31, 2021 @12:28PM (#61012302)

      But who confirms Netcraft?

      I've seen so little of them, the last decade, that we might ask BSD users to confiem their existence.

      • I am a BSD user, and have been since about 1982. I still exist (but refuse to confirm whether I am alive or undead).

        I have written occasional bits of Cobol, but I am reasonably certain that no Cobol written by me is still in use, and I will not write any more IBM or Microfocus Cobol ever.

        I will not confirm whether or not Netcraft or Gartner exists, not will I listen to anyone else confirming anything relating to Netcraft unless they have evidence written in Snobol4 that they are a genuine Zombie.

  • Role of COBOL in your daily life overstated, news at 11.

    • See, real conspiracies are around but they're often boring to a general audience, like this conspiracy to overhype the use of an ancient programming language to drum up business for the handful of companies that still sell products and services related to it.

      Most people probably aren't even interested in the fake STEM shortage conspiracy.

    • by Anonymous Coward

      Articles like this are infinitely better and more suitable than the inane political rage-bait that gets posted here. COBOL's incredible PR factoids should absolutely be subject to scrutiny.

  • the number lof LINES of Cobol code executed each day.
    • Most lines are in the identification division, the environment division (COBOL programming taught me the correct spelling of the word environment) and the data division.

      Think of Cobol as the software version of bureaucracy, where a small fraction of employees are doing actual work?

      • Yeah, but since '*' has to be written MULTIPLY, among other things (i.e. all the rest), you can count on many statements in all divisions that in any most other language would take much less lines. Thus a very different `wc -l`.
        • by Entrope ( 68843 )

          You have to admit that "MULTIPLY xx BY yy GIVING zz" is clearly an imperative statement.

          • He's implying that he can fit 1000 lines of clearly understandable COBOL into one line of any modern language (I exaggerate for effect). What he doesn't mention is that one line will take longer to write than 1000 lines of COBOL and will take three times longer to test and debug. Most people who knock COBOL don't understand just how BIG these systems are, they have worked on one or two large websites and think that's as big as it gets. Usually they come from AGILE environments and have no clue about proc
            • He's implying that he can fit 1000 lines of clearly understandable COBOL

              You are implying that understanding one line of Cobol out of context implies that it is possible to understand the entire context.

              As one of the few people here who has experience of actual Cobol, I initially read this statement as
              1000 lines of clearly incomprehensible COBOL

              which is far more credible if you have ever had to debug the stuff.

              • You are implying that understanding one line of Cobol out of context implies that it is possible to understand the entire context.

                Yeah, no. Pretty sure I wasn't.

                As one of the few people here who has experience of actual Cobol

                As do I, although it was a long time ago in a bank far far... well it's not that far away, and it's still running COBOL.
                Although I did dive into COBOL again not so long ago when writing a .Net front end for a COBOL backend and the COBOL guys kept sending me segments in the wrong ord

                • Yes, the point I was making is that the problem is not necessarily Cobol, but the environment as a whole, it can be very hard to find out why the statement is executed, not because of the code, but, as much as anything else, because anyone who knew why it was written at all was dead before my grandchildren were born.

                  References to departments or machines that were scrapped before I moved there can also be a problem.

                  I write code expecting it to be used for 10 years - not 50 years. (I have also supported F

                  • Well, you can't really blame any programming language when the real problem is the corporate environment that caused the loss of system knowledge. I was told to go to one of the banks our company worked with and answer any questions they had. They insisted we sent them a file via FTP to their mainframe, they wanted a change and we needed to do it since we sent the file. When I told them we don't send files to anyone, at all. We send SWIFT messages, and since you've been ACKing them, everything seems fin
        • You can use standard operators in COBOL since decades.

  • by xack ( 5304745 ) on Sunday January 31, 2021 @11:56AM (#61012218)
    Look at the recent failures of trying to phase out python2 and flash player. Internet Explorer and Windows XP still powers critical infrastructure. Cobol is here for at least another 100 years when quantum Cobol will take over.
  • Estimates are always made to favor the needs of the people publishing the estimates. People who sell trash compactors estimate the need for such machines as a way of selling more; companies that train and hire out COBOL programmers estimate the need to for COBOL to support getting more people to use their services.

    That isn't to say COBOL isn't used. I know of a large corporation involved in shipping where the majority of the data on packages passes through at least one COBOL-using mainframe as it moves arou

    • The days of when the sales of trash compactors were predicted by the use of COBOL ended when recycling of paper was instituted.

    • by lrichardson ( 220639 ) on Sunday January 31, 2021 @02:09PM (#61012532) Homepage

      "Estimates are always made to favor the needs of the people publishing the estimates"

      Well yes, but actually no.
      There *are* a lot of estimates/studies which aim to be unbiased. These are generally from people/companies who have a solid reputation for doing such work.

      Conversely, I still remember the !@#$ing Microsoft ads, across virtually all media they thought would appeal to IT types, pushing the idea that the 95% of the internet ran on Microsoft Server**, based on an independent survey!
      Where **, in the smallest font that media would support, mentioned that the number was based on sales ... completely ignoring the *minor* detail that Apache was running on 85-90% of all internet servers - based on numerous unbiased sources - and was completely free.

  • 30billion per day, I would rather estimate 3000 billion, and plus, per day.
    For reference, the German railway system - Deutsche Bahn - had ~150million passengers in the year 2018.
    Assuming everyone booked a ticket (which they did not as many have monthly or weekly tickets), that would be roughly 400k bookings a day. Considering that most of the backends are in Cobol running on Tandems (yes!) and most transactions consist of a set of sub transactions, assuming perhaps up to ten per ticket, that leads us to 5 million COBOL based transactions for a single company, per day.

  • Has anyone ever seen COBOL still being actively used?

    How much of what you saw in the last 10 years was COBOL? Not saw, as in code, but as in running software.

    And what industries? Banks? What else?

    My imprecise gut says, the word we are looking for, is "negligible". But I'm happy with being told I am wrong.

    • by nbvb ( 32836 ) on Sunday January 31, 2021 @12:52PM (#61012368) Journal

      Get a phone bill? COBOL generated that. Every call, every text, every data packet gets rated. And that rating engine? COBOL.

      Flight reservations? COBOL again.

      Just two examples that I’ve touched in my professional life routinely. There’s tons more.

      • Get a phone bill? COBOL generated that. Every call, every text, every data packet gets rated. And that rating engine? COBOL.

        As someone who maintained a phone billing system in Java, and another written in PHP, which I replaced with one I wrote myself at least partly in PHP, I refute that statement.
        The bills may have been generated by Cobol, but the call data collection was not in Cobol.

      • This is exactly the repeated stupidity that the story is about.

        • Indeed. There is a mythology that COBOL is widely used, but little evidence that it actually is.

          The PP is just repeating the myth while providing no evidence and no specifics.

          "Phone companies use COBOL." Really? Which phone company?

          "Airlines use COBOL." Really? Which airlines? Where are their job postings for COBOL coders?

      • Get a phone bill? COBOL generated that.

        No it didn't.

        My phone bills arrive as PDFs in my inbox. PDF generators are not written in COBOL.

        Perhaps you mean the data was processed with COBOL. Highly unlikely. Pac-Bell has been using Java for decades. They have dozens of job listings for programmers, but none for COBOL as far back as I can search.

      • by Cyberax ( 705495 )

        Get a phone bill? COBOL generated that.

        That hasn't been true for a long time. T-Mobile and Verion both have billing systems written in Java. Not sure about AT&T.

    • by Mr. Barky ( 152560 ) on Sunday January 31, 2021 @01:07PM (#61012394)

      Banks are obviously the big one. In general any organization that has been around a while (i.e in business since the 70s) with large amounts of financial transactions. The gains of using computers were so great and the standardization around COBOL basically made it a technology that is impossible to dislodge. It is too critical to the underlying infrastructure. Their core processing is done using very old code (but probably as close to bug-free as you can get) - and it would be far too risky to change it, so it stays the same. Most is in-house so it isn't publicly visible.

      In addition to banks, you have airlines, government agencies (many different ones), railroads, electric grids, ...

      It isn't the "cool" technology and not a lot of new stuff is done using it (which probably means the volume of programmers working on such systems is relatively small), but I wouldn't call it negligible.

    • by RightwingNutjob ( 1302813 ) on Sunday January 31, 2021 @01:19PM (#61012424)

      Insurance.

      My mother writes COBOL for a living. Her employer services the insurance industry. Not the big boys like Allstate or Liberty Mutual who are big enough to have an in-house software team but little-ish regional companies in places like Upstate New York that are big enough to be insurance companies but too small to justify insourcing their software development.

      Many of these little guys have been around for many decades and started going digital versus pencil and paper in the 70s when COBOL was the only option. My mom's company was founded around then as a joint venture of a bunch of these little guys. And they just kept on going.

    • But I'm happy with being told I am wrong.

      You're wrong. Be happy.

    • I have a friend who works with a insurance company. He is a cobol programmer. The core of their system run on old main frames with cobol code. I asked him why they still run on them with old code. His answer was simple. "All the bugs are known."

      Makes sense. You are dealing with financial records. A bug, hardware or software, could cost millions if not billions dollars before its stopped.

      How many remember the old floating point bug in the old Pentium processor? Modern processors or a bug f

      • by bws111 ( 1216812 )

        Highly unlikely they are running 'old mainframes'. Mainframes, yes. Old, no. If they are like most mainframe users, the mainframe (and OS, middleware, etc) is probably less than 5 years old. The only 'old' thing is generally the application (business logic).

    • Logistics systems, insurance, banking, stock exchange, I have either worked on these systems myself at some point or know people working on them. Also I still have contacts in most of the places I have worked and can vouch that they are still running on COBOL. If you have ever worked with some backend system that when called sends you long fixed length strings you are probably talking to a COBOL program somewhere deep in the bowels of the backend. But that doesn't mean ALL COBOL systems will send you fix
      • I can confirm some of that. I know of a large insurance company here in SoCal that runs a large mainframe system fronted by Java apps that act as translation layers or transaction brokers sometimes. The folks who know how to support the mainframe backend are getting pushed out very rapidly, it will be interesting to see what happens....
        • It's because the Java ppl are promising they can replace the backend easily, they underestimate the amount of work involved (or know and lie about it). The real problem is the people pushing for the change. They want good looking stuff on their CV, helping maintain a functioning system based on dated technology does not look good on a IT workers CV.
    • Lots of on premise ERP systems use COBOL extensively. Most often in Payroll systems but also Inventory systems too. The thing about COBOL is that, yes, it's ancient and not a lot of fun to code in but it WORKS. It is extremely efficient at processing huge data sets. For years I have been hearing that it is going to be retired but it ends up being such a massive undertaking that most places just give up.

  • If it isn't broken, don't fix it.

    or

    I need a job fixing stuff that ain't broken.

    • They're saying the opposite, actually.
    • If it isn't broken, don't fix it.

      I suspect you still go to work on horseback. A much more sensible approach is:

      • If it ain't broken, maintain it.
      • If it's broken, either repair it or discard it.
      • If it's beyond maintenance, it's definitely broken.
  • - There are 200 billion lines of COBOL Code

    There probably aren't 5B lines of code in the entire body of project code bases in the ecosystems of Red Hat, Debian and Ubuntu.

    If you are starting from 200B lines of code and adding 5B new lines a year, your language is nothing short of some satanic torture device meant to prepare developers for an eternity in Hell in terms of how ineffective it is at anything resembling decent practices.

    • Don't be dense, an operating system is actually a very limited piece of software with a very specific job, You are also forgetting one very important thing. There is one Ubuntu in the WORLD. But there is ONE logistics system for EVERY logistics company in the world. Sure there are some companies which share a common COBOL logistics systems, but most COBOL systems are home grown and heavily individualized. So, do you think there is 5B lines of code in 1000 Ubuntu projects? Or 10,000? Or 100,000? I tr
  • - That's 60-80% of all the source codes in the world [sic]

    The plural of FORTRAN programs written until some time in the 80s or 90s is also apparently "codes" among physicists and engineers of a certain age.

    I'm 35. I Learn[ed] To Code for realz in college in the early 2000s. I've always known "code" to be a collective noun.

    Any 4 or 5-digit user ids around here care to weigh in on when the transition happened?

  • We do not know, but it is a lot... probably billions.

    Until you know who is using COBOL or whatever and how it is being used, then how much is a figure that might be addressed objectively.

    But hey, you have not lived until you have to face the new guy who wants to swap out all of the company systems and implement a service-based architecture, when he cannot even write a simple service.

    • But hey, you have not lived until you have to face the new guy who wants to swap out all of the company systems and implement a service-based architecture, when he cannot even write a simple service.

      Don't you love those guys? Don't you love it even more when those guys are the bosses nephew or son? Fortunately, I was already on the way out and got to watch it all happen from a distance.

    • Or worse, the people who want to implement framework xyz so they can put it on their CV and therefore be more marketable, whether it fits into the current system or not.
  • Just envy (Score:5, Interesting)

    by dromgodis ( 4533247 ) on Sunday January 31, 2021 @02:15PM (#61012546)

    I think many programmers are just envious because they know that the code they write will not be used for a tenth of the lifetime of the reputable Cobol code.

    • I think they are just ignorant, they have never worked on really large systems and probably never will. They THINK they have though, which is just annoying. I started on these large systems and have never had a problem finding more work on them, but according to others it's actually a hard industry to break into. It doesn't help that they don't generally advertise, if you don't know the right agents you will probably never even get the chance to try.
    • To be fair, many companies would have been happy to replace Cobol with something else around 1999. How many Cobol programs are still in use because nobody can replace them with something more "modern"?
  • as someone who wrote their first program in fortran, in 1976, i have seen languages come and go, and the birth of C, Java and Python. I have seen/heard thousands of debates about which is the 'better' language, and the pros and cons. i also wrote a lot of assembly language programs in Z80, 6502 and x86. there are the 'real' programs. i am always bemused when high level language developers have no idea of a cpu instruction set, and argue among themselves about the best language. i tell them that they will be
  • I began writing software in the early 1980's in BASIC, Assembly, RPG II, eventually C, Bash, Perl, C++, and PHP along with SQL and more so now JavaScript/Node.js, C#, and Python. The truth is, these languages have continually become more complex, more memory bloated, slower, harder to learn, and less efficient to code in. Frameworks have enabled people who cannot write softare from scratch to write applications, too, but very restricted and poor performing.

    COBOL is still easy to argue as the best overall

    • by Tablizer ( 95088 )

      COBOL is a domain-specific language and it fits the domain pretty well such that general-purpose languages have a hard time competing against it.

  • by greenwow ( 3635575 ) on Sunday January 31, 2021 @11:08PM (#61013922)

    work for still uses COBOL. Every paycheck for well over three million employees is generated by COBOL. We've been trying to replace it for well over a decade, but we've failed so far since there's just so many rules.

  • If you look at CSDs (Central Security Depositories), you'll find scary amounts of code written in both Cobol and PL/1.

    CICS is also still alive and kicking in a lot of places.

  • Just "Hello World" takes 50-100 lines of code. Modernization of the language to support things like "functions" requires multiplied additional lines of code, just to start telling the program what you actually want it to do. I'm surprised the number of lines of Cobol isn't higher. But that doesn't equate to a lot of functionality. I'm old enough to have written Cobol professionally. I don't ever want to see another line of it!

I tell them to turn to the study of mathematics, for it is only there that they might escape the lusts of the flesh. -- Thomas Mann, "The Magic Mountain"

Working...