Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Government United States

Cobol Programmers Heed the Call to Support America's Overloaded Unemployment Systems (ieee.org) 113

Earlier this week Slashdot reader puddingebola shared a CNN story headlined "Wanted urgently: People who know a half century-old computer language so states can process unemployment claims..."

But now IEEE Spectrum reports that "Cobol programmers in the United States are heeding the call to work on antiquated state unemployment benefits computer systemsâ¦" The new claims brought the three-week total to more than 16 million, the equivalent of a tenth of the U.S. workforce. The spike in new claims has inundated benefits computer systems in states such as Connecticut, Florida, and elsewhere, some of which haven't updated their Cobol-based mainframe systems in years, or decades...

New Jersey isn't alone. Florida's unemployment claims system has been so overwhelmed, the state is reverting to using paper applications. Massachusetts deployed more than 500 new employees to work remotely to meet increased demand that has overloaded its unemployment system... Connecticut's labor department is bringing back retirees and using IT staff from other departments to upgrade its 40-year-old system, which runs on a Cobol mainframe and connected components.

One company that says it reached out to New Jersey was the Texas-based "Cobol Cowboys" placement agency, with nearly 350 contractors, including a man in his mid-80s "who did some work with Grace Hopper." Also helping is U.S. Digital Response, a new group hoping to find skilled volunteers with technical skills for government agencies.

The article notes estimates that Cobol still handles 95% of all ATM swipes and 43% of banking systems.
This discussion has been archived. No new comments can be posted.

Cobol Programmers Heed the Call to Support America's Overloaded Unemployment Systems

Comments Filter:
  • by martiniturbide ( 1203660 ) on Sunday April 12, 2020 @08:39AM (#59936290) Homepage Journal
    Rumor has it that a lot of COBOL programmers were cryogenically frozen to solve the Y10K bug. ;D ;D
  • by Rockoon ( 1252108 ) on Sunday April 12, 2020 @08:41AM (#59936300)
    Applied for unemployment mid-march. Before being "approved" the website still listed an additional federal +$25 weekly supplemental from 2001, when the dot com bubble burst, as part of my expected benefits.

    Thats basically how long its been since anyone maintained the unemployment COBOL code. 19 fucking years. 19 fucking years!
    • Re: (Score:2, Flamebait)

      Last time I had breakfast I had bacon and eggs. Those fucking chickens and the eggs are the same damn chickens and eggs for the last several thousand years at least. The last time the "egg code" was maintained was a billion years ago. A billion fucking years!

      What the fuck does a stupid website have to do with the price of tea in china? That is like saying that the library had books because the ants have hills. One has absolutely nothing whatsoever to do with the other.

    • Was that 19 year old cruft in the COBOL code, or in the legal code though?

  • You have a 40 year old system running software written in a at least 30 year old dead language, maybe the state you are in should have some punishments applied to it. Every fuckin time there's even the slightest bump in the road, one of these COBOL stories goes up, and its usually on a somewhat critical system. While these state agency's waffle fart around people are being directly, all because some bureaucrat twat doesn't want to understand anything to do with technology... its pathetic.

    • Re: (Score:3, Insightful)

      Nobody cared about IT until something break. But you knew that already.
    • Re:Here's a thought (Score:5, Interesting)

      by ArchieBunker ( 132337 ) on Sunday April 12, 2020 @10:13AM (#59936718)

      COBOL isn't dead and the fact that it ran for three decades says a lot about its design. Meanwhile your python code breaks between major releases.

      • COBOL isn't dead and the fact that it ran for three decades says a lot about its design. Meanwhile your python code breaks between major releases.

        It has very little to do with the design of the language. ANY code can run for three decades if you utterly control its environment and prevent any part of it from ever changing, to the point you're salvaging replacement boards out of old, dead machines to use when boards in your production machine fail. Which is exactly what is happening.

        Really what we need is a Century Scale Computing Initiative. It kicks off for a limited 2 year period every 20 years to design a computing environment that is intended

        • by bws111 ( 1216812 )

          This post is a riot. First, nobody is taking boards out of old, dead machines. The hardware, OS, and middleware is probably less than 5 years old.

          The rest of the post is meaningless because you failed to realize this is what mainframes are. A hardware and software ecosystem that allows continuous replacement with the newest stuff, while continuing to be able to run existing applications unchanged. While not 100 years old yet, we are currently at the 56 year mark.

          • This is absolutely mainframes. The previous post was in fact a riot as stated. Practically nobody is still running a 360-something other than, perhaps, in a museum. The current (very current - you can call up IBM and order one today) z series still runs all the old 360, 370, etc. OS's and middleware and software. So the story about CT running on 50-year-old computers isn't credible - nothing electronic from 50 years ago is still working reliably enough for daily business. But the software certainly can.

            COBO

            • The current (very current - you can call up IBM and order one today) z series still runs all the old 360, 370, etc. OS's and middleware and software.

              Why buy one? IBM will happily offer you as many as you need as a Z cloud service [ibm.com].

              After IBM generously sends in a few Green Beret COBOL Commandos in to help you out during this emergency and form a beachhead . . . they will follow up with a full army of salesmen and consultants who will strategically advise you how to move to their Z cloud model.

        • The IBM Z servers are pretty sweet check them out [s81c.com]. I wouldn't mind having one of those, I'd install Linux on it and have a Beowulf or something.
          • I wouldn't mind having one of those, I'd install Linux on it and have a Beowulf or something.

            . . . or you can just buy a dedicated Z server with Linux already installed: IBM LinuxONE [ibm.com]

            Getting all those drivers to compile and run on Z hardware might be a bitch if you try to do it yourself. You might need EBCIDIC to ASCII thunks, or something gnarly like that.

            Now . . . what does "ONE" stand for . . . ? Didn't Sun used to call their networking stuff "ONE" for "Open Net Environment" . . . ?

            • by bws111 ( 1216812 )

              LinuxOne, like all Z machines, is processors, memory, and IO (network, fiber channel, etc) only. As they do not include any storage they can not come with Linux (or anything else) installed.

              The thing that sets LinuxOne apart from other Z systems is that it contains IFLs only, and will not run 'traditional' mainframe workload.

              You can get Red Hat, SuSE, and Ubuntu distros for Z, there is no need to compile drivers or any such thing.

            • Solved https://www.ibm.com/support/pa... [ibm.com] But in JCL there is an option to use about 60 translation tables, and in SORT you have the same character table richness. Memory of the TRANSLATE verb as well. Data Pipes - translation yes. https://www.ibm.com/support/kn... [ibm.com] I suspect they have no IBM system programmers to remind them how the issue is so solved out of the box.
        • Most hardware lasts easy 10 years +, especially high end server hardware.
          And for old systems we have emulators (virtual machines). For long term storage optical disks.

          All problems solved long ago

          • Apropos of nothing related to COBOL, I have a 2007-era desktop (with 8GB RAM, a newer Nvidia GPU and a SSD boot drive) that runs Windows 10 very well. Fast startup, fast response, a little slow when hit with CPU-intensive jobs but for general use it's the nicest Win10 box in the house. No COBOL compiler, though; nobody has done one for the gcc system.

        • The secret behind Cobol was is strongly typed data structures. You could not do anything until you worked your data out. There was a data administrator - call it a data nazi - who made everyone suffer by not re-inventing data, or lengths fields whatever. There are plenty of tools on IBM to globally change all fields and recompile. IBM even has a register that points to the start of data structures so that if you expanded a 10 digit number to 12 digets te assembler or linker would throw an error. Easy As. Bu
      • Meanwhile your python code breaks between major releases.

        Oh yeah, thanks for reminding me why I hate Python. Could be worse, with Node libraries you have to test to make sure it still works between minor releases. Note this isn't true for Javascript: Javascript has weaknesses, but backwards compatibility is solid.

      • by Araaj ( 6764012 )
        fantastic work. really a nice portal and you are doing a great job in this field. thanks a lot for this also visit https://kolkata-ff.org/ [kolkata-ff.org]
      • Python is contributing greatly to the decay of software engineering as a science. Any dynamically typed language should be legislated as illegal to use in a critical government function.

  • by Dunbal ( 464142 ) * on Sunday April 12, 2020 @08:56AM (#59936370)
    And just like that, four new divisions were created?
  • by xack ( 5304745 ) on Sunday April 12, 2020 @09:00AM (#59936394)
    Whatever language you propose rewriting it in you face rewriting it again in 2070. What we need is a standard of computing that can go centuries without maintenance, so 500 years down the line so our descendants can understand the code.
    • Pretty sure that language is called BASIC.
    • What we need is a standard of business practices and legislation that can go centuries without changes. Yeah, right.
    • What we need is a standard of computing that can go centuries without maintenance, so 500 years down the line so our descendants can understand the code.

      Such a standard can only be useful in fields where there is no change for centuries at a time. That is why the legal profession still runs on fax transmission of printed pages.

    • I think you will observe that spoken and written languages evolve faster than 500 years so our descendants will have trouble reading this slashdot page. The problem is that our language is not a static thing.

      Some examples:
      a) William Shakespeare's plays were written in his active years of 1585–1613 which is less than 500 years ago. However, his plays are written in Olde English.
      b) Similarly, the King James Bible was originally published in 1611 which is also written in Olde English.
      c) Sir Isaac Newton

      • I think you will observe that spoken and written languages evolve faster than 500 years so our descendants will have trouble reading this slashdot page. The problem is that our language is not a static thing.

        Some examples: a) William Shakespeare's plays were written in his active years of 1585–1613 which is less than 500 years ago. However, his plays are written in Olde English. b) Similarly, the King James Bible was originally published in 1611 which is also written in Olde English.

        Both of these were written in early modern English - which is why they are still readily intelligible today, with only a little support in the way of a glossary of obsolete terms. The need for a term glossary for complete understanding is only a matter of degree - people need dictionaries today to understand everything being written today as well, and the farther back you go (19th, 18th...) the need gradually but steadily increases.

        Shakespeare was in fact the most influential contributor to modern English.

      • Shakespeare and Newton spoke modern or early-modern English, which developed in around the Elizabethan period (shakespeares time). Chaucer wrote in Middle English, which is distinctly tougher than Shakespeare for a 21st century readerâ"it requires annotation. And old English is even more unintelligible to the modern reader.

      • However, his plays are written in Olde English.

        which is also written in Olde English.

        Both of these works are written in late Early Modern English. Old English fell out of use sometime after 1100.

    • That would mean the software needs to be highly adaptable and people would need to predict what future generations might need. The adaptable part is pretty easy, we already have that with some restrictions, for example an SQL like database can be read by any type of programming language that can read the database.

      "so our descendants can understand the code"
      What about a source-to-source translator aka transcompiler. It can convert the source code to another programming language. We already have that too.

      In t

      • In a way, COBOL was designed to be self-documenting. Yes, that's kind of a joke, but it *is* written in a form of structured English. If you understand the language's basic rules and procedural programming in general, you can pick up a COBOL program and almost immediately understand what it's doing. It always had a reputation for being excessively wordy, but that's because it wasn't hiding its actions behind a bunch of inscrutable symbols (unless you wanted to; there were ways to specify complex calculation

    • What we need is a standard of computing that can go centuries without maintenance

      That's what MongoDB was created for. It's webscale database and doesn't use SQL or joins so it is high performance.
      https://www.youtube.com/watch?... [youtube.com]

  • I believe that, if you always want to start to learn COBOL but never have the chance (or was too much afraid of) to start your studies in some very very ancient language, you don't need to worry anymore! You still have a lot of time to catch up with the other dinosaurs and earn some very good money. I think there will be a lot of legacy systems drowned in COBOL still lurking around for at least one or two more decades from now on... :D

    • Re:OUTATIME (Score:4, Interesting)

      by Rockoon ( 1252108 ) on Sunday April 12, 2020 @09:09AM (#59936444)
      The issues are twofold.

      COBOL programmers with credentials to back it up either (a) already have a VERY high paying job in the financial industry or (b) have retired.
      These States want volunteers that will work for free, rather than pay the market rate for one.

      Whats going to happen is that these States are going to get amateurs, and their amateur code is going to be even harder to maintain.
      • Lets take the prison population and put them through code camps. Afterwards, they'll can be paid in sentence reductions. Just because someone is a murderer doesn't mean they can't be good at programming. Even more truer for white-collar criminals.
      • by Eirele ( 6731032 )

        The issues are twofold.

        COBOL programmers with credentials to back it up either (a) already have a VERY high paying job in the financial industry or (b) have retired.

        Well, my previous comment had a little funny in between...

        ...But, as a possible scenario, this is still not too much outside from reality if people try harder. All novices could try to learn COBOL the hard way: l really believe in learning on the job. If they could join forces with other ancient COBOL programmers (inclusive the retired ones) th

    • COBOL is nothing fear, unless you believe you can die of boredom.
  • In other news, President Trump was blamed by CNN for defunding COBOL programs.

    "COBOL is a dead language, like latin and we shouldn't put any more money into it. We need more full stack cloud developers, SCRUM Masters, and great Dev Ops Americans to compete with the China.", Trump said in a press conference. Further he retorted to the the fake news operative, "Do you think COBOL keeps Netflix or Fox News running? Give me a break?"

    • We need more full stack cloud developers, SCRUM Masters, and great Dev Ops Americans to compete with the China."

      Wow, I was going to vote for him before I read that. I seriously dodged a bullet, it could have been awful if he got re-elected.

      Here in the modern world, we do Kanban not scrum.

  • What are they tracking... 20 or 30 fields per person?  Why don’t they just junk that big iron and rebuild it?  Seems like something a couple competent devs could make in a few weeks.
    • What are they tracking... 20 or 30 fields per person? Why don’t they just junk that big iron and rebuild it? Seems like something a couple competent devs could make in a few weeks.

      That's cute.
      You forget to include all the laws and rules about privacy.
      And you forgot to include all the laws and rules about how unemployment works.
      And you forgot to include all the calculation logic to determine payouts.
      And reporting.

      Oh, and you forgot to include the infrastructure "required" to get a process like this going. I don't mean building and servers... I mean the agency boss, assistant, and lieutenants -- lots of lieutenants!

      You're speaking as if a guerrilla process is permitted inside a bureauc

    • by raymorris ( 2726007 ) on Sunday April 12, 2020 @10:09AM (#59936696) Journal

      COBOL is often used is applications where it needs to be correct every time, on many millions of records which will include corner cases. Things like banking - it's not okay for Wells Fargo's code to mess up your bank balance 0.001% of the time and set it to -$16 million because of an overflow.

      I worked at a government agency where they tried re-writing some old stuff, stuff that had stood the test of time, using new languages and tools, and new programmers. That would included switching from DB2 to MS-SQL. About half of the new subsystems choked on names like O'Reilly, due to the apostrophe. That indicates they were also susceptible to SQL injections.

      Overall, the new stuff mostly pretty much worked, with some manual hacks to go back and fix data each day. The old stuff went years between errors.

      Is it possible to write new code that doesn't have extremely stupid, obvious errors like embedding untrusted user input directly into your SQL queries? Of course it's possible. And in fact rewrites include such stupid, obvious errors routinely. If you get much better programmers who arw careful and actually write both positive and NEGATIVE test cases, you can avoid the obvious problems. Of course most people reading this have never written a negative test case I their life. There will still be bugs, just fewer really stupid bugs. Corner cases abound.

      The older language and development processes weren't about "it looks like it pretty much works", programmers studied how to mathematically prove their code was correct. Not that they always did prove it, but they knew the concepts of constructing systems that can be known to be correct. What bugs did make it to production were found and fixed 35 years ago.

      • People writing government apps weren't proving out anything, and it's obvious that performance testing never happened.

        If it can't do 10x the work on 1000x faster hardware, something is seriously wrong.
        • by dwywit ( 1109409 )

          These aren't "apps", they're "systems".

          What did you pay for that 6-digit UID?

          • App is short for Application, and is more specific than System.

            Or do you still secretly yearn call them "programs"

            I've been in various facets of the industry since the late 80s.  I've also evolved skills and frame of reference to stay relevant. Believe it or not, a LOT of really useful stuff has been developed since you were submitting batch jobs and collecting greenbar output in college.
            • by dwywit ( 1109409 )

              1984 System/36, then 1988 AS400. Rock solid, both of them. Then I watched the world turn to "client/server" BS with cheap (or free), but buggy hardware and operating systems.

              "What, the file server crashed AGAIN?"

              So, instead of programming/managing, I turned to fixing. Definitely a reliable source of income, there.

      • " About half of the new subsystems choked on names like O'Reilly, due to the apostrophe. That indicates they were also susceptible to SQL injections."

        This is an idiot whippersnapper bad programmer issue and has nothing to do with the language or the database. It is simply that the young uns are stupid useless incompetent children who should not be allowed to play with computers.

        This is a very common thing. So common in fact that competence and common sense is almost non-existant.

        • by raymorris ( 2726007 ) on Sunday April 12, 2020 @02:34PM (#59937730) Journal

          > This is an idiot whippersnapper bad programmer issue and has nothing to do with the language or the database.

          The first clause is true. Let me suggest that the second clause is less true.

          Suppose you looked at 1,000 random things written as Excel macros, by people who use Excel macros for programming, and then looked at 1,000 things written in assembly, by assembly programmers. I think you'd find that on average the assembly programmers know programming a bit better and make fewer really stupid mistakes.

          If you look at randomly chosen examples of things built with Lego and compare them to randomly chosen examples of things built with a CNC machine, I think you'll find the CNC-built items are, on average, better engineered than the Lego constructions. CNC machines are more likely to be used for well engineered items than Lego is. Lego is easy to build with and suitable for kids. CNC is harder and suitable for professionals.

          When Lego wanted a programming language to allow kids to program their Lego constructions, they chose Python. Python is literally the Lego of programming languages. It fits because it's easy to use to build something, suitable for kids.

          In Python it's also easy to confuse strings with integers, and integers with floating point numbers. It's easy to make a lot of mistakes that are prevented by "harder" languages. Some languages make things harder to make - including mistakes. It's harder to make mistakes when you have to pre-declare the types of all of your variables than it is when variables magically spring to life without ever being declared at all.

          Some languages make certain assertions and error checking a required or normal part of using the language. Other languages make assertions and error checking more difficult. In general, the languages that are "quick and easy" also make it easy to skip any error checking, to not bother with it.

          A huge class of errors comes from side effects and global variables. Some languages have no global variables and side effects aren't permitted - you can't write a function which changes global state. In others, such as MS-SQL, pure functions which only return values (called user-defined functions) are a totally different type of entity than procedures which affect state (called stored procedures). So you know at a glance which routines can affect state and which can't - user defined functions *can't* change the data.

          So I'd say some languages very much encourage a disciplined approach which reduces errors, while other languages encourage practices which are more error prone (often "quick and easy" practices).

          Furthermore, each language comes with a culture, not just semantics. If you interact with the Perl community and read books by Randall you'll likely take a different approach to software development than if you're steeped in the Lisp community.

          Which leads to an interesting open question:
          Given that most programming time is spent debugging rather than writing the initial draft of a function, reducing the opportunities for bugs reduces development time. Knowing that, do "quick and easy" languages which have more opportunities for bugs actually take LONGER than more disciplined languages that reduce bugs by doing things like declaring variables? It's entirely possible that for many types of programming "quick and easy" is the slowest, hardest way to do it because of the time you spend at the end getting the bugs out of the that quick and easy code, fixing mistakes that you couldn't easily make in a "harder" language.

          • This often stated problem of confusing strings with integers. Does anyone actually believe that a halfway decent programmer would make this mistake ?
            • It's not even so much the PROGRAMMER not being clear on strings vs numbers as it is untyped and weakly programming languages doing the wrong thing. You can know that your employee ID is a string, but you can't tell Python that.

              YOU may know that along with addresses like 5610 Maple Street, there can also exist 5608B Maple Street, but there is no mechanism to tell Python that part of the address, in this case 5610 is a string, not actually a number, and have that stick.

              The project I was doing *today* needed t

            • No. Only idiots make that mistake. And in both cases the idiot making the mistake is told about it.

      • It is an easy mistake for inexperienced developers to forget about best practices for SQL like input validation.
        The scary part is that they got hired in the first place. Who ever hired them should be fired. When it comes to government software you should take it very seriously and hire the best experienced developers you can.

    • Sure slap in a few Raspberry Pi's running emulators. There can't possibly be any bugs or unforeseen issues...

    • Seems like something a couple competent devs could make in a few weeks.

      Unlike the "modern" Agile processes all the Hipster Boomer-haters are in love with today, the existing systems required extensive testing and validation prior to being used. When a system dispersing Trillions of dollars has a bug it's a Really Bad Day and you can't just put it on a whiteboard to "fix it in the next Sprint."

      I find it rather amusing to see all the kids who rail against Boomers and refuse to employ anyone over the age of 30 suddenly forced to come begging those very people to help save them. P

      • Praise be to the Boomers, who bestowed upon us code so awesome that it tips over under 10x normal load, on hardware that's 1000x as fast as it was written on.

        • by bws111 ( 1216812 )

          The need to change the code has nothing to do with the load, it has to do with the fact that the rules changed. The fact that the systems are overloaded has nothing to do with the code, it has to do with the fact that the hardware is sized for typical loads. That problem is easily corrected by adding capacity to the hardware, which can usually be done instantaneously and with zero interruption to the running systems.

          • That stuff was written for a machine less powerful than a smart watch.  How do you even scale down modern hardware that far?  I remember IBM throttling things on the AS400 series unless you paid them.  Is this what's going on?
            • by bws111 ( 1216812 )

              Yes, of course. A customer who needs to process 10k transactions a day does not want to pay for hardware or software license fees that can process 100k transactions a second. They also do not want their software license fees to jump just because they purchased new hardware. So they buy one of the other 300 or so speed choices to match their needs. If a sudden peak comes along they can either add permanent capacity, or temporary capacity (CBU).

              • Sounds like an extremely wasteful scam.
                • by bws111 ( 1216812 )

                  Yeah, having customers only pay for what they use is quite a scam.

                  • When you sell them the hardware for an exorbitant price, yes.
                    • Most mainframes were (and are) leased. There were rare exceptions, but for the most part they were treated like the cloud is now - you bought compute cycles, not machines. Yes, you had to provide a data center to install it in, just like corporate server farms that are now becoming "private clouds."

                      If you suddenly needed more capacity, the mainframe companies (IBM is the only real survivor) could upgrade your system in place quite easily and quickly as long as there was space, power, and cooling available.

                    • At last, I understand. My biases ran away with me.  Thank you for explaining.
        • Don't you mean the executives who saw that it "just works" for decades and so never spent money on planning for a 10x load or the need to make changes to the code for emergency situations and and thus got rid of all the people who could make the changes? Imagine if your code actually just worked continuously for 10 years without having to be changed? Can you imagine that level of reliability? Oh, wait, you can't because your code is shit and fails regularly enough that people like me exist to go in and pro
          • "Imagine if your code actually just worked continuously for 10 years without having to be changed? Can you imagine that level of reliability?"

            Not only can I imagine it, I've achieved it. Used to freelance in the mid 90's, Accounting and Logistics, with a bit of GIS. I moved on to the game industry 15 years ago, but about half of those apps are still in use. Nobody has touched them since. It's not that hard to write solid code for businessey apps if understand the problem domain and work al
  • IDENTIFICATION DIVISION.
    PROGRAM-ID. WE-ARE-FUCKED.

    ENVIRONMENT DIVISION.
    CONFIGURATION-SECTION.
    SPECIAL-NAMES.
    CLASS WS-GLOBAL-WARMING-WILL-MAKE-THIS-LOOK-LIKE-A-PICNIC IS
    'Y' 'E' 'S' 'I' 'N' 'D' 'E' 'E' 'D'.

    DATA DIVISION.
    WORKING-STORAGE SECTION.
    01 WS-FAREWELL-MESSAGE PIC X(30).

    PROCEDURE DIVISION.

    • I bought an old book on COBOL out of curiosity. After longs chapters on the language, (that I don't have the capacity to understand.) it explains that we may build one day teleporters to others planets using COBOL. Wow.... Sorry for my bad english, I'm french.
  • by OscarGunther ( 96736 ) on Sunday April 12, 2020 @09:35AM (#59936568) Journal

    I like the idea of a bunch of septuagenarians riding over the hill to save our butts. Smacks of The Crimson Permanent Assurance, if anyone remembers that short.

  • It is this very reason that these kinds of systems should be open sourced. That the state and Federal governments in the U.S. have steadfastly -refused- to move to open source code repositories and code sharing is nearing criminal. If code is speech, then government code should be freed speech. Simple as that.

    If these codebases were open sourced in publicly available repositories, they'd be worked on. I have no doubt. Our community is JUST THAT WAY. That wouldn't mean the projects would have to accept any a

    • Absolutely not, Linux and FreeBSD are "maintained", and they are very complex codebases. Plus, I'd imagine that a gross majority of the COBOL code in these codebases is actually redundant code, doing things like CMS or IdM or the like, that could be bridged to other, better systems. (And, in fact, are likely actually counterproductively acting as gatekeepers to better data sharing and transparency... not that I want to give the Feds more spying power, but many of these systems seem to have inherent interoperability issues that make them so highly inefficient that they're making government substantially more costly than it need or should be.)

      Here are the underlying problems in doing this:

      1. The code base is old, probably very crufty, but *stable* and, in huge financial systems like this, stability and code-correctness is more important than *anything*

      2. You want to run this stuff on mainframes for the same reason as above. These mainframe architectures have been around forever, and at this point are stable and mature. As others have pointed out, you can take a binary written on an S/370 in 1982, and run it un-modified on a brand new IBM z/Syste

      • If you think they are crufty, then you have never seen the code and have never written anything in COBOL.
  • Who are all the people that need to be fired for allowing such an ancient system to still be used.
    • by bws111 ( 1216812 )

      There is nothing 'ancient' about these systems. The hardware, OS, and middleware is probably all less than 5 years old.

  • What the hell is a "COBOL Mainframe"? Closes I can think is it must be the printed manuals and the MAPS for the computers. The whippersnappers have probably never seen actual documentation and manuals before.

  • by LostMyBeaver ( 1226054 ) on Sunday April 12, 2020 @04:42PM (#59938154)
    Honestly, COBOL is just a programming language.

    COBOL generally runs on IBM mainframe and midrange computers. These are machines based on a transaction based model. It's actually very easy to understand... especially for anyone who has tried AWS Lambda or Azure Functions.

    On IBM terminals, they're not like VT100 displays as most of us are used to. Instead, the 3270 understands the concept of fields. And when you produce and event, it's similar to a form being posted on the web.

    So... everything is event driven. If you schedule a task like a batch job, an even triggers at a given time. If a user submits a form, it generates an event... you get the idea.

    When an event is generated, a system called CICS (pronouced kicks) receives the event and routes it (similar to how most web frameworks route URLs and validate form data these days). And it then calls the function which is registered for the event handler.

    On IBM, everything is an object... this should be familiar to anyone who has used a NoSQL database similar to mongo. Schema enforcement is still more strict than NoSQL though.

    Data is stored in an ISAM (fancy word for a database table) and the API for this is called DB2 which does support SQL, but it's amazing how many programmers don't bother with that as there are other APIs for that.

    You don't write COBOL programs usually. Instead, you write COBOL functions... which isn't really a big deal since typically a program written in C is generally just a function named main() which calls other functions with different names.

    The layout of a COBOL function is pretty much that you define the model at the top and the you define the code to operate on the model beneath it. This is called divisions.

    Math expressions in COBOL look like English rather than ... well math expressions. You have to explain your math using the term evaluate. The language itself is not really much more difficult than SQL to understand.

    If you have a system up and running which supports COBOL, it's REALLY easy to learn. But like with most new languages, getting the build environment running can be a problem.

    If you want to give it a try and see if you can figure it out. Install Raincode's free development tools which pretty much implements the entire IBM programming infrastructure as a .NET platform. It's free and it's a lot easier to learn to use than an IBM mainframe or Microfocus tools.

    There should be no shortage of COBOL programmers... the only difficult part of COBOL is actually figuring out how to use an IBM mainframe. The COBOL/CICS/DB2/RPGIII stuff is quite easy... much easier than Python, C#, Perl etc... at least.
    • by dwywit ( 1109409 )

      Thanks for that - It's such a different environment to x86/ARM world.

      Mine was AS400s - I couldn't get over the granularity of control over user-space jobs. Subsystems (AKA cgroups), 100 levels of "nice", all from the first iteration of OS400 in 1988.

      I was running a mid-level local govt F35, with about 500 end users - half on 5250 terminals and half on PCs. The AS400 had IIRC 48MB memory, and still rendered sub-second response times on the terminals.

    • COBOL does not run on a computer. A COBOL compiler runs on a computer. That compiler outputs object modules. A program called a 'linker' then takes the object module and a bunch of runtime libraries and creates a load module. A computer will load the load module (which contains machine instructions, and not COBOL instructions), combine it with "load libraries" and execute the result.

      At NO POINT WHATSOEVER is COBOL executed. Nor is there such a thing as a "computer" which supports COBOL. Compilers supp

      • First of all, there are many interpreted COBOL environments in production today. Which would mean very specifically that COBOL is being executed.

        Second, I would hope that the people who read what I wrote would have that level of understanding and take it for granted and allow me the benefit of saving 10 pages of typing rather than describing the process of writing this.

        Finally, if we really want to split hairs on this and see whose is longest (pages of errors printed from the least amount of code of course
        • by bws111 ( 1216812 )

          IBM mainframes have not executed machine code produced from COBOL in a very very very long time. The COBOL compilers for IBM product intermediate code (which for old fogies would look like 370 machine code evolved) that is then AOTted or JITted into the machine language you're referring to.

          Where did you get THAT bullshit from? The result of compiling COBOL on Z is an object module which is linked then executed. There is no 'intermediate code'.

          • Ok... I may have reached the limit of my understanding of IBM mainframes and have been wrong. But the LIST option of IBM Enterprise COBOL generates HLASM (https://www.ibm.com/support/knowledgecenter/SS6SG3_4.2.0/com.ibm.entcobol.doc_4.2/PGandLR/ref/rpbug09.htm) which from what I can tell generates Z machine code (as is shown in the listing and you can verify the format as the same as BAL on the 370 with some extensions). The object code generated from this in coff, goff or xobj (from what I can tell in http
  • One thing I have yet to see explained is what exactly the problem is that programmers are needed to solve. The systems are said to be overwhelmed. That sounds like a hardware problem: not enough cycles, not enough memory, inability to support a large enough number of users, that sort of thing. If the problem is with limitations in the software, what is it? Are there compiled in constants that limit numbers of accounts? Data structures of limited size? What is it that needs to be changed in the software?
    • Probably nothing whatsoever. Someone needs an excuse for why they cannot do their job. This is a convenient excuse that the proletariat will buy. Look at all the proles on here that are buying it. This is CYA for someone who needs to appear to be doing something in order to get a new yacht.

      • This is the same as the dickhead that claimed the Internet was overloaded and please shittify your video streams NetFlix and YouTube (and Crave too, though I do not think they can shittify them and more than they already are since they are already compressed to ratshit unwatchability).

        Both are scams by shitheads who need to appear to be doing something in order to get a new yacht. This is something. So they will get a new yacht. It is not anything meaningful or useful. Just something. It is called "Cov

      • by bws111 ( 1216812 )

        Programmers are needed because the rules for unemployment have suddenly changed. They don't have the normal lead time to contract out the work, hence the scramble.

        Hardware is needed because the systems are sized for normal workload, and are now over capacity. No programmers needed for that,

        No need for conspiracy theories.

    • by bws111 ( 1216812 )

      The problem is these articles are poorly written. There is not ONE problem with these systems, there are TWO different problems, with different solutions.

      The first problem is that the rules for unemployment have changed. There are different eligibility requirements. Probably more disruptive is that fact that now there are funds from the federal government. These types of changes obviously require programming (business logic) changes. In normal times, changes such as these would be scheduled far in adva

    • by bws111 ( 1216812 )

      Oh, and one way programming can help with the overload issue is by not requiring millions of people to log on every week to certify that they are actively looking for work, when the government has ordered that there is no work.

"Someone's been mean to you! Tell me who it is, so I can punch him tastefully." -- Ralph Bakshi's Mighty Mouse

Working...