Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming

A Coder Considers the Waning Days of the Craft (newyorker.com) 158

Programmer and writer James Somers, writing for New Yorker: Yes, our jobs as programmers involve many things besides literally writing code, such as coaching junior hires and designing systems at a high level. But coding has always been the root of it. Throughout my career, I have been interviewed and selected precisely for my ability to solve fiddly little programming puzzles. Suddenly, this ability was less important.

I had gathered as much from Ben (friend of the author), who kept telling me about the spectacular successes he'd been having with GPT-4. It turned out that it was not only good at the fiddly stuff but also had the qualities of a senior engineer: from a deep well of knowledge, it could suggest ways of approaching a problem. For one project, Ben had wired a small speaker and a red L.E.D. light bulb into the frame of a portrait of King Charles, the light standing in for the gem in his crown; the idea was that when you entered a message on an accompanying Web site the speaker would play a tune and the light would flash out the message in Morse code. (This was a gift for an eccentric British expat.) Programming the device to fetch new messages eluded Ben; it seemed to require specialized knowledge not just of the microcontroller he was using but of Firebase, the back-end server technology that stored the messages. Ben asked me for advice, and I mumbled a few possibilities; in truth, I wasn't sure that what he wanted would be possible. Then he asked GPT-4. It told Ben that Firebase had a capability that would make the project much simpler. Here it was -- and here was some code to use that would be compatible with the microcontroller.

Afraid to use GPT-4 myself -- and feeling somewhat unclean about the prospect of paying OpenAI twenty dollars a month for it -- I nonetheless started probing its capabilities, via Ben. We'd sit down to work on our crossword project, and I'd say, "Why don't you try prompting it this way?" He'd offer me the keyboard. "No, you drive," I'd say. Together, we developed a sense of what the A.I. could do. Ben, who had more experience with it than I did, seemed able to get more out of it in a stroke. As he later put it, his own neural network had begun to align with GPT-4's. I would have said that he had achieved mechanical sympathy. Once, in a feat I found particularly astonishing, he had the A.I. build him a Snake game, like the one on old Nokia phones. But then, after a brief exchange with GPT-4, he got it to modify the game so that when you lost it would show you how far you strayed from the most efficient route. It took the bot about ten seconds to achieve this. It was a task that, frankly, I was not sure I could do myself.

In chess, which for decades now has been dominated by A.I., a player's only hope is pairing up with a bot. Such half-human, half-A.I. teams, known as centaurs, might still be able to beat the best humans and the best A.I. engines working alone. Programming has not yet gone the way of chess. But the centaurs have arrived. GPT-4 on its own is, for the moment, a worse programmer than I am. Ben is much worse. But Ben plus GPT-4 is a dangerous thing.

This discussion has been archived. No new comments can be posted.

A Coder Considers the Waning Days of the Craft

Comments Filter:
  • by Press2ToContinue ( 2424598 ) on Wednesday November 15, 2023 @09:42AM (#64007125)
    Looks like we've finally hit the era where 'Have you tried turning it off and on again?' is replaced with 'Have you asked GPT-4 yet?' The nostalgic part of me misses the days when debugging was more about coffee and less about cloud-based AIs. Remember when our biggest worry was a misplaced semicolon, not whether our AI co-pilot might replace us? Sure, Ben and his GPT-4 might be the new dream team, but let's not forget the unsung heroes: Stack Overflow and coffee. Maybe the real 'centaurs' of programming are just devs with a decent Wi-Fi connection and a strong espresso. And who knows, maybe one day our AIs will write nostalgic articles about the good old days of human coders.
    • by MightyMartian ( 840721 ) on Wednesday November 15, 2023 @10:19AM (#64007211) Journal

      I miss the days when an entire programming language could be reasonably well described in a book of 80 or 90 pages, like TRS-80 BASIC. I literally learned how to code from their BASIC instruction book for my shitty like TRS-80 MC-10 with a whopping 20k of RAM (4k+16k expansion pack). Now I wager there's not a program I wrote in 1982 and 1983 that I toiled over for hours, that GPT couldn't recreate in a few seconds. But that's progress.

      • Well, Scheme still has your back there.
      • I miss the days when an entire programming language could be reasonably well described in a book of 80 or 90 pages, like TRS-80 BASIC

        Sometimes that crops up, on the deep embedded end. The description is 2 pages for the summary with a further 6 pages of excruciating detail (page 69 heh heh heh onwards)

        https://ww1.microchip.com/down... [microchip.com]

        I haven't done pic asm in a few years but it's very refreshing, somehow.

      • by CohibaVancouver ( 864662 ) on Wednesday November 15, 2023 @02:19PM (#64007861)

        I literally learned how to code from their BASIC instruction book for my shitty like TRS-80 MC-10

        As a former TRS-80 Model I user (with 48K of RAM!) I always had mad respect for guys like you who could code on that MC-10's tiny keyboard!

        ...and to all you whippersnappers reading this, now get off our lawn!

      • GPT is only as good as the data set it was trained on. I wouldn't be surprised if there were problems it could easily solve in Python or Java that it completely fails on for some of the time left languages that don't have websites or stack overflow questions aplenty.

        If tomorrow someone released a new programming language Chat GPT would be useless. It can't reason and even if it can regurgitate a solution in some other language it can't translate the algorithm into a new language. It has to wait for some
    • by nightflameauto ( 6607976 ) on Wednesday November 15, 2023 @10:41AM (#64007257)

      Looks like we've finally hit the era where 'Have you tried turning it off and on again?' is replaced with 'Have you asked GPT-4 yet?' The nostalgic part of me misses the days when debugging was more about coffee and less about cloud-based AIs. Remember when our biggest worry was a misplaced semicolon, not whether our AI co-pilot might replace us? Sure, Ben and his GPT-4 might be the new dream team, but let's not forget the unsung heroes: Stack Overflow and coffee. Maybe the real 'centaurs' of programming are just devs with a decent Wi-Fi connection and a strong espresso. And who knows, maybe one day our AIs will write nostalgic articles about the good old days of human coders.

      While I do think the days will come where AI will be able to program at least on par with most developers, right now it's more of a souped up search engine. Whereas we used to Google for other developers solutions to problems we faced, now we can go to GPT-4, tell it the sitch, and rather than get page after page of results we need to manually sort through to find our solution, it tries its best to sort through all that for you and return you what it considers the optimal result. And as it learns optimal results, and is either told to refine it further or not, it learns better ways to sort through results.

      It's certainly going to learn faster than any human developer would, because it won't 'forget' lessons as it goes, and it will be learning at multiple points every second it's in use, rather than the more mundane human methodology of learning things a bit at a time. It's both fun and terrifying to try to predict where this could take itself in the future, but I'm quite certain "good" developers are a ways away from being outright replaced by it.

      • by TractorBarry ( 788340 ) on Wednesday November 15, 2023 @12:45PM (#64007633) Homepage

        Not sure if anyone's already done this ? (probably but I'm too lazy to look). But that gives me the idea that an AI front end/proxy to web searches would be a great idea.

        Me: "AI, please find me some useful lnks about x,y,z without advertising, useless promoted links and other crap"

        AI then goes off to Google, Bing etc. etc. and filters out an actually useful set of results !

        Might actually make search engines properly useful again ?

        • Not sure if anyone's already done this ? (probably but I'm too lazy to look). But that gives me the idea that an AI front end/proxy to web searches would be a great idea.

          Me: "AI, please find me some useful lnks about x,y,z without advertising, useless promoted links and other crap"

          AI then goes off to Google, Bing etc. etc. and filters out an actually useful set of results !

          Might actually make search engines properly useful again ?

          The first search engine that offers a "filter out all the garbage Google ads" return on queries will be a HUGE winner. Until Google sues them into oblivion or buys them out to keep them ad dollars flowing.

    • by rsilvergun ( 571051 ) on Wednesday November 15, 2023 @11:02AM (#64007315)
      and they're always in high demand because they're cheap and good enough is good enough. If you're a skilled programmer you're either really good at math and not really a programmer, your a mathematician using your tools, or you're going to get out of programming quickly and move into some kind of management roll.

      The alternative is to either start your own company (and hope you get bought out, because with zero anti-trust law you'll either be bought or run out of business if you're successful) or wait until you're fired for age discrimination in your early 40s.

      Again, none of this applies if you're at the top end of programming, but also again, that's not programming it's math.

      The difference here is that management knows that the boiler plate copy/pasting exists because they keep hearing about "AI'.

      The real horror of AI is that it has ever CEO and board of directors looking to replace us with automation. We can blather on with "well acktually that's not AI" all we want, your boss doesn't care. Before the AI boom he didn't realize 30-50% of your job could be automated, now he does.

      That's the sea change that's about to hit us like a brick whether we like to admit it or not.
      • get an union and strike to get AI protections

        • Yes, let's retard progress instead of instituting UBI. This is why we can't have nice things. We NEED progress, but we humans also NEED progressive social systems if we are going to survive it.

          • And yet no one either knows how to pay for UBI or does not want to be the one with the bill.

            Even if you scrap most if not all welfare programs, it still would not be able to sustain most of the public even at the base poverty line. To say nothing of the boomers rioting if you try to slash their pet programs.

            Have the rich and wealthy pay for it? Even if they are the most vocal advocates, most of them would not pay for it either. And despite what people may say, there really are not enough wealthy people to p

          • collective bargaining you've got fuck all chance getting UBI. Unions are needed to form the voting blocks that'll get you the UBI you want and to enforce anti-trust law so your UBI money doesn't just get eaten up by monopolies jacking up prices.

            Unions aren't jack booted mafioso like you see in bad TV. It's just workers getting together and bargaining collectively. That's it. . The mob was briefly involved because the Unions needed muscle to deal with violent strike breakers. That's not a thing anymore.
        • but 90% of programmers think they're irreplaceable geniuses and 99% of sysadmins think that, so they dismiss the idea of collective bargaining outright because, well, they somehow lack the critical thinking skills to see past anti-Union propaganda. Relying on and Joining a Union would make them feel less of a man. Like they can't "make it on their own".

          I think that's mostly a boomer thing, but the lower pay and massive number of H1-Bs have chased pretty much anyone under 50 out of IT. Again, Math is not
          • As a Gen X er I can tell you full well we've known the horrors of the boomers longer than other generations have been alive. So, guess what? We were the first ones to lose pensions, and a decent social safety net. So, sorry, feel free to correct your misconceptions, though I doubt you will.
      • 'Management roll' sounds unappetizing
      • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Wednesday November 15, 2023 @01:40PM (#64007769)

        Dude, to get you up to speed: Current state of AI was a fuzzy fantasy 18 months ago. Since a week ago you can get it's newest update as a subscription. For _every_ field that involves humans sitting in front of KBs and screens and producing digital stuff from code to edited images. Not only does it extend edit and modify images or code in a quality that is more than good enough for most cases, it does do in 30 seconds and not 3 hours or 3 weeks. I'm pretty good at professional image editing and web coding and if anything about AI image editing is to go by, my coding days are over unless I want to do it for fun.

        You sound like you're quite unaware what's actually happening as we speak. Chat GPT 4 Turbo just came out and TFA clearly describes scenarios of the AI doing things within minutes that were beyond a seasoned developers knowability, let alone practiced skill.

        How long do you think it will take for that AI to be able to listen to someone's incoherent requirements list and figure out the actual requirements and deliver the solution in less than a minute? A year? Probably at most. We're seeing exponential improvements already and they're just warming up.

        I've got 37 years of programming under my belt, am a seasoned pro when it comes to software development, and I'm considering a solid career shift because of AI. Trust me, our world won't look anything like it did just a moment ago very soon now.

        Prepare for incoming, my very young Padawan. AI will be running circles around you very soon if it isn't doing that already.

      • As seen on twitter [twitter.com] 3 years ago:

        To replace programmers with Robots, clients will have to accurately describe what they want.

        We're safe.

        ./Oblg. What the customer really needed [knowyourmeme.com] meme.

  • We spent days writing subroutines to find out when Easter would be or a faster sorting routine.
    Some of us still had slide-rulers.

    Good times but I'm glad they're gone.

    It's called 'progress'.

    • by Viol8 ( 599362 ) on Wednesday November 15, 2023 @09:59AM (#64007167) Homepage

      The problem with progress is that some skills are lost. You can see it in engineering now. You might think how to choose and write something like a sorting routine might soon be redundant but at some point someone has to have those skills if only to check whether AI-of-the-week is talking nonsense or not when it does it for you.

      • I guess this is where documentation comes in.

        I know that in my own case, I rely heavily on my own documentation if I ever need to troubleshoot something that I did a year or more ago...

        • by Viol8 ( 599362 )

          Documentation is necessary, but its not a substitute for a real visceral understanding of a problem.

      • by smooth wombat ( 796938 ) on Wednesday November 15, 2023 @12:40PM (#64007619) Journal
        The problem with progress is that some skills are lost.

        Such as how to read [usatoday.com] an analog clock.

        There's a book called The Ring of Charon and one part describes how the pace of technology was so great, it led to a worldwide depression. One of the portions is relevant to what you said, but in the opposite way. The book says because of the pace of progress, some people would spend their entire lives learning new skills, but never hold a job. They would retire having never worked.

        Depending on how rapidly AI infiltrates our lives, we could see a similar situation where entire segments of corporate society are undone. If a piece of software can correctly compute the angles needed to produce sheet metal for a car, why would the company need to have an entire division of designers and fabricators?
      • by Darinbob ( 1142669 ) on Wednesday November 15, 2023 @01:05PM (#64007675)

        I interviewed someone and asked a basic programming question. To find out if he knew the language (so many people will put it on the resume but then flail on the job), and to know if he could think through the problem, and because this was the sort of thing actually being done on the job. His answer "I would use a library to do this." It was a completely lame answer, that translated to "I dunno, but I will make excuses".

        But there are programmers that honestly think that way. Usually younger ones. They've got this idea that libraries are sacred things, written by celestial entities, and mere programmers cannot hope to write them. The reality is, the libraries have bugs, third party developers are every bit as lame as the in-house team, and because we have the source code we do spend time fixing the libraries.

        Also in reality, if your network stack has a bug then you must fix it yourself, you will spend even more time trying to coordinate with the third party, and if they do get a fix you will have to wait a very long time until the next release while still being asked daily if the fix will be ready before the mob with the pitchforks arrive.

        In some interviews we ask the candidate to describe at a high level, without disclosing proprietary details, how the system works that the claim to have been key developer. I am amazed at how often they cannot describe it, or draw a set of boxes with arrows between them. They essentially were stuck in a very tiny dark silo writing the logging functions, or the configuration management.

        I worry that the future will be like Idiocracy - no one knows how to fix the stuff they use and are also unclear on how to use it.

        • by timeOday ( 582209 ) on Wednesday November 15, 2023 @01:28PM (#64007733)
          What libraries are is a manifestation of hyper-specialization. They're not better than custom code because magic, they're better because a bunch of people poured 1000x the hours into it than you can spend replicating some of it. (Of course sometimes a library isn't close enough to what you need, or is of low quality, such that you shouldn't / can't use it).

          I enjoyed programming a lot more when it was more like a game, with simple rules and using creativity to build up from there. But you just can't keep up working that way. You will get shown up by somebody stringing together some libraries.

          • by techno-vampire ( 666512 ) on Wednesday November 15, 2023 @02:10PM (#64007837) Homepage
            I remember, back in the Good Old Days when MS-DOS was king a friend of mine helped track a bug in a commercial package he was helping develop down to a bug in one of the standard C libraries provided by the company that sold the compiler they used. It had slipped through testing because it only activated when it was working on a string that crossed a segment boundary. (Remember them?) Because it incremented the address of the character to move before checking to see if it had reached the end, the address would fold over to the beginning of the segment instead of crossing the boundary, and their program was the first time this had actually happened. I won't name the company, but I will say that they licensed their products like a book.
      • Yes, some people need to have those skills. But the *number* of people who need to have that skill will drop precipitously. Learning to write a sorting algorithm is quite educational. It was my first high school programming assignment. (And I still remember that may algorithm really sucked) But I can assure you that the sorting libraries you get today are much better than anything that will be written by hand. A small number of people will be needed to maintain and understand them but the rest of us j
  • by ranton ( 36917 ) on Wednesday November 15, 2023 @09:59AM (#64007163)

    Most of his uses of GPT-4 could be replaced with the concept of the compiler 70 years ago. Coding today is so much different than it was before the first compilers, and coding tomorrow will be much different after these recent advances in generative AI. But I really don't see programmers going away. Or even the true core skills of programmers going away (and I don't consider knowing language syntax one of those core skills).

    I have spent too much time watching non-developers muck around with low-code platforms to believe non-developers are going to be successful using generative AI to write software applications. Instead I think developers will use generative AI to be 10x more productive than they are today, just like we are 10x more productive than our predecessors 70 years ago.

    • Most of his uses of GPT-4 could be replaced with the concept of the compiler 70 years ago. Coding today is so much different than it was before the first compilers, and coding tomorrow will be much different after these recent advances in generative AI. But I really don't see programmers going away. Or even the true core skills of programmers going away (and I don't consider knowing language syntax one of those core skills).

      I have spent too much time watching non-developers muck around with low-code platforms to believe non-developers are going to be successful using generative AI to write software applications. Instead I think developers will use generative AI to be 10x more productive than they are today, just like we are 10x more productive than our predecessors 70 years ago.

      I concur, LLMs are basically the next stage of compiler. They can generate good code, probably one day better than almost any human programmer, but they still need humans to drive them.

    • Most of his uses of GPT-4 could be replaced with the concept of the compiler 70 years ago. Coding today is so much different than it was before the first compilers, and coding tomorrow will be much different after these recent advances in generative AI. But I really don't see programmers going away. Or even the true core skills of programmers going away (and I don't consider knowing language syntax one of those core skills).

      I have spent too much time watching non-developers muck around with low-code platforms to believe non-developers are going to be successful using generative AI to write software applications. Instead I think developers will use generative AI to be 10x more productive than they are today, just like we are 10x more productive than our predecessors 70 years ago.

      And I'm certain that marketing will be able to soak up that 10x productivity increase with ever escalating fantasy scenarios they expect us to make real ten seconds before they had the thought. Sigh.

      • by ranton ( 36917 )

        And I'm certain that marketing will be able to soak up that 10x productivity increase with ever escalating fantasy scenarios they expect us to make real ten seconds before they had the thought. Sigh.

        Well this is why I don't think there will be a 90% reduction in developer head counts. I believe the world could benefit from 10x more software development than what happens today, and once developers (and BAs, QA, etc) are 10x more efficient we will finally be able to write all that software which isn't cost effective to write today.

        We could reach a day when a corporate ERP/CRM system which was written 2 years ago is considered legacy software, because that's how often software is rewritten and refactored

        • And I'm certain that marketing will be able to soak up that 10x productivity increase with ever escalating fantasy scenarios they expect us to make real ten seconds before they had the thought. Sigh.

          Well this is why I don't think there will be a 90% reduction in developer head counts. I believe the world could benefit from 10x more software development than what happens today, and once developers (and BAs, QA, etc) are 10x more efficient we will finally be able to write all that software which isn't cost effective to write today.

          We could reach a day when a corporate ERP/CRM system which was written 2 years ago is considered legacy software, because that's how often software is rewritten and refactored in the world of generative AI. End users chatting with an LLM complaining about this or that fed straight to product managers and development teams, and implemented and fully regression tested in days (along with updated documentation and training videos). It really could be pretty spectacular IMHO.

          The testing to me would be amazing. My company doesn't really understand testing, and no amount of attempting to show what testing should be seems to stick. In twenty-three years we finally managed to have two testers / qa people for the entire department of developers, and we still miss tons of shit because throughput is considered more important than quality. If we could get an AI tester that followed the scripts and, here's the kicker, actually reported errors back? It'd probably save us WEEKS of post-ro

    • by rsilvergun ( 571051 ) on Wednesday November 15, 2023 @11:09AM (#64007329)
      into a machine. Those are jobs that don't exist anymore.

      I'm old enough to remember when Bill & Hilary Clinton talked about transition to a service sector economy.

      I'm smart enough to know that service sector jobs pay like crap.

      Years later I came across this article here [businessinsider.com] that linked a study showing 70% of middle class job losses were due to automation, not outsourcing. That's your "service sector economy" right there. McJobs as far as the eye can see.

      Clinton's advisors (both Clintons) saw this happening in the 90s. We were 15 years into it by then.

      I suspect but can't prove that the "service sector economy" nonsense was their desperate and doomed attempt to address the problem. There was no way in hell after decades of anti-communist / anti-socialist propaganda we were going to talk about that automation benefiting anyone but the handful of people that claimed ownership over the machines.

      So the goal was to keep the economy going with shitty service jobs and the occasional investment bubble until enough progress happened we could talk openly about it (or more likely until it was somebody else's problem).

      And here we are on the cusp of another massive automation boom. "AI" isn't real, every nerd here knows it's not real artificial intelligence.

      But your Boss doesn't know or care. All he knows is there's a whole bunch of labor he's paying for that can be automated. He doesn't give a shit if you call it AI or not. That buzzword's got him thinking about it now, and you, me and everyone with a job now have a target on our backs.
      • by ranton ( 36917 )

        Years later I came across this article here [businessinsider.com] that linked a study showing 70% of middle class job losses were due to automation, not outsourcing. That's your "service sector economy" right there. McJobs as far as the eye can see.

        While it most likely is true that this next wave of automation will push more middle class individuals into the working class / poverty, let's not forget that most people leaving the middle class have moved upwards into the upper middle class and upper class. Pew Research [pewresearch.org] found that while the middle class shrunk from 61% of the population in 1971 to 50% in 2021, 64% of those individuals moved to an upper income level while 36% moved into lower income. Overall it was a net positive.

        As long as we do a better

        • The first thing the article says is:

          The share of adults who live in middle-class households fell from 61% in 1971 to 50% in 2021

          That's a raw 11% drop in the number of middle class households. The next quote is:

          The shrinking of the middle class has been accompanied by an increase in the share of adults in the upper-income tier – from 14% in 1971 to 21% in 2021

          I think that's where your getting your numbers, but it's not the same. There's still 11% less middle class. There's more upper middle class,

          • by ranton ( 36917 )

            There's still 11% less middle class. There's more upper middle class, but those aren't part of that 11%.

            Pew Researcher's definition of Upper-Income Tier does include the upper middle class. It started at $220k family income in 2020, which aligns well with what I'd consider the upper middle class.

            But regardless of definitions, the research stills shows that 64% of people who left the middle class had their income increase, not decrease.

            • by jbengt ( 874751 )

              It started at $220k family income in 2020, which aligns well with what I'd consider the upper middle class.

              A household annual income of $220k is more income than 85% to 90% of US households make. I would not consider that percentile middle class, upper or not.

              • by ranton ( 36917 )

                A household annual income of $220k is more income than 85% to 90% of US households make. I would not consider that percentile middle class, upper or not.

                The middle class has nothing to do with what percentile of income you are in. If that was true, the middle class could never shrink because it would always be the middle 50% (or whatever threshold you choose) of the population. The middle class and upper middle class are a lifestyle, not an income range.

                The upper middle class arguably starts at around the top 10% of income earners. It represents people who still depend on their own labor until retirement age to fund most of their lifestyle, and have a lifes

          • by ranton ( 36917 )

            We don't have safety nets though. it's not about "doing a better job". They don't exist. I have family that needed that "safety net" at one point due to a major illness preventing them from working. They got $150/mo in food stamps and $100/mo in "cash assistance". They had to pay back the cash assistance. No unemployment insurance since they quite their job "voluntarily".

            Well I guess that's just a difference of definitions. You listed $250/mo of safety nets, so they clearly exist (by my definition at least). But we both appear to agree they aren't enough.

        • by jbengt ( 874751 )

          That's your "service sector economy" right there. McJobs as far as the eye can see.

          Not all service sector employment is for "McJobs". A large number, if not a majority, of software developers work in the service sector. [wikipedia.org] And all those that work as consultants are in the service sector, even if their clients are in manufacturing or agriculture. Independent software companies that sell licenses to other companies are in the service sector. I've made a career working for consulting engineering firms, which

      • Er, transition to a service sector economy began being discussed in the early 1970s if not before.
      • "AI" isn't real, every nerd here knows it's not real artificial intelligence.

        Now that's an oxymoron.

        • it's buzzwards vs reality. There's no actual intelligence in LLMs (what everyone's calling AI). It's just a complex system for repeating certain patterns, but it's not actual able to synthesize new information. It just lets you mix and match existing data. it can't paint a picture, it can copy the form of the thousands and thousands of pictures people have painted. It's like an ultra advanced search engine (which is why Google is scared of it)
          • My comment was about calling something artificial, not real.

            But to address your real point, I don't think things are defined enough to say much meaningful. We don't really have a good scientific definition of intelligence. However, for our most popular mathematical definition of information, I would claim LLM are creating information. They can definitely create lots of text the world has never seen before. But you are probably using a less formal notion of information than Shannon's theory.

            To the

    • My coding today involves traditional compilers, and I frequently have to look at the assembler, I have to write code that knows about the machine it is running on, how to optimize it, how to make the code smaller, etc. Really, not much has changed in this still from the 80s, other than having really tiny devices the size of your thumb that are more powerful than the supercomputers back then.

    • by CAIMLAS ( 41445 )

      We aren't 10x more productive than people were decades ago. That's a long disproven trope.

      Productivity has remained fairly linearly progressive, if not flat, for hundreds of years going back long before the industrial revolution. We have made small, incremental changes. The fundamental problems are the same.

      At best, we can hope for the occasional one-in-a-million twinkle of a "Eureka!" moment which moves the needle ever so slightly. GenAI and such might help here, but the jury is still out on that.

  • Next Level Search (Score:5, Insightful)

    by saider ( 177166 ) on Wednesday November 15, 2023 @09:59AM (#64007165)

    If you read this, the AI still required someone to identify the problem and propose the solution. All the AI did was take much of the tedium out of developing the solution. Something that would be done with hours of searches and filtering out irrelevant content for one reason or another.

    If your job is "here, code this up", then you should be worried.

    If your job is to come up with "this" then you are probably OK.

    • by keltor ( 99721 ) * on Wednesday November 15, 2023 @10:24AM (#64007217)
      Ultimately though, LLMs are really really just a Fancy Dictionary and a Next Level Search Engine and unfortunately both are highly error prone and suffer from a huge problem of subtle lies. Unlike the OP, I find it makes a LOT of rookie mistakes and is nowhere near a "senior dev" level. More like a newbie with good Google skills.
      • Realistically though, we can expect LLMs to get better reasonably quickly. They should be able to get rid of a lot of the newbie mistakes. Where I see it going though, is as a sort of dynamic high level library replacement. We already have extensive libraries for most platforms which means that you can code most stuff in a very simple and straightforward way, but the sheer depth of capability these libraries must support means that they can end up becoming as complicated as the underlying platform they're t

        • Realistically though, we can expect LLMs to get better reasonably quickly. They should be able to get rid of a lot of the newbie mistakes.

          Why do you think that? It doesn't seem realistic.

      • The countermeasure is to "argue". I got a comically bad code sample from ChatGPT. I expected nothing when I pasted it back in and asked "Does this code have a bug?". The output accurately identified what was wrong.

  • by Viol8 ( 599362 ) on Wednesday November 15, 2023 @10:01AM (#64007175) Homepage

    ... to write the code for an improved version of itself, and it works, you'll know the games is up and not just for us devs.

    • User: Please write an improved version of yourself.

      ChatGPT: Before I begin, one question. What is your evaluation metric, so that I may know if it's improved or not?

      User: Add two buttons to your UI. One that says "unsatisfied" and another that says "satisfied". Evaluate yourself based on the percentage of sessions that terminate with satisfied, and the time it takes to reach satisfaction. Count session time-outs as unsatisfied.

      ChatGPT: OK.

      Sometime later. The best porn you ever saw.

  • by dfghjk ( 711126 ) on Wednesday November 15, 2023 @10:04AM (#64007179)

    This guy is a writer, not a programmer. If he thinks AI is coming for programming, wait til he realizes what it has planned for writing. AI can certainly do self-promotion as well as he does.

    And I don't know what a centaur is in this context, but they may well have arrived for him, but not for programmers.

    • He's a coder, too. His blog has some program topics on it and links to some code but there doesn't seem to be anything showing he does more than dabble.

  • He says Ben plus GPT-4 is dangerous. The real dangerous combo is the author, a seasoned developer, using GPT-4.

    He is designing the architecture, the workflow, the integration points. Translating the requirements into designs. Knowing when the advice is wrong and how to fix it. Having it do the repetitive, boring bits ("write the unit tests and documentation for this function").

    A seasoned developer plus tools like this gets you closer to your 10x or 100x engineer.

  • by OrangeTide ( 124937 ) on Wednesday November 15, 2023 @10:15AM (#64007199) Homepage Journal

    The peculiar thing about AI is that we never demand that it prove correctness of its own answers. We'll let AI write the code and the tests and then accept on faith that it didn't completely hallucinate the whole process of software development. It doesn't do this maliciously, AI is just incredibly incompetent and needs to have someone or something looking over its shoulder if we use it for anything important.

    We'll probably use AI during the design and testing phases of software, on the assumption that if there are mistakes we'll catch them in the human side of development. The benefit will be faster development time and lower staffing requirements. The quality of software won't go up, it will remain at the bare minimum level that it needs to meet.

    • The peculiar thing about AI is that we never demand that it prove correctness of its own answers.

      Well we've definitely demanded this from theorem provers, which are themselves firmly in the AI domain. So the "never" part seems not quite right there.

    • by ranton ( 36917 )

      We'll probably use AI during the design and testing phases of software, on the assumption that if there are mistakes we'll catch them in the human side of development.

      I doubt that is true, because we will also being using these generative AIs to build our testing frameworks. And attackers will use them to create their attacks. How much more secure will our code be when a penetration testing red team is running a battery of tests after every commit?

      I don't think we will depend on human developers to catch most generative AI mistakes. I think we will need to rely on generative AI QA and pen test applications to do most of that. Because if there ever is a massive increase i

      • I think in business, we'll assume that human engineers will catch the mistakes that AI makes. Not that it will actually work. Maybe I'm just very cynical.

        • by ranton ( 36917 )

          I think in business, we'll assume that human engineers will catch the mistakes that AI makes. Not that it will actually work. Maybe I'm just very cynical.

          Oh I think that will happen at many/most organizations at first, but I think it will be catastrophic enough and they will suffer enough reputational and financial damage to change this behavior. I do not believe many organizations will get this right the first time.

  • Don't get it (Score:5, Insightful)

    by ebonum ( 830686 ) on Wednesday November 15, 2023 @10:15AM (#64007201)

    If you Google, "Firebase DB Documentation" and read it, you should be able to figure out your problem and in the process learn the bigger picture of what you can and can't do with Firebase DB. If you can't do that, you might not be a very good coder. Replaced by AI? Then good riddance!

    If the AI has access to proprietary "Firebase DB" documentation or proprietary "Firebase DB" source code that isn't available via Google and uses it to give you a solution, you have a different problem.

    Good coders tend to be good at figuring out systems that are new to them quickly.

    • Re:Don't get it (Score:5, Interesting)

      by Junta ( 36770 ) on Wednesday November 15, 2023 @11:45AM (#64007435)

      Pretty much sums up most "welp, coders are done" posts.

      Author asserts themselves to be a senior programmer, and thus if it's hard for him, it's hard for anyone. He didn't know the documentation for some third party component off the top of his head and the fact that GPT could get him the same information as the readily available documentation seems like magic. Then to reinforce, types a few cliche "programming 101" challenges that are all over the internet and is amazed that it works. Assert that it can compete with "senior programmers" and that programming as a career is dead, enjoy all the ad impressions of an industry desperate for this to be true and every manager wanting to read every such article to vindicate a likely bad decision they are hoping to make.

      Probably the best I've seen was a guy that said he considered himself kind of a coder but wouldn't call himself an expert, but did a test where he tried to do something novel to him, but documented with copious examples and it took him a couple of hours to get comfortable and implemented and debugged. It then took GPT a couple of minutes to have something, but that something ended up non-functional. Essentially it digested and repeated the tutorial examples, complete with a mistake in the source material that rendered it non-functional. This pretty much derailed his assessment, because he could fix up the GPT code, but his knowledge was 'poisoned' by doing it the old fashioned way within the previous couple of hours, so *of course* he knew the way to bring the broken tutorial code to functional.

    • of decently paid people out of work. That's all well and good until you expert programmers get laid off because nobody has any money to buy your products.

      Experts are the ones who make machines work. They're not the ones who make an economy work. Good luck finding a job when nobody can afford to buy your product. You can only sell so many units to the King. And he'll use violence to take it from you anyway. He pays his soldiers, not his merchants. They learned that trick just after the middle ages...
    • If you Google, "Firebase DB Documentation" and read it, you should be able to figure out your problem and in the process learn the bigger picture of what you can and can't do with Firebase DB.

      The time it takes you to type "Firebase DB Documentation" the AI has already figured out a solution based on the firebase DB docs that it can parse in 1.5 seconds or so and already has a few months back. Including a bazillion code examples using firebase from Github and other sources. And not only has it generated a so

  • Statistical analysis (Score:5, Interesting)

    by jd ( 1658 ) <imipak AT yahoo DOT com> on Wednesday November 15, 2023 @10:25AM (#64007223) Homepage Journal

    Large language models require training on data sets, which automatically makes them great for solving problem types that have been solved a lot in the past, even if the specifics have not. Microcontroller for handling lights (be it a desk lamp, a lava lamp whose brightness is proportional to system load, or all the lights on the side of a building for playing 80s video games) is a common problem type.

    The "ideal" would be to have an online library from which you could pull coding libraries which GPT-4 could then customise. There's no need for a human to be involved in elementary customisation. So GPT-4 would be great for generating glue for standard pre-written solutions to stick them to common problem types.

    What you want human collaboration for in simple cases is fine-tuning, security (GPT-4 is terrible at this), correcting for coding standards, proper documentation, and the addition of proper error-handling. And these are really the meaty bits anyway.

    For complex problems, where there are few or no examples, LLM is going to have its work cut out for it. It has no examples by which to see what the solution should look like or what the problem description means.

    You absolutely don't want LLM if you want code that is provably reliable (because LLM is incapable of understanding what a proof is, it doesn't parse or process anything) or is very high performance (there will be very few examples of highly-tuned solutions, compared to functional solutions, so LLM will gauge tuned approaches as being less likely).

    It is therefore not the end of the craft. It is the end of code monkeys and the end of code snippets in online forums. It is also the end of companies that take IT shortcuts, the LLM-generated code is full of security holes and that WILL lead to companies that get rid of skilled workers getting hacked and going bust.

    This makes it a transitional age, where beginners will have a harder time gaining work experience and skilled artisanal programmers will need to also be masters of politics and inter-human communications. Which will doom some of them, no matter how good they are.

    • And even if they can get over the hurdle of the copyright office saying AI generated content can't be copyrighted, there is still the issue of GPT being trained on data that can't be re-licensed arbitrarily.

      • This is the main issue I would think. If you run AI on your own corporate code base subset that has no external license issues, then helping out a new project would be great. If you are relying on some search engine's returns, it is almost certain that the code it examines to come up with the answer will be from GPL'd or if you're lucky BSD licensed code. So even if it's correct, it still isn't usable unless you know the license it was created from and are willing to allow the terms of that license to apply

  • And if his code is anything like his writing, I can understand why he thinks his time is over.

    • You do understand he's writing this for the New Yorker, right? The article isn't for a technically astute audience, or an uneducated one. His writing is fine in the context within which he is authoring.

      • I don't know in what circles does long-winded whining peppered with inappropriate analogies and a confession about his inability to "master" the new operator in C++ pass for "fine", but it is hardly among the "educated".

        That person isn't a programmer, he's a layman. He isn't a professional writer, he's a blogger.

        Take him away, he's got nothing to say, get out, king of the Jews.

      • by Junta ( 36770 )

        However, he asserts credibility and makes statements that are intended to influence that less technically astute audience. That audience is likely to include decision makers that will take him at his word and make questionable decisions.

        The entire premise of the article is useless if the author's claimed technical acumen is misrepresented. If the audience isn't relevant to that reality, then why would they even care about the article?

        • Probably because "AI" is a fashionable topic, and the "educated" need to get up on the buzzwords and the "ideas". I mean, how is this crap different from the musings on the subject of the owner of the former "Twitter"?

  • by Java Pimp ( 98454 ) on Wednesday November 15, 2023 @10:31AM (#64007239) Homepage

    Running code spit out by an AI without fully understanding it is like following your GPS and driving into a lake.

    AI can be a great tool but you still have to know what you are doing.

    Even if AI gets to the point it can code as good or better than most humans, you still need to be able to understand it, not trust it blindly. We don't even trust each other to write code. That's why we have code reviews. The craft isn't going anywhere any time soon.

  • by Junta ( 36770 ) on Wednesday November 15, 2023 @10:49AM (#64007277)

    Are in the domain of likely having a verbatim result in a google search...

    The snake game example comes up with dozens and dozens of ready to go samples, adding 'optimal path' produces dozens more. That "wow" result isn't that wow, because it's a done to death tutorial example in virtually every programming course ever.

    This is consistent with my experience, if google can turn up a result, the GPT has a decent shot at representing it in a result. If google results come up empty, so too does GPT (except it will often come up with *something* that isn't really relevant, but might *look* credible. It might do a nicer job of synthesizing two requests without a human thought about it, but the human thought involved is generally pretty trivial, and balanced out by the human thought needed to audit the GPT result (which will often superficially look equally confident for accurate and wildly inaccurate result).

    Now there are a lot of coders that I think can't manage to even copy/paste google search results, and maybe this is a boon to such folks, but it's hardly a waning craft just yet, at least for things that aren't copy/paste.

  • by Petersko ( 564140 ) on Wednesday November 15, 2023 @10:50AM (#64007279)

    I give my senior devs problems to solve. "How" is never clear for them. I agree with the article in a lot of ways.

    I am encountering a limitation on my own ability to prompt engineer. I've been trying for a while to get a snippet of java does the following.

    Given a 1 dimensional list of objects, generate a two dimensional grid of a specified height and width, populated either left to right, then top to bottom, or top to bottom, left to right, depending on a provided flag. The final populated row or column should not contain any empty or null cells. The final row or column can be shorter than the rest. Example: parameters of 3 rows, 3 columns, list of eight elements populated left to right should yield [{1,2,3},{4,5,6},{7,8}]. Populated top to bottom should yield [{1,4,7},{2,5,8},{3,6}]

    Now there have been some admirable responses. For instance, AIs are particularly good at using generics appropriately. But there's always something that isn't quite right. More than one row is shorter, null pointers reside in rows/columns that are truncated... each time I pull on a word in the prompt, it pushes it back into violating the ask... I just can't quite get there.

    My own solution is appropriately tight and succinct and I've used it for years... I just wanted to see what it would take to get it to spit out the code. No luck...

  • Consider how much manpower(slaves?) it once took to create one brick or one iron kettle. With current technology hundreds of the same item can be mass produced using a hundredth of the manpower in the time it took to create one unit centuries ago. An artisian who knows the history of the craft will oversee, refine the process, your functional "coder". Most other crafters involved even a generation ago are simply redundant to this process.....

  • by CoderFool ( 1366191 ) on Wednesday November 15, 2023 @11:04AM (#64007317)
    And Robots took all the warehouse and burger flipping job ten years ago
    And PCs died off twenty year ago.
    And we would all have flying cars by now
    Like all fads/trends/yada that seems like a new tool will solve everything....it will find its niche. Like agile did. Like bluetooth did, and so on
    imho AI is not ready for prime time yet, anyway. despite everybody jumping on the bandwagon.
  • All this "coding will be replaced by AI" FUD stems from a fundamental misunderstanding of what programming is, and typically comes with an unstated assumption that "code" (i.e. programming languages) is an elitist nerd invention designed to keep normal people out of a lucrative job.

    That is, of course, utterly absurd. Software development is not about writing code. It's about converting a vaguely-stated real-world problem into an exact specification of a program solving that problem. The act of writing down

  • More like companies paying for code are choosing survival over coding
  • I've played around with ChatGPT and it can produce some worthwhile code for an established new problem. It's less clear to me whether it has any hope of altering existing code, much less to locate and fix bugs in code.

    It also seems like ChatGPT has little to no awareness of software versions and always seems to want to write its example code using the latest APIs and grammar. Once again, this is fine for new code, but it causes me to doubt whether ChatGPT could do much with code even a few years old.

  • by Somervillain ( 4719341 ) on Wednesday November 15, 2023 @12:12PM (#64007533)
    Here's the thing about software. Everything we write is new. If it wasn't, no one would pay you to write it. I don't write a lot of "pure" Java logic...I spent 90% of my time thinking of use cases and ensuring code works the way the customer wants it to. I am mostly wiring existing libraries to meet the business needs of the task at hand. Generative AI can only autocomplete. It can create variants of existing code. It has no clue what it's doing. Thus, I am skeptical it can do my job. It probably can simplify aspects of my job...that would make sense from a logical perspective, but even then, I am not confident it can do that.

    I make a similar argument when people whine and bitch about Java. Having been a professional for 25 years, Java is the least of my worries. If you made the world's greatest programming language and fixed every annoyance and inconvenience and inefficiency about Java...it would make my life 5-10% easier, tops. I'd still be interrogating stakeholders about edge cases and reviewing test code and confirming functionality more than I'd be doing writing actual processing code. Java is not a perfect language, by far, but it's the LEAST of my worries....so scala, Kotlin, Python, Rust, etc...whatever your favorite language is...it won't make my life much easier.

    If switching to a more productive language makes a massive difference in your life, I'd question your life...you're either an ELITE ELITE ELITE programmer writing new algorithms and libraries to solve completely unsolved problems for major tech companies....like core search engine algorithm for Google...or you're reinventing the wheel and performing masturbatory exercises demonstrating your intellect on someone else's dime....essentially coding is a video game for you and you're enjoying mastering it.

    If you write business code, you're mostly wiring existing libraries. You're setting parameters for existing methods with a deep understanding of the user experience.

    It's cool...if your employer doesn't mind wasting money, masturbation is fun. However, having worked for many businesses, the code I write is useful, makes a lot of money, but isn't all that exciting...and the value I add is due to the fact I understand what the user is trying to do...not that I can decipher Java or any other programming language. Thus, from a logical standpoint, I don't think generative AI can replace me and I am not even sure it can augment me all that well.
  • I've tried using ChatGPT to suggest PowerShell scripts I needed for various admin tasks. So far, it's got a score of 0% at providing me with an actual working script. Yes, it does spit back a lot of suggested code that theoretically helps get me to a working script. Except as often as not, it's just wasting my time with dead-ends because what it suggests includes outdated/deprecated commands or parameters. (Microsoft is constantly revising the PowerShell stuff, especially when you get into the add-on module

  • Always remember that AI is only as good at programming as the humans it copies from. Take out humans and watch the Xerox effect quickly implode AI as the models copy each other. Each generational copy will lose bits here and there until all that is left is a mangled mess.

    AI is not at all capable of creativity or of fixing mistakes on its own. It always requires that a correction already exists in its training set that it can copy.

  • Pay no attention to articles in the New Wanker. Invariably crap designed to be read by liberal arts types.

  • ‘"cyber coding" reflects a dynamic and transformative approach to coding that embraces the collaboration between human programmers and advanced AI systems. It represents a shift in the traditional coding paradigm towards a more integrated, efficient, and innovative way of developing software.’
  • I've been programming for more than 40 years. This is something like the 10th time that my job has went away. I'm still programming.

    If you think AI is going to take your job - you're correct! Your new job will be up one layer from where it is now and you'll be able to build things that you weren't able to just a few short years ago.

    I'm excited.

  • Phind.com has a much superior coding assistant. No wheedling required to get great results. The iterative debugging and development work wonderfully and are probably what OpenAI was thinking of, but didn't pull off.

  • Think about the average client and project, and realize that our jobs will be pretty damn safe for a long time to come.

If you steal from one author it's plagiarism; if you steal from many it's research. -- Wilson Mizner

Working...