Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Programming

AI-Powered Software Delivery Company Predicts 'The End of Programming' (acm.org) 150

Matt Welsh is the CEO and co-founder of Fixie.ai, an AI-powered software delivery company founded by a team from Google and Apple. "I believe the conventional idea of 'writing a program' is headed for extinction," he opines in January's Communications of the ACM, "and indeed, for all but very specialized applications, most software, as we know it, will be replaced by AI systems that are trained rather than programmed."

His essay is titled "The End of programming," and predicts a future will "Programming will be obsolete." In situations where one needs a "simple" program (after all, not everything should require a model of hundreds of billions of parameters running on a cluster of GPUs), those programs will, themselves, be generated by an AI rather than coded by hand.... with humans relegated to, at best, a supervisory role.... I am not just talking about things like Github's CoPilot replacing programmers. I am talking about replacing the entire concept of writing programs with training models. In the future, CS students are not going to need to learn such mundane skills as how to add a node to a binary tree or code in C++. That kind of education will be antiquated, like teaching engineering students how to use a slide rule.

The engineers of the future will, in a few keystrokes, fire up an instance of a four-quintillion-parameter model that already encodes the full extent of human knowledge (and then some), ready to be given any task required of the machine. The bulk of the intellectual work of getting the machine to do what one wants will be about coming up with the right examples, the right training data, and the right ways to evaluate the training process. Suitably powerful models capable of generalizing via few-shot learning will require only a few good examples of the task to be performed. Massive, human-curated datasets will no longer be necessary in most cases, and most people "training" an AI model will not be running gradient descent loops in PyTorch, or anything like it. They will be teaching by example, and the machine will do the rest.

In this new computer science — if we even call it computer science at all — the machines will be so powerful and already know how to do so many things that the field will look like less of an engineering endeavor and more of an an educational one; that is, how to best educate the machine, not unlike the science of how to best educate children in school. Unlike (human) children, though, these AI systems will be flying our airplanes, running our power grids, and possibly even governing entire countries. I would argue that the vast majority of Classical CS becomes irrelevant when our focus turns to teaching intelligent machines rather than directly programming them. Programming, in the conventional sense, will in fact be dead....

We are rapidly moving toward a world where the fundamental building blocks of computation are temperamental, mysterious, adaptive agents.... This shift in the underlying definition of computing presents a huge opportunity, and plenty of huge risks. Yet I think it is time to accept that this is a very likely future, and evolve our thinking accordingly, rather than just sit here waiting for the meteor to hit.

"I think the debate right now is primarily around the extent to which these AI models are going to revolutionize the field," Welsh says in a video interview. "It's more a question of degree rather than whether it's going to happen....

"I think we're going to change from a world in which people are primarily writing programs by hand to a world in which we're teaching AI models how to do things that we want them to do... It starts to feel more like a field that focuses on AI education and maybe even AI psychiatry. In order to solve these problems, you can't just assume that people are going to be writing the code by hand."
This discussion has been archived. No new comments can be posted.

AI-Powered Software Delivery Company Predicts 'The End of Programming'

Comments Filter:
  • too late (Score:2, Funny)

    by Osgeld ( 1900440 )

    The end of programming happened decades ago, now its just copy paste lines or include entire modules and arrange input/output to get the desired result. All he is really saying is we can teach a computer to do the job of a student with 6 months of python under their belt and no previous interest or experience beforehand

    • Re: (Score:2, Insightful)

      by JBeretta ( 7487512 )

      The end of programming happened decades ago

      Oh bullshit. What kind of drugs are you on?

      • by Darinbob ( 1142669 ) on Sunday January 01, 2023 @12:06PM (#63172346)

        True, this wasn't decades ago, more like 1 decade only. So much programming these days, outside of low level devices, is ultra high level and involves using existing frameworks that get glued together. Actually writing those frameworks is considered a godly ability that mere workers are never allowed to do. That's because the mantra repeated over and over while worshipping those framework creating deities is "never reinvent the wheel... ommm.. never reinvent the wheel..."

        Except it can go wrong when there are bugs in the deity's code:
        "You know, there's a but in our sort library..."
        "No problem, I can just whip up a quicksort in about 10 minutes, then 10 minutes to get it fully tested."
        "Don't you dare! We'll submit a support request then wait two weeks for a response and 13 months for the next release."

    • Re:too late (Score:5, Funny)

      by Brain-Fu ( 1274756 ) on Saturday December 31, 2022 @08:59PM (#63171428) Homepage Journal

      You have experience working at Microsoft, I see.

    • Re: (Score:3, Insightful)

      by muntjac ( 805565 )

      The end of programming happened decades ago, now its just copy paste lines or include entire modules and arrange input/output to get the desired result. All he is really saying is we can teach a computer to do the job of a student with 6 months of python under their belt and no previous interest or experience beforehand

      Maybe you are a web dev? Or maybe a dev who works for a small company implementing their slightly different version of the same thing everyone else already has. There are still many new things being implemented. I work on test automation for kubernetes storage provisioning drivers. And no you can't copy paste code to do automated integration testing between k8s apis and storage vendor apis to test proprietary functionality. I do think AI will take most software jobs at some point. Not sure how long that w

  • by CaptainLugnuts ( 2594663 ) on Saturday December 31, 2022 @05:41PM (#63171040)
    You can't even get what a program needs to do described properly to a human. There's no way you're going to describe it to something that doesn't make past information judgement calls to even get close what they're trying to do.
    • by PPH ( 736903 ) on Saturday December 31, 2022 @05:51PM (#63171070)

      You can't even get what a program needs to do described properly to a human.

      User: "Computer. My allocated disk space is nearly full. I need more free disk space."

      Computer: runs rm -rf $user_home/*

      Computer: "There. Fixed it for you. Anything else?"

    • I'd argue that he's not so much an idiot, as hunting them.
    • Re: (Score:2, Insightful)

      by Anonymous Coward
      It's not naivety, it's stupidity, fueled by good old fashioned greed.

      The ultimate dream of pointy-haired-bosses everywhere is a business with no employees. No more computer programmers -- all software will be written by A.I.-powered computers. It's a stupid and workable idea, but they will never give up because no employees means MORE MONEY FOR ME.
      • This sounds great, but when taken to the end you suggest, who are these businesses serving if none of them have any employees because, spoiler alert, employees also the equally important role of "consumers".

        • by rworne ( 538610 )

          Usually, those employees are someone else's "consumers".

          That employee only spends a small percentage of their paycheck to their employer for goods and services - likely nothing if it is a Lockheed or Northrop Grumman, and a few percentage points if it is a fast food joint. All that other salary that McD's employee did not spend on McD's products (like rent & utilities) is wasted profit potential. That's where the thought experiment ends with management. Automate and fire all the workers, get a fat bonu

          • The problem is (as you point out) when everyone starts doing this

            The McCormick Reaper and steel moldboard plow put about 60% of the population out of work.

            The obvious result was that living standards soared as all that labor was available for more productive use.

            • by tragedy ( 27079 ) on Sunday January 01, 2023 @02:13AM (#63171798)

              Two issues. One is that the upheavals that such technological changes cause may eventual settle into a situation where we can say that it all worked out in the end. However, there is a period between the introduction of the disruptive technology and people getting their new, better jobs. Classically, there are plenty of people who starve to death in the hedgerows in that in-between period. No-one really remembers them because they died and don't have a voice any more. The other issue is that, just because there were new jobs for people in the past does not mean that trend will last forever. It seems like a simple matter of robot/AI capability. Once robots/AI are as capable as humans in a particular field, they can take basically all of the human jobs in that field. Think about how many people work in retail stores stocking shelves That's basically a job that humans have been doing for centuries and we're definitely on the verge of robots being able to take over there. What job do shelf-stockers move on to? Sure, there may be something, but when they take over the cashier jobs and the cleaning as well as the factory jobs and the logistics jobs, the driving and loading and unloading, the warehouse management, most of the office jobs, etc. Whether or not they will take over the management jobs or if those jobs will simply mostly become irrelevant because there are no employees to manage any more is unclear, but everyone isn't just going to move to management.

              Basically, once AI/robot capabilities increase enough, they can not only do most of the existing jobs, they can do the new jobs that replace them, leaving little for humans to do. Now, this is not necessarily a bad thing. It all depends on how the transition is handled. A world where it's handled gracefully and people have leisure time and their needs are met is a pretty good one. Then there's the nightmare scenario where most of the population is now homeless and begging and fighting for scraps. Denying that it's even possible that machines could take most of the jobs and there simply won't be any available replacement jobs is the kind of attitude that leads to the second, dystopian world.

              • by dvice ( 6309704 )

                Reason why I think "people will get better jobs" won't work, is simply that we don't have need for so much work. If you think what humans need is:
                - Food (mostly automated already)
                - Housing (won't need more people, as automation increases, will need less)
                - Logistics (will soon be almost fully automated)
                - Energy (won't need much more people)
                - Factory production (mostly automated already)
                - Entertainment (already has more people than needs)
                - Education (personal teacher for everyone? For every 2 people?Would you

        • employees also the equally important role of "consumers".

          That people work for wages and then buy stuff from corporations became common with the industrial revolution about three centuries ago. It isn't some law of the universe that things have to work that way.

          A computer that can write any program will need human-level intelligence and knowledge of the world. Once that happens, the world will profoundly change, and worrying about "jobs" may be the least of our concerns.

        • by sg_oneill ( 159032 ) on Sunday January 01, 2023 @12:21AM (#63171668)

          This was literally the "contradiction in capitalism" Karl Marx talked about in capitalism. The ideal company for the owners have no employees, but no employees means no consumers and a hungry and angry former working class pushed beyond reason.

          Of course he imagined steam powered automation would be the "gravedigger", perhaps he was 170 years off......

      • by dvice ( 6309704 )

        Most of my time goes into extracting specifications from customer and figuring out what they really want and what is the cost effective way of implementing it (what customer says is not what customer wants problem). I think it would be better to invest on automating that part of programming. Once you have the specifications, actual code writing part is quite easy, but even then, most of the time goes into thinking "what is best way to do this so that it is maintainable".

    • Probably the best take on this I've seen.

    • I find it possible we could see the end of programming in the next decade or two. But that doesn't mean the same thing as the end of software development. 80% of the skills of current software engineers will still be needed to write software, we just might not write lines of code anymore. Just like almost no one writes assembler code today.

    • by AmiMoJo ( 196126 )

      Most people commissioning software don't know what they want. They need someone to look at their business, workflow, and needs, and then tell them what will work.

      • They need someone to look at their business, workflow, and needs, and then tell them what will work.

        Sure, but that someone doesn't have to be a human.

    • by Roogna ( 9643 ) on Sunday January 01, 2023 @11:54AM (#63172328)

      This is it exactly. Will programming change? absolutely! Has it changed before? yep! The key is no matter how good this all gets for translating statements to machine language, you'll still need someone who can figure out what the managers meant to write those statements clearly enough for the "AI" to do the right thing. Heck it will even come with specific languages to write those statements in, so that there's no confusion on what is intended. Then of course you'll need people who speak those languages to the machine learning algorithms that generate the applications... we can call them, oh I don't know, programmers perhaps.

    • This idiot is a CEO. And the primary skill requirement for a CEO is to have completely unrealistic expectations.

  • by quonset ( 4839537 ) on Saturday December 31, 2022 @05:41PM (#63171042)

    If someone is relegated to examining the finished code, how are they supposed to know if it's right if they don't know how to code, let alone read code? Are they supposed to go on blind faith it's correct?

    Situations like this are the exact reason the phrase, "What could possibly go wrong?" came into existence. Delegating everything to computers simply means making mistakes that much more quickly.

    • by ThePhin ( 525032 ) on Saturday December 31, 2022 @06:11PM (#63171106) Homepage
      I agree with you. I've been writing software (in the EDA industry) for almost 35 years, and my own intuition tells me that you're right. But then, I also think of Upton Sinclair's observation It is difficult to get a man to understand something when his salary depends on his not understanding it. [oxfordreference.com]
      • by CaptainLugnuts ( 2594663 ) on Saturday December 31, 2022 @06:31PM (#63171162)
        That's a valid point of view, but I look at it this way, what's the hardest part about my job programming computers? It's not the programming, it's getting consensus and an accurate description of what to write.

        I don't see that changing any time soon.

        • by ThePhin ( 525032 ) on Saturday December 31, 2022 @06:55PM (#63171204) Homepage
          Yeah, I don't want to draw this thread out, but you are spot on. Writing code is a small fraction of the task of developing complex software. If AI's can gather requirements (for a novel application or even just a complex instance of an understood one), then carefully specify the use cases, edge conditions, error handling and many other things, then we might see such a shift. Until then, "I'll have what he's having" :)
          • by NFN_NLN ( 633283 ) on Saturday December 31, 2022 @08:06PM (#63171338)

            There is one particular use case that AI could be suitable for and that's navigating tricky politics.

            Manager> ChatGPT I had to fire the last programmer that told me he couldn't write an algorithm to solve NP hard problems. Are you up for the challenge?
            ChatGPT> You're an idiot.

          • by dvice ( 6309704 )

            Instead of error handling, how about AI that can find bugs from existing code. They should start with that.
            - It is easy to see if it works or not.
            - If they fail, no-one gets hurt.
            - If they succeed, world becomes a better place (except for software testers)

        • by Comrade Ogilvy ( 1719488 ) on Saturday December 31, 2022 @11:11PM (#63171620)

          As someone on slashdot wiser than me once pointed out, a descriptive text expressed in language that has sufficient mathematical precision to unambiguously describe what a program will do under all possible situations could reasonably be called writing a program in a programming language.

          If the language is truly unambiguous, creating a compiler for the language used is a very solvable Computer Science problem of non-huge effort.

          "I do not need to know how to write code, because I can just describe what it should do and details do not matter!" boils down to not being able to describe what it should do.

        • by serviscope_minor ( 664417 ) on Sunday January 01, 2023 @05:25AM (#63171960) Journal

          That's a valid point of view, but I look at it this way, what's the hardest part about my job programming computers? It's not the programming, it's getting consensus and an accurate description of what to write.

          This applies so generally. Certainly applies to the business case kind of thing where people don't know what they want until they see it. But it also applies at a much lower level too. For example if you're writing some code with libUSB to interact with a piece of hardware and the little fucker only ever really responds to the first request until it's power cycled, but of course issues no errors.

          The code isn't hard, at all. It's dead simple, more or less: 1. init libusb. 2. find device. 3. open device. 4. send request. 5. send request which fucks up.

          Basically a copy of the tutorial example, but in this case something's amiss. What does the AI do? Sure it could recommend the initial, usually correct code, and then some variants. Beyond a certain point, however, the rubber hits the road and debugging has to happen. You need to understand libUSB of course, which is well documented online and of course C. You also need to understand the piece of hardware which may well not be documented online or at all, and you might need to understand the firmware too.

          Did the AI also write the firmware?

          All simple code, not a complex algorithm in sight or really much of anything that would even qualify as an "algorithm". It is definitely the "what to write" which is the problem. One could argue that this is consensus, it kind of doesn't matter exactly what but the software on both sides of the wire must match perfectly.

      • by Zangief ( 461457 )

        this is a very valid objection, but have you considered it also applies to the guy being interviewed?

    • by NFN_NLN ( 633283 ) on Saturday December 31, 2022 @06:29PM (#63171158)

      > If someone is relegated to examining the finished code

      This would probably be in the realm of black box testing. Run it through a test suite and if the given inputs yield the given outputs then it passes. If you don't like what it does design better inputs parameters or performance conditions.

      There is no point examining machine generated code. It could restructure/optimize the entire thing the next time you tell it to make a change and all that work understanding it was useless.

      • by quonset ( 4839537 ) on Saturday December 31, 2022 @06:33PM (#63171166)

        It could restructure/optimize the entire thing the next time you tell it to make a change and all that work understanding it was useless.

        That's my point. If you don't understand the code in the first place, or how to read code, how will you know if what the AI produces is any good? How can you tell it to restructure/optiimize if you don't know what you're looking at?

        • I think you are missing the point.

          This guy isn't talking about AI generating C code or anything like that, he wants a machine learning app to be generated - and nobody will know what's going on inside.

          • So when I'm using this machine learning app, I'm also responsible for "teaching" it when it's wrong? Like the self-checkout line only much, much worse.

      • You'll be able to pull in immeasurable wealth if you know how to write a black box test that amounts to "Do only the specified behavior, and nothing else. That is, do this and only this." And if you can guarantee your test suite is complete (doesn't miss any cases.)

        The reason critical systems do (well, should, sadly) not rely on black box testing, but also on inspection and white box testing, is because the only way we presently know how to guarantee absence of unwanted behavior is by inspection. E.g., the

      • by sosume ( 680416 )

        Would you fly in an airplane that was designed in a stable diffusion kind of way, refined until the entire airplane is designed, and it passes all the tests we came up with?

    • What's interesting is he mentions the "AI Winter", but fails to grasp one of the reasons behind it: Early over-proposing by AI advocates, and the subsequent rebound of skepticism of the entire field. I believe we're going to see another "AI Winter" in the next decade as so many of these promises collapse and people grow cynical with AI, as it fails to beyond a few cool tools and some interesting party tricks.

      There are undoubtedly going to be some amazing new AI-related tools created that surpass what human

    • If someone is relegated to examining the finished code, how are they supposed to know if it's right if they don't know how to code, let alone read code? Are they supposed to go on blind faith it's correct?

      They can look at the output and test it, just like project managers do now. From the point of view of the project manager, it will work as well as developers do now, but a lot, lot cheaper. Which overall would be a net benefit for the economy.

      In theory, that is. This company is going to have to invent technology that no one has invented yet, or has any idea how to invent to reach that goal. There is nothing in the article to make me thing this company knows how to invent technology.

      His whole company is base

    • by adrn01 ( 103810 )
      Not to mention, what if the software examples the program-bot are trained on, turn out to have serious exploits?
    • I guess we will use another AI to verify the results of the first AI. Problem solved.
  • by devslash0 ( 4203435 ) on Saturday December 31, 2022 @05:43PM (#63171046)

    Has someone started the New Year's Eve drinking fest early this year?

  • Black Hats (Score:5, Funny)

    by Barny ( 103770 ) on Saturday December 31, 2022 @05:43PM (#63171048) Journal

    Black hats are going to fucking love this guy.

  • More than half coders only know how to hide behind their frameworks and libraries, good luck getting them to understand how to put things together from the ground up properly. This is nonsense. Just like all tech predictions before.
  • Good luck (Score:5, Insightful)

    by Tony Isaac ( 1301187 ) on Saturday December 31, 2022 @05:54PM (#63171074) Homepage

    Car companies, and Google, have been working for decades now to try to build self-driving cars, and that elusive goal is still years away. Programming problems are much more complex and varied than the problem being solved by self-driving car technology. The chance that AI will be able to handle general programming by 2053, is about the same as achieving warp drive by that date.

  • You wish (Score:5, Insightful)

    by bettodavis ( 1782302 ) on Saturday December 31, 2022 @05:54PM (#63171076)
    These guys are funny. Not the first time some eediot forecasts the end of programming, thanks to the thousand and one code generators programmers have created for their PHBs, trying to eliminate them from the equation (code monkeys are pricey!) with some simple incantation. Things like CASE, IDEs, specification driven program generators, and a long etc. Well, it has neer worked.

    Why? well, because programs already are a pretty good formal description of what they do, and their semantics is outside of the program context. It relies on perception and flesh and blood interaction with the system to get its full semantics.

    Programming already is an excercise of mostly copying pre-existing algorithms in the language you use, filling up the required fluff for the system to work. Stackoverflow code is basically everywhere, with a few bits changed here and there. Even design patterns are a bit trite and very reused, if you flaunt the title of "Architect".

    The problem is not producing the required LOC, but to integrate them and make sure they respond to the semantics of the problem in question, by running it and see it fly. And for that you still need to use the code monkeys in front of the screen, able to have the user's pespective.
    • If you pitch that you're going to invent a magical AI box that will eliminate 100,000 jobs paying an average salary of $100,000, which is worth $10 billion per year in "savings", VCs can make a really nifty spreadsheet and slide deck showing the massive expected NPV and ROI. Then you say that "100,000 jobs is on the low side", and they can't write checks fast enough.

    • thanks to the thousand and one code generators programmers have created for their PHBs, trying to eliminate them from the equation (code monkeys are pricey!) with some simple incantation.

      Code monkeys have always been cheap. As the summary points out, what is really valuable is the ability to accurate describe requirements:

      The bulk of the intellectual work of getting the machine to do what one wants will be about coming up with the right examples, the right training data, and the right ways to evaluate the training process.

      I do agree with you that people are being sold a bill of goods as to what AI will do. As an example, summary uses the slide rule as an argument, it’s a flawed one. Yes, it is an obsolete tool, but the ability to identify what numbers to crunch and how is as important today as it was in the era of a slide rule. Don’t confuse the tool with the knowledge requi

  • Makes a lot of noise about the death of programming.

    I am shocked.

  • Right (Score:5, Informative)

    by Drethon ( 1445051 ) on Saturday December 31, 2022 @05:57PM (#63171082)
  • Actually â" what heâ(TM)s really saying is that everyone will be a user, which is nonsense.

    In a sense, we are already there â" very few people understand machine architecture or deal with assembly. But _someone_ has to ⦠otherwise we canâ(TM)t create anything new. More stuff will be automatically generated â" and weâ(TM)ll get to a point where Hello World compiles to a terabyte in size â" but no one will care.

    Still â" _someone_ will have to understand how

  • It was called COBOL.

    I expect this will work out as well as that did.

  • by Gravis Zero ( 934156 ) on Saturday December 31, 2022 @06:05PM (#63171098)

    Anyone who doubts this prediction need only look at the very rapid progress being made in other aspects of AI content generation, such as image generation. The difference in quality and complexity between DALL-E v1 and DALL-E v2—announced only 15 months later—is staggering

    The problem with this idea is that unlike art, computer programs are not static but rather they are highly dynamic. This means that in order to get what you want that you need to be very specific about how elements interact. I can see a future where tools are made to simplify the process by manipulating a series of visual representations but I don't see any way around being very specific if you want your program to behave properly.

    • Anyone who doubts this prediction need only look at the very rapid progress being made in other aspects of AI content generation, such as image generation. The difference in quality and complexity between DALL-E v1 and DALL-E v2â"announced only 15 months laterâ"is staggering

      This little bit of idiocy can be rephrased with something like: "AI image generation has progressed so well that it can easily replace over-the-road drivers, since image generation and large vehicle operations are so synonymous with each other."

      The guy's a sales idiot, hoping that he's preaching to even larger idiots.

      Once again: AI will, at best, provide assistive technology in the same vein as automatic code completion. However, it will perform much, much worse than automatic code completion, and will requ

      • The guy's a sales idiot, hoping that he's preaching to even larger idiots.

        If he is then he's not typical because has a Ph.D. in Comp Sci from Berkeley.

  • I'm glad I got my quarter century of development-associated livelihood in before now. I think "most" business programming will either be automatic or so commoditized that it might be more lucrative to be a talented barista. I'm a PM now and expect my teams to lose devs any gain technically inclined BAs.

  • I've seen several iterations of the death of programming mantra. Each one has failed because programming is fundamentally a hard problem. It's the same reason why the teach coal miners to be programmers concept fails. The perception that programmers are caffeinated monkeys pounding on keyboards is prevalent.
    • The perception that programmers are caffeinated monkeys pounding on keyboards is prevalent.

      The problem is that writing code involves typing on a keyboard, and typing on a keyboard is something done by minimum-wage secretaries. Therefore, writing code is easy and can be done by anyone.

      This line of thinking leads clueless CEOs everywhere to believe that it should be easy to eliminate programmers and replace them with some sort o f A.I.-powered snake oil.

      • This line of thinking leads clueless CEOs everywhere to believe that it should be easy to eliminate programmers and replace them with some sort o f A.I.-powered snake oil.

        Only if the AI is cheaper than the very cheap 3. world "programmers" they already hired to copy and paste lines from Stack Overflow.

    • It's the same reason why the teach coal miners to be programmers concept fails.

      That was never really a concept: it was mostly a misrepresentation of other concepts which are not wrong.

      The concept being that coal mining will employ fewer and fewer people, due to both reduction in coal use and crucially ever more automation and mechanisation.

      The coal lobbying industry wants its pet politicians to yell MOAR COAL while shovelling money and/or lower regulations at them to "make jobs", while quietly ignoring th

  • "Self driving cars!" Prediction made by Elon Musk, CEO. They were right around the corner! For like, the past decade. But who needs timelines when you're a "visionary"?
  • Heard it before (Score:5, Insightful)

    by SuperKendall ( 25149 ) on Saturday December 31, 2022 @06:25PM (#63171150)

    I still remember a Byte magazine ad, maybe in the 80's, for a product called "The Last One" - what it was meant to be as I remember, was a product to replace programmers...

    To my mind, after having used a number of different programming languages professionally, what I see AI as is almost it's own kind of programming language because getting what you want from it is a process of taking in requirements for what people want built, then crafting a specific series of words and symbols you feed to the AI that produces the result you want....

    What does that sound like? Sounds like pogromming to me! Yeah it MAY be more efficient but that doens't matter to the aspect of someone being employed in that work.

    There will always be the need for people that are good at bridging the realm of imagination and technical limitations and reality, those people will be computer programmers even if there ends up being some other name to describe them.

    • But even if AI is its own programming language, when you don't know what does on inside of it, you can't ever fully understand what it is going to do. It's why self driving cars will probably never work - all the machine learning seems great, until it does something totally unexpected and seemingly illogical, like randomly change lanes, crash into another car, or any of the other multitude failures.

      Nothing in AI to date has shown that it is anything other than a loose approximation of "intelligence". It has

      • But even if AI is its own programming language, when you don't know what does on inside of it, you can't ever fully understand what it is going to do.

        The same is true of a sufficiently complex framework... :-)

        It's why self driving cars will probably never work

        They are already working, hundreds of thousands of times a day, every day. And at this point better than most human drivers allowed on the road.

        It has no imagination, it can't try to "think through" previously unconsidered scenarios

        No imagination (

        • The same is true of a sufficiently complex framework... :-)

          I think you're conflating the idea that any one person probably doesn't understand a complex framework, or how it all interacts together, with the un-openable black box understood by no one that AI represents. Even the most complex framework could still be studied and documented, as well as separated into smaller, understandable parts because it's made of components that we fully understand.

          They are already working, hundreds of thousands of times a day, every day. And at this point better than most human drivers allowed on the road.

          Driver assist seems to work ok, but nothing is truly self-driving 100% of the time without issues. I'm surprised they

  • by rknop ( 240417 ) on Saturday December 31, 2022 @06:26PM (#63171152) Homepage

    I can predict with confidence that we are NOT reaching the end of blowhards making bold and broad pronouncements.

  • by avandesande ( 143899 ) on Saturday December 31, 2022 @07:22PM (#63171240) Journal
    Any sufficiently developed requirements are indistinguishable from code.
  • There has been a lot of hype and speculation about the potential for artificial intelligence (AI) to revolutionize the way software is developed and delivered. Some people have even suggested that AI could lead to the "end of programming" as we know it. However, it is important to keep in mind that AI is still a developing technology, and it is unlikely to completely replace the need for human programmers anytime in the near future.

    AI can certainly be used to automate certain tasks in the software developme

  • by inglorion_on_the_net ( 1965514 ) on Saturday December 31, 2022 @07:42PM (#63171296) Homepage

    I expect this will end programming to about the same extent compilers have. There will be a whole range of things we programmers currently spend time on that we will spend considerably less time on, and we will spend that time on other things. And some people will still have to develop the software that makes this possible. We are not toggling switches on front panels anymore, and most of us are not hand-optimizing our machine code anymore, but previous leaps in computer programming have hardly put programmers out of work, and I doubt this one will, either.

  • I've been patiently waiting decades for Magic to be invented, apparently I may still be alive when it comes:) Just think what we'll be able to do with it, why, it will be just like... magic.

  • by irving47 ( 73147 ) on Saturday December 31, 2022 @08:16PM (#63171358) Homepage

    But using that open gptchat ai last week, I was instructing it to write arduino sketches... One that would cycle through 8 sounds on a specific board (that it knew to use the right library for) and also a 6 button simon game using neopixel LED's. I watched in amazement as it spit them out. Probably one of the most amazing (tech.) things I've seen all year.

  • It is actually very easy to come up with a set of requirements, a set of input variables, built-in constants, defined relationships (one-one, one-many, many-many), and what is nullable, and get a program that never fails, ever.

    The first thing users do when they finally get the program is ask for a change in the requirements. What was one-many is now many-many. What was not nullable now can be nulled. What was a fixed length max value now needs to be a varchar. What fit in an 'int' now needs to be BigInt. What was ok in a single timezone now needs to adapt to world-wide scheduling.

    And the program needs to still handle the original data with no issues, and reports written from before all of these changes still need to work perfectly because they're the inputs into another program that belongs to another company and so it can't be changed.

    This is the reality of programming: not a fixed requirement set but an ever changing suite of changes as the customer, now that they have it, keeps coming up with more things they'd like or other data sets they want run through it.

    Maybe someday an AI will be able to adapt like that, but it is still dozens of years away.

    • So very true.

      And, even before requirements change, you find out that they were mis-stated. I can't tell you how many times I sat with a customer and had exchanges like this:

      Customer: The remote site can never change anything in the order.
      Me: So, if something has to be voided, it has to be done at the primary, not the remote:
      Customer: Well, no.

      Or,

      Customer: Parts always flow directly from the lehr to the grader.
      Me, paraphrasing: So the lehr never sends parts anywhere except the grader.
      Customer: No.

      and I

  • They had to implement crew rest and work load periods according to this. [ecfr.gov]

    They implemented each requirement to loop through all the pilots and all the flights to find the best match.

    Every time FAA issued a new rule, they added one more loop over all variables nested inside the innernmost loop.

    This code too will be part of the training set.

    Bad programmers dont understand the logic, they copy paste bad code and then make even more bad loops to make it work. Thus bad code examples proliferate. But good p

  • Is it already time for the weekly "AI will replace programmers" post? I thought the last one was only 2 days ago.

    • Oh, I hope it does. And when that happens, I hope to be on an island somewhere in the South Pacific, sitting on a comfortable couch in my living room, with an industrial sized bucket of popcorn, watching it all play out on my TV screen, via the Webcam I setup right before I left.

      You can't fix dumb. Someone should hide the pointy objects from this idiot before he hurts himself. Just give him a length of string to play with; nice, safe string.

      Actually, you should probably hide the string as well.

  • ... for every time someone hawking program generators has proclaimed the End of Programming over the last 40+ years of my career, I'd be spending the holiday on my private island.

    If it's a really really good program generator, it transfers the workload from writing code to trying to figure out where the requirements were wrong or incomplete. Otherwise, it's not a lot better than cut-and-paste.

  • by NotEmmanuelGoldstein ( 6423622 ) on Sunday January 01, 2023 @01:25AM (#63171744)

    ... encodes the full extent of human knowledge ...

    Yes, this is what learning engines pretend, they know everything and they have all the answers. Actually, their programmers pretend the software knows everything and has the answers. As Douglas Adam's Deep Thought realized, sometimes one doesn't know the question.

    Then there's the human factor: Why does Johnny want 3 squares of chocolate but only one car? The first 'AI' robot (arm), in the 1960s, wasn't taught about gravity, so it tried to build everything top-to-ground. Software is a model of the world and like most models, it doesn't know everything.

    Accurate weather models have been 'around the corner' for 70-plus years, and we're finding we still don't know enough, plus some things are truly unknowable (Eg. quantum fluctuations and chaos theory).

    Self-driving highway-only vehicles, which weren't predicted in the 1960s, still aren't here, after 40 years of research: Measuring the world, is difficult, so cars need GPS and pre-printed 'maps' to pretend they understand the world. This is the bit that confuses most fan-bois: They think that even if a machine doesn't know everything, it can 'see' everything. It reminds me of a "I love Lucy" episode where a mainframe without cameras, microphone or speaker, is verbally telling her what it saw.

  • The elephant in the room is what the training set will be. Getting a good training set is quite non-trivial. I do not mean a 15 sample set. If the system is supposed to be robust and useful it has to address a lot of cases, so the set has to provide multiple samples for each case for generalization. I am not sure this is going to be easier. At least, much less predictable. When you have a formula or an algorithm, one expression can cover a lot of cases over well defined domain. For a training set you have
  • Unless the AI system has full context aware language recognition and and understanding of a very wider range of business models and use cases, how would you tell it what you wanted it to do? (Even a big problem with human developers who do have those capabilities).
  • Feed "Code a blockbuster AAA game which will make at least $1bn in sales" to the best programming AI. Come back when that AI generated game sells $1bn worth, we can talk then. Until then, it's all just science fiction, or wishful thinking. Perhaps Matt has a future as a science fiction author?
  • Pseudo-AI requires training sets. Where do you get training sets for new problems?

  • Is automated fuzz testing. Fuzz testing is a great way to patch over the human fallibility of software testing. Understanding language syntax well enough is mostly good for generating randomized testing.

    Maybe even somehow find a way to teach it how to understand the semantics of the hardware instructions well enough to generate randomized tests that uncovers surprising hardware artifacts. eg, those that produce subtle errors, or unexpected reduces performance.
  • You'll still need to find an exact way to describe what you need the machine to perform... which in the end boils down to programming, just with bells and whistles attached. This isn't the first such claim.
  • But ... how will we specify for the AI exactly what it needs to do?

    I know ... we'll come up with some sort of standardized syntax to specify business rules and logic! We could call it ... hmm, what shall we call it ... and the people who do it could be called "specifiers" ...

  • I predict the end of Fixie.ai will come before the end of programming.

  • Now I have a good stock to short.

  • Business-execs will be able to write their own code.
  • The machine will only be able to duplicate what has already been done, and not be able to come up with new things

Technology is dominated by those who manage what they do not understand.

Working...