Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
AI Programming

95% of Code Will Be AI-Generated Within Five Years, Microsoft CTO Says 116

Microsoft Chief Technology Officer Kevin Scott has predicted that AI will generate 95% of code within five years. Speaking on the 20VC podcast, Scott said AI would not replace software engineers but transform their role. "It doesn't mean that the AI is doing the software engineering job.... authorship is still going to be human," Scott said.

According to Scott, developers will shift from writing code directly to guiding AI through prompts and instructions. "We go from being an input master (programming languages) to a prompt master (AI orchestrator)," he said. Scott said the current AI systems have significant memory limitations, making them "awfully transactional," but predicted improvements within the next year.

95% of Code Will Be AI-Generated Within Five Years, Microsoft CTO Says

Comments Filter:
  • by LeadGeek ( 3018497 ) on Wednesday April 02, 2025 @12:19PM (#65276613)
    AI is really good at boilerplate implementations and boring stuff, and I think this is simply an indication of that programming languages really need to evolve.
    • The problem isn't to create code that's making bread&butter stuff. The problems starts when someone figures out that there's a bug in the system and has to figure out the root cause.

      Since language models evolves rapidly it will also be hard to maintain the code just a few years from now. Especially with a self-learning evolving language model.

      Add to it legal requirements that changes over time.

    • The real long term problem is:
      AI generates plausible code because it is fed human written code.
      Once >50% of the code is AI generated, _if that ever happens_, well, we are in a self-amplifying diverging crap generator with no exit or alternative.

  • School Assignments (Score:3, Interesting)

    by Calydor ( 739835 ) on Wednesday April 02, 2025 @12:19PM (#65276615)

    It's easy for the AI to write 95% of the code if it writes horribly bloated and inefficient code. Remember using a larger font, wider linebreaks etc. to pad out boring essays? That's how they'll easily accomplish a promise such as this.

    • So, basically "business as usual" for Microsoft products then.

    • by Hadlock ( 143607 ) on Wednesday April 02, 2025 @12:55PM (#65276755) Homepage Journal

      Most developers just need to crank out tickets, if efficiency becomes a problem they'll solve those problems from worst to least worst; almost certainly the code AI writes won't be worse than the database concurrency problem or whatever major show stopper they're trying to fix this week. Done > inefficient

      • by mmdurrant ( 638055 ) on Wednesday April 02, 2025 @01:27PM (#65276863)

        It shocks me how people fail to understand that isn't a problem of technology or programming languages - it's a problem of capitalism and how these organizations are managed. It's why I say "private equity poisons everything" because it does. If pure-profit is what drives engineering decisions - just like every other market sector (wanna order a pizza from Pizza Hut? They don't wanna pay delivery drivers and instead contract it to DoorDash which inevitably results in a worse experience) - you're gonna end up with a worse product.

        The only thing ML-driven code generation (I respect myself and others too much to call it AI) adds to that equation is being able to make garbage at-scale.

        • by nightflameauto ( 6607976 ) on Wednesday April 02, 2025 @01:33PM (#65276891)

          It shocks me how people fail to understand that isn't a problem of technology or programming languages - it's a problem of capitalism and how these organizations are managed. It's why I say "private equity poisons everything" because it does. If pure-profit is what drives engineering decisions - just like every other market sector (wanna order a pizza from Pizza Hut? They don't wanna pay delivery drivers and instead contract it to DoorDash which inevitably results in a worse experience) - you're gonna end up with a worse product.

          The only thing ML-driven code generation (I respect myself and others too much to call it AI) adds to that equation is being able to make garbage at-scale.

          That would explain why Microsoft's CTO would be all about it. "Garbage at scale," has been their secret company motto for decades now.

        • by Hadlock ( 143607 )

          Unfortunately garbage at scale is all that's needed in most cases

    • Since it usually takes as much time to review code properly as it does to write it, what are we gaining here. I certainly wouldn't trust any AI currently available to write code without checking it carefully.

  • by sdinfoserv ( 1793266 ) on Wednesday April 02, 2025 @12:26PM (#65276637)
    begrudgingly agree. I think the real and unaccounted for cost will be future maintenance. When no one is left who understands the art or logic of programming, maintenance and security will be relegated to an afterthought ~ like it is now for IoT garbage.
    As code generation is automated, entire systems will become disposable when new feature sets are required.
    • by nightflameauto ( 6607976 ) on Wednesday April 02, 2025 @01:36PM (#65276905)

      begrudgingly agree. I think the real and unaccounted for cost will be future maintenance. When no one is left who understands the art or logic of programming, maintenance and security will be relegated to an afterthought ~ like it is now for IoT garbage. As code generation is automated, entire systems will become disposable when new feature sets are required.

      Consider the perspective of the companies behind this push. If every new library update, every new feature update, every new security update, has to be a ground-up rebuild because there is no one and no "coding system" that actually has a fundamental understanding of the code base, that's a lot of machine hours hired out. They're building in obsolescence at a rate and scale that has only been dreamt of up to now, and it's all designed to make sure that everybody has to pay them constantly for every update forever and ever.

      • by sdinfoserv ( 1793266 ) on Wednesday April 02, 2025 @02:15PM (#65277009)
        Interesting thought experiment. I hadn't taken it that far. At some point, costs will get so egregious, customers both corporate and home, will simply tap out. Like many are doing with VMware. I'm ripping it from my corporate environment, along with every other Broadcom product we have. VMware was the casualty of price gouging, CarbonBlack was the blowback from my anger at Broadcom.
        If AI generated crap and SaaS/PaaS price gouging goes too high, I can bring programmers back in organization.
        • by nightflameauto ( 6607976 ) on Wednesday April 02, 2025 @02:35PM (#65277065)

          Interesting thought experiment. I hadn't taken it that far. At some point, costs will get so egregious, customers both corporate and home, will simply tap out. Like many are doing with VMware. I'm ripping it from my corporate environment, along with every other Broadcom product we have. VMware was the casualty of price gouging, CarbonBlack was the blowback from my anger at Broadcom. If AI generated crap and SaaS/PaaS price gouging goes too high, I can bring programmers back in organization.

          I think the AI prophets are counting on holding the cards for just long enough that real programmers will either switch careers or retire before we hit that point. I hope they're wrong. Sure is fun living in the end-run of Greed as God.

  • I merely assumed they'd been using AI-generated code for some years now.
    (/low hanging fruit joke)

    • by Lewie ( 3743 )

      This.

      • That long and expensive developer program, decades of conferences, petabytes of example code, gruesome backwards compatibility, all dashed against the rocks of AI.

        It's almost side-splittingly funny. OMG. Seasoned Microsoft devs, lemmings led to the cliff, following "progress" to the ends of its leaders feigning that AI will cast them aside. No unemployment insurance because most were contractors, lots of prime real estate to sell now that no one will be in an office.

        It's a gleeful day, those AI investments

  • by Somervillain ( 4719341 ) on Wednesday April 02, 2025 @12:29PM (#65276645)
    Not surprising, Claude and ChatGPT, when they actually work, produce some BLOATED code. 95% of useful code?...or 95% of lines of code? I know MS is trying to sell you their solutions, so they probably meant useful. However, this is not necessarily a good thing. If your code reinvents the wheel at a great performance penalty...sure, you saved 10 minutes vs having a skilled professional write it, but that comes at huge ecological expense....assuming the bloat didn't add bugs or security vulnerabilities. Always remember...

    Nothing is more expensive than a cheap programmer.

    But to the point, this metric is not as meaningful as I think the original author intended.
    • by vux984 ( 928602 )

      Yep, I was looking at some code the other day that formatted a date a little unsually (it was being used as part of a file name so it was creating a string of "yyyy-MM-dd_hh-mm-ss"), and there was a whole 20+ line custom function used exactly once in the project to extract the date and time bits, pad them, and assemble this string.

      Replaced it with one line of code using a custom date format string using a date time library the project was already using (luxon).

      Nothing is more expensive than a cheap programmer.

      Truly. And nothing is a cheaper programmer than

  • by greytree ( 7124971 ) on Wednesday April 02, 2025 @12:29PM (#65276647)
    5% of the human programmers' time will be spent writing the code the AI can't write, and the other 95% will be spent debugging the code the AI did write.

    Oh joy.
    • by nucrash ( 549705 )

      If you like fixing other people's shit with how way to scream at them for their incompetence, this is the future for you!

      • If you like fixing other people's shit with how way to scream at them for their incompetence, this is the future for you!

        Well this is from the Microsoft CTO. Seems par for the course.

      • If you like fixing other people's shit with how way to scream at them for their incompetence, this is the future for you!

        AI uses more code comments than nearly every human developer I've worked with or after.

        I use Amazon Q to document human written code and it does a remarkably good job of it.

    • 5% of the human programmers' time will be spent writing the code the AI can't write, and the other 95% will be spent debugging the code the AI did write. Oh joy.

      Look on the bright side, meatsack. You might be one of the lucky ones who can sustain some kind of justification for your human employment long enough to survive the UBI starvation wars.

      You know, that we-know-it’s-coming transitional timeframe before Before Greed N. Corruption was forced into tribunal to account for a billion dead due to starvation driven by AI mass migration events and the corrupt lobbying against UBI taxes.

      • 5% of the human programmers' time will be spent writing the code the AI can't write, and the other 95% will be spent debugging the code the AI did write. Oh joy.

        Look on the bright side, meatsack. You might be one of the lucky ones who can sustain some kind of justification for your human employment long enough to survive the UBI starvation wars.

        You know, that we-know-it’s-coming transitional timeframe before Before Greed N. Corruption was forced into tribunal to account for a billion dead due to starvation driven by AI mass migration events and the corrupt lobbying against UBI taxes.

        You have more faith in humanity than I do. The corporations almost outright own the government today. I don't see a time when forced tribunal would come for Greed. I see a time when their ultimate comeuppance will be a slow dawning realization that they have no one left to fleece, as the technological world collapses around them because they're the only humans left, and it turns out money is not actually knowledge.

      • When these times come, companies will find themselves without any customers. Even government contracting relies on taxpayers existing and not unchecked inflation. The very concept of companies themselves will become obsolete and we'll end up with a dystopian form of Communism 2.0 where the means of production can only be afforded by the government and there are no taxes nor UBI because currency begins to fail as a concept.

    • by omnichad ( 1198475 ) on Wednesday April 02, 2025 @02:25PM (#65277035) Homepage

      I know some people who work in medical transcription. It's exactly like you described. The pay has dropped off a cliff because companies have been sold on the idea that this will save them money. But it actually takes longer to find and fix mistakes than it does to do it from scratch. Kind of the same issue we have with "full" self-driving.

  • by david.emery ( 127135 ) on Wednesday April 02, 2025 @12:32PM (#65276657)

    We'll have errors in the AI-generated code (although that rate will hopefully decline, if we can figure out how to teach AI 'good code' from 'bad code'.) And in particular, a lot of the coding errors (e.g. those that result in security vulnerabilities) will probably go down. But in the short run, I'd expect human generated errors to go up, until we get better at writing specs for the code to be produced by the AI. But design errors will continue. The work done by Nancy Levison and John Knight on n-version coding for safety-critical systems (that show independent groups of designers tend to make the same mistakes) is still relevant. http://sunnyday.mit.edu/papers... [mit.edu]

    What I'd -hope- would happen is that we get a lot better at software specification, that's both more rigorous requirements and more rigorous design/code. The potential effort savings from machine-generated code SHOULD BE used to get better designs, and in particular designs that can be shown correct by more than just "enough testing."

    But the pessimist in me says that "AI generating code will reduce the effort curve to talk to the AI, and the result will be more really poorly -designed- software, even if the generated code does what it was asked to do."

  • Public domain (Score:5, Interesting)

    by Mononymous ( 6156676 ) on Wednesday April 02, 2025 @12:33PM (#65276659)

    Since all of that code is uncopyrightable, I guess that achieves what the anti-copyright extremists want.

    • Re:Public domain (Score:4, Interesting)

      by Prof.Phreak ( 584152 ) on Wednesday April 02, 2025 @01:25PM (#65276857) Homepage

      ...that code is uncopyrightable

      he was clever to say "authorship is still going to be human"... even-though we expect AI to write all of it...

      If AI can really write "nearly all" of Microsoft's code... wouldn't that capability put Microsoft out of business? (e.g. hey, AI, write me an operating system, business software, etc., from scratch!, and make it backwards compatible to everything I have.... on second thought, rewrite all of my legacy stuff too!... on third thought, forget me using software at all, you go and generate revenue for me, I don't care how...).

      With AI "future" they're pushing, what would be Microsoft's competitive advantage over say anyone else?

  • by whitroth ( 9367 ) <`su.tnec-5' `ta' `htortihw'> on Wednesday April 02, 2025 @12:34PM (#65276665) Homepage

    Companies start going out of business, or reverting to older code bases, because the "AI"-generated crap is completely unmaintainable.

    • ...because the "AI"-generated crap is completely unmaintainable.

      Or equally (or more) likely, courts rule that all the "AI" companies are massive copyright infringers (which they are, and have admitted to being), and the entire "AI" bubble experiences a nuclear implosion. Then all the "AI"-generated code is replaced in a big scramble before the copyright infringement lawsuits hit their customers' front doors. The need for experienced programmers hits a high not seen since Y2K.

      • I don't think that is likely at all. All the big-tech elites are producing AI models. Collectively they hold a tremendous amount of political power (they should not, of course, but wealth = political power and they've got it). Even if they do somehow get rulings against them, they have the power to change the law to allow themselves an exception. Or some petty "feel good" compromise where they toss a few pennies in the direction of a few middle-sized businesses that own a lot of the copyrighted code, an

  • by nightflameauto ( 6607976 ) on Wednesday April 02, 2025 @12:38PM (#65276683)

    If code engineering, ground-up, is able to be done by machine, and automation is essentially already on the brink of replacing most types of work, what's the use of those pesky humans? There's no need to employ them. I mean, they ask for things employers don't want to give them, like time off, a living wage, insurance (since we have to keep that tied to a job), benefits, and they don't give you anything you can't get from the machines. With no one employed save a few C-Suites and board members to keep the machinations machinating, the majority will be out of work, and out of work = irrelevant. Non-value to business = non-value to community = non-value to society. And things that aren't valuable have no purpose to exist.

    So, since the C-suites are day-dream fapping about how automation will replace us all, what is their plan for how to deal with all the excess bodies? We know that our governments won't do shit about it. They're beholden to the business sector, not to us. So where do we go? Do we just die in the streets once we lose our homes? What's the end-game in this fantasy scenario that we're having to watch be publicly discussed constantly without any discussion at all about the long-term repercussions? Do we just rush headlong toward it and pretend everything's fine until we run off the cliff like a bunch of lemmings? Or do we start discussing what's coming and trying to sort out a solution other than, "Protect the rich, let everybody else die?" Just curious if we even care at this point, or if it's so important to protect the business class that even this daydream of complete dominance over humanity must be sustained at all costs.

    • They'll need lots of backhoe drivers.
      At least for a little while.

      • They'll need lots of backhoe drivers. At least for a little while.

        Have you seen a recent construction job by a high-tech construction company? I watched a great big backhoe run itself this summer for about an hour before I got bored with it. The dude pulled it into the site, set up a couple laser guides, popped in a programmed foundation guide, and turned it loose. He had a killswitch on his belt to stop it if somebody wandered into the area. I'd imagine in another few years, there'll be even less need for a human to be around.

        • And the foundation is the hard part. I saw a hotel being built near me and it was just prefab wall modules being dropped into place by crane - insulation and framing and all completely finished.

      • Humans mattered?

        There was a time, somewhere around the industrial age, where they started to matter less. We're just entering the age where we don't even pay lip-service to the thought that humans are a part of the equation at all. Up to the last few years, businesses and governments still sometimes made rumblings that individuals mattered. Not so much anymore.

        • Interesting paradox. It's also when the Romantic poets felt we started losing our humanity, but at the same time humans started believing that they had an outsized position in the universe.

          Not that humans of the Western Civilized proclivity have not long possessed an outsized sense of entitlement. We now feel a need to save the whales, but who can save the humans from themselves?
    • If code engineering, ground-up, is able to be done by machine, and automation is essentially already on the brink of replacing most types of work, what's the use of those pesky humans?

      You're assuming this stuff writes "good enough" code. I can tell you...it doesn't. It's like a roomba...not useless, expensive, and fun to play with. But when a homeowner buys a roomba, do they fire their maid? do they throw away their brooms?

      Will it someday? Eh...I am skeptical...unless they have totally new algorithms for AI. Generative AI is just fancy pattern matching. Logically, it can take known patterns and reapply them to similar situations. However, most programmers are hired to write ne

      • If code engineering, ground-up, is able to be done by machine, and automation is essentially already on the brink of replacing most types of work, what's the use of those pesky humans?

        You're assuming this stuff writes "good enough" code. I can tell you...it doesn't. It's like a roomba...not useless, expensive, and fun to play with. But when a homeowner buys a roomba, do they fire their maid? do they throw away their brooms? Will it someday? Eh...I am skeptical...unless they have totally new algorithms for AI. Generative AI is just fancy pattern matching. Logically, it can take known patterns and reapply them to similar situations. However, most programmers are hired to write new code and solve novel problems. If the problem was solved in the past, no one would hire you. If Generative AI actually worked...which most of the time it doesn't...it could save time, but not eliminate a skilled programmer. At the very least, you need a human being to sign off that you didn't just expose your employer to a massive lawsuit by writing insecure code. OK, so you don't believe me?...fair...I would pose this question. The promises of Generative AI are to generate near infinite wealth from electricity and time. If you had a magic coding machine, you could basically print money and be the wealthiest company in the world. Given the trillions spent already and the massive motivation to make unfathomable profits, why are these companies selling mere toolkits? Why not sell finished goods? For example, MS, Google, or Amazon have massive cloud hosting environments. Why not have sell services to have Generative AI fix your code for you?...at a HIGH monthly fee? Businesses would trample over each other for the opportunity to pay any cost for such a service. Why not? Because you could objectively determine if this shit works...similarly...why isn't there a generative AI CLR for Microsoft that takes shitty slow C# code written by someone who sucks at coding and converts it to the leanest C or assembly imaginable? Why even bother with a runtime environment? How about a product that replaces the CLR, JVM, or Python runtime and outputs C/rust/assembly/whatever...takes your sloppy prototype code and optimizes it and converts it to the leanest executable imaginable? Why?...because we'd figure out INSTANTLY that this shit doesn't work. OK, performance and security isn't your thing? Why doesn't MS write a Generative AI level generator for your favorite games?...infinite content for a monthly fee....why?...because you'd figure out instantly that it doesn't work.

        I don't disagree with your assessment of current gen AI generated code. I just enjoy playing with the theories the AI prophets are pushing right now. The what ifs are fascinating to me. It reveals the emptiness in all the promises, while also revealing the ultimate psychopathy behind the machines (companies) in charge of the machines they're pushing.

  • by nothinginparticular ( 6181282 ) on Wednesday April 02, 2025 @12:39PM (#65276687)
    LLMs have very low "intelligence" and have no ability to truly create. All they can do is write code based on an amalgamation of github, stackoverflow etc. If the programming languages and software requirements never change then, yes, maybe we could keep churning out code from the same training data. But everyone knows that software is ever changing, which means we will still need an army of developers to write the new programs that can feed the training data for the LLMs. If he's suggesting that we can have AGI in the next five years, well that'll never happen with LLMs which essentially fit curves to datapoints. With regard to AGI, there has so far been no meaningful development in this area to date. The rate of improvements to LLMs has greatly decreased recently because they've trained them on pretty much all available training data! At least that's Francios Chollet's recent take on things (I'd highly recomment listening to his podcast with Sean Carroll's Mindscape)
    • LLMs have very low "intelligence" and have no ability to truly create.

      Ya, bullshit.

      All they can do is write code based on an amalgamation of github, stackoverflow etc.

      Ya, bullshit.

      If the programming languages and software requirements never change then, yes, maybe we could keep churning out code from the same training data. But everyone knows that software is ever changing, which means we will still need an army of developers to write the new programs that can feed the training data for the LLMs.

      Given they have no capacity to continue learning, it is true that their concept of the state of the art is frozen in time. This is definitely their greatest shortcoming right now.

      If he's suggesting that we can have AGI in the next five years, well that'll never happen with LLMs which essentially fit curves to datapoints.

      This tired argument.
      Yes, neural networks fit curves to data points. Biological, mathematical, all of them. They're universal function approximators.
      Where the fun starts is when you add state (memory) and recurrence.
      In LLMs, this is accomplished with the context window, and feeding it back into the LLM a

      • Where the fun starts is when you add state (memory) and recurrence. In LLMs, this is accomplished with the context window, and feeding it back into the LLM again for every generated token. This means no, the combined system is far more than something that merely "fits a curve to the data points." It's capable of actual computation.

        You can test this by asking it to count whether parentheses are balanced. If it can't do that, then it's not using the context window effectively.

        For Turing machine level of computation, it will need something like scratch memory where it can go back and modify things it has written in memory.

        • You can test this by asking it to count whether parentheses are balanced. If it can't do that, then it's not using the context window effectively.

          It can.
          I've posted demonstrations here before.

          Of course "it" means a suitable powerful one.
          For my tests, I used QwQ-27B-fp16. Not even the smartest. In its reasoning tokens, it created a stack and balanced it.

          For Turing machine level of computation, it will need something like scratch memory where it can go back and modify things it has written in memory.

          Or an ability for the tape (context) to be able to overwrite past contexts. i.e., the LLM can simulate a memory in the context that's refreshed every response.
          There's a paper somewhere demonstrating it on even a much older LLM.

          This does mean that in the normal way that you interact with it, it's

          • This does mean that in the normal way that you interact with it, it's not Turing Complete always, but it can be effectively Turing Complete at times, or always with the right system prompt.

            What does that mean? What are the limitations?

    • All they can do is write code based on an amalgamation of github, stackoverflow etc

      We're all "standing on the shoulders of giants" to a degree and humans do the same. We may be far from AGI, but very few people are using GI when writing code either. Even when doing complex things. Most new patents these days are just X+Y combinations of things that already exists.

  • by Virtucon ( 127420 ) on Wednesday April 02, 2025 @12:40PM (#65276695)

    "Ditch those pesky, competent software engineers! Embrace the glorious chaos of our Hallucinogenic AI LLMs! Why write maintainable code when you can generate digital spaghetti that even a seasoned spaghetti monster would find perplexing?

    Imagine: Your Senior Managers, those glorious relics of a bygone coding era (circa 2003, when Java was really cool), churning out entire systems in an afternoon! Based on the finest, most confidently incorrect answers from StackOverflow, naturally. Think of the 'learning opportunities' when you have to pay us exorbitant fees to untangle the mess!

    Our LLMs are trained to respond with the same delightful, condescending snark as your favorite StackOverflow contributors. Feel right at home being told you're doing it wrong, even by a machine! It's like a warm, passive-aggressive blanket of familiarity.

    Don't understand the gibberish our AI spits out? Perfect! That's job security... for us! We'll happily dispatch a team of 'experts' from a location so offshore, that they're practically swimming with the fishes, to 'fix' things at a mere $500 an hour. (Plus travel, lodging, and emotional support for our team after dealing with your code.)

    This isn't just coding; it's CASE 4.0: The Revenge of the Unmaintainable! Remember the CASE hype? We've taken that level of over-promising and under-delivering and injected it with pure, unadulterated AI-powered madness.

    So, throw your sanity out the window and join the revolution! Your shareholders will love the burn rate! It's not a bug, it's a feature... a very expensive, very confusing feature!

  • by ChunderDownunder ( 709234 ) on Wednesday April 02, 2025 @12:47PM (#65276711)

    Windows and Office are so second millennium.

    If 95% of the code is generated by AI, they can lay off 95% of their programmers.

    And who needs an office suite if AI is savvy enough to hallucinate a document that no human will ever need read because we have AIs now to process the paperwork of all those silly reports humans used to produce in Word, Powerpoint and Excel.

  • Auto code generation has ALWAYS been a major thing. For example NASA has been claiming 70% to 80% autocoding since their inception. I guess it was dismissed as liberal lies until now.....
  • by avandesande ( 143899 ) on Wednesday April 02, 2025 @12:50PM (#65276723) Journal
    If Windows 12 is 95% written by AI.
  • by FudRucker ( 866063 ) on Wednesday April 02, 2025 @12:51PM (#65276725)
    thats why Windows is such a bloated kludge, hopefully Linux does not fall into that abyss
  • by TheMiddleRoad ( 1153113 ) on Wednesday April 02, 2025 @12:51PM (#65276727)

    If the software engineers have a good specification that breaks down into the smallest possible chunks of logic, then they can prompt each chunk individually and understandably until they get sufficiently-quality code.

    And prompt can be made to create that specification, again chunk by chunk.

    Eventually, there will be systems that can do that whole process.

    The bad part will be that, eventually, few will understand the chunks let alone the overall specifications, or the software that makes them, as there won't be many career paths to train people to that level.

    We're going in a bad direction in the long term, and market forces are going to fuck us over worse than ever.

    • Reasoning LLMs do not require breaking down logic.

      In fact, they're quite good at pointing out the broken logic of overconfident and poorly educated humans.
      • I've seen reasoning LLMs make simple programs, like snake, but never anything big. I think we're a long way off from getting anything good by saying, "Make a specification for a system that makes social security payments".

        • Please take this response as being in good faith and fun-

          But if you were to ask an engineer that exact question, how worried would you be about them cutting you in half with mutant cyclops powers?
          You've outlined a critical problem with LLMs here, actually, in that the LLM will try to satisfy that request, instead of what an engineer would do, which is start plotting your violent removal from the gene pool.
  • by bradley13 ( 1118935 ) on Wednesday April 02, 2025 @12:51PM (#65276729) Homepage

    I'm old enough that I have written programs in machine language. Also assembly. Not much call for that anymore. Now, you say "new ArrayList" or whatever. Do you know how much machine code that generates, and relies on out of libraries also written n a high-level language?

    Tools advance. Maybe we are on the threshold of the next major shift. There will always be people writing Java, C#, etc, just as a few people still write assembly. But for the masses? Newer, better tools that lead to higher productivity.

    This kind of disruptive shift has happened many times. I knw a guy who started out as a typesetter: putting metal letters in a frame. Then he moved on to new tools in the printing trade. Now professional printers barely exist - he became a gardener.

  • by swan5566 ( 1771176 ) on Wednesday April 02, 2025 @12:53PM (#65276735)
    ... the vast majority of code used when creating new things come from libraries that someone else already wrote. There's nothing scary about that (unless there's bugs, but I digress). AI might help me be more efficient at delivering results, but I'm not sensing AI is getting anywhere close to replacing me.
  • Even if AI wrote 99% of the code, that is all well and good... but there is always that 1% which will be a show stopper.

    This isn't new. People have been copying/pasting stuff from Stack Overflow for years. I've used AI for code, and for something mainstream, it can work okay, but have anything that isn't 100% mainstream, and that is where things can break down, as the AI code can confidently make garbage or try to invoke methods which never even exist.

    Sometimes AI can happily write a page of code... and d

    • The tricky thing will be what it does to the talent pipeline. In practice a lot of the high-clue programmer supply is either trained or discovered by initially doing the somewhat lower-skilled stuff and gaining experience and working with more experienced people.

      If companies think that they can replace their entry level programmers with bots they'll presumably do so; but if there are basically no entry level programming positions to be had it's unclear who will be gaining experience to become the more se
    • The difference between referencing stackoverflow and AI is the conversation. AI gives you an authoritarian answer whether correct or not; stackoverflow offers a dialog on possible solutions. It's a huge difference.
  • I wonder, was there a point at which we hit "95% of all code is libraries" Because... it's true in almost every tech stack i've ever touched.

    And it doesn't get viewed (as much) in the same virulently negative light.

    After all, if we didn't have the tool Tomcat, every single web project would be required to hire a *whole* lot more devs.

  • In the NSA, FSB and among cyber criminals. Insurance companies beware...

  • He will quit his job, give back the money he had been paid, and then kill himself? Come on, put some skin in the game, Kristin.
  • I predict 95% of CXOs to be ground up into dogfood in the next five years. Whose prediction is more likely?
  • because they'll be bad at AI prompts and checking the AI generated code.

    Good developers will create code better than AI because they understand the user requirements better than can be expressed in AI prompts. Plus they can see much further into the future and will write maintainable code that AI can't.

  • It's not surprising that someone from Microsoft would say this. I suspect AI has been writing code for Microsoft from just before Vista came out. Maybe earlier.
  • by ZipNada ( 10152669 ) on Wednesday April 02, 2025 @01:35PM (#65276895)

    This is an entirely accurate statement. We are already hearing that AI writes a significant amount of the code at Google and tech companies are laying off developers in droves. I do all my development work with AI these days, it is a huge productivity boost. People who criticize it here are obviously not using it.

    You can start off simple; 'examine this codebase and explain it to me'. You'll get loads of details about it without having to plow through it all yourself. You can say 'document this code with comments and a README' and you will get that in a few minutes. A huge savings of time and drudgery. You can say 'refactor this large file into multiple modules that can be independently tested and reused. Write test cases for them'. You will get that in under 10 minutes. Anything you don't like you can simply reject and revert back to the original.

    Then you can experiment. Create a branch and try some things out. 'As this code processes data, accumulate statistics about the performance of these specific modules, store that data in a local database and be able to generate reports that show these key metrics'. No problem, all of that will be produced for you and it will even make a website that shows the reports if you want.

    • Youâ(TM)ll never find a programming language that frees you from the burden of clarifying your ideas.
      -Randall Munroe
      https://xkcd.com/568/ [xkcd.com]

      • What's great is that you can start off with ideas that are pretty vague. If you don't like the results you can give the AI more specific instructions. And if you aren't willing to use AI to do any of that you will be out of a job pretty soon.

  • It surely will (Score:3, Interesting)

    by jkechel ( 1101181 ) on Wednesday April 02, 2025 @01:36PM (#65276903)

    Why Not? It’s Already Working Great for Small Projects

    Example 1: Annoyed by Amazon Prime Video showing you all those “buy or rent” videos? Just go to an AI of your choice, give it a simple prompt like “I want a Chrome extension that only shows me videos with a Prime badge,” and provide the HTML source of one of those pages. Bam—never see those unwanted videos again.

    Example 2: Tired of seeing those annoying mobile game ads on Netflix? Do the same as above—give it Netflix’s HTML, and let AI filter them out for you.

    Example 3: For larger projects using Cursor, I just highlight the code where I want to add a parameter, condition, or whatever, describe the change I want, review the AI’s suggestion, and accept or tweak it as needed.

    Example 4: For new projects, you can guide AI step by step:
            1. Provide the planned software documentation and ask it to outline a code structure (refine as needed).
            2. Generate unit and integration tests automatically (or outsource them).
            3. Let AI write the code to pass those tests (or outsource it).

    This already works today—just try Claude Code or similar tools.

    Sure, the real thinking and specification still need to be done, but AI speeds up the process. I don’t mind if it takes a day for tests and a week for implementation as long as it works—and it’s only getting better, cheaper, and faster.

    The key now is having large enough context windows to handle specifications, coding style guides, to-do lists, and 10–50 source files at once. No need to dump the whole Linux kernel in there.

    Next, we’ll see models with a smaller, possibly local, AI trained on your specific codebase. It will remember your preferences, decision-making, and prompts, optimizing how it interacts with the larger remote model while working through it's own todo-lists.

  • by organgtool ( 966989 ) on Wednesday April 02, 2025 @01:40PM (#65276919)
    AI Prompt 1: Take my code and add a feature that does the following things...
    AI Prompt 2: No, do it like this
    AI Prompt 3: No, I need it to do this other thing as well

    If this ever pans out, you'll probably spend most of your time writing AI commands rather than programming commands. I don't know that development will be any faster, but it does seem a lot more annoying.
  • Are we just writing code, or are we writing programs?

    So much of what we do is just reimplementing what's already been done without innovation anyway, let's skip all that and do the fun stuff, let the AI do the boilerplate and lifting, let's do what we need to make it better.

  • Put up or shut up, assholes. How much of your own money are you willing to wager on this? Is it zero? Then I don't see why any of us should believe what you say, since you yourself don't.
  • Prompt is code. It's like Python but a more retarded way to talk to computers. I mean python was already for retards, this is taking it to another level. You're telling a computer what you want it to do. LLM is like the first stage of a compiler. No different from a high level programming language. Maybe even higher level language, like cocaine high.

    • Wasn't that what COBOL was supposed to do?

      Statements in that programming language sounded like the utterances of a PHB ergo a PHB could code in COBOL with having programmers around?

  • ...was going to happen 15 years ago but as it turns out humans are not smart enough to replicate human intelligence just yet. In 100 years they'll look back and wonder how anyone thought they can replicate human level intelligence with primitive shit.
  • by databasecowgirl ( 5241735 ) on Wednesday April 02, 2025 @02:09PM (#65276999)
    It's important to remember that in 2016 all the AI visionaries were prophesising a five year expiration date for ALL truck driving jobs.

    These soundbites work because within five years is corporate speak for this makes me sound smart and I'm banking on no one being able to fact checking this. The press loves it. Makes great clickbait. No need to consult an opposing position.

    My observation on this is that AI has replaced A1 some time ago making every day artificial fool's day for those who want to believe.
    • That said, after thinking about it and reflecting on other comments, this prediction is fairly conservative and probably accurate.

      As mentioned by others in this discussion, it's particularly close to true today given automated stubbing, code completion, grammar checking, error highlights, automated test tools like Sonar, and the practice of using existing libraries rather than reinventing the wheel.

      While the five years incantation is, as a rule, misleading hand waving, I should have picked up on the u
  • by dfghjk ( 711126 ) on Wednesday April 02, 2025 @02:13PM (#65277005)

    "Scott said the current AI systems have significant memory limitations, making them "awfully transactional," but predicted improvements within the next year."

    Why don't you ask AI to write you some code to solve those "significant memory limitations"? Maybe AI can invent virtual memory!

    I like how we are supposed to believe that AI will have the ability to replace programmers almost entirely while also accepting basic limitations of AI as though AI cannot possibly do what programmers have taken for granted for decades.

  • then maybe 95% of code will be written by AI. But only 5% of code will be usable.
  • We've had 'prompt engineer' and 'vibe coder'. What other glorified titles can we give to people to make them feel useful?
  • Simple question here. If AI is writing code, then how is it executing this code the test it? If I have a smallish repo containing 250k lines of code, how is the AI executing what it has written that works within that codebase? Is it compiling an EXE, and running it within a Windows environment? Is it running PHP code on a LAMP stack? How does it test the query on my database containing 5 million records? How does it verify the HTML and CSS it created behaves correctly in half a dozen major browsers on deskt

    • In the environment I use the AI will write test cases on demand, run them itself if you want, evaluate the results, and implement fixes for any problems it sees. Yes it will generate and test queries against your database all day long. It's like coding on cruise control.

      "behaves correctly in half a dozen major browsers", not there yet, you have to do that part yourself. But it will definitely generate the HTML and CSS and fix it if you don't like it.

  • *puts om They Live glasses*

    "AI will replace you because it doesn't get paid"

  • Who's going to write the other 5% and how do you plan on training their replacements, once they retire?
  • I remember what it was like after having taught myself C++, to be told by companies that they wanted someone with experience. Even though I could demonstrate working programs I had written, companies weren't even interested in hiring me unless I had a college degree.

    Debugging is a skill taught through practical experience. The fundamental problem with using AI to generate code is that new engineers will never learn how to debug code, or how the machine really works, so when it comes to really difficult

    • That's an interesting perspective (I already commented, so I can't mod up.) BUT, when you're debugging your own code, you know what it's supposed to do. If you're debugging someone else's code, the first challenge is understanding what that code is supposed to do. Presumably you have a head start on that, when you're told, "Here's what is NOT going right."

      Now the problem with AI generated code will be the need to gain understanding of the -intent-. So you'd have to hope to be given the query that produc

  • I know a large company that had a disastrous company-wide meeting and afterward an AI-generated email came out with a summary of the meeting. It was self-contradictory and had some charts about how successful the meeting was. It was like no one bothered to check the email before it went out.

    My experience with AI coding is also bad. It's deceptively clever and wrong. Back in the 1980s people were talking about auto-generated programs that ended up being tossed in the dumpster. AI isn't holding your hand

  • So 5 years of writing code will become 5 years of trying to edit and fix code that does not do what you want. You can already see this in all the companies looking for content editors for their AI generated contest. Because WE DO NOT HAVE ARTIFICIAL INTELLIGENCE. We have weighted randomization. Let's see how this works for you.

  • It's the basic recipe for publicity now:

    state that <difficult task> will be done <high percentage> by AI in <only a few> years, and boom, people/news sites will bite

    the field is so crowded now, and this is one of the easiest ways to get attention

The hardest part of climbing the ladder of success is getting through the crowd at the bottom.

Working...