

95% of Code Will Be AI-Generated Within Five Years, Microsoft CTO Says 116
Microsoft Chief Technology Officer Kevin Scott has predicted that AI will generate 95% of code within five years. Speaking on the 20VC podcast, Scott said AI would not replace software engineers but transform their role. "It doesn't mean that the AI is doing the software engineering job.... authorship is still going to be human," Scott said.
According to Scott, developers will shift from writing code directly to guiding AI through prompts and instructions. "We go from being an input master (programming languages) to a prompt master (AI orchestrator)," he said. Scott said the current AI systems have significant memory limitations, making them "awfully transactional," but predicted improvements within the next year.
According to Scott, developers will shift from writing code directly to guiding AI through prompts and instructions. "We go from being an input master (programming languages) to a prompt master (AI orchestrator)," he said. Scott said the current AI systems have significant memory limitations, making them "awfully transactional," but predicted improvements within the next year.
languages need to evolve (Score:4, Insightful)
Re: languages need to evolve (Score:3)
The problem isn't to create code that's making bread&butter stuff. The problems starts when someone figures out that there's a bug in the system and has to figure out the root cause.
Since language models evolves rapidly it will also be hard to maintain the code just a few years from now. Especially with a self-learning evolving language model.
Add to it legal requirements that changes over time.
AI is a self-amplifying diverging crap generator (Score:2)
The real long term problem is:
AI generates plausible code because it is fed human written code.
Once >50% of the code is AI generated, _if that ever happens_, well, we are in a self-amplifying diverging crap generator with no exit or alternative.
School Assignments (Score:3, Interesting)
It's easy for the AI to write 95% of the code if it writes horribly bloated and inefficient code. Remember using a larger font, wider linebreaks etc. to pad out boring essays? That's how they'll easily accomplish a promise such as this.
Re: (Score:2)
So, basically "business as usual" for Microsoft products then.
Re: (Score:2)
Re:School Assignments (Score:4, Informative)
Most developers just need to crank out tickets, if efficiency becomes a problem they'll solve those problems from worst to least worst; almost certainly the code AI writes won't be worse than the database concurrency problem or whatever major show stopper they're trying to fix this week. Done > inefficient
Re:School Assignments (Score:4, Interesting)
It shocks me how people fail to understand that isn't a problem of technology or programming languages - it's a problem of capitalism and how these organizations are managed. It's why I say "private equity poisons everything" because it does. If pure-profit is what drives engineering decisions - just like every other market sector (wanna order a pizza from Pizza Hut? They don't wanna pay delivery drivers and instead contract it to DoorDash which inevitably results in a worse experience) - you're gonna end up with a worse product.
The only thing ML-driven code generation (I respect myself and others too much to call it AI) adds to that equation is being able to make garbage at-scale.
Re:School Assignments (Score:4, Funny)
It shocks me how people fail to understand that isn't a problem of technology or programming languages - it's a problem of capitalism and how these organizations are managed. It's why I say "private equity poisons everything" because it does. If pure-profit is what drives engineering decisions - just like every other market sector (wanna order a pizza from Pizza Hut? They don't wanna pay delivery drivers and instead contract it to DoorDash which inevitably results in a worse experience) - you're gonna end up with a worse product.
The only thing ML-driven code generation (I respect myself and others too much to call it AI) adds to that equation is being able to make garbage at-scale.
That would explain why Microsoft's CTO would be all about it. "Garbage at scale," has been their secret company motto for decades now.
Re: (Score:2)
Unfortunately garbage at scale is all that's needed in most cases
Re: (Score:2)
Since it usually takes as much time to review code properly as it does to write it, what are we gaining here. I certainly wouldn't trust any AI currently available to write code without checking it carefully.
programmers - tomorrows farrier (Score:4, Insightful)
As code generation is automated, entire systems will become disposable when new feature sets are required.
Re:programmers - tomorrows farrier (Score:5, Interesting)
begrudgingly agree. I think the real and unaccounted for cost will be future maintenance. When no one is left who understands the art or logic of programming, maintenance and security will be relegated to an afterthought ~ like it is now for IoT garbage. As code generation is automated, entire systems will become disposable when new feature sets are required.
Consider the perspective of the companies behind this push. If every new library update, every new feature update, every new security update, has to be a ground-up rebuild because there is no one and no "coding system" that actually has a fundamental understanding of the code base, that's a lot of machine hours hired out. They're building in obsolescence at a rate and scale that has only been dreamt of up to now, and it's all designed to make sure that everybody has to pay them constantly for every update forever and ever.
Re:programmers - tomorrows farrier (Score:4, Interesting)
If AI generated crap and SaaS/PaaS price gouging goes too high, I can bring programmers back in organization.
Re:programmers - tomorrows farrier (Score:5, Insightful)
Interesting thought experiment. I hadn't taken it that far. At some point, costs will get so egregious, customers both corporate and home, will simply tap out. Like many are doing with VMware. I'm ripping it from my corporate environment, along with every other Broadcom product we have. VMware was the casualty of price gouging, CarbonBlack was the blowback from my anger at Broadcom. If AI generated crap and SaaS/PaaS price gouging goes too high, I can bring programmers back in organization.
I think the AI prophets are counting on holding the cards for just long enough that real programmers will either switch careers or retire before we hit that point. I hope they're wrong. Sure is fun living in the end-run of Greed as God.
Given the quality of MS's code as of late (Score:5, Funny)
I merely assumed they'd been using AI-generated code for some years now.
(/low hanging fruit joke)
Re: (Score:2)
This.
Re: (Score:2)
That long and expensive developer program, decades of conferences, petabytes of example code, gruesome backwards compatibility, all dashed against the rocks of AI.
It's almost side-splittingly funny. OMG. Seasoned Microsoft devs, lemmings led to the cliff, following "progress" to the ends of its leaders feigning that AI will cast them aside. No unemployment insurance because most were contractors, lots of prime real estate to sell now that no one will be in an office.
It's a gleeful day, those AI investments
How much of that is bloat? (Score:5, Insightful)
Nothing is more expensive than a cheap programmer.
But to the point, this metric is not as meaningful as I think the original author intended.
Re: (Score:2)
Yep, I was looking at some code the other day that formatted a date a little unsually (it was being used as part of a file name so it was creating a string of "yyyy-MM-dd_hh-mm-ss"), and there was a whole 20+ line custom function used exactly once in the project to extract the date and time bits, pad them, and assemble this string.
Replaced it with one line of code using a custom date format string using a date time library the project was already using (luxon).
Nothing is more expensive than a cheap programmer.
Truly. And nothing is a cheaper programmer than
On the other hand... (Score:5, Insightful)
Oh joy.
Re: (Score:2)
If you like fixing other people's shit with how way to scream at them for their incompetence, this is the future for you!
Re: (Score:2)
If you like fixing other people's shit with how way to scream at them for their incompetence, this is the future for you!
Well this is from the Microsoft CTO. Seems par for the course.
Re: (Score:2)
If you like fixing other people's shit with how way to scream at them for their incompetence, this is the future for you!
AI uses more code comments than nearly every human developer I've worked with or after.
I use Amazon Q to document human written code and it does a remarkably good job of it.
Re: (Score:3)
5% of the human programmers' time will be spent writing the code the AI can't write, and the other 95% will be spent debugging the code the AI did write. Oh joy.
Look on the bright side, meatsack. You might be one of the lucky ones who can sustain some kind of justification for your human employment long enough to survive the UBI starvation wars.
You know, that we-know-it’s-coming transitional timeframe before Before Greed N. Corruption was forced into tribunal to account for a billion dead due to starvation driven by AI mass migration events and the corrupt lobbying against UBI taxes.
Re: (Score:2)
5% of the human programmers' time will be spent writing the code the AI can't write, and the other 95% will be spent debugging the code the AI did write. Oh joy.
Look on the bright side, meatsack. You might be one of the lucky ones who can sustain some kind of justification for your human employment long enough to survive the UBI starvation wars.
You know, that we-know-it’s-coming transitional timeframe before Before Greed N. Corruption was forced into tribunal to account for a billion dead due to starvation driven by AI mass migration events and the corrupt lobbying against UBI taxes.
You have more faith in humanity than I do. The corporations almost outright own the government today. I don't see a time when forced tribunal would come for Greed. I see a time when their ultimate comeuppance will be a slow dawning realization that they have no one left to fleece, as the technological world collapses around them because they're the only humans left, and it turns out money is not actually knowledge.
Re: (Score:3)
When these times come, companies will find themselves without any customers. Even government contracting relies on taxpayers existing and not unchecked inflation. The very concept of companies themselves will become obsolete and we'll end up with a dystopian form of Communism 2.0 where the means of production can only be afforded by the government and there are no taxes nor UBI because currency begins to fail as a concept.
Re:On the other hand... (Score:4, Interesting)
I know some people who work in medical transcription. It's exactly like you described. The pay has dropped off a cliff because companies have been sold on the idea that this will save them money. But it actually takes longer to find and fix mistakes than it does to do it from scratch. Kind of the same issue we have with "full" self-driving.
What percentage of errors will be generated? (Score:3)
We'll have errors in the AI-generated code (although that rate will hopefully decline, if we can figure out how to teach AI 'good code' from 'bad code'.) And in particular, a lot of the coding errors (e.g. those that result in security vulnerabilities) will probably go down. But in the short run, I'd expect human generated errors to go up, until we get better at writing specs for the code to be produced by the AI. But design errors will continue. The work done by Nancy Levison and John Knight on n-version coding for safety-critical systems (that show independent groups of designers tend to make the same mistakes) is still relevant. http://sunnyday.mit.edu/papers... [mit.edu]
What I'd -hope- would happen is that we get a lot better at software specification, that's both more rigorous requirements and more rigorous design/code. The potential effort savings from machine-generated code SHOULD BE used to get better designs, and in particular designs that can be shown correct by more than just "enough testing."
But the pessimist in me says that "AI generating code will reduce the effort curve to talk to the AI, and the result will be more really poorly -designed- software, even if the generated code does what it was asked to do."
Public domain (Score:5, Interesting)
Since all of that code is uncopyrightable, I guess that achieves what the anti-copyright extremists want.
Re:Public domain (Score:4, Interesting)
...that code is uncopyrightable
he was clever to say "authorship is still going to be human"... even-though we expect AI to write all of it...
If AI can really write "nearly all" of Microsoft's code... wouldn't that capability put Microsoft out of business? (e.g. hey, AI, write me an operating system, business software, etc., from scratch!, and make it backwards compatible to everything I have.... on second thought, rewrite all of my legacy stuff too!... on third thought, forget me using software at all, you go and generate revenue for me, I don't care how...).
With AI "future" they're pushing, what would be Microsoft's competitive advantage over say anyone else?
Re: (Score:2)
No, because the AI will be owned by Microsoft and you will pay to use it.
And in five years and six months (Score:5, Insightful)
Companies start going out of business, or reverting to older code bases, because the "AI"-generated crap is completely unmaintainable.
Re: (Score:2)
...because the "AI"-generated crap is completely unmaintainable.
Or equally (or more) likely, courts rule that all the "AI" companies are massive copyright infringers (which they are, and have admitted to being), and the entire "AI" bubble experiences a nuclear implosion. Then all the "AI"-generated code is replaced in a big scramble before the copyright infringement lawsuits hit their customers' front doors. The need for experienced programmers hits a high not seen since Y2K.
Re: (Score:2)
I don't think that is likely at all. All the big-tech elites are producing AI models. Collectively they hold a tremendous amount of political power (they should not, of course, but wealth = political power and they've got it). Even if they do somehow get rulings against them, they have the power to change the law to allow themselves an exception. Or some petty "feel good" compromise where they toss a few pennies in the direction of a few middle-sized businesses that own a lot of the copyrighted code, an
Humans will cease to matter in under 10 years. (Score:4, Interesting)
If code engineering, ground-up, is able to be done by machine, and automation is essentially already on the brink of replacing most types of work, what's the use of those pesky humans? There's no need to employ them. I mean, they ask for things employers don't want to give them, like time off, a living wage, insurance (since we have to keep that tied to a job), benefits, and they don't give you anything you can't get from the machines. With no one employed save a few C-Suites and board members to keep the machinations machinating, the majority will be out of work, and out of work = irrelevant. Non-value to business = non-value to community = non-value to society. And things that aren't valuable have no purpose to exist.
So, since the C-suites are day-dream fapping about how automation will replace us all, what is their plan for how to deal with all the excess bodies? We know that our governments won't do shit about it. They're beholden to the business sector, not to us. So where do we go? Do we just die in the streets once we lose our homes? What's the end-game in this fantasy scenario that we're having to watch be publicly discussed constantly without any discussion at all about the long-term repercussions? Do we just rush headlong toward it and pretend everything's fine until we run off the cliff like a bunch of lemmings? Or do we start discussing what's coming and trying to sort out a solution other than, "Protect the rich, let everybody else die?" Just curious if we even care at this point, or if it's so important to protect the business class that even this daydream of complete dominance over humanity must be sustained at all costs.
Re: (Score:2)
They'll need lots of backhoe drivers.
At least for a little while.
Re: (Score:2)
They'll need lots of backhoe drivers. At least for a little while.
Have you seen a recent construction job by a high-tech construction company? I watched a great big backhoe run itself this summer for about an hour before I got bored with it. The dude pulled it into the site, set up a couple laser guides, popped in a programmed foundation guide, and turned it loose. He had a killswitch on his belt to stop it if somebody wandered into the area. I'd imagine in another few years, there'll be even less need for a human to be around.
Re: (Score:2)
And the foundation is the hard part. I saw a hotel being built near me and it was just prefab wall modules being dropped into place by crane - insulation and framing and all completely finished.
Re: (Score:2)
Re: Humans will cease to matter in under 10 years. (Score:2)
Re: (Score:2)
Humans mattered?
There was a time, somewhere around the industrial age, where they started to matter less. We're just entering the age where we don't even pay lip-service to the thought that humans are a part of the equation at all. Up to the last few years, businesses and governments still sometimes made rumblings that individuals mattered. Not so much anymore.
Re: Humans will cease to matter in under 10 years (Score:2)
Not that humans of the Western Civilized proclivity have not long possessed an outsized sense of entitlement. We now feel a need to save the whales, but who can save the humans from themselves?
No one hired a maid since the Roomba came out! (Score:2)
If code engineering, ground-up, is able to be done by machine, and automation is essentially already on the brink of replacing most types of work, what's the use of those pesky humans?
You're assuming this stuff writes "good enough" code. I can tell you...it doesn't. It's like a roomba...not useless, expensive, and fun to play with. But when a homeowner buys a roomba, do they fire their maid? do they throw away their brooms?
Will it someday? Eh...I am skeptical...unless they have totally new algorithms for AI. Generative AI is just fancy pattern matching. Logically, it can take known patterns and reapply them to similar situations. However, most programmers are hired to write ne
Re: (Score:2)
If code engineering, ground-up, is able to be done by machine, and automation is essentially already on the brink of replacing most types of work, what's the use of those pesky humans?
You're assuming this stuff writes "good enough" code. I can tell you...it doesn't. It's like a roomba...not useless, expensive, and fun to play with. But when a homeowner buys a roomba, do they fire their maid? do they throw away their brooms? Will it someday? Eh...I am skeptical...unless they have totally new algorithms for AI. Generative AI is just fancy pattern matching. Logically, it can take known patterns and reapply them to similar situations. However, most programmers are hired to write new code and solve novel problems. If the problem was solved in the past, no one would hire you. If Generative AI actually worked...which most of the time it doesn't...it could save time, but not eliminate a skilled programmer. At the very least, you need a human being to sign off that you didn't just expose your employer to a massive lawsuit by writing insecure code. OK, so you don't believe me?...fair...I would pose this question. The promises of Generative AI are to generate near infinite wealth from electricity and time. If you had a magic coding machine, you could basically print money and be the wealthiest company in the world. Given the trillions spent already and the massive motivation to make unfathomable profits, why are these companies selling mere toolkits? Why not sell finished goods? For example, MS, Google, or Amazon have massive cloud hosting environments. Why not have sell services to have Generative AI fix your code for you?...at a HIGH monthly fee? Businesses would trample over each other for the opportunity to pay any cost for such a service. Why not? Because you could objectively determine if this shit works...similarly...why isn't there a generative AI CLR for Microsoft that takes shitty slow C# code written by someone who sucks at coding and converts it to the leanest C or assembly imaginable? Why even bother with a runtime environment? How about a product that replaces the CLR, JVM, or Python runtime and outputs C/rust/assembly/whatever...takes your sloppy prototype code and optimizes it and converts it to the leanest executable imaginable? Why?...because we'd figure out INSTANTLY that this shit doesn't work. OK, performance and security isn't your thing? Why doesn't MS write a Generative AI level generator for your favorite games?...infinite content for a monthly fee....why?...because you'd figure out instantly that it doesn't work.
I don't disagree with your assessment of current gen AI generated code. I just enjoy playing with the theories the AI prophets are pushing right now. The what ifs are fascinating to me. It reveals the emptiness in all the promises, while also revealing the ultimate psychopathy behind the machines (companies) in charge of the machines they're pushing.
There's a fundamental problem with that statement (Score:5, Interesting)
Re: (Score:2)
LLMs have very low "intelligence" and have no ability to truly create.
Ya, bullshit.
All they can do is write code based on an amalgamation of github, stackoverflow etc.
Ya, bullshit.
If the programming languages and software requirements never change then, yes, maybe we could keep churning out code from the same training data. But everyone knows that software is ever changing, which means we will still need an army of developers to write the new programs that can feed the training data for the LLMs.
Given they have no capacity to continue learning, it is true that their concept of the state of the art is frozen in time. This is definitely their greatest shortcoming right now.
If he's suggesting that we can have AGI in the next five years, well that'll never happen with LLMs which essentially fit curves to datapoints.
This tired argument.
Yes, neural networks fit curves to data points. Biological, mathematical, all of them. They're universal function approximators.
Where the fun starts is when you add state (memory) and recurrence.
In LLMs, this is accomplished with the context window, and feeding it back into the LLM a
Re: (Score:2)
Where the fun starts is when you add state (memory) and recurrence. In LLMs, this is accomplished with the context window, and feeding it back into the LLM again for every generated token. This means no, the combined system is far more than something that merely "fits a curve to the data points." It's capable of actual computation.
You can test this by asking it to count whether parentheses are balanced. If it can't do that, then it's not using the context window effectively.
For Turing machine level of computation, it will need something like scratch memory where it can go back and modify things it has written in memory.
Re: (Score:2)
You can test this by asking it to count whether parentheses are balanced. If it can't do that, then it's not using the context window effectively.
It can.
I've posted demonstrations here before.
Of course "it" means a suitable powerful one.
For my tests, I used QwQ-27B-fp16. Not even the smartest. In its reasoning tokens, it created a stack and balanced it.
For Turing machine level of computation, it will need something like scratch memory where it can go back and modify things it has written in memory.
Or an ability for the tape (context) to be able to overwrite past contexts. i.e., the LLM can simulate a memory in the context that's refreshed every response.
There's a paper somewhere demonstrating it on even a much older LLM.
This does mean that in the normal way that you interact with it, it's
Re: (Score:2)
This does mean that in the normal way that you interact with it, it's not Turing Complete always, but it can be effectively Turing Complete at times, or always with the right system prompt.
What does that mean? What are the limitations?
Re: (Score:3)
All they can do is write code based on an amalgamation of github, stackoverflow etc
We're all "standing on the shoulders of giants" to a degree and humans do the same. We may be far from AGI, but very few people are using GI when writing code either. Even when doing complex things. Most new patents these days are just X+Y combinations of things that already exists.
AI is sputtering we must push harder. (Score:5, Interesting)
"Ditch those pesky, competent software engineers! Embrace the glorious chaos of our Hallucinogenic AI LLMs! Why write maintainable code when you can generate digital spaghetti that even a seasoned spaghetti monster would find perplexing?
Imagine: Your Senior Managers, those glorious relics of a bygone coding era (circa 2003, when Java was really cool), churning out entire systems in an afternoon! Based on the finest, most confidently incorrect answers from StackOverflow, naturally. Think of the 'learning opportunities' when you have to pay us exorbitant fees to untangle the mess!
Our LLMs are trained to respond with the same delightful, condescending snark as your favorite StackOverflow contributors. Feel right at home being told you're doing it wrong, even by a machine! It's like a warm, passive-aggressive blanket of familiarity.
Don't understand the gibberish our AI spits out? Perfect! That's job security... for us! We'll happily dispatch a team of 'experts' from a location so offshore, that they're practically swimming with the fishes, to 'fix' things at a mere $500 an hour. (Plus travel, lodging, and emotional support for our team after dealing with your code.)
This isn't just coding; it's CASE 4.0: The Revenge of the Unmaintainable! Remember the CASE hype? We've taken that level of over-promising and under-delivering and injected it with pure, unadulterated AI-powered madness.
So, throw your sanity out the window and join the revolution! Your shareholders will love the burn rate! It's not a bug, it's a feature... a very expensive, very confusing feature!
Microsoft announcing their own demise (Score:3)
Windows and Office are so second millennium.
If 95% of the code is generated by AI, they can lay off 95% of their programmers.
And who needs an office suite if AI is savvy enough to hallucinate a document that no human will ever need read because we have AIs now to process the paperwork of all those silly reports humans used to produce in Word, Powerpoint and Excel.
So what..? (Score:1)
I will believe it (Score:4, Funny)
probably already is at Microsoft (Score:3)
Code in chunks (Score:3)
If the software engineers have a good specification that breaks down into the smallest possible chunks of logic, then they can prompt each chunk individually and understandably until they get sufficiently-quality code.
And prompt can be made to create that specification, again chunk by chunk.
Eventually, there will be systems that can do that whole process.
The bad part will be that, eventually, few will understand the chunks let alone the overall specifications, or the software that makes them, as there won't be many career paths to train people to that level.
We're going in a bad direction in the long term, and market forces are going to fuck us over worse than ever.
Re: (Score:2)
In fact, they're quite good at pointing out the broken logic of overconfident and poorly educated humans.
Re: (Score:2)
I've seen reasoning LLMs make simple programs, like snake, but never anything big. I think we're a long way off from getting anything good by saying, "Make a specification for a system that makes social security payments".
Re: (Score:2)
But if you were to ask an engineer that exact question, how worried would you be about them cutting you in half with mutant cyclops powers?
You've outlined a critical problem with LLMs here, actually, in that the LLM will try to satisfy that request, instead of what an engineer would do, which is start plotting your violent removal from the gene pool.
It already is. Actually 99% (Score:5, Insightful)
I'm old enough that I have written programs in machine language. Also assembly. Not much call for that anymore. Now, you say "new ArrayList" or whatever. Do you know how much machine code that generates, and relies on out of libraries also written n a high-level language?
Tools advance. Maybe we are on the threshold of the next major shift. There will always be people writing Java, C#, etc, just as a few people still write assembly. But for the masses? Newer, better tools that lead to higher productivity.
This kind of disruptive shift has happened many times. I knw a guy who started out as a typesetter: putting metal letters in a frame. Then he moved on to new tools in the printing trade. Now professional printers barely exist - he became a gardener.
Reminds me how... (Score:3)
It is that 5% which counts... (Score:2)
Even if AI wrote 99% of the code, that is all well and good... but there is always that 1% which will be a show stopper.
This isn't new. People have been copying/pasting stuff from Stack Overflow for years. I've used AI for code, and for something mainstream, it can work okay, but have anything that isn't 100% mainstream, and that is where things can break down, as the AI code can confidently make garbage or try to invoke methods which never even exist.
Sometimes AI can happily write a page of code... and d
Re: (Score:3)
If companies think that they can replace their entry level programmers with bots they'll presumably do so; but if there are basically no entry level programming positions to be had it's unclear who will be gaining experience to become the more se
Re: It is that 5% which counts... (Score:2)
Odd question about percentages... (Score:2)
I wonder, was there a point at which we hit "95% of all code is libraries" Because... it's true in almost every tech stack i've ever touched.
And it doesn't get viewed (as much) in the same virulently negative light.
After all, if we didn't have the tool Tomcat, every single web project would be required to hire a *whole* lot more devs.
Much rejoicing (Score:2)
In the NSA, FSB and among cyber criminals. Insurance companies beware...
Or what? (Score:2)
Re: Or what? (Score:2)
Re: Or what? (Score:2)
95% of CXOs (Score:2)
Re: (Score:2)
Re: 95% of CXOs (Score:2)
Bad developers will still create bad code (via AI) (Score:2)
because they'll be bad at AI prompts and checking the AI generated code.
Good developers will create code better than AI because they understand the user requirements better than can be expressed in AI prompts. Plus they can see much further into the future and will write maintainable code that AI can't.
Not Surprising. (Score:2)
"95% of code within five years" (Score:3)
This is an entirely accurate statement. We are already hearing that AI writes a significant amount of the code at Google and tech companies are laying off developers in droves. I do all my development work with AI these days, it is a huge productivity boost. People who criticize it here are obviously not using it.
You can start off simple; 'examine this codebase and explain it to me'. You'll get loads of details about it without having to plow through it all yourself. You can say 'document this code with comments and a README' and you will get that in a few minutes. A huge savings of time and drudgery. You can say 'refactor this large file into multiple modules that can be independently tested and reused. Write test cases for them'. You will get that in under 10 minutes. Anything you don't like you can simply reject and revert back to the original.
Then you can experiment. Create a branch and try some things out. 'As this code processes data, accumulate statistics about the performance of these specific modules, store that data in a local database and be able to generate reports that show these key metrics'. No problem, all of that will be produced for you and it will even make a website that shows the reports if you want.
Re: (Score:2)
Youâ(TM)ll never find a programming language that frees you from the burden of clarifying your ideas.
-Randall Munroe
https://xkcd.com/568/ [xkcd.com]
Re: (Score:2)
What's great is that you can start off with ideas that are pretty vague. If you don't like the results you can give the AI more specific instructions. And if you aren't willing to use AI to do any of that you will be out of a job pretty soon.
It surely will (Score:3, Interesting)
Why Not? It’s Already Working Great for Small Projects
Example 1: Annoyed by Amazon Prime Video showing you all those “buy or rent” videos? Just go to an AI of your choice, give it a simple prompt like “I want a Chrome extension that only shows me videos with a Prime badge,” and provide the HTML source of one of those pages. Bam—never see those unwanted videos again.
Example 2: Tired of seeing those annoying mobile game ads on Netflix? Do the same as above—give it Netflix’s HTML, and let AI filter them out for you.
Example 3: For larger projects using Cursor, I just highlight the code where I want to add a parameter, condition, or whatever, describe the change I want, review the AI’s suggestion, and accept or tweak it as needed.
Example 4: For new projects, you can guide AI step by step:
1. Provide the planned software documentation and ask it to outline a code structure (refine as needed).
2. Generate unit and integration tests automatically (or outsource them).
3. Let AI write the code to pass those tests (or outsource it).
This already works today—just try Claude Code or similar tools.
Sure, the real thinking and specification still need to be done, but AI speeds up the process. I don’t mind if it takes a day for tests and a week for implementation as long as it works—and it’s only getting better, cheaper, and faster.
The key now is having large enough context windows to handle specifications, coding style guides, to-do lists, and 10–50 source files at once. No need to dump the whole Linux kernel in there.
Next, we’ll see models with a smaller, possibly local, AI trained on your specific codebase. It will remember your preferences, decision-making, and prompts, optimizing how it interacts with the larger remote model while working through it's own todo-lists.
How AI Companies Expect SW Dev in the Future (Score:3)
AI Prompt 2: No, do it like this
AI Prompt 3: No, I need it to do this other thing as well
If this ever pans out, you'll probably spend most of your time writing AI commands rather than programming commands. I don't know that development will be any faster, but it does seem a lot more annoying.
Code or programs (Score:2)
Are we just writing code, or are we writing programs?
So much of what we do is just reimplementing what's already been done without innovation anyway, let's skip all that and do the fun stuff, let the AI do the boilerplate and lifting, let's do what we need to make it better.
Gotta pay to see 'em. (Score:2)
Prompt is code. (Score:2)
Prompt is code. It's like Python but a more retarded way to talk to computers. I mean python was already for retards, this is taking it to another level. You're telling a computer what you want it to do. LLM is like the first stage of a compiler. No different from a high level programming language. Maybe even higher level language, like cocaine high.
COBOL (Score:2)
Wasn't that what COBOL was supposed to do?
Statements in that programming language sounded like the utterances of a PHB ergo a PHB could code in COBOL with having programmers around?
Human level intelligence (Score:2)
All Truck Driving Jobs Will Be Replaced By 2021 (Score:4, Insightful)
These soundbites work because within five years is corporate speak for this makes me sound smart and I'm banking on no one being able to fact checking this. The press loves it. Makes great clickbait. No need to consult an opposing position.
My observation on this is that AI has replaced A1 some time ago making every day artificial fool's day for those who want to believe.
Re: coding isn't truck driving (Score:2)
As mentioned by others in this discussion, it's particularly close to true today given automated stubbing, code completion, grammar checking, error highlights, automated test tools like Sonar, and the practice of using existing libraries rather than reinventing the wheel.
While the five years incantation is, as a rule, misleading hand waving, I should have picked up on the u
significant memory limitations (Score:3)
"Scott said the current AI systems have significant memory limitations, making them "awfully transactional," but predicted improvements within the next year."
Why don't you ask AI to write you some code to solve those "significant memory limitations"? Maybe AI can invent virtual memory!
I like how we are supposed to believe that AI will have the ability to replace programmers almost entirely while also accepting basic limitations of AI as though AI cannot possibly do what programmers have taken for granted for decades.
If amount of code written increases 20-fold (Score:2)
Bullshit job titles (Score:2)
How does it execute the code? (Score:2)
Simple question here. If AI is writing code, then how is it executing this code the test it? If I have a smallish repo containing 250k lines of code, how is the AI executing what it has written that works within that codebase? Is it compiling an EXE, and running it within a Windows environment? Is it running PHP code on a LAMP stack? How does it test the query on my database containing 5 million records? How does it verify the HTML and CSS it created behaves correctly in half a dozen major browsers on deskt
Re: (Score:2)
In the environment I use the AI will write test cases on demand, run them itself if you want, evaluate the results, and implement fixes for any problems it sees. Yes it will generate and test queries against your database all day long. It's like coding on cruise control.
"behaves correctly in half a dozen major browsers", not there yet, you have to do that part yourself. But it will definitely generate the HTML and CSS and fix it if you don't like it.
"AI will replace you because it's faster/better" (Score:2)
*puts om They Live glasses*
"AI will replace you because it doesn't get paid"
But... (Score:2)
Learning to debug... (Score:2)
I remember what it was like after having taught myself C++, to be told by companies that they wanted someone with experience. Even though I could demonstrate working programs I had written, companies weren't even interested in hiring me unless I had a college degree.
Debugging is a skill taught through practical experience. The fundamental problem with using AI to generate code is that new engineers will never learn how to debug code, or how the machine really works, so when it comes to really difficult
Re: (Score:2)
That's an interesting perspective (I already commented, so I can't mod up.) BUT, when you're debugging your own code, you know what it's supposed to do. If you're debugging someone else's code, the first challenge is understanding what that code is supposed to do. Presumably you have a head start on that, when you're told, "Here's what is NOT going right."
Now the problem with AI generated code will be the need to gain understanding of the -intent-. So you'd have to hope to be given the query that produc
Unsupervised AI is a quiet mess (Score:2)
I know a large company that had a disastrous company-wide meeting and afterward an AI-generated email came out with a summary of the meeting. It was self-contradictory and had some charts about how successful the meeting was. It was like no one bothered to check the email before it went out.
My experience with AI coding is also bad. It's deceptively clever and wrong. Back in the 1980s people were talking about auto-generated programs that ended up being tossed in the dumpster. AI isn't holding your hand
90% (Score:2)
So 5 years of writing code will become 5 years of trying to edit and fix code that does not do what you want. You can already see this in all the companies looking for content editors for their AI generated contest. Because WE DO NOT HAVE ARTIFICIAL INTELLIGENCE. We have weighted randomization. Let's see how this works for you.
basic recipe for publicity (Score:2)
It's the basic recipe for publicity now:
state that <difficult task> will be done <high percentage> by AI in <only a few> years, and boom, people/news sites will bite
the field is so crowded now, and this is one of the easiest ways to get attention
Re: (Score:2)
No framework maker seems to strive for DRY, KISS, and parsimony these days, rather uses "scaffolding" to automate bloat
I spent time complaining that programmers don't write elegant code, then I realized most of us can't even recognize elegant code, which is the root of the problem.