Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
Programming AI

'No Longer Think You Should Learn To Code,' Says CEO of AI Coding Startup (x.com) 105

Learning to code has become sort of become pointless as AI increasingly dominates programming tasks, said Replit founder and chief executive Amjad Masad. "I no longer think you should learn to code," Masad wrote on X.

The statement comes as major tech executives report significant AI inroads into software development. Google CEO Sundar Pichai recently revealed that 25% of new code at the tech giant is AI-generated, though still reviewed by engineers. Furthermore, Anthropic CEO Dario Amodei predicted AI could generate up to 90% of all code within six months.

Masad called this shift a "bittersweet realization" after spending years popularizing coding through open-source work, Codecademy, and Replit -- a platform that now uses AI to help users build apps and websites. Instead of syntax-focused programming skills, Masad recommends learning "how to think, how to break down problems... how to communicate clearly, with humans and with machines."

'No Longer Think You Should Learn To Code,' Says CEO of AI Coding Startup

Comments Filter:
  • Instead of syntax-focused programming skills, Masad recommends learning "how to think, how to break down problems... how to communicate clearly, with humans and with machines."

    If you were ever in doubt about the quality of these "learn to code" initiatives....

  • AI is great, AI can help you program, but you don't want someone who doesn't know how to program using AI to generate programs. Note: a webpage is not a program
    • Re:Doubt it (Score:5, Interesting)

      by DamnOregonian ( 963763 ) on Monday March 31, 2025 @12:51PM (#65271931)
      At this juncture, as someone with ~30 years of programming experience, 20 of them professionally, I agree.
      AI can be a fucking wild helper. It can even pump out little utility programs for you like they're gumballs from a gumball machine... but when doing complex work within a code base... you really do need to keep an eye on that fucker.

      With the improvement I've seen in the last 2 years, I'm not convinced that will remain true, though.
      • ...when doing complex work within a code base... you really do need to keep an eye on that fucker... With the improvement I've seen in the last 2 years, I'm not convinced that will remain true, though.

        I've done a laughably small amount of programming. But in what I DID do, the actual coding was minor compared with understanding the business logic, defining and obtaining the desired behaviour, programming with built-in flexibility for growth and modification, and documenting what I had done to make it easier for others to understand and maintain. Does AI currently come anywhere close to doing those things? Will it ever get there, and if so, how soon?

        To be clear, I'm not picking an argument. I have almos

        • The answer is no, and anybody who incorporates AI Generated Code into an existing human programmed codebase is just looking to quantify their job existence by fixing bug tickets.

        • As for:

          the actual coding was minor compared with understanding the business logic, defining and obtaining the desired behaviour

          No, you really do need to be the logic source. You can't let it come up with its own ways to do things, because its goals are simply not going to be automatically aligned with yours.
          and

          programming with built-in flexibility for growth and modification, and documenting what I had done to make it easier for others to understand and maintain.

          This it actually absolutely excels at.
          LLM documentation tends to be pretty damn great, and it tends to account for things it thinks might be needed in the future, both documenting that, and how to do it, both in the provided documentation, and the comments in the code. It really is good at this task.

          Will it ever get there, and if so, how soon?

          For part 1, I don

          • An AI that "grows up" with a kid eventually helps the kid with goals in the future. That kind of AI is coming and will have the ubiquitous knowledge for your personal goals.

            • Well- on a long enough time line- ya, I guess I agree.
              But we are a long way away from that, simply due to resource constraints in LLM learning.

              However, if some paradigm shift happens in LLM learning, then that would of course change that perspective... but the standard march of compute horsepower isn't bringing that future to us rapidly at all.
            • by Targon ( 17348 )

              You have missed one of the very key parts that make knowledge of HOW to code all the more important in this day and age. The ability to break down a problem into procedures and functions is something that helps students understand how to break down ANY problem into elements that are small enough to be managed. Suggesting that people stop needing to think because AI can "do more" for someone over time just leads to an increase of clueless people in the world.

              • I get the impression that what you said is out of style.. generalized problem solving is out of favor and as algorithmic is in... you become disconnected from the details using AI. As Less people can think rigorously for themselves, you get groupthink, like MBAs. The use of a method tends to give better results for more people, because the expertise is baked in. Its very hard to counter that arguement when you save money for the company.
              • Maybe. Depends upon how integrated the AI is with the person. A relationship could develop like a math coprocessor or an executive assistant, either of which offload work and free up capacity for other types of thinking. I do not think "idiots galore" is a foregone conclusion.

        • by allo ( 1728082 )

          Currently not. But it would be naive to rely on AI stagnating at the same level.

      • by e3m4n ( 947977 )

        Its automation all over again. Replace 100 workers. But who fixes the machines? Great; rehire 3 employees. Still a net loss of 97.

      • by tsm_sf ( 545316 )

        > With the improvement I've seen in the last 2 years, I'm not convinced that will remain true, though.

        AI will always be limited by the user's ability to communicate.

        I'm looking forward to my smart agents indexing reference sites to technologies I don't know I'll be interested in yet.

    • Exactly. I use CAM software to generate gcode, but it's still important to be able to read and write gcode to understand the CAM output and make tweaks.
    • by allo ( 1728082 )

      AI models are better getter and AI systems are as well. Look at the trends with agents. The first pass will be writing code, possibly badly (or at least worse than humans). Then another Agent will point out the flaws. The first agent or maybe even a third agent (they may for example have differently computational expensive models), then rewrites the code.

      The crucial point will be feedback. The AIs need to get humans into the loop to have feedback about if the product is as expected. That does not mean quali

  • by Registered Coward v2 ( 447531 ) on Monday March 31, 2025 @12:46PM (#65271911)
    You mean “learning "how to think, how to break down problems... how to communicate clearly, with humans and with machines." is actually valuable? I can just be antisocial and expect to be successful? I actually have to interact with people, be able to understand, break down and explain a solution to a problem to be useful? OMG, I think I have been triggered.
    • It's what we use to call an "analyst".
      I doubt people will be able to do analysis in the future, because they will leave that to the AIs, and therefore will have lost the ability.. it's too hard to think, for gen A,B, or C ... whatever they will be called. I think 10 years from now, all the smart people will be dead and you'll be in Trump's 4th term.
      I'll still be alive, but living in the Alps, tending to a few sheep and hanging out with the dogs.
      Maybe we'll go from asking the AI questions, to the AI asking y
  • A ceo is much easier replaced by AI then a good developer, his tasks are way easier to pattern match to writing complex code.
    • Re: (Score:2, Troll)

      by dfghjk ( 711126 )

      The principle characteristic of a CEO or any business executive is sociopathy, something every computer application inherently starts with. It's not clear we even need AI to replace a CEO, unless you need an artificial simulation of the bad parts. Lack of empathy would arguably be something to overcome in AI, it's a valued feature for executives and Republicans.

  • No Longer Think You Should Learn To Code
    Amodei predicted AI could generate up to 90% of all code within six months.

    Who will understand it, or peer review it, or ensure it does what it's suppose to do, or fix it if it doesn't -- 'cause the AI didn't "get it" or do it correctly in the first place. You still need people who can code, and code well, not only to understand whatever is generated, but to do the inevitable work that AI can't do.

    Google CEO Sundar Pichai recently revealed that 25% of new code at the tech giant is AI-generated, though still reviewed by engineers.

    To my point ...

    • Firstly, I agree that people should not learn to code, just like everyone should not learn to plumb. It was always a dumb idea. OK he didn't say everyone but it implies its some kind of useful life skill that most people should have, its not.

      Another thing that is important is what AI does is code from what other people have done, if we wish to progress we still need some people to actually code. What AI does well is code things that have been done before.

  • by backslashdot ( 95548 ) on Monday March 31, 2025 @12:52PM (#65271935)

    The AI will code in some prompt gibberish that is suited for AIs to debug. I've been "vibe coding" for maybe a year and a half or two ... the thing it REALLY sucks at is UI .. debugging React shit . it gets confused and fucked .. it's own code .. it can't fix its own shit. Many times I stubbornly spend an hour trying fix via prompting shit that I can fix in 5 minutes of manual review. Guaranteed what comes out of this mess is UI frameworks and programming languages lobotomized such that AI can code in it and more importantly debug it.

    • by Tablizer ( 95088 )

      How about Debugger Camp! AI can crank out cheesy code like nobody's mama, but still can't solve the tricky bugs.

    • by el84 ( 10322963 )
      You're right - it's a cert that future (mainstream) programming languages will be optimised for AIs rather than people - cos it's another (very effective) way to dis-intermediate human devs out of the process.
      • You're right - it's a cert that future (mainstream) programming languages will be optimised for AIs rather than people - cos it's another (very effective) way to dis-intermediate human devs out of the process.

        I for one am looking forward to the return of Hexcode as a mainstream programming language.



        (For giggles, I just googled Hexcode and ALL results on the front page were about colour codes in web pages... anybody else around here still reads "3F" as "SoftWare Interrupt" or "A6" as "LoaD accumulator A indeXed"?)

    • No, the thing it is best at is UI.

      It is much worse at anything else.

    • I'm using Claude 3.7 sonnet for a react UI today and it's perfect, no prompt gibberish and it looks just like something I could have come up with if I spent way more time on it. It even chooses the same UI framework that I would have picked and it does it consistently without me telling it to. Maybe it remembers it gave it to me before but I don't think that's what's happening.
  • hogwash (Score:5, Insightful)

    by GorillaSapiens ( 10413899 ) on Monday March 31, 2025 @12:55PM (#65271943)
    the current crop of LLMs can "write code" the same way a 5 year old can write a novel. when a human today says "AI can write code", there is a 100% chance that that human can NOT write code. there's WAY MORE to "writing code" than throwing down something that looks syntactically correct. Pichai says "25% of new code at [Google] is AI-generated, though still reviewed by engineers." call me when human "review" is no longer necessary. when it compiles without errors. and does what it is supposed to do. then i will say that "AI can write code"
    • when a human today says "AI can write code", there is a 100% chance that that human can NOT write code.

      The guy can code. You're forgetting that he's pushing an AI product that can produce code. He's running with the idea that discouraging people from learning to code will translate into more demand for his product and more profits.

    • by e3m4n ( 947977 )

      Replacing 25 employees with a single human review is still bad news for humans.

      • by ceoyoyo ( 59147 )

        Is it?

        They're building a new subdivision across the street. They needed to extend the water supply, which required digging a ditch. So a guy shows up with a backhoe, digs the ditch, a couple guys show up with some pipe, stick it in the ditch, and backhoe guy fills it back up.

        Take out that backhoe and the job would have required a dozen guys with shovels and taken ten times as long. Gee, which one is better?

        • by e3m4n ( 947977 )

          So 12 guys cant get a job now, in theory, because the backhoe works 24x7. Now your taxes quadruple to subsidize pay to the 12 people that cant work. Major automation, and industrialization has always gone hand in hand with population adjustment. The black plague and the printing press; WWI and WWIi with industrialization. The more non workers the more likely something like a real pandemic or global war will come along and kill off the non contributers. That level of unsustainability at the scale AI is poten

          • That's not how it works. That's not how anything works. And people worked a lot LESS after the black blague than before it, so you have things utterly backwards.

            • by e3m4n ( 947977 )

              Dumbass. They worked less because precisely from automation. The printing press replaced a thousand scribes. Scribes that were wiped out by the plague. Throughout history every major advancement of automation that did the work of dozens of men has gone hand in hand with a major population adjustment. War being one of the most common.

              • You're seriously claiming that the FARMERS were replaced by the printing press. And you call me "dumbass".

                You have no case, only a "just so" imagined story you're for some reason heavily emotionally attached to.

          • by ceoyoyo ( 59147 )

            Yeah, except that's not how it ever turns out. Those twelve guys have jobs doing something other than digging a ditch. Almost certainly something that's more pleasant and pays better.

            Major automation, and industrialization has always gone hand in hand with population adjustment.

            Yes, increased population.

            • by e3m4n ( 947977 )

              you are thinking too 1950s. When AI replaces every fucking job there is, those 12 guys are competing with another 25 million other out of work fellows trying to find a job. This isnt one backhoe. This is a backhoe, your car, your mail man, your plumber, your landscaper, your teacher, your amazon delivery, your door dasher, your bartender, the list of potential replacements will be staggering in the next 15-20 years. There wont be a more pleasant better paying job. If youre lucky you'll be a waiter working f

    • Actually I was moderating ...

      But your point of view is so catastrophic wrong: unless you want to nitpick about "an AI can do".

      Of course they are not really doing anything, they are just a gigantic library and are able to process natural language good enough to find stuff in the library (as in library of congress, aka Books!) that fits your need. Of course that is not "coding".

      The AIs we train at the moment are far beyond "coding". They are used to analyse code, write regression tests and unit tests and stuf

    • Does "25% of new code at Google is AI-generated" explain why Google products today can't do half the stuff that they could do years ago? It seems like my phone loses functionality every year.
      • It seems like my phone loses functionality every year.

        My android loads more games every month, that I don't play and I keep deleting. But now it's harder to delete them, so I guess that's progress?

      • Milennials don't give s shit about us power users, that's why.

    • > call me when human "review" is no longer necessary

      At that point, why would anyone call you? Can't see the forest from the trees.

  • by WarJolt ( 990309 ) on Monday March 31, 2025 @12:56PM (#65271949)

    Struggling to write software gives you design intuition. Learning to code is not the end goal. It's learning to design. AI can't barely write code. It is terrible at designing code.

    Many software engineers are terrible at design. Every project devolves into a steaming pile ðY'© at some point. AI just accelerates that.

    • Struggling to write software gives you design intuition. Learning to code is not the end goal. It's learning to design. AI can't barely write code. It is terrible at designing code.

      Many software engineers are terrible at design. Every project devolves into a steaming pile ðY'© at some point. AI just accelerates that.

      I’ve found at least 70% of my time is figuring out what I want to do and the logic behind it, the rest coding and debugging.

      • Struggling to write software gives you design intuition. Learning to code is not the end goal. It's learning to design. AI can't barely write code. It is terrible at designing code.

        Many software engineers are terrible at design. Every project devolves into a steaming pile ðY'© at some point. AI just accelerates that.

        I’ve found at least 70% of my time is figuring out what I want to do and the logic behind it, the rest coding and debugging.

        For most of us 70% of our time is trying to figure out what the management or user group is actually asking for, because 90% of the time, the words they use have about as much resemblance to what they'd like to see as an end goal as, "I like pie," has to do with how you go about baking a pie that that person would like. Not to mention, there's a world of difference between an apple pie with a traditional flakey crust and a chocolate cream pie. Or did they mean a lemon meringue pie? Although, when it comes t

  • If you are content to churn out crap for minimum wage because you're slightly better than an AI (for now), by all means do not bother learning to code. You will, however, be less likely to progress in your career, extremely unlikely to ever achieve anything interesting, and first to be let go when AI improves enough to surpass you.

    For non-IT people, learning basic coding is an awesome method for learning how to problem solve and think logically. It will absolutely improve your ability to handle life even

    • I always thought that non-developers should take basic learning to code courses because the courses helped them understand what is required to do actual coding. I remember a fellow coworker not in my area approached me because he wanted to change the API of some code to use a new different data type. I immediately told him he could not do that without changing an underlying library first. Being non-technical, it took ten minutes of me unsuccessfully trying to explain what "dependencies" are. Finally, I had
      • Maybe you should have also showed him abstraction and overloading? :) to me that scenario reeks of "I don't wanna" thinking, and I would suspect there are good reasons for the request, but you blocked them by pedantically showing a compile error instead of asking, "Why, and should we accommodate?"

        But regardless, i don't think people should need to learn coding just to communicate with coders. There are plenty of avenues to critical, logical thought that don't include code.

        • Maybe you should have also showed him abstraction and overloading?

          1) I did. It took ten minutes to explain "dependencies". I mentioned abstraction and overloading at the beginning but he was stuck on "why can't I just deploy the change I made." 2) It wasn't my code, my area, etc. He came to me asking how he should approach a change but then refused to believe me when I told him what to do.

          but you blocked them by pedantically showing a compile error instead of asking, "Why, and should we accommodate?"

          I wasn't pendantic. He wanted to make a change that would not compile, and he would not accept that very basic reason until I showed him that it literally would not compile without othe

      • I always thought that non-developers should take basic learning to code courses because the courses helped them understand what is required to do actual coding.

        The problem is many of those course don’t go much beyond “Hello World “ and so coding looks easy.

  • by turtle graphics ( 1083489 ) on Monday March 31, 2025 @01:05PM (#65271987)

    Masad recommends learning "how to think, how to break down problems... how to communicate clearly, with humans and with machines."

    This is great, and in fact the primary focus of most educational programs. But a great way to communicate with machines is to use a precise language to express yourself - we call that a computer programming language. Sure, CS students need to learn some syntax, but most of their effort is spent on thinking and breaking down problems. If you're not going to do that through code, then how? Even accurate specifcation of a problem is difficult in English. Still the best way to think about computers is to learn to code. And IMO learning to code makes you a better writer for humans, as well.

  • by MpVpRb ( 1423381 ) on Monday March 31, 2025 @01:07PM (#65271991)

    Designing novel, complex systems is inherently hard
    Using an AI prompt to generate a web page only works because millions of webpages exist and the model was trained on them
    Using a simple text prompt to describe a novel, complex system, unlike anything the model was trained on, will fail
    Complex systems require complex designs, regardless of the language used to express them
    We need AI tools that help us manage and understand complex systems

    • Using an AI prompt to generate a web page only works because millions of webpages exist and the model was trained on them
      That is not how it works.

      Training happens by interacting in natural language with the models. You can not train models by feeding them web pages, just like you feed them cat pictures to recognize or later generate cats.

      Depending on the web page, what ever is behind it is so far abstracted away that the only thing you can feed/teach is html and layout.

  • How much are you trying to make me laugh!?

    I've been dabbling in these "AI" tools since their inception, all the way back to when they were called "tab completion"

    The modern LLM style is essentially "tab completion on crack", that's about it. They're glorified StackOverflow search engines. If a problem has -already- been solved, these tools help "find" that solution for you (assuming they're not absolutely hallucinating and injecting security vulnerabilities up the ass everywhere)

    However, as soon as you atte

  • by awwshit ( 6214476 ) on Monday March 31, 2025 @01:11PM (#65272003)

    A few years from now, when even more of slashdot is retired, they will call us up and ask us to fix the holes that AI has made.

  • ...I don't have a pole-dancer body (i bent the pole), so what's Plan C?

  • Learning how to think is not enough. What is sorely lacking is domain knowledge. If you want to produce effective software, you need domain knowledge in the target purpose. Otherwise, what is produced is some horrible mish-mash.

  • by ugen ( 93902 ) on Monday March 31, 2025 @01:24PM (#65272035)

    Learning math has become sort of pointless since there are calculators.
    Learning to write has become sort of pointless, since there are word processors (and dictation, and autocorrect, and AI).
    Learning anything has become sort of pointless, since there are products (made by people who know how to do those things) who will do it for you.

  • Wages will skyrocket as demand for coders increases "unexpectedly" and the supply is scarce.
  • No, AIs can't fully replace devs yet. No, they can't code GUIs. No, the need for human coders won't vanish overnight.

    The point is that we don't need "more" coders. We have more than enough now, and the pace of AI involvement is increasing. We have coders from age 18 to 70+ - we will not run out. So no, people shouldn't learn to code as career prep. The number needed is going to fall, not rise.

  • Tangential comment, but what is with every CEO, big and small, opining all over the internet and media space about...well just about everything. How do they find the time when it's oh so incredibly busy being CEO, no time for anything, working 16 hour days. Or at least that's what is said.

    In reality they all just want to be Tik Tok influencers with 500k subs. Lame.

    • by ceoyoyo ( 59147 )

      They're actually just writing Linkedin posts and business blogg.., er, journalists have discovered that when you need an article you just have to write "so and so said..." and then paraphrase Linkedin.

  • Masad called this shift a "bittersweet realization" after spending years popularizing coding through open-source work, Codecademy/

    Masad recommends learning "how to think, how to break down problems... how to communicate clearly, with humans and with machines."

  • They keep saying that XXX will eliminate the need for people to code. Code generates, Dan Brikline demo to convert demos to code, Microsoft Visual C, now AI.

    In 1982, I was working for someone who insisted on flow charts, which practically was code-level. But even then, you had coders also doing programming and system design. You can have systems pump out code from designs, but the code would be essential template-level code. But you still need the system to be designed, the code needs to be checked, and de

    • by allo ( 1728082 )

      The question is about understanding systems. Programming syntax is not complicated but concise. You can use no-code systems and use building blocks to create your program. You'll get annoyed by too much drag&drop of building blocks and wish for a concise syntax. Which leads to some programming language as we have it, because they are a more compact way to express the same. What stays the same is, that you have to understand the logic you want to express, no matter if it is a programming languages syntax

  • Hey folks, the AI-GUY says don't learn to code because AI. So do most other AI-GUYs. I think it's safe to ignore them. Wait for people like me (a 40-year developer) to say it. When (if) I say it, it won't be motivated by $$.

  • by dfghjk ( 711126 ) on Monday March 31, 2025 @01:49PM (#65272135)

    'Instead of syntax-focused programming skills, Masad recommends learning "how to think, how to break down problems... how to communicate clearly, with humans and with machines."'

    Otherwise known as learning how to code. Also, since when is learning "syntax-focused programming skills" NOT learning " to communicate ...with machines"? And once we have a generation of "programmers" who never learned to program, how will AI's being taught how to program and who will audit the "work"?

    Once again, another bullshit article about ignorant grifters selling people on the idea that programmers are replaceable. How long before we do something about ignorant know-nothings with money telling us how everything is done?

  • So... instead of copying code from stack exchange we can use AI to copy the code? Progress!
  • Masad called this shift a “bittersweet realization” after spending years popularizing coding through open-source work, Codecademy

    Masad recommends learning “how to think, how to break down problems... how to communicate clearly, with humans and with machines.”

    What I was trying to say, is that engineering is literally about how to think, breakdown problems, and craft solutions. What does this CEO think he's suggesting? He's proposing people should be engineers, okay, and? If you're not engineering the solution, then what are you doing? Developing software is not just randomly mashing keys and being surprised at the outcome, which is apparently how many executives see developers, since they think mindless AI is the same thing.

  • Anyone who is a real full-stack, non-vibe-coder developer will be in roughly the same position as COBOL developers today:

    "Please don't retire, you're the only one who understands this code enough to modify it."

  • Writing a little program is just one thing AI can surely do extremely well, but why dont we give AI a run at the CEO position and see how well it does? LLMs can already produce as much gibberish text as you could possibly want, so just feed it a few business strategies and access to market data! Then all executive positions will be increasingly redundant as well!

  • ...since several years. You just need to pass a multiple-choice exam [arrl.org]. Being able to code is however a nice have, and quite fun.
    Oh....you mean a different kind of code from Morse code, maybe ?!?
  • by KalvinB ( 205500 ) on Monday March 31, 2025 @02:39PM (#65272303) Homepage

    That's all this is about. Skilled labor thinks they're worth money. Unskilled labor can be paid minimum wage.

    It's long past time for developers to unionize and stop playing the game of letting rich people oppress workers by demeaning them to justify poverty wages.

  • After being told to "just learn to code" by the politicians who took their jobs away, what are they left with now?
  • We still need to engineer and program software. The language and tools are changing. Just like it always has and always will.

  • It just means more job security for me.

  • This guy is clearly selling you something and sadly most won't know not to take him seriously. By design, modern AI is a pattern matcher. It can replicate pre-solved patterns. If you ask it to do something that's not been well established in it's training data, it will not do very well. That's just basic logic. With software, you can make infinite copies and reuse easily, so pretty much all commercial programming is writing stuff that is new, not problems that have been solved already.

    Also, I work
  • by Tomahawk ( 1343 ) on Monday March 31, 2025 @04:21PM (#65272567) Homepage

    I tend to be a good problem solver. Despite mainly working in System Ops/DevOps, I've had developers pull me into sessions were their stuff was broken, and I'd help them quickly find and fix their problems.

    I'm good at this because I learned to code. I learned C and assembler in college, and networking, and OS design, and all those low level things they help me "think" like a computer.

    That's a skill that will be lost if people don't learn how to code, I fear. AI will churn out code, and if it breaks the "developers" won't know how to even start looking at it for issues.

  • by cascadingstylesheet ( 140919 ) on Monday March 31, 2025 @07:04PM (#65272839) Journal

    Instead of syntax-focused programming skills, Masad recommends learning "how to think, how to break down problems... how to communicate clearly, with humans and with machines."

    Er, that was always what good programming was about.

    If you weren't already doing that, you were just a sort of code monkey.

  • Why? Because AI code generation isn't perfect and someone needs to fix the generated code.

    IMHO, AI generated code can take a lot of the tediousness out of coding, but will never be perfect, so will still need some humans to verify and fix the output.

    Plus Software Engineers/Architects are still needed to plan out how and stitch together the small bits of code so they work together well and can be maintained and expanded upon easily.

  • Well of course, what else is he going to say!

  • I really don't get the "let AI write code for you" and humans can debug, improve, and maintain it. It's completely backwards from what it should be. AI should first be used to write unit tests and run automated tests. So long as the tests are mostly right, that's a big win and there's little potential downside in production. Next, AI should move on to identifying and fixing bugs. That's the natural progression from writing tests. Once AI is good at writing tests and fixing bugs, then you let it take a crack

We are each entitled to our own opinion, but no one is entitled to his own facts. -- Patrick Moynihan

Working...