Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Media Programming SourceForge

Interview with Programmer Steve Yegge On the Future of AI Coding (sourceforge.net) 29

I had the opportunity to interview esteemed programmer Steve Yegge for the SourceForge Podcast to ask him all about AI-powered coding assistants and the future of programming. "We're moving from where you have to write the code to where the LLM will write the code and you're just having a conversation with it about the code," said Yegge. "That is much more accessible to people who are just getting into the industry."

Steve has nearly 30 years of programming experience working at Geoworks, Amazon, Google, Grab and now SourceGraph, working to build out the Cody AI assistant platform. Here's his Wikipedia page. He's not shy about sharing his opinions or predictions for the industry, no matter how difficult it may be for some to hear. "I'm going to make the claim that ... line-oriented programming, which we've done for the last 40, 50 years, ... is going away. It is dying just like assembly language did, and it will be completely dead within five years."

You can watch the episode on YouTube and stream on all major podcast platforms. A transcription of the podcast is available here.

Interview with Programmer Steve Yegge On the Future of AI Coding

Comments Filter:
  • by OneOfMany07 ( 4921667 ) on Thursday November 07, 2024 @04:47PM (#64928893)

    Can't have an effective conversation without precise terminology. And that same terminology was what held back inexperienced people from getting what they wanted from previous internet searches (against forum posts, etc).

    Both knowing what can be easily done, and what to call that, are both very important to any software changes... with or without AI. Best case AI will eventually be able to translate paragraphs of talking around something into the actual thing, but that's just doing the PM's work too (translating requirements from the business owner).

    • It's neat that AI will be able to code in five years, since the AI we have now sure can't.

    • by war4peace ( 1628283 ) on Thursday November 07, 2024 @05:25PM (#64928969)

      Can't have an effective conversation without precise terminology. And that same terminology was what held back inexperienced people from getting what they wanted from previous internet searches (against forum posts, etc).

      I'd go much, much further deeper than that.
      Computer interactions (in order to create something using them as tools, not just using them for entertainment) require discipline. This applies to most, if not all types of work, from drawing something to creating the next operating system. Anyone can "use", say, MS Paint, at its basic level. Click this, drag that, and you get a very basic drawing. But if you want to be proficient with it, you need to develop certain skills.

      Coding with help of LLMs is no different. Yes, it can be successful, even in its current state, but there are both prerequisites to achieving that, as well as problems.

      Issue: using natural language is inherently vague. "Make me the best game in the world" just doesn't work.
      Solution/Prerequisite: You need to know how to break a request into its tiniest parts, as well as how to put them back together. Some people can't change their remote batteries, FFS. Also, you need to understand and be experienced in proper prompting, a task made more difficult by the fact that each different LLM understands the same prompting differently. This is not necessarily visible for very simple tasks, but as the task becomes more complex, prompting needs to be more and more specific and tailored to that specific LLM.
      Issues: It can be much more difficult to gain coding skills if you use a LLM to help you code. I, paradoxically, find it easier, because I can quickly iterate, experiment and test small code validity, especially for languages I don't know well, or at all. But that ability goes back to the previous entry: I can analyze, check, verify and rephrase, simplify and extrapolate, etc. Most people can't. Just listen to people order stuff at McDonalds, sometimes it's a pain to hear them bumble and struggle with something so simple.
      Solution/Prerequisite: "git gud" - and that takes time and a ton of effort.

    • Never-mind that even if you could just ask the AI to do it, you'd not have a job in doing so nor would there be a "programming" industry. It would just be a prompt that some office assistant or Project Manager dumped the requirements into as part of their other duties. There wouldn't be a special paid position to just input a prompt into a computer, like the C-Levels keep claiming there will be, because that would be seen as an unnecessary expense to be cut.
  • If this guy thinks that, he's not really a programmer. Also, "assembly language" did not go away. He's obviously not worked as a programmer on the 80% of programming jobs.

    But hey, who can tell the difference when "people who are just getting into the industry" call themselves programmers but don't know programming and will be incapable of knowing whether AI generated code works or not.

    What a society we live in where the most important things to repeat lies over and over.

  • "We're moving from where you have to write the code to where the LLM will write the code and you're just having a conversation with it about the code,"

    No. We're not. That's not a thing LLMs can actually do. They seem like they're having a conversation because each piece of text seems like it should flow from the previous text, but it's not actually a conversation in the sense that it has an idea, and your replies affect and change that idea. There's no dialectic to it. It's just responsive to a rollin

    • Re: (Score:1, Funny)

      by Anonymous Coward
      you're very confidently wrong, i respect that.
    • by gweihir ( 88907 )

      Indeed. I should add that the idea of "instead of coding, have a conversation with the computer" is _very_ old. I already heard it as old when I studied CS 35 years ago. This is the proverbial pipe-dream that people desire but never get in reality.

      Obviously some no-honor scum will try to sell you something that looks like it but essentially does not deliver.

  • Imagine if everyone could write their own software. The days of this stupid shit where EULA's and regulation stop people from protecting themselves against anti-competitive software companies. All the shit they do like phone home routines, and those coupled with stuff like pinned certificates that prevent you from seeing what traffic companies are sending about you from software that is on your machine. Imagine when EULA's that are restrictive and prevent you from doing the things you want to do are all irr

    • by gweihir ( 88907 )

      Hahahaha, no. We are about as far removed from that as ever. I.e. "not in the next few decades and maybe never".

  • I checked his LinkedIn account, he's one of my 3rd-degree connections. Hot dog! Maybe one day I'll graduate to being a 2nd-degree connection!

  • by Tony Isaac ( 1301187 ) on Thursday November 07, 2024 @06:00PM (#64929047) Homepage

    We can all see the vision of being able to just "talk to" the AI when creating code. It's tantalizingly close, we can almost taste it. I mean, if they'd just fix those pesky little glitches, where it pastes a bunch of HTML tags in the middle of my javascript, or adds a new function definition inside of the function I'm working on. Then we'd be there, right?

    Not so fast. Getting AI assistants to the point that they can be a big help with productivity is great, and that's already happening, but you've still got to know what you're doing. Getting to where you can *trust* the AI to do what you meant for it to do...that's going to be about has hard as...getting self-driving cars to stop colliding with pedestrians.

  • by Otis B. Dilroy III ( 2110816 ) on Thursday November 07, 2024 @06:32PM (#64929167)
    Is not who or what writes the code. The real question is who debugs it.

    In my 30+ years of coding experience I found that debugging someone else's code normally takes longer than it would if I just wrote it and debugged it myself.
  • I don't do this "line oriented coding". I _design_ things, as algorithms, and then I translate that to code. Having an AI would not save time, because to tell the AI what I want, I would have to describe the algorithm. But that's when I am normally almost done anyway.

    I can only see it being useful if, say, the algorithm contained some steps that can be summarized, such as, "Extract data from excel into a set of MySQL tables". If an AI can do that, it would save me time.

    One thing I _don't_ do it code by tria

    • But that's just stupid.

      Have you actually met many humans? :D

      More seriously put, this won't affect you much, but it may well replace a lot of lower-tier code monkeys.

      I've seen a similar shift over the years in the localization industry where I work. Increasing automation has put more pressure on the lower end of the job market. We don't need Bumbling Bob and Crappy Carl as freelance translators anymore, when Google or DeepL have comparable (or better!) error rates. Bob and Carl are tools, and not very goo

    • "Extract data from excel into a set of MySQL tables"

      Apparently it can do that. And probably lots of other menial jobs if you know how to ask it nice.
      https://medium.com/@sayaleedam... [medium.com]

    • by gweihir ( 88907 )

      One thing I _don't_ do it code by trial and error, which seems to be the norm today - designing as you code. But that's just stupid.

      Yep. Unless you do very simply "business code" only, the actual coding is a minor part of the work. The major part is architecture and design. I guess there is a market for simplistic code, but it is not one that any real coders are to be found in.

      My take all this "AI coder" and "everybody should learn to code" nonsense is just a step on the way to idiocracy.

  • "Line-oriented coding ... will be completely dead within five years."

    Hmm. Folks still purchase vinyl records. Paper books. Buggy whips, even.

    Very little ever disappears completely.

    That said, changes in the job market are inevitable. Everything changes anyway. We should all plan accordingly.

    • by SirSlud ( 67381 )

      There are two kinds of people in the world, people who can listen to somebody - somebody they might even be predisposed to think of as stupid - say something like "buggy whips are completely dead" and nod their head in agreement like a sane, well adjusted adult capable of inferring context and implied qualifiers, or people who just can't help themselves and go, "aaaaakkkkshhuaallly" ...

      The latter kind of people are super fucking annoying.

    • by gweihir ( 88907 )

      On the other hand, this prediction is just bullshit. And it is not even the first time it has been made. With a somewhat variable time-horizon (usually 5...10 years) I must have heard it regularly over the last 35 years since I got my CS degree. Apparently, it was also made well before. Never panned out, will not pan out this time. But that guy wants to sell something, so he thinks blatantly lying is acceptable.

  • "I'm going to make the claim that ... living, which I've done for the last 40, 50 years, ... is going away. I am dying just like all humans do, and will be completely dead within 30 to 40 years." - Me

"The lesser of two evils -- is evil." -- Seymour (Sy) Leon

Working...