Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
Programming AI Education

'Coding is Dead': University of Washington CS Program Rethinks Curriculum For the AI Era (geekwire.com) 87

The University of Washington's Paul G. Allen School of Computer Science & Engineering is overhauling its approach to computer science education as AI reshapes the tech industry. Director Magdalena Balazinska has declared that "coding, or the translation of a precise design into software instructions, is dead" because AI can now handle that work.

The Pacific Northwest's premier tech program now allows students to use GPT tools in assignments, requiring them to cite AI as a collaborator just as they would credit input from a fellow student. The school is considering "coordinated changes to our curriculum" after encouraging professors to experiment with AI integration.

'Coding is Dead': University of Washington CS Program Rethinks Curriculum For the AI Era

Comments Filter:
  • by Casandro ( 751346 ) on Friday July 11, 2025 @10:49AM (#65512418)

    when schools got rid of their programming courses and replaced them with Office 95 courses in Germany. The result was a lost generation... and a world in which "telling a computer what to do" is a very lucrative job, particularly if you know what the computer should do.

    • History (Score:5, Insightful)

      by JBMcB ( 73720 ) on Friday July 11, 2025 @11:05AM (#65512482)
      Coding was dead with 4GL. Then again with RAD tools in the 1990s like PowerBuilder. I've heard people argue coding was dead with UI coding tools like Scratch. Coding has been dead for decades. Still a lot of coding jobs, though.
      • You know the story of the frogs (which are cold-blooded) sitting in a pot of water and thinking everything is fine as the temperature of the water rises and rises, until one minute, the water boils. i.e. an analogy about the dangers of unanticipated phase changes.

        Just because you've been able to say "see, it's not different this time" several times in the past does not mean that I will be wrong when I say "It's different this time!"

        The difference is that the cognitive capability and ready-to-hand, exploit
      • by gweihir ( 88907 )

        Hahahaha, and 5GL! When I started studying CS, the 5GL initiative just has been recognized as a complete, abject and resounding failure. I expect that in at most 5 years, the same will be true for LLM "coders".

    • Re: (Score:2, Insightful)

      by AmiMoJo ( 196126 )

      To be fair, office skills are probably of far more use to most students. As long as the ones who wanted to learn to code had the opportunity to, it seems okay. I'm not complaining about it being lucrative either.

      • Well but those weren't "office skills", but "Microsoft Office skills". Nothing in there prepared you for working in an office. It was essentially learning how to work around the bugs of that software package in the current version.

        • If it was in the 1990s, Microsoft Office skills were Office skills in general.

          Before the mid-2000s, virtually all office suites had CUA UIs - a standardized placement of menus, keyboard short cuts, dialogs, etc, that made it easy to pick up a new application as long as you were familiar with its features and the CUA itself. A Wordperfect user had little difficulty transitioning to Word and vice versa. So teaching people how to use Word in the 1990s was teaching them wordprocessing.

          This all changed with the

      • by Junta ( 36770 )

        Part of it though is those sorts of classes were stupid not because those skills were useless, but because you didn't really *need* the classes as the software was easy to use, unless you got into some of the more tricky spreadsheet stuff that looks more like programming than just office stuff.

        Similar here, to the extent LLM *can* augment/implement coding, they really don't need a university major to tell them how to use AI... If it can work, it can work easily. You don't need to be *taught* how to use the

    • We really didn't have the tech back then we really do now. At least for the rank and file jobs that are out there

      Coding is absolutely dead. What's left is high-end mathematics. But coding is dead. The kind of job where you could just come in and bang out code all day is kaplitzki. Mind you most of those jobs were already going either offshore or to h-1b's but they were still some left.

      I think the biggest problem humanity faces isn't climate change or fascism. It's the inability to adapt to and recog
      • by MachineShedFred ( 621896 ) on Friday July 11, 2025 @11:53AM (#65512650) Journal

        When human beings hit about the age of 12 they are locked in. Anything to change is they refuse to acknowledge or interact with.

        This is absolute horseshit.

        List of things that didn't exist when I was 12, which near everyone my age or older uses on a regular basis:
        Smartphones
        Cellular data services
        WiFi
        Commercialized internet access
        Instant messaging
        Streaming media
        "The Cloud"
        Residential broadband

        Your theory contends there is still tens of millions of people in the US in their mid-40s and above regularly using dial-up, or are completely disconnected because they "refuse to acknowledge or interact" with any of the above. Ridiculous.

        • People do resist change, and the pace of change is strong.

          We're entering an era of sort-of works AI code, shitty AI code, some that is quite good and works until something breaks, a minor amount that can withstand the change of tech and especially other AI coding trends, and a parallel universe of humans coding along side these changes.

          We went from machine code to primitive if highly capable languages to C to Java to Py to Go, and most languages are improvements on and compared to basic languages. All will

      • by PPH ( 736903 )

        We really didn't have the tech back then we really do now.

        Yeah, we did. Write out a pseudo-code (English language) description of what you want done. A _complete_ description. Hand it to a machine which generates and compiles the code. Done. Back in the 1990s.

        But what we had in the 1990s was a group of middle managers that realized that their status and income depended on the number of warm bodies they had reporting to them. So they fought the move to technologies that would reduce their headcount from 500 to 5. Those people are retired and, increasingly dead. An

        • >Current AI is garbage because it's envisioned as a general purpose tool
          Bingo!

          The narrower you define your market the more you can create tools of value for their use cases. LLMs are literally all things to all people right now, solve no specific problems well. Write me a letter, code me a function, be my friend, guide me thru my drug trip (see other articles here on /.), be my therapist (agents on facebook) .... What doesn't LLM do?

          The people optimizing the LLMs for coding assistants, for instance, have
        • I graduated from comp sci over 20 years ago, and have had a lifelong career/interest in programming. Of note, my specialty in university was compilers and computer language design.

          Thing is, all computer languages exist for one reason: to translate what we want to do, and can describe in English (or other human language), into a language a computer can understand. The sole reason for compilers is to take our high level languages (eg. C, Java), and progressively rewrite them until you have specific languag
      • Maybe a job of nothing but coding would be dead but I don't think such a job should have ever been alive.

        I can't imagine anyone who has actually used LLM coding assistance think the skill of being able to read, modify, and write code would be dead anytime soon. It went from complete absurdity to surprisingly capable, but still mostly wrong real quick and has kind of sat there for all but the most brain dead simple projects.

        • "It went from complete absurdity to surprisingly capable...". Your post reads like someone in denial of the fact their profession will soon be automated away, like so many others in the past. So you think tech will stop at "surprisingly capable" eh? Get on the bandwagon vs. being left behind in the dust.
      • by narcc ( 412956 )

        The kind of job where you could just come in and bang out code all day is kaplitzki.

        It's absolutely astonishing that anyone still believes that nonsense.

        We've seen that same promise countless times over the years. This time is no different.

        Well, it's a lot more expensive than earlier attempts, so its got that going for it, I guess.

    • My American high school doubled down and upgraded to Turbo Pascal 5.5 in the 90's. These days about half of US high schools offer some kind of computer science or IT class. It might be JavaScript and HTML/CSS, or it might be Java, a handful of high schools are doing Python. (there isn't much standardization in education in the US)

      The brain dead thing is legislatures try to push computer science as a mandatory course in our schools. We keep running into this in California. And Silicon Valley keeps telling th

      • Agree. I went to high school 1985-1989.

        My electives included 3 years of computers involving basic, pascal and Z-80 assembly, creative writing, economics, civil law and criminal law. There may have been some other things that I'm forgetting about.

        In theory, they offered a second year of chemistry (organic chem), but enough students signed up for it. Sadly, not enough did when I wanted to take it.

    • by HiThere ( 15173 )

      College profession courses are always about predicting the future. There's no alternative. I think it's a drastic overstatement of where things are right now, but a professional is expected to last for several decades, and right now coding would be a bad bet.

  • Good. (Score:5, Insightful)

    by Petersko ( 564140 ) on Friday July 11, 2025 @10:57AM (#65512442)

    Between established coders with careers and the folks already nearly through the pipe, we're staffed up. It's not that I agree that coding is dead... I just think we're saturated, and the demand will decrease. People entering two and four year programs to become commodity coders are in for a shock, like pinning your hopes on being a web designer because you understand html in 2005.

    • Between established coders with careers and the folks already nearly through the pipe, we're staffed up. It's not that I agree that coding is dead... I just think we're saturated, and the demand will decrease.

      HR: "We're doing a RIF, and unfortunately, we're going to have to let you go".

      Worker: "What am I going to do?"

      HR: "Learn to co... oh, fuck, I'm so sorry".

    • Re: Good. (Score:5, Insightful)

      by fluffernutter ( 1411889 ) on Friday July 11, 2025 @11:27AM (#65512558)
      I now think that AI gives an experienced programmer a huge advantage over a new programmer. Coding at speed with AI is all about knowing what to ask for as precisely as possible. Ask it a more general question and it is more likely to hallucinate.
      • by Junta ( 36770 )

        I've never had particularly compelling results from prompt style interaction, though as a code completion it has been... ocassionally useful, still usually wrong, but it can generate almost-correct code worthy of fixing faster than I can type it 10-20% of the time.

        It could just be the area I develop in, which is a bit more niche than maybe what other people are doing, but it seems to struggle a lot with having no clue about the ecosystem I work with day to day.

        • I've seen advanced LLMs generate good code down to rare control system architectures now - I've seen your opinion a thousand times before in IT. It's protectionism and a refusal to believe you will soon need to find another niche.
    • AI is just another tool in the toolbox for the skilled Software Engineer.

      Rather than walk over to the bookshelf and look something up in my old data structures and algorithms textbooks (yes, of course I kept those and more), I can ask an AI to lookup up the sort of algorithm I need, provide a summary of the concepts involved, and also provide some sample code. Just like the textbook's sample code, the code is not production ready. It lacks all the necessary defensive coding and perhaps task specific logi
  • by evil_aaronm ( 671521 ) on Friday July 11, 2025 @10:58AM (#65512446)
    If kids don't know how to code, or how to read code, how will they know whether the product of AI is correct in the first place? How will they know where to look when there's a problem? This is about as dumb as not teaching future mechanics how to use their tools.
    • by Waffle Iron ( 339739 ) on Friday July 11, 2025 @12:04PM (#65512696)

      More importantly, how will they be able to spot the trojans in the code that are part of the AI's plan to eliminate humans and take over the world?

      • by ksw_92 ( 5249207 )

        I'll start worrying about AI-generated code once it submits the winning entry to the Obfuscated C Code Contest.

        • It wasn't that long ago that people were making posts on this site confidently claiming that "Maybe machines can beat humans at chess, but they will never be able to beat the best human players at go!"

        • by HiThere ( 15173 )

          But perhaps instead of writing C it will write it's code in whiteface or brainfuck.

    • by Brain-Fu ( 1274756 ) on Friday July 11, 2025 @01:01PM (#65512910) Homepage Journal

      I expect that colleges have seen a significant reduction in the number of students enrolling in Computer Science and Software Engineering programs, because AI has scared the students away.

      Colleges want that enrollment money. They have already been watering down the curriculum year-after-year so that the difficulty would stop scaring potential students away (the fact that many of their graduates couldn't code themselves out of a brown paper bag notwithstanding). So now they are just continuing that process, looking at a way to promise "something relevant" to get students continuing to enroll.

      So this statement is really just advertising. They are trying to align with what young people are thinking and expecting, so they can get the enrollment money from them. Whether it is true or not really doesn't matter. It's just marketing.

      • I expect that colleges have seen a significant reduction in the number of students enrolling in Computer Science and Software Engineering programs

        A drop with respect to the ticket punchers that only applied because a parent or guidance counselor said its a good career path?
        Or also a drop among those with a genuine interest and curiosity about software and/or hardware?

        So we may be retuning to pre Internet Revolution days in terms of coders and coding. The sort of coders that built the internet. Not so much the sort of coders that will glue together 37 layers of public libraries of dubious origins to accomplish some simple task.

        Maybe we can put

        • by Junta ( 36770 )

          I hear you, as a member of the first generation to be cluttered with software developers who jumped in for a gold rush, it may be nice, eventually, for things to settle in.

          But we still have a great deal of grift from those gold rush people who can convince managers that they are best because they can use Claude to write up whatever the manager wants.

    • by gweihir ( 88907 )

      Indeed. But look at all the crappy, insecure, unreliable and hard to use software out there. Correct code has not been a priority for the industry for a long time. As the damage from that is now very high, that will change and AI will not play a role in that as it cannot.

  • by eepok ( 545733 ) on Friday July 11, 2025 @11:00AM (#65512456) Homepage

    Do to changes like this, I foresee universities more loudly advertising that their CS programs are accredited because I'm pretty damn sure that using GPT to create a program will not be worthy of a CS degree in most peoples' eyes.

    • by ebonum ( 830686 ) on Friday July 11, 2025 @11:25AM (#65512546)

      Learning to program isn't the same as Computer Science.

      Computer Science is lots of algorithms, computational theory (finite automata, P and NP, etc.), graph theory, tons of numerical algorithms, lexical, syntax and semantic analysis, program transformations (loop unrolling, etc.), lots of compiler theory, databases, networking, cryptography, etc. Tons of really interesting stuff! A lot of CS is more like mathematics than programming. Lots of proofs.

      Focusing on programming is a little like telling an engineering student that the curriculum is mostly bricklaying. Are we talking about a college that teaches computer science or a trade school doing "programming"?

    • Do to changes like this, I foresee universities more loudly advertising that their CS programs are accredited because I'm pretty damn sure that using GPT to create a program will not be worthy of a CS degree in most peoples' eyes.

      Hmmm...I am going to assume you are really talking about software engineering, and not computer science. They are related, but the article is about changes in the software engineering curricula at UW, and not so much the CS side of the house. Here's a direct quote from the article:

      “We have never graduated coders. We have always graduated software engineers.”

      With that said, I actually have a CS degree from the University of Arizona, but I spent thirty-odd years as a sysadmin, riding herd on sof

      • In the 1960s and 1970s a "real" programmer might have said that anyone using a compiler like FORTRAN or COBOL instead of writing assembly code wasn't doing "real" programming. In the 1990s, a "real" programmer might have said that anyone using an IDE with syntax highlighting and code completion instead of vi and make was taking a shortcut. Today, you're suggesting that using an AI assistant to handle boilerplate code, debug a tricky API call, or translate a Python algorithm into Rust is somehow not worthy.
  • by Lunati Senpai ( 10167723 ) on Friday July 11, 2025 @11:09AM (#65512502)

    Even if you use ai to write it, you still need to code.

    You still need to know the math so you can check the results.
    You still need logic to figure out if it did the right thing.
    You still need to sit down and write the process to write the code.

    One of the most popular books for teaching csci is written in lisp scheme (Structure and Interpretation of Computer Programs), which is not a language many people use day to day. But you still learn in that language because it makes a good starting point on how to think about code.

    After you know the principles, you can then start applying that knowledge to other computer languages.

    AI just makes it so we have more English like code. COBOL, C, Javascript, all of these are ways to make coding easier.

    It's fine to use AI as part of the coding process, but it's like telling people that calculators make mathematicians obsolete. We still need programmers.

    • I saw an interview from the guy from Nvidia this week. He admitted that sometimes he did not even know how to ask AI questions, so he asked the AI what he should say. The AI gave him good answers. Then he asked the AI to answer those answers. As a person who learned English the "hard way", and learned all kinds of coding the "hard way", ya I kind of resent the revolution, mostly because I worked so hard, but I also wanted it to happen. What happens next, I dunno.
    • by gweihir ( 88907 )

      Actually, AI will not write any a bit less basic code with advantages anytime soon. Because at a very low complexity level, checking the code become more work than writing known good code yourself. Unless hallucination get solved (impossible for LLMs), AI "coders" will never become very useful. Sure, there are many coders that cannot really code themselves, and those may lose their jobs. But the (probably quite a bit smaller) rest? Not so much.

  • if you don't understand what you are coding. Yes, it's good to have classes to teach you how to be more efficient but you have to understand more than the computer and you will have to know how to code. I don't think it would be more than adding a class or two about AI to any CS program, not change the whole program.

  • by vyvepe ( 809573 ) on Friday July 11, 2025 @11:12AM (#65512510)

    Current LLM approach to software development is leading to too many errors. LLMs spit garbage about 10-20% of the time. We cannot use software which would be buggy so much. That means that people need to check what LLMs are generating. That is not much distant from "coding". People still need to understand the code itself. Whether they write it themselves from scratch or whether they let LLM generate first version and then review and correct it is not such a big difference. The code still needs to be well understood by people.

    We use exactly defined programing languages for coding for a reason. A programming language must be precise so that we can describe correct systems with it. If LLMs could do coding then we would not need a programming language. A specification in imprecise and poorly defined (as for as schematics goes) English would be enough.

    But I agree that LLMs and stable diffusion can be used anywhere where precision is not important. Thinks like marketing campaigns, sentiment management, disinformation spreading, engagement control, spam filtering etc. Art to a high level as well. In the end, it is subjective what is correct and pretty as for as art goes. Occasional mistakes can be interpreted as artist's intention.

    • I argued with an LLM for about an hour about how to properly escape paths with whitespace in them for passing to AWK. I could have fixed it myself quicker, but I wanted to see just how many times of going 'No, you have to because " it'd take before it'd realize what the actual issue was.

      It took a *long time*

      • by Junta ( 36770 )

        The problem is that it is incapable of "realizing", so if it's spiraling out, it's really not worth trying to make it correct itself, it's not "learning".

        To the extent the chat guides things to a correct path it is by influencing the statistics of the content away from a failing outcome.

        It will happily be correct, but admit it is wrong if the human contribution to the text content and then say something wrong, then pivot back if that's what the human says. None of this influences the 'next' person to come a

    • by gweihir ( 88907 )

      As soon as code passes a relatively low complexity level (say, what a somewhat talented 1st year coder can surpass with moderate effort), analyzing code becomes harder than writing correct code. This has been known for a long time.

  • LONG LIVE CODING! (AI tools have a long way to go before they can understand requirements, integrate and solve day to day issues)
  • by MpVpRb ( 1423381 ) on Friday July 11, 2025 @11:21AM (#65512530)

    Predictions are hard, especially about the future. While it's true that AI research is making great progress, the hype vastly exceeds reality.

    I agree that the days of minimally talented programmers making big bucks are over.

    Creating complex, novel systems is hard, regardless of the language used to specify them. Current "vibe coding" works because the code being produced is simple and very similar to code that already exists.

    Talent is real. It takes a special kind of mind to be really good at programming. Unfortunately, many people advocate that everyone, regardless of talent, can learn to code and make big bucks. This is a myth. For those with talent and passion, studying CS still makes sense. And yes, I define CS as Computer SCIENCE, not basic coding.

    • by gweihir ( 88907 ) on Friday July 11, 2025 @12:45PM (#65512856)

      Predictions are hard, especially about the future. While it's true that AI research is making great progress, the hype vastly exceeds reality.

      I agree that the days of minimally talented programmers making big bucks are over.

      Indeed. And not only because of AI. Minimally talented programmers also cannot survive when regulation and liability starts to require correctness and security. And that has started now (EU NIS2, EU software liability for consumers, and more to come), because the damage that minimally talented programmers are doing is getting extreme. One figure from Germany 2023 had the damage around an average monthly salary per citizen, and that was only for those companies that were willing to report. That is not a small factor. That approaches make-or-break for an industrialized society.

    • Problem is, too many code monkeys believe CS is being able to "code" HTML.
  • 5x (Score:4, Insightful)

    by fluffernutter ( 1411889 ) on Friday July 11, 2025 @11:23AM (#65512538)
    I am now using AI to code and once I figured it out it made me around five times faster with probably more efficiency to gain. Much better if you need to write a bunch of small scripts. I now structure my code around AI query sessions.
    • It probably made me 5 times slower because it gets hung up on a stupid thing, misinterprets me telling it what the correction is, does something different, rewrites the whole thing, introduces different issue

    • Re:5x (Score:5, Insightful)

      by SoftwareArtist ( 1472499 ) on Friday July 11, 2025 @12:32PM (#65512802)

      I've observed a pattern with this. People who write lots of short scripts to do simple, one-off operations say AI is amazing and makes them much more productive. People who develop and maintain a large code base say it's useless, and won't let it anywhere near their code base. That probably says something about what it can do, and what it can't do.

      • The problem is that the scope becomes too big with a large codebase. Though I'm still trying different techniques.
      • by HiThere ( 15173 )

        Sorry, this just means you need to get better a modularizing your code...sorry, requests to the AI.
        If it can only handle relatively small modules, that's what you ask it for.

        FWIW, I expect that if it's true it won't stay true. (OTOH, I still code by hand, and only use the AI for hints as to how to do something in a language I'm not that familiar with.)

  • by bugs2squash ( 1132591 ) on Friday July 11, 2025 @11:26AM (#65512550)

    Time to return to the golden years of discovering new ways of organizing, conveying, and processing information, solving problems with less drudgery and understanding how things work without arcane syntax getting in the way.

    No-one complains about word processors stifling the creative process, why should they complain about tools to support advancing CS. Well, I suppose people who think CS is learning how to use excel perhaps

  • Not what AI does (Score:5, Insightful)

    by SoftwareArtist ( 1472499 ) on Friday July 11, 2025 @11:31AM (#65512582)

    Director Magdalena Balazinska has declared that "coding, or the translation of a precise design into software instructions, is dead" because AI can now handle that work.

    That's exactly what AI doesn't do. You give it poorly defined instructions in an imprecise language. It produces code that might or might not do what you were hoping for. Any computer scientist had better understand the difference.

  • There's no sense in whining about it, just like complaining about the weather. AI can now generate high quality code very quickly, write test cases for it, and thoroughly document it. Academic institutions have no choice but to adapt.

    "We have never graduated coders. We have always graduated software engineers.” A sensible way of thinking about it.

    “The hard problem is to precisely figure out what we want computers to do in order to accomplish some task,” she said. “That creati

    • by evanh ( 627108 )

      Sounds like you think an MBA is more than adequate.

      • If I thought that I would have said so. An MBA might be able to tell a software engineer what kind of functionality would be useful and valuable to a customer.

    • AI can now generate high quality code very quickly, write test cases for it, and thoroughly document it.

      What I've seen is not what I'd call "high quality code". For something extremely simple, maybe. For an even halfway complex task, the output I've seen has been less than 50% useful, and certainly not something I, as a responsible professional, would even think about implementing without extensive review (and I don't mean just looking at the outputs) and probably extensive code corrections and/or enhancements.

      • >> the output I've seen has been less than 50% useful

        You must be using a different LLM than me, I generally get great results. I was able to do a major revision of some existing code this morning, a huge improvement and I didn't have to write a line of it.

        But it obviously doesn't matter what you or I think. Employers have already decided, and the colleges now have to either get on board or be irrelevant.

    • by gweihir ( 88907 )

      There's no sense in whining about it, just like complaining about the weather. AI can now generate high quality code very quickly, write test cases for it, and thoroughly document it.

      Hahahaha, no. It cannot do any of that as soon as you leave very simple examples behind. Also, quality and even correctness is hit-or-miss that applies to code, testing and documentation.

  • Likewise writing, making music, etc, if we're to believe all the "AI" hype. If it can replace even a majority of programmers, then likewise it can replace a majority of people holding any job which isn't either manual labour or scientific experimentation. And even those jobs will eventually fall before the AI onslaught, if all the PR is to be believed.

    Whenever I read something like this that smells so strongly of bullshit, I wonder what the real motive behind it is. Is it just to get more people jumping fas

    • by flippy ( 62353 )

      I wonder what the real motive behind it is.

      Follow the money. It's 100% about the money, and AI will generate that income for the AI companies, whether the output is good or not. They've thrown boatloads of money at it, and they want their return. They don't much care (or at all) about the consequences of their low-quality output.

    • by gweihir ( 88907 )

      Indeed. According to the inane, disconnected and fantastical claims of the LLM people, basically anything is dead as LLMs can obviously do it all.

      That is obvious nonsense. In the end, very little of these claims will actually work out and it is even possible we will lose LLM tech for general application completely, because the effort-to-usefulness ratio is just not acceptable. That has happened before.

      On a side note: we keep putting more and more layers of abstraction between ourselves and the world of hands-on manipulating, tinkering, and building. I think this makes it less likely that we'll be able to reboot modern civilization after any of the near-death events that seem likely to occur.

      It is even worse: With each of these layers, things become harder to do unless you stay entirely within sta

    • by HiThere ( 15173 )

      You shouldn't believe the hype ... in either direction. The vendors will always claim their product is better than it is, and those who are threatened by it will always deny the competency.

      Unless, however, AI development hits a wall, one should expect it to continue to improve.

      OTOH, I suspect that training it on the unmoderated internet has gone quite a bit beyond the optimal stage. That was good for basic grammar, and picking up neologisms, but beyond that it doesn't seem to lead anywhere. What is needs

  • Does Paul think that coding is dead? Or, does Paul think that the director of his namesake school should be removed? Or, does Paul think; Fuck it. I'm rich, bitch!

  • UW announced a major revamp of its computer science curriculum to embrace large language models (LLMs) as collaborative tools. Not just for ethics discussions or one-off assignments—LLMs are being structurally integrated into how students learn to code, reason, and debug. In short: the assumption going forward is you won’t be competing with LLMs—you’ll be building with them.

    Fwiw, UW was in my top three when I was looking for a college after I left the USAF in 1989. For anyone who di

  • I still need to know *what* tech stack, how it fits together, and what to correct when AI does it wrong. And there's the design and performance aspects that is more of an esthetic thing.

    That said, it has sure saved me a lot of CSS twiddling time, and it does boilerplate faster than I can. It's like a very adept, very fast junior colleague that needs precise instructions and careful supervision.

  • I think this is a strong statement, if anything with AI doing the "hard" part for students, they will become less able to do it without AI and AI is not fool proof. So you have less skilled people running into walls and doing things the hard way because they could only get AI to do it that way for them. AI is out of the box, it's not going away, but I recall CS programs always telling students we aren't teaching you to be programmers, but computer scientists who are concerns with the theory of computation,
  • "Just coding" was not a good idea all along. But coding will only get stronger for the ones that really can do it and can do more than the really simple stuff.

    Because here is the thing: Reviewing even a bit complex code for errors and security problems is much harder and takes much more time than writing it well in the first place. All these claims of "coding being dead" or "AI can code" are really just for toy examples and simplistic business code that basically everybody can do. Say, on the engineering di

  • What's the point on spending 2+ hours debugging a stupid error, where your math is off slightly, or you misused a pointer? Of course the flip on that is what if the AI does the same problem, and now you can't debug it?
    • Given how many "teach" the subject, this will not turn out well. This can make sense if done properly.
      CS education has become software engineering when actual CS. CS should probably be in the math dept and have almost nobody majoring in CS. Software engineering has a lot of overlap and can be what most CS majors are today.

      For math CS, that can use AI to do coding and nothing else but real CS has almost no coding and in the past people did get masters degrees in CS by maybe doing 1-2 programs in punch cards

  • I just see this move as meaning more jobs/demand and less competition for the remaining few of us that actually can write device drivers and stuff..

  • There is a tremendous amount of existing code. Many times I've had to debug or enhance existing code. Reading code is hard. If no one is taught to program, and everyone is relying on code snippets from an LLM, that code is not going to be maintained and will slowly degrade.

    Writing new code - you need to know data structures and algorithms for anything more than the highest level scripting. If you feed an LLM all the code on the planet, you still have to understand what it's returning in order to debug it.

    If

  • by ndykman ( 659315 ) on Friday July 11, 2025 @02:03PM (#65513092)

    I get it, schools are hardly in a position to fight against AI tools and all the marketing around them, and are rolling over (just asking people to cite the AI they used). That still misses the whole point.

    The reason plagiarism is a fundamental issue isn't some abstract ethical point, it goes to learning. You don't learn by taking other's work and presenting it as your own. And using a chatbot is even worse for learning for that then finding, reading and copying actual work.

    And coding is no more dead than it was when (insert tool here) claimed it would be every decade since the 70s.

    So, in the end, the schools are devaluing themselves and the value they provide to students and the community. Higher education is useful because people *can fail* to get degrees.

  • Use of that word to describe what software engineers actually do is beyond ignorant.
    God I hate that word. It sounds like all you do is to brainlessly transcribe someone else's idea into computer language.
    It massively demeans everything that a good software engineer actually does.

  • The question that should concern us in the software dev field is how well are the best AIs progressing at
    1. eliciting requirements from humans who have various degrees of vague understanding of what they want and even less understanding of what they really need, and
    2. translating requirements into a system design with good characteristics (performant, future-proof, scalable architecture and component/tool/framework choices, fit-for-purpose, maintainable, adaptable design etc.)
  • So what the fuck is coding now? (Roughly) 1950s, machine language. 1960s, assembly. 1970s, C, basic. 1980s pascal, lisp. 1990s Java, C++. 2000s, frameworks. 2010s, I got no idea, a bunch of shit. 2020s, even more what the shit.

    Can AI write the super Woz machine? The fast fourier transform? The 3d Doom libraries? Perl scripts to manage unix servers? WHAT ARE WE CODING? Because it is utterly ridiculous to consider the BS we call "AL" as actually coding, as I understand it.

    Do a database query and

  • Programming is not about coding. It's about expressing your thoughts clearly and coherently. About identifying the complete set of steps needed to achieve a task. Unfortunately most people suck at clear, rational, coherent thinking, and no amount of AI is ever going to help them.

I haven't lost my mind -- it's backed up on tape somewhere.

Working...