Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming AI

Amazon CEO: AI-Assisted Code Transformation Saved Us 4,500 Years of Developer Work (x.com) 130

Long-time Slashdot reader theodp shared this anecdote about Amazon's GenAI assistant for software development, Amazon Q: On Thursday, Amazon CEO Andy Jassy took to Twitter to boast that using Amazon Q to do Java upgrades has already saved Amazon from having to pay for 4,500 developer-years of work. ("Yes, that number is crazy but, real," writes Jassy). And Jassy says it also provided Amazon with an additional $260M in annualized efficiency gains from enhanced security and reduced infrastructure costs.

"Our developers shipped 79% of the auto-generated code reviews without any additional changes," Jassy explained. "This is a great example of how large-scale enterprises can gain significant efficiencies in foundational software hygiene work by leveraging Amazon Q."

Jassy — who FORTUNE reported had no formal training in computer science — also touted Amazon Q's Java upgrade prowess in his Letter to Shareholders earlier this year, as has Amazon in its recent SEC filings ("today, developers can save months using Q to move from older versions of Java to newer, more secure and capable ones; in the near future, Q will help developers transform their .net code as well"). Earlier this week, Business Insider reported on a leaked recording of a fireside chat in which AWS CEO Matt Garman predicted a paradigm shift in coding as a career in the foreseeable future with the prevalence of AI. According to Garman, "If you go forward 24 months from now, or some amount of time — I can't exactly predict where it is — it's possible that most developers are not coding."

This discussion has been archived. No new comments can be posted.

Amazon CEO: AI-Assisted Code Transformation Saved Us 4,500 Years of Developer Work

Comments Filter:
  • AI is just something to offload the really simple. We already have a lot of code generation. Say one uses a GUI layout tool that generates all the code display all the windows and widgets, blank templates for a developer to add the functional code to. That's 1990s tech. With the AI straight forward things could be automated. Selected an array and tell HAL to sort it using an insertion sort.
    • So I used to work a pretty low tier shit job but it didn't pay the bills and so I had to bust my ass to move up so I could survive.

      Somewhere out there is a shitload of programmers who can do what you do maybe even better but who would be more than happy to take cakewalk jobs where they don't have to do tough challenging work all day.

      All those guys are about to be forced out of their comfort zone and they're going to start competing with you. Yeah a lot of them won't make it but a lot of them will an
      • by jhoegl ( 638955 )
        What is this BS?

        Here is what happens when you try that... enjoy as your crappy business gets crappier.

        https://www.industryweek.com/supply-chain/article/22027840/boeings-737-max-software-outsourced-to-9-an-hour-engineers [industryweek.com]
        • by GrumpySteen ( 1250194 ) on Sunday August 25, 2024 @09:08AM (#64733552)

          I'm against this sort of outsourcing, but that article completely ignores the actual problems with the 737 Max.

          The fundamental problem is that they stuck larger, more fuel efficient engines on it, but the engines were so large that they would scrape the ground if they were mounted the same way engines on older 737 models were. They 'solved' that problem by mounting the engines in front of the wings and higher up.

          The new mounting changed the centerline of the thrust, making the plane pitch upwards a lot more than previous models when thrust was applied. That increased the chances that the plane would stall.

          They 'solved' that problem not by going back to a simpler design, but rather by slapping on some sensors and software to try and compensate. Thus we got the Maneuvering Characteristics Augmentation System (MCAS) which was designed badly too.

          Let me say it clearly; MCAS only looking at the one input wasn't a bug. Boeing specified that it was meant to work that way. Whether the software engineers would have the level of knowledge to raise questions about how the software worked and whether Boeing management would have listened are different questions.

          When an angle-of-attack sensor went bad and fed incorrect data into MCAS, MCAS responded by trying to nose dive into the ground. When the pilots tried to pull up, the Elevator Feel Computer (the system that provides feedback by moving the pilots yokes) applied as much force as necessary to prevent them from doing so because MCAS was indicating that the plane was stalling and pulling up would make the stall worse. Pilots weren't trained, so they didn't know that could happen.

          All this was done to avoid designing a new air frame that was capable of handling the larger engines. If they had designed a new air frame, it would have had to go through a full certification and pilots would have to train on it, both of which are costly. By patching things together, Boeing was able to say "It's just another 737. No need for new certifications or pilot training."

          The blame over the 737 Max crashes lies completely with management at Boeing. They were the ones who made the decisions to design an unstable aircraft, patch over the instability with badly specified software, avoid training pilots and generally cut every corner they could.

          • by Aviation Pete ( 252403 ) on Sunday August 25, 2024 @05:53PM (#64734666)

            The new mounting changed the centerline of the thrust, making the plane pitch upwards a lot more than previous models when thrust was applied.

            It wasn't the line of thrust. It was increased suction on the intake lip, which created a stronger pitch-up moment with its higher engine mass flow and more forward location. The critical thing here is the increase in pitch-up as angle of attack increases. This gradient should always be negative but became positive at high angle of attack on the 737 MAX. And that the MCAS relied on a single sensor. And that pilots were unaware of this system, as you wrote.

      • To put it way too crude... AI does the stuff dumb programmers did. Now what do we do with the dumb programmers? Let them flip burgers! I hear some frustrated programmers say. If this takes on, we are numbing down a lot of people. It is not the right thing to do. Just pay a bit extra for your toaster.
        • by Shaitan ( 22585 )

          Actually I think other guy has a point, lazy programmers aren't always dumb programmers... a lot of them just have a mroe life heavy work/life balance. If you force them to wake up you'll find more than a few are capable [especially these days where clever coding is reviled and code-by-numbers is the norm].

          The other problem is that those aren't just the roles for dumb/lazy programmers but NEW developers. That means more and more people coming out of school and finding no jobs. Tech on the whole is already n

          • Re: (Score:3, Insightful)

            Lazy programers are usually the best programmers.
            They spent time drinking tea, coffee or beer thinking about the most simplest solution, instead of hammering out code.
            Then they write a few lines, and that is it.

            • by Shaitan ( 22585 ) on Sunday August 25, 2024 @07:45AM (#64733418)

              That's what some of my best coding work has looked like. I'd spend months just thinking about and talking about what to build. White board it out to explain it to others, maybe write a little code to mock up a piece or two and see how it goes together. The actual implementation then gets hammered out in under a week, sometimes in a weekend.

            • by gweihir ( 88907 )

              Agreed. But these people are _rare_. Most coders just write mode code to solve problems and create a maintenance nightmare down the line.

            • by ObliviousGnat ( 6346278 ) on Sunday August 25, 2024 @09:49AM (#64733630)

              "My design style is to spend quite a bit of time thinking out every angle in my head and in rough sketches, and then to start coding. The first results aren't visible right away, but at the end they come up very quickly. Steve Jobs got concerned that I wasn't making enough progress. He even accused me of slacking and coming in at 10 AM in one staff meeting, but...I'd been leaving at 4 AM every morning, long after even the Houston brothers, Dick and Cliff, had left."

              --Steve Wozniak [woz.org]

              "The result of all this was that it didn't look like we were working very hard at all. I went home at 5.30pm, I never worked weekends, we didn't spend hours crowded around each other's desks throwing out guesses about what could be wrong with some failing production system. From the outside it must have looked like we'd been given a far easier task than the analogue TV guys. In truth, the requirements were very similar, we just had better designed and implemented software, and better supporting infrastructure, especially the unit tests.

              "Management announced that they were going to give out pay rises based on performance. When it was my turn to talk to the boss, he explained that it was only fair that the pay increases went to the people who worked really hard, and that our team just didn't seem to care so much about the company, not compared to the heroes who gave up their evenings and weekends."

              --Mike Hadlow [blogspot.com]

            • by cusco ( 717999 )

              I used to explain to newer people how to do task such-and-such, and they'd be surprised because what I did was much simpler than what they had learned. I'd just tell them, "If you want to know the most efficient way to do something just ask the lazy guy." They always asked, but then they always came back for more tips. :-)

        • by ShanghaiBill ( 739463 ) on Sunday August 25, 2024 @07:01AM (#64733350)

          We've been automating work for 300 years, and people have always predicted doom and poverty.

          Instead, we have a twenty-fold improvement in living standards and near record low unemployment.

          Automation doesn't lead to unemployment. It leads to a bigger economy.

          Lump of labor fallacy [wikipedia.org]

          Programming is already highly automated, with compilers replacing assemblers, GUI IDEs, and frameworks replacing custom code. The productivity tools have always led to higher demand for programmers. This is an example of Jevons Paradox [wikipedia.org].

          • We've been automating work for 300 years, and people have always predicted doom and poverty.

            Instead, we have a twenty-fold improvement in living standards and near record low unemployment.

            Automation doesn't lead to unemployment. It leads to a bigger economy.

            Yes. And too many people have core competency in inertia. They want to graduate, get their jerb, and do the same thing every day of their life until they retire.

            Ain't happening, and boring.

            Every time an older technology has been on the wane, I picked up something new. This leads to doing a lot of different things in one's career. And I haven't been unemployed other than one short stint when I was around 20 years old.

            Lots of jobs out there, some pay pretty well. But it isn't appealing to the incurious.

            • There is this decade old Dutch song Hilversum III by Herman van Veen. It cheerfully complains about radio. Before radio, he goes, People sang and whistled in the streets while doing their work. There was no radio, but everybody had his own voice. Sounds like fun, but it is gone.
              I hear my father in law complain about email. Back in the day, he says, People were more relaxed. Need a document urgently? I send it through the post office. It will reach you in two days.
              The price each time appears to be a piec
              • How does waiting two days for an urgent letter make us more human?

                  • by Chelloveck ( 14643 ) on Sunday August 25, 2024 @09:49AM (#64733632)

                    It teaches you patience.

                    Then I presume it would be even better to revert to the days of wind-powered ships, when it would take weeks to deliver a message across the Atlantic. That teaches you even more patience.

                    Both the radio and postal arguments are cases of, "The way it was when I grew up is the way it's supposed to be." The time when you learned how the world worked was the pinnacle of human achievement, and everything since then has been a regression in some way. So it has been ever since cave paintings replaced oral storytelling. "Back in my day people would gather around the fire and talk to each other! Now all anyone does is stare at the walls. God help us when we finally develop written language, that will be the final nail in the coffin of humanity!"

                    • Socrates claimed that new-fangled writing was bad because people no longer needed to memorize things.

                    • You have a point but also, back then maybe people had less mental health issues because they had more brain downtime?

                    • Things can be both "more human" and worse. There's nothing more human than living the hunter-gatherer lifestyle. Not like in the office when your fight-or-flight response gets triggered by a poor performance review, or when you have job stress but there's no lion to run away from. It's unhealthy, it's not what humans are "designed" for. But also, we can't go back; there's too many humans to live like that, if we did we'd get taken over by whatever group kept industrialization, almost no one would even choos

              • There is this decade old Dutch song Hilversum III by Herman van Veen. It cheerfully complains about radio. Before radio, he goes, People sang and whistled in the streets while doing their work. There was no radio, but everybody had his own voice. Sounds like fun, but it is gone. I hear my father in law complain about email. Back in the day, he says, People were more relaxed. Need a document urgently? I send it through the post office. It will reach you in two days. The price each time appears to be a piece of our humanity. Not something to just waive away.

                Here's my take on it - at each stage of progress, there is a group that thinks it is a terrible thing. That what existed in the days when they were young was the sweet spot.

                Notwithstanding that the old people when they were young thought exactly the same thing. Email is dehumanizing? Radios with integrated circuits, then transistors, then tubes, then RF alternators than Spark, then telegraphs. Deisel trains versus coal fired boiler trains. Back in the day, telephones were a work of the devil for older fo

                • I think it is a bit less cliché than that. When AI really gets good, a lot of jobs really will be obsolete. It will make a large group of people economically obsolete. That is a very dangerous situation. Maybe we can skip the riots this time. Maybe we just need to be a bit careful. Show some restraint. Think a bit further than "ooooh this will make my product cheaper go go go go before the competition finds out." A bit of delay will not be a disaster. I know, new toys are fun.
          • We've been automating work for 300 years, and people have always predicted doom and poverty.

            Instead, we have a twenty-fold improvement in living standards and near record low unemployment.

            Automation doesn't lead to unemployment. It leads to a bigger economy.

            back when it started it did lead to doom and poverty and starvation for many, you gloss over decades of bad times a long time ago.

          • ... doesn't lead to unemployment.

            How do you know? Nowhere does Jevron's Paradox claim that, and if goods are cheaper, someone is getting less money (per unit). Increased efficiency causes more consumption. Who makes all the goods for us to consume? Other humans. It's easy to spot the assumption in your argument: An assumption that is flawed by the existence of robots.

            ... higher demand for programmers.

            Then, why are so many out of a job this year? The short answer, is the demand was never there. The increase in productivity, over the last few years, did not increase

    • I predict that somewhere inside that conversion there's going to be some oddball bugs that only occurs under special circumstances.

      • by Viol8 ( 599362 )

        This. And the code may possibly have been written in a really odd way meaning not only does no one have the knowledge of how it works but figuring out how it works could be the devils job.

      • by tippen ( 704534 )
        How is that different than when humans are doing the conversion?
        • How is that different than when humans are doing the conversion?

          Because this is AI. When AI screws up it shows it is unworthy of doing anything useful. When a human screws up it shows they are human.

      • by gweihir ( 88907 )

        Likely. And some of them will be security bugs. Problem is that while you can fix the bugs, you cannot really fix the AI that made them and hence will keep making them. And at some time the attackers will know and routinely test for them.

      • by drnb ( 2434720 )

        I predict that somewhere inside that conversion there's going to be some oddball bugs that only occurs under special circumstances.

        Which is why you always carry your helmet with you.

      • by drnb ( 2434720 )

        I predict that somewhere inside that conversion there's going to be some oddball bugs that only occurs under special circumstances.

        I'm really thinking of a "conversation" that goes something like:
        Right click on an array of some structure, select sort, select sorting method, select increasing/decreasing, select key.
        The "assistant" implements the canned template for that method.

        Can we contrive a more complicated scenario than such a basic task? Sure. But my point is that such basic tasks are a fairly logical next step for automation.

        Think of the assistant as being more like an expert system trained on the Standard Template Librar

    • by Luckyo ( 1726890 )

      This is the next level in the "80% of what we do is simple". It's the 80% of the remaining 20%. And it's the first iteration of LLM assist for coding. It will take another 80% of 20% chunk soon enough, and then probably at least another out of the remaining 20%.

      That's the actual universal promise of automation. It takes on the part of work that is far less demanding, allowing humans to focus on things that are actually hard, and only on them.

      • by drnb ( 2434720 )
        Technical drawing, the creation of blue prints, used to require the skill of cleanly drawing lines, curves, and letters with pen and ink. Pencil is for lightweights, Pen was the ultimate goal. Probably had something to do with reproducing the original.

        In any case, we moved to CAD and let the computers draw the lines, curves, and letters. The pen and ink skill obsoleted.
  • by drnb ( 2434720 ) on Saturday August 24, 2024 @11:03PM (#64732892)
    So that's 11.25 million lines of code.

    10 lines * 250 workdays * 4500 years.
    • So next step is to run an analysis doing code de-duplication reducing it to 1.2 million lines of code.

    • by Shaitan ( 22585 )

      Back when people measured output in LOC the usual standard was 100 lines, not 10.

      • by drnb ( 2434720 )

        Back when people measured output in LOC the usual standard was 100 lines, not 10.

        No, pretty damn sure it was 10. And that is lifecycle. So debugging (which may involve throwing some lines out, a negative number if you will), refactoring (more negatives), upgrades, etc is represented in that 10. A programmer may type more than 10 on a given day but over the life of a project it will average down to 10 as stuff gets written and less productive days occur. "Less productive" including studying, thinking, not typing. Its why LOC is kinda naive.

        A slightly better metric is operands and oper

  • Bullshit (Score:5, Insightful)

    by lsllll ( 830002 ) on Saturday August 24, 2024 @11:10PM (#64732898)

    "Our developers shipped 79% of the auto-generated code reviews without any additional changes

    Using my own experience with ChatGPT as anecdote, I can say that my success rate hasn't been anywhere near that. As a matter of fact, with the simple exception of asking ChatGPT to write a function to calculate the square root of a number, every single time I've asked it for anything more complex has resulted in a lot of going back and forth until it gets things to be more aligned with what I asked and even then there are still issues with it and it contains errors. Maybe they're using something that's better. Maybe I'm stoopid and don't ask the right questions.

    • I agree with your sentiment. "AI" as it exists now is all well and good to do (ChatGPT used in this example) "Write C++ code to calculate the square root of a number." Of course, it understandably uses the sqrt() function. "Write C++ code to calculate the square root of a number without using existing libraries" generates code using the Newton-Raphson Method, and it looks about right, but I can think of a few easy ways to make it way more efficient, at least for larger numbers. Of course, the programmer sti

    • Re: Bullshit (Score:5, Informative)

      by phantomfive ( 622387 ) on Saturday August 24, 2024 @11:41PM (#64732924) Journal
      Amazon seems to have put effort into the special case of programming. Their AI will first give you an outline of the algorithm in words. At that point you can adjust the algorithm. Then, once you agree on what it should do, it will write the algorithm. That is an approach that might help it overcome some of the limitations of chatGPT, by functionally increasing the amount of context it can control.

      Secondly, it seems like they've augmented its ability to scan code, and figure out how to upgrade it. That's often something that can be easily automated without AI, especially in Java which has types and compile time binding. However, chatGPT might be a decent frontend for this kind of upgrade program.

      It still sounds like bullshit, upgrading to Java 17 shouldn't be THAT hard unless Amazon was doing something really weird in the past with Java.
      • Amazon seems to have put effort into the special case of programming. Their AI will first give you an outline of the algorithm in words. At that point you can adjust the algorithm. Then, once you agree on what it should do, it will write the algorithm. That is an approach that might help it overcome some of the limitations of chatGPT, by functionally increasing the amount of context it can control.

        Interesting approach! I haven't used theirs.

        I wonder if you could simulate that in ChatGPT - tell ChatGPT to first work with you to hash out the requirements, before generating any code. I bet you could.

    • Oh, I don't know.
      Java is pretty much backwards-compatible.
      Unless they're using words as variables or class names that later became part of the language (like "module"), I don't even know what their "Java upgrade" was supposed to be doing except maybe replacing deprecated methods with their successors.
      Ok, there's some stuff that used to be part of the JDK/JRE that got dropped in newer versions. And Amazon is old enough that they might be using collections without generics in some corners, but still..4,500 ye

      • by bsolar ( 1176767 )

        Oh, I don't know. Java is pretty much backwards-compatible. Unless they're using words as variables or class names that later became part of the language (like "module"), I don't even know what their "Java upgrade" was supposed to be doing except maybe replacing deprecated methods with their successors.

        The main issue typically are dependency upgrades. Upgrading Java version might mean you also need to update some dependency to a new major version since the old dependency version does not support the new Java version. This means the dependency would also contain breaking changes, sometimes pretty significant ones.

        The annoying thing is that it might even be that the new dependency version is not compatible with the old Java version, meaning that you have to update both at the same time...

        Dependency manageme

    • by jvkjvk ( 102057 )

      I expect you are doing something much different than they are.

      They are using AI as an auto update tool to update library/Java version issues and the like, not produce new stuff.

      As such, we have been doing stuff like this "AI" for quite some time. It is more of a template type search/replace. Replace this call with that, replace this pattern with that.

      So good. We finally have a way to reduce the technical debt of language/library version upgrades. That stuff is generally never done anyway. Can you see Amazon

    • Maybe, but how wrong could a guy like this be? /s

      Jassy — who FORTUNE reported had no formal training in computer science — also touted Amazon Q's Java upgrade prowess ...

    • by ras ( 84108 )

      My experience also agrees with yours. My hit rate on AI questions is low, so low if I'm confident of my search query terms it's faster to sift through the results a google search throws up than ask an AI, and trying to decide if it's lying to me.

      So lets do some basic research. First who is Andy Jassy. Wikipedia [wikipedia.org] says he got a degree in government, then got an MBA, and was eventually hired by Amazon as a marketing manager. He and Jeff Bezos came up with the idea of AWS, which he headed and was a wild succ

      • Re:Bullshit (Score:4, Insightful)

        by gtall ( 79522 ) on Sunday August 25, 2024 @06:55AM (#64733340)

        My own take is Monkey Butt Syndrome. When down on the tree looking up, all one sees are monkey butts. When on the top of the tree looking down, all one sees are smiling monkey faces. Which monkey is going to tell the top guy that their experiment to save "uncounted" hours of development time is not "robust"? So Jassy hears some encouraging talk. His reptilian marketing brain takes over and we get the above quotes.

        I wouldn't trust anything he says on technical subjects further than I can spit a two-headed rat.

    • every single time I've asked it for anything more complex

      The vast majority of code isn't complex. That's why "no-code" programming exists in the first place. AI isn't there to think for you, it's to do the things you don't normally think about.

      • by Viol8 ( 599362 )

        IME of no-code its only no-code so long as you don't need to do anything more complex that load data from DB and display in table. If any kind of processing is required then you're going to have to write some kind or algorithm whether its done in text or by drawing lines and circles and it pretty quickly becomes more hassle than doing it using a traditional programming language.

      • Congratulations. You are officially the entertainment around here.

        Here's your trophy for "Most Moronic Statement on Slashdot 2024".

        Someone help Garbz with his drool bucket.
    • Re:Bullshit (Score:4, Insightful)

      by eneville ( 745111 ) on Sunday August 25, 2024 @03:01AM (#64733104) Homepage

      No question.

      Amazon just wants to tell investors they have an AI product.

      That's all this is.

      • by znrt ( 2424692 )

        plus the ceo wants to boast to justify his bonus.

        spent a couple of minutes in that x thread, the choir of clueless ass-lickers that congregates there is a really embarrassing display of humanity. (i held out that long because there are also a few funny trolls inbetween).

    • by Shaitan ( 22585 )

      Yeah but this isn't coding from scratch, they said "java upgrade" so possibly just updating existing java functions to a new standard. ChatGPT will generally eat that for breakfast.

      • by Shaitan ( 22585 )

        Especially if you have an existing test suite and have it automatically kick it back with whatever tests fail for revision until everything passes and only then pass it to a dev for review.

    • with the simple exception of asking ChatGPT to write a function to calculate the square root of a number, every single time I've asked it for anything more complex has resulted in a lot of going back and forth

      I don't know if their claims are legit, but they allegedly are only asking for their code to be updated, not for new code to be written out of nothing. That's a much, much easier job. They have working code as input! And they might even be able to use their existing tests to validate it. To me, this seems like something where AI could actually do what it promises, though you still need a human to look over the code produced to make sure it isn't doing something completely bananas.

    • "Our developers shipped 79% of the auto-generated code reviews without any additional changes

      Using my own experience with ChatGPT as anecdote, I can say that my success rate hasn't been anywhere near that. As a matter of fact, with the simple exception of asking ChatGPT to write a function to calculate the square root of a number, every single time I've asked it for anything more complex has resulted in a lot of going back and forth until it gets things to be more aligned with what I asked and even then there are still issues with it and it contains errors. Maybe they're using something that's better. Maybe I'm stoopid and don't ask the right questions.

      I will say that using ChatGPT effectively for programming takes a lot of project management skills, which a lot of programmers seem to be surprisingly lacking in (at any rate, going by some of their complaints about ChatGPT).

      I mean the back and forth is reduced considerably if you write good requirements to give it to begin with.

      And after that, you still have to be willing to engage in back and forth. Which seems to almost personally offend a lot of programmers who post about this.

    • by Junta ( 36770 )

      My personal experience is similar to yours, for more coding AI solutions. To support your point, this is an Amazon exec bragging about an Amazon product, so he's inclined to cherry pick and exaggerate. Notably he didn't speak to whether the AI generated results were sufficient (e.g. they might have been acceptable to pass code review, but missed stuff that a human needed to do). Particularly if the developers were under pressure to make the results 'look good', they may be inclined to accept a review requ

    • Maybe they're using something that's better. Maybe I'm stoopid and don't ask the right questions.

      You didn't ask if migrating from Java 8 to 11 or 17 etc is even difficult, or if it's a relatively simple task that is very time consuming for human devs.

      So you're comparing whatever your use case is, and your experience, to one you don't understand, with people that know how to use the tool better, and saying that's impossible.

      I'm not saying you're stupid, but you said it, knowing the questions to ask are what makes a person smart.

  • by end rant ( 7141579 ) on Saturday August 24, 2024 @11:59PM (#64732952)
    3000 Years of Commuting
    1499 Years of unproductive meetings

    Yup, it looks like his estimate is right on!
    • Now there is a potential for AI! Let AI do the meetings. Everyone is represented in the meeting by an individual AI. They work it all out while you get some actual work done. AI, how did the meeting go? We spent 1h discussing an issue John has. It is unrelated to your part. There is no need to read that part of the summary. I informed the manager AI that your work is going well and pointed out that you have less issues than John despite your work is as complicated. I negotiated a 1% raise with the CEO AI. S
      • by Calydor ( 739835 )

        HR AI reduced his work load and salary.

        Also known as "John got fired."

      • by gtall ( 79522 )

        Sort of makes you wish for an Electric Monk from Dirk Gently's Holistic Detective Agency by Douglas Adams. People have toasters tor toasting bread for them, TVs for showing them pictures, etc. An Electric Monk believes things for you, saves a lot of time and effort and mental anguish.

  • by theodp ( 442580 ) on Sunday August 25, 2024 @12:01AM (#64732956)

    ...is the new Mythical Man-Month [wikipedia.org]. Which begs the question, "If it takes 4,500 developer-years to bear 6,000 children [goodreads.com], how long would it take with Amazon Q?"

    • Exactly. Those 4,500 developer-years don't mean what they want you to think it means.

      Most likely, the number was calculated by some executive who's trying to justify the tons of money he authorized to be spent on a project that didn't have much actual merit.

  • It isn't as good as Amazon CEO claims. And it isn't as bad as some skeptics make it out to be. It is a very very good auto-complete. Much better than any we have had before. And just like you don't completely trust auto-complete, you shouldn't completely trust AI suggestions either.

    • And just like you don't completely trust auto-complete, you shouldn't completely trust AI suggestions either.

      And I don't. Who does?

      That said, I don't "trust" my own handwritten code either. I test it, I have others test it, and I have others review it.

    • This is my experience too. I love how it suggests lines or blocks of code that adjusts for my variable names, or takes a pattern from somewhere else and adapts it. But it gets it "right" maybe 25% of the time or less. It's still useful, because the other 75% of the time, I can usually tweak the output, and I've just saved a significant amount of time.

  • by az-saguaro ( 1231754 ) on Sunday August 25, 2024 @02:55AM (#64733098)

    I am not a professional programmer. Like many, I can code well enough for work and school projects, but that is far from being a pro in the business. So, I have no basis for understanding what the pro's do at companies like Amazon. But, I understand BS and what it smells like, and this post is making my nose run.

    The great American literary humorist Mark Twain famously stated that there are three kinds of untruths - lies, damned lies, and statistics. This statement, that AI did 4500 developer-years of coding, smells like the latter, flaunting numbers to sound impressive, but likely numbers that have been mixed, mashed, and massaged far from reality just to make a point.

    I google'ed "how many programmers does amazon employee" to find out that Amazon employees about 35,000 programmers. So, the press release is implying that AI did the same work as fully one-eighth of their massive programmer workforce working for one year.

    Actually, the article did not say it did 4500 man-years of work. It said AI "...saved Amazon from having to pay for 4,500 developer-years of work". Shit, that could mean anything. Like, "we wrote in an annual report that we would hire 12% more programmers, then didn't, instead had AI write code - see how much that saved us!" Or, " our programmers are asking for a 12% raise, so we told them no deal and made them use AI to augment their work, at risk of permanent replacement by AI." Of course, I am just thinking of cynical scenarios.

    For those in the industry, does this sound realistic or even plausible?

    • by gtall ( 79522 )

      An oldie but goodie:

      The Plan
      In the beginning, there was a plan, And then came the assumptions, And the assumptions were without form, And the plan without substance,

      And the darkness was upon the face of the workers. And they spoke among themselves saying, "It is a crock of shit and it stinks."

      And the workers went unto their Supervisors and said, "It is a pile of dung, and we cannot live with the smell."

      And the Supervisors went unto their Managers saying, "It is a container of excrement, and it is very stro

  • So the same company that makes the product, is citing how much it saved them? And the CEO, which has a primary duty of selling corporate product, is saying how incredible it is, and so forth?

    No one should take this information, or data, as valid. No one should believe one word.

    If there is any validity to this, then 3rd parties should be sought out for opinion. Never trust a vendor's promises or assertions about their product.

  • by trawg ( 308495 ) on Sunday August 25, 2024 @05:36AM (#64733236) Homepage

    .... claims company that sells the shovels

  • Made it easy to know to skip article.
  • ...that his company's new service is really, really good, OK?

    Of course, we can trust a CEO's announcements to fair, unbiased, comprehensive, & balanced, right?
  • "Our developers shipped 79% of the auto-generated code reviews without any additional changes," Jassy explained. "This is a great example of how large-scale enterprises can gain significant efficiencies in foundational software hygiene work by leveraging Amazon Q."

    Running an automatic tool on a stack that is self-contained is, of course, going to be easier.

    This boast of Amazon's is like saying an economy car is as good as Rolls Royce when sitting at a stop light.

    Great. However, Amazon customers customize th

    • > software hygiene work

      Sounds like toilet cleaning.

      But, more seriously, given that all the AI code reviewing then had to be human reviewed, with 21% of it rejected, wouldn't it have been more efficient just to have humans review the code in the first place?

      It seems what is really going on here is that human code review time is being saved by only looking at issued raided by the AI, and ignoring the (90-95%?) of code the AI didn't suggest changes too.

      I guess this is the "trust me bro" school of code revie

  • I'll take Things That Never Happened for $500, Alex.

  • They're not posting on slashdot, so they can't be real...

  • From Amazon CEO Andy Jassy's opening remarks in Amazon's August 1st Q2 2024 Earnings Call with Wall Street Analysts [aol.com]: "With Q's code transformation capabilities, Amazon has migrated over 30,000 Java JDK applications in a few months, saving the company $260 million and 4,500 developer years compared to what it would have otherwise cost That's the game changer."

  • Or is it just a bunch of cupcake recipes translated into Javascript?
  • We have been trying to track down some weird problems on AWS for a while now, all on systems that haven't changed in quite some time. Here is an exemple: https://redmine.pfsense.org/is... [pfsense.org]
    Tracks with the new CEO and reliance on AI.

  • Does Q work from home?

  • So theyâ(TM)re saying they basically took billions of dollars out of the economy with zero repercussions on them.

    Which means a billion dollar plus hit to federal and local taxes and billions in lost economic activity as well as those developers would have sent that earned income through the economy.

    For the rest of us this is money our companies potentially do not see, more taxes we pay, and lost employment opportunity for many not just at Amazon but elsewhere.

FORTUNE'S FUN FACTS TO KNOW AND TELL: A guinea pig is not from Guinea but a rodent from South America.

Working...