Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Programming

80% of Software Engineers Must Upskill For AI Era By 2027, Gartner Warns (itpro.com) 49

80% of software engineers will need to upskill by 2027 to keep pace with generative AI's growing demands, according to Gartner. The consultancy predicts AI will transform the industry in three phases. Initially, AI tools will boost productivity, particularly for senior developers. Subsequently, "AI-native software engineering" will emerge, with most code generated by AI. Long-term, AI engineering will rise as enterprise adoption increases, requiring a new breed of professionals skilled in software engineering, data science, and machine learning.

80% of Software Engineers Must Upskill For AI Era By 2027, Gartner Warns

Comments Filter:
  • by Rosco P. Coltrane ( 209368 ) on Wednesday October 09, 2024 @05:41PM (#64852155)

    AI tools will boost productivity, particularly for senior developers. Subsequently, "AI-native software engineering" will emerge, with most code generated by AI.

    Translation: unskilled code monkeys will use AI to generate shit code that seasoned professionals will get to correct and rewrite properly, because somebody in charge who's never done software engineering in their life has been convinced by Microsoft that it's gonna be cheaper and faster somehow.

    Long-term, AI engineering will rise as enterprise adoption increases, requiring a new breed of professionals skilled in software engineering, data science, and machine learning.

    Translation: eventually AI will get good enough that nobody will be needed anymore other than types-question guys [youtu.be] and everybody will get the sack.

    • by GameboyRMH ( 1153867 ) <gameboyrmh@@@gmail...com> on Wednesday October 09, 2024 @05:50PM (#64852189) Journal

      If companies are going to let generative AI shart code straight into their applications, I can see the future looking bright for security-related roles (and black-hat hackers).

      • And any and all vulnerabilities discovers will be unfixable because no skilled people will want to touch any shit, auto-generated, spaghetti code that would need rewriting from the grounds up to fix anything.

        • No way, I'm an expert. "ChatGPT, rewrite this function removing all security vulnerabilities." I'm an expert. I'm getting paid tons while you guys all weep.
      • by xtal ( 49134 )

        Ask GPT to be a security auditor and have it audit your code.

        It's pretty damn good.

        • Damn good at fooling people. Had my heart nearly stop today when I heard someone at work gave as their innovation idea to have AI scan existing code and combine it with Black Duck to find all our bugs. Which is really short cut for him saying "I don't have any useful skills but I'm good at running in place to make it look like I'm one of the movers!"

          We already have HUMAN intelligence, and even the dumbest human is vastly better than the state of the art AI, why not give it a chance for once?

    • by Rujiel ( 1632063 )
      "AI-native"? Seems these assholes have discovered a way to use "native" as branding for something that's actually bad.
    • Other translation: computing power is so cheap that a "hello world" now requires 15 GB of RAM and three virtual machines to run, and *no one cares*, or even notices.

    • Once the AI development loop closes to include automatic testing and validation itâ(TM)s over.

      There is still time to get on the boat early.

    • No, translation is "Gartner is still clueless and is likely to remain in the wrong quadrant for the remainder of their existence."

    • That's a good thought, but they also want people skilled in data science, software engineering and AI. That's quite a difficult skill set (ie, it takes a lot of time to develop). It's not going to happen.
    • And after all that, those companies will have to hire developers with actual talent, to untangle the mess generated by the types-question guys typing bad questions into their AIs..

  • Companies license "AI" stuff in the hope that it allows them to only require less, less skilled, and primarily cheaper, personnel. You will not find HR departments asking for "upskilled" hires after their company licensed $$$$$ "AI"-services.
    • The corporate dream is to fire the army of dead weight developers and testers. And hire a handful of AI rockstars do all the work.
      Good F'ing luck!

      • It's a very old story, and it hasn't really worked out well yet. "Fire the army of those we assume are dead weight employees and hire a $XYZ to do all the work, while we sit back and pat each other on the back."

        I've seen it personally. We get a new CEO, and we're all to fucking stupid to make IOT products (even though that's what we do) and he brings in his special team that managed to do nothing, and be late with the nothing, while the rest of us kept marching along and earning the bulk of profits. The

      • by sjames ( 1099 )

        Among other problems, they won't be offering rockstar pay for that.

  • I'm thinking the only relevant path now is get the FUCK out of software engineering. Move to management. Move to practical engineering. Basically, move to anything not directly tech related. The AI obsession will subsume real development work, whether we like it or not. The tech oligarchs have spoken. It *WILL* happen, because that's how they will tighten their grip on technology and bend what's left of society not already suckling at their teat of treachery and bullshit to their will.

    • I enjoyed some of the work I did in tech and I was good at it, but these effects of the AI obsession and the instability that has revealed itself in the now year-long and seemingly unending wave of layoffs make me want to get out of the industry and never look back, for the sake of future career stability. I've been applying to about 50/50 tech and non-tech jobs for a while now, and most of the really appealing jobs I've seen are non-tech.

    • It will be disruptive, but unless it produces useful results companies will eventually ditch it in favor of traditional approaches that work even if they are more expensive. It doesn't matter if you saved the company a ton of money by firing developers and reducing labor costs by 80% if the AI produces shit code and all of your customers leave and reduce your revenue by 100%.

      The developers who are laid off still have their skillsets. Given a lack of other gainful employment I wonder how many will start t
    • Gartner doesn't have a crystal ball. This article is hype and speculation.

      In order for his predictions to come true, code-generating AI will have to become a whole lot better than it is now. Not just a little better. Not just a natural linear progression which we can clearly plan for and expect we will achieve. It needs to take an essential "step up" in how it functions before it will be able to deliver at the quality level that is being promised here.

      Of course, articles with claims like this will get a

    • Jump over to COBOL and Ada. The old programmers still work with real intelligence and real tech, instead of the artificial kind.

  • This is not new (Score:5, Insightful)

    by MpVpRb ( 1423381 ) on Wednesday October 09, 2024 @05:53PM (#64852203)

    I learned programming in the 70s with punchcards on mainframes
    Being a programmer has always required constant learning, and being able to teach yourself is one of the most important skills to master

    • I learned programming in the 70s with punchcards on mainframes

      In HS, paper punched tape [wikipedia.org] on a Teletype Model 33 [wikipedia.org] (or similar) over dial-telephone with an acoustic coupler [wikipedia.org].

      Second year in university, punch cards on an IBM 4381 running MUSIC/SP [wikipedia.org] for a FORTRAN class as an EE student. Switched to CS after that and got access to the VAX 11/785 running 4.3BSD and ACSII terminals.

  • ... or code-monkeys?

    Proper software engineering, i.e. systems engineering applied to software systems, isn't going to care whether its AI-generated code or not.

    • by Anonymous Coward

      Proper software engineering, i.e. systems engineering applied to software systems, isn't going to care whether its AI-generated code or not.

      Err, no. One of the major differences between coding and software engineering is that you often need to prove the code is correct. In the same way that a real-time OS isn't made to be faster, it's made to operate predictably and reliably.

      In safety-critical scenarios it's not enough that something works, you need to understand and document how it works.

      • A big part of any project is the process. The more critical projects have a stunning amount of formal process. Maybe someone can train a machine to do it. I think for now we would be happy to have a machine that checks the the humans are doing the process and maybe even offer suggestions.
        At the end of the day I need a human being in the driver's seat taking responsibility for the decisions.

  • by Anonymous Coward

    All you have to do is the exact opposite of whatever Gartner thinks you should do.

  • Gartner (Score:5, Insightful)

    by abulafia ( 7826 ) on Wednesday October 09, 2024 @06:03PM (#64852223)

    Gosh, I will have to take time off from my upskilling to stay relevant in the Metaverse

    https://www.gartner.com/en/new... [gartner.com]

    to fit in my upskilling to stay relevant with AI.

    Good thing I was totally on top of the Transformational Impact of shitcoin:

    https://www.gartner.com/en/new... [gartner.com]

  • by dfghjk ( 711126 ) on Wednesday October 09, 2024 @06:05PM (#64852227)

    That article was filled with unprecedented amounts of stupid with no indication that the author knew the slightest thing about what he was writing, yet he was paid for that "work" and there will be no accountability for nothing falsifiable even being said. Pure bullshit, content free.

    • It's sort of like how every car company has a million awards from JD Power & Associates. Everybody is #1 at something, even if it's being #2, or taking a #2.

      Ah, yes, your shitty ECM system is the best value visionary niche player in this gartner quadrant.

  • So, after pushing the âzAI will take over the worldâoe narrativ for several years, Gartner extends the same to software engineers? Need to keep the hype up, guys.

  • Probably written by AI
  • This website doesn’t even support basic Unicode from decades ago — it’s 1980s backend will vomit emojibake over port 80 in response to this perfectly normal message from “the future”.
  • A new job will emerge: "fixer of bullshit insane code generated by AI". It'll be called a DevAIOps Engineer.

    • We've seen this cycle before. Ever had to fix Entity Framework-generated SQL queries? It's a lot harder than fixing regular SQL queries, because it creates such crap. AI will do this kind of stuff on steroids.

  • ... a new breed of professionals ...

    Everyone will have AI (like a phone), so everyone will need to know how to use its services (Eg. Facebook, Whats App, Instagram, TikTok, X/Twitter).

    To translate the translation: Everyone needs to be a better consumer of this new technology.

    This isn't about a much-touted paradigm shift, change in cost/efficiencies, or even the consequence/convenience of ubiquitous devices. This is advertising.

  • I am forced to work from home due to disability, so getting a job is extremely difficult and despite 30 years experience, it's been many months now with only one (unsuccessful) job interview. There are simply no programming jobs left.

    That's pretty astonishing, since before the ChatGPT reveal, it was never more than a month before I had a new job. I expect to be unemployed permanently now.

  • up skill to vote union!

  • Remember how we were all gonna be out of work because visual programming paradigms were going to obviate the need for old school "identify and implement appropriate layers of abstraction" programmers ?

  • Kind of like learning how to formulate a Google search, 20 years ago?

    A lot of people still don't know how to do it. But if you know how to get good info out of Google, you're probably going to be all right asking AI for stuff.

  • I am approaching 30 years of Gartner fluff. To this day I have never heard someone say, "This is why we follow Gartner's guidance and why we are winning."

    Why can't AI just read the headlines and blogs and play futurist? I bet they'd be as good or better than Gartner.

  • "AI" will be a thing of the past by then, just like every other passing fad... such as blockchains, the metaverse, NFT's, and other corporate trends in tech.

In the long run, every program becomes rococco, and then rubble. -- Alan Perlis

Working...