Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
AI Programming

80% of Software Engineers Must Upskill For AI Era By 2027, Gartner Warns (itpro.com) 61

80% of software engineers will need to upskill by 2027 to keep pace with generative AI's growing demands, according to Gartner. The consultancy predicts AI will transform the industry in three phases. Initially, AI tools will boost productivity, particularly for senior developers. Subsequently, "AI-native software engineering" will emerge, with most code generated by AI. Long-term, AI engineering will rise as enterprise adoption increases, requiring a new breed of professionals skilled in software engineering, data science, and machine learning.

80% of Software Engineers Must Upskill For AI Era By 2027, Gartner Warns

Comments Filter:
  • by Rosco P. Coltrane ( 209368 ) on Wednesday October 09, 2024 @05:41PM (#64852155)

    AI tools will boost productivity, particularly for senior developers. Subsequently, "AI-native software engineering" will emerge, with most code generated by AI.

    Translation: unskilled code monkeys will use AI to generate shit code that seasoned professionals will get to correct and rewrite properly, because somebody in charge who's never done software engineering in their life has been convinced by Microsoft that it's gonna be cheaper and faster somehow.

    Long-term, AI engineering will rise as enterprise adoption increases, requiring a new breed of professionals skilled in software engineering, data science, and machine learning.

    Translation: eventually AI will get good enough that nobody will be needed anymore other than types-question guys [youtu.be] and everybody will get the sack.

    • by GameboyRMH ( 1153867 ) <gameboyrmh@@@gmail...com> on Wednesday October 09, 2024 @05:50PM (#64852189) Journal

      If companies are going to let generative AI shart code straight into their applications, I can see the future looking bright for security-related roles (and black-hat hackers).

      • And any and all vulnerabilities discovers will be unfixable because no skilled people will want to touch any shit, auto-generated, spaghetti code that would need rewriting from the grounds up to fix anything.

        • No way, I'm an expert. "ChatGPT, rewrite this function removing all security vulnerabilities." I'm an expert. I'm getting paid tons while you guys all weep.
          • Hehe, well, I'm a software dev, I use ChatGPT plus my original skillset and it works rather nicely. Call me crazy but I feel like I'm just going to write more software and it's going to be higher quality. I also have noticed it's really good at doing things like writing documentation for code I upload. That not only saves me a metric assload of time, but makes my job more fun because I write more meaningful code at the core of what's going on and I get to "design" more than just bang out boilerplate code sn
            • Gartner group gets paid to provide strategy consulting to companies for IT transformations. They do not get paid for just making IT things run smoother.

              This is another "world will end" strategy marketing lines from the Gartner Group. Among their achievements is strategy consulting sold to CEOs that laying off tens of thousands of workers in the USA and hiring offshore India workers or importing H1B candidates is an effective long term business strategy.

              The longer result is an innovation drain and brain dr

              • That could be. Another possibility is that companies see less value in hiring in India and getting H1B's. Why? Because they can hire someone like me, I can farm out the easy & mundane parts to AI, and finally use my experience and expertise to build something that would have taken me three or four times as long before.

                The goal is to lower cost. A smart businessperson knows they can do it in multiple ways. Hiring one expensive guy can still be a bargain if your alternative is hiring a large number of c
                • Cloud, the most recent big IT trend is now in the disillusionment phase of technology with big companies having to examine if moving to the cloud was a cost reduction or a business boosting event.

                  Many cloud based systems over 5 years old are now having to reevaluate the cost and impact of finally needing to do large changes for new business logic and the cost / benefit is in play.

                  • These are just "money in motion" and if money is in motion, then an army of consulting firms, experts, tax collectors, accountants, planners, etc. can make money off oft the money in motion.

                    It's now to the point where whenever you see such "money in motion" equivalents it is a warning siren for the same reason that a "for the children" is in marketing outreach / news articles. People want to help the children, though adding a "for the children" for clout and sympathy points is worrying.

      • by xtal ( 49134 )

        Ask GPT to be a security auditor and have it audit your code.

        It's pretty damn good.

        • Damn good at fooling people. Had my heart nearly stop today when I heard someone at work gave as their innovation idea to have AI scan existing code and combine it with Black Duck to find all our bugs. Which is really short cut for him saying "I don't have any useful skills but I'm good at running in place to make it look like I'm one of the movers!"

          We already have HUMAN intelligence, and even the dumbest human is vastly better than the state of the art AI, why not give it a chance for once?

    • by Rujiel ( 1632063 )
      "AI-native"? Seems these assholes have discovered a way to use "native" as branding for something that's actually bad.
    • Once the AI development loop closes to include automatic testing and validation itâ(TM)s over.

      There is still time to get on the boat early.

    • No, translation is "Gartner is still clueless and is likely to remain in the wrong quadrant for the remainder of their existence."

    • That's a good thought, but they also want people skilled in data science, software engineering and AI. That's quite a difficult skill set (ie, it takes a lot of time to develop). It's not going to happen.
    • And after all that, those companies will have to hire developers with actual talent, to untangle the mess generated by the types-question guys typing bad questions into their AIs..

  • Companies license "AI" stuff in the hope that it allows them to only require less, less skilled, and primarily cheaper, personnel. You will not find HR departments asking for "upskilled" hires after their company licensed $$$$$ "AI"-services.
    • The corporate dream is to fire the army of dead weight developers and testers. And hire a handful of AI rockstars do all the work.
      Good F'ing luck!

      • It's a very old story, and it hasn't really worked out well yet. "Fire the army of those we assume are dead weight employees and hire a $XYZ to do all the work, while we sit back and pat each other on the back."

        I've seen it personally. We get a new CEO, and we're all to fucking stupid to make IOT products (even though that's what we do) and he brings in his special team that managed to do nothing, and be late with the nothing, while the rest of us kept marching along and earning the bulk of profits. The

      • by sjames ( 1099 )

        Among other problems, they won't be offering rockstar pay for that.

        • I'm a software developer making rockstar pay and using AI to speed things up, control quality, audit for bugs, etc... If I were fully running a company that needed software written rather than partnering with one, I'd want a guy like me and I'd definitely pay for that wizard in a cave (even if he's got an LLM in there with him) than for the 1000 monkeys + LLM model.
  • by nightflameauto ( 6607976 ) on Wednesday October 09, 2024 @05:52PM (#64852199)

    I'm thinking the only relevant path now is get the FUCK out of software engineering. Move to management. Move to practical engineering. Basically, move to anything not directly tech related. The AI obsession will subsume real development work, whether we like it or not. The tech oligarchs have spoken. It *WILL* happen, because that's how they will tighten their grip on technology and bend what's left of society not already suckling at their teat of treachery and bullshit to their will.

    • I enjoyed some of the work I did in tech and I was good at it, but these effects of the AI obsession and the instability that has revealed itself in the now year-long and seemingly unending wave of layoffs make me want to get out of the industry and never look back, for the sake of future career stability. I've been applying to about 50/50 tech and non-tech jobs for a while now, and most of the really appealing jobs I've seen are non-tech.

    • It will be disruptive, but unless it produces useful results companies will eventually ditch it in favor of traditional approaches that work even if they are more expensive. It doesn't matter if you saved the company a ton of money by firing developers and reducing labor costs by 80% if the AI produces shit code and all of your customers leave and reduce your revenue by 100%.

      The developers who are laid off still have their skillsets. Given a lack of other gainful employment I wonder how many will start t
    • Re:Upskill? (Score:4, Insightful)

      by Brain-Fu ( 1274756 ) on Wednesday October 09, 2024 @06:42PM (#64852329) Homepage Journal

      Gartner doesn't have a crystal ball. This article is hype and speculation.

      In order for his predictions to come true, code-generating AI will have to become a whole lot better than it is now. Not just a little better. Not just a natural linear progression which we can clearly plan for and expect we will achieve. It needs to take an essential "step up" in how it functions before it will be able to deliver at the quality level that is being promised here.

      Of course, articles with claims like this will get a lot of attention, and hence bring in ad revenue, which is why it was written.

      For those of us living in reality, it makes sense to continue steering our careers as we have been. Moving up to management may be wise just for the pay bump, and many software engineers do this anyway once they have the opportunity. Getting out of tech is a good idea for anyone who doesn't like it. But this all-hype future shouldn't motivate anything more than natural curiosity.

      It's not like we will be "left behind" if we just keep working our current jobs. Once this prediction actually comes true (if ever), we can skill up at that time. Until then, we can just adapt to the reality we presently face.

      • AI will have to become a whole lot better than it is now

        Yep. I use it daily and it's simply not good enough yet. It's helpful and does helpful things (esp writing code documentation) but it's not a panacea for labor costs on coders.

        It needs to take an essential "step up" in how it functions

        I also agree on this. It's clumsy. CoPilot included. It can write functions, snippets, and not more than about 100 lines before major bugs are apparent. It cannot even come close to writing large complex programs with inter-dependencies and multiple data sources, etc... I find it's pretty decent at automating boilerplate code and it

    • Jump over to COBOL and Ada. The old programmers still work with real intelligence and real tech, instead of the artificial kind.

  • This is not new (Score:5, Insightful)

    by MpVpRb ( 1423381 ) on Wednesday October 09, 2024 @05:53PM (#64852203)

    I learned programming in the 70s with punchcards on mainframes
    Being a programmer has always required constant learning, and being able to teach yourself is one of the most important skills to master

    • I learned programming in the 70s with punchcards on mainframes

      In HS, paper punched tape [wikipedia.org] on a Teletype Model 33 [wikipedia.org] (or similar) over dial-telephone with an acoustic coupler [wikipedia.org].

      Second year in university, punch cards on an IBM 4381 running MUSIC/SP [wikipedia.org] for a FORTRAN class as an EE student. Switched to CS after that and got access to the VAX 11/785 running 4.3BSD and ACSII terminals.

  • ... or code-monkeys?

    Proper software engineering, i.e. systems engineering applied to software systems, isn't going to care whether its AI-generated code or not.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Proper software engineering, i.e. systems engineering applied to software systems, isn't going to care whether its AI-generated code or not.

      Err, no. One of the major differences between coding and software engineering is that you often need to prove the code is correct. In the same way that a real-time OS isn't made to be faster, it's made to operate predictably and reliably.

      In safety-critical scenarios it's not enough that something works, you need to understand and document how it works.

      • A big part of any project is the process. The more critical projects have a stunning amount of formal process. Maybe someone can train a machine to do it. I think for now we would be happy to have a machine that checks the the humans are doing the process and maybe even offer suggestions.
        At the end of the day I need a human being in the driver's seat taking responsibility for the decisions.

      • by Vylen ( 800165 )

        Proper software engineering, i.e. systems engineering applied to software systems, isn't going to care whether its AI-generated code or not.

        Err, no. One of the major differences between coding and software engineering is that you often need to prove the code is correct. In the same way that a real-time OS isn't made to be faster, it's made to operate predictably and reliably.

        In safety-critical scenarios it's not enough that something works, you need to understand and document how it works.

        Uh, yeah? You've successfully described systems engineering.

        Doesn't matter if the code is generated by an artificial intelligence or an actual idiot, the same processes apply. Unit test, system test, document, verify, etc, as dictated by your engineering management system.

  • by Anonymous Coward

    All you have to do is the exact opposite of whatever Gartner thinks you should do.

  • Gartner (Score:5, Insightful)

    by abulafia ( 7826 ) on Wednesday October 09, 2024 @06:03PM (#64852223)

    Gosh, I will have to take time off from my upskilling to stay relevant in the Metaverse

    https://www.gartner.com/en/new... [gartner.com]

    to fit in my upskilling to stay relevant with AI.

    Good thing I was totally on top of the Transformational Impact of shitcoin:

    https://www.gartner.com/en/new... [gartner.com]

  • by dfghjk ( 711126 ) on Wednesday October 09, 2024 @06:05PM (#64852227)

    That article was filled with unprecedented amounts of stupid with no indication that the author knew the slightest thing about what he was writing, yet he was paid for that "work" and there will be no accountability for nothing falsifiable even being said. Pure bullshit, content free.

    • It's sort of like how every car company has a million awards from JD Power & Associates. Everybody is #1 at something, even if it's being #2, or taking a #2.

      Ah, yes, your shitty ECM system is the best value visionary niche player in this gartner quadrant.

  • So, after pushing the âzAI will take over the worldâoe narrativ for several years, Gartner extends the same to software engineers? Need to keep the hype up, guys.

  • Probably written by AI
  • This website doesn’t even support basic Unicode from decades ago — it’s 1980s backend will vomit emojibake over port 80 in response to this perfectly normal message from “the future”.
  • A new job will emerge: "fixer of bullshit insane code generated by AI". It'll be called a DevAIOps Engineer.

    • We've seen this cycle before. Ever had to fix Entity Framework-generated SQL queries? It's a lot harder than fixing regular SQL queries, because it creates such crap. AI will do this kind of stuff on steroids.

  • ... a new breed of professionals ...

    Everyone will have AI (like a phone), so everyone will need to know how to use its services (Eg. Facebook, Whats App, Instagram, TikTok, X/Twitter).

    To translate the translation: Everyone needs to be a better consumer of this new technology.

    This isn't about a much-touted paradigm shift, change in cost/efficiencies, or even the consequence/convenience of ubiquitous devices. This is advertising.

  • I am forced to work from home due to disability, so getting a job is extremely difficult and despite 30 years experience, it's been many months now with only one (unsuccessful) job interview. There are simply no programming jobs left.

    That's pretty astonishing, since before the ChatGPT reveal, it was never more than a month before I had a new job. I expect to be unemployed permanently now.

  • up skill to vote union!

  • Remember how we were all gonna be out of work because visual programming paradigms were going to obviate the need for old school "identify and implement appropriate layers of abstraction" programmers ?

  • Kind of like learning how to formulate a Google search, 20 years ago?

    A lot of people still don't know how to do it. But if you know how to get good info out of Google, you're probably going to be all right asking AI for stuff.

  • I am approaching 30 years of Gartner fluff. To this day I have never heard someone say, "This is why we follow Gartner's guidance and why we are winning."

    Why can't AI just read the headlines and blogs and play futurist? I bet they'd be as good or better than Gartner.

  • "AI" will be a thing of the past by then, just like every other passing fad... such as blockchains, the metaverse, NFT's, and other corporate trends in tech.
  • Software Engineer skills up to use AI
    Software Engineer: AI write a function to do X in language Y, Here are the input parameters, here is what the output should be.

    The future
    Software Development Manager: AI write me a Platform that allows me to fire all my Software Engineers.
  • The job of a software engineer is to constantly learn new skills and tools and apply them to solve problems.

    Coders are tools software engineers use to pump out code.

    So, an engineer is like the person who designed the car, the coder is the assembly line worker and the AI is the tool which makes it so we need 1/2 the engineers and 1/50th the assembly line workers

Space tells matter how to move and matter tells space how to curve. -- Wheeler

Working...