Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
AI Programming

80% of Software Engineers Must Upskill For AI Era By 2027, Gartner Warns (itpro.com) 108

80% of software engineers will need to upskill by 2027 to keep pace with generative AI's growing demands, according to Gartner. The consultancy predicts AI will transform the industry in three phases. Initially, AI tools will boost productivity, particularly for senior developers. Subsequently, "AI-native software engineering" will emerge, with most code generated by AI. Long-term, AI engineering will rise as enterprise adoption increases, requiring a new breed of professionals skilled in software engineering, data science, and machine learning.
This discussion has been archived. No new comments can be posted.

80% of Software Engineers Must Upskill For AI Era By 2027, Gartner Warns

Comments Filter:
  • by Rosco P. Coltrane ( 209368 ) on Wednesday October 09, 2024 @04:41PM (#64852155)

    AI tools will boost productivity, particularly for senior developers. Subsequently, "AI-native software engineering" will emerge, with most code generated by AI.

    Translation: unskilled code monkeys will use AI to generate shit code that seasoned professionals will get to correct and rewrite properly, because somebody in charge who's never done software engineering in their life has been convinced by Microsoft that it's gonna be cheaper and faster somehow.

    Long-term, AI engineering will rise as enterprise adoption increases, requiring a new breed of professionals skilled in software engineering, data science, and machine learning.

    Translation: eventually AI will get good enough that nobody will be needed anymore other than types-question guys [youtu.be] and everybody will get the sack.

    • by GameboyRMH ( 1153867 ) <gameboyrmh@@@gmail...com> on Wednesday October 09, 2024 @04:50PM (#64852189) Journal

      If companies are going to let generative AI shart code straight into their applications, I can see the future looking bright for security-related roles (and black-hat hackers).

      • And any and all vulnerabilities discovers will be unfixable because no skilled people will want to touch any shit, auto-generated, spaghetti code that would need rewriting from the grounds up to fix anything.

        • by phantomfive ( 622387 ) on Wednesday October 09, 2024 @07:59PM (#64852671) Journal
          No way, I'm an expert. "ChatGPT, rewrite this function removing all security vulnerabilities." I'm an expert. I'm getting paid tons while you guys all weep.
          • Hehe, well, I'm a software dev, I use ChatGPT plus my original skillset and it works rather nicely. Call me crazy but I feel like I'm just going to write more software and it's going to be higher quality. I also have noticed it's really good at doing things like writing documentation for code I upload. That not only saves me a metric assload of time, but makes my job more fun because I write more meaningful code at the core of what's going on and I get to "design" more than just bang out boilerplate code sn
            • Gartner group gets paid to provide strategy consulting to companies for IT transformations. They do not get paid for just making IT things run smoother.

              This is another "world will end" strategy marketing lines from the Gartner Group. Among their achievements is strategy consulting sold to CEOs that laying off tens of thousands of workers in the USA and hiring offshore India workers or importing H1B candidates is an effective long term business strategy.

              The longer result is an innovation drain and brain dr

              • That could be. Another possibility is that companies see less value in hiring in India and getting H1B's. Why? Because they can hire someone like me, I can farm out the easy & mundane parts to AI, and finally use my experience and expertise to build something that would have taken me three or four times as long before.

                The goal is to lower cost. A smart businessperson knows they can do it in multiple ways. Hiring one expensive guy can still be a bargain if your alternative is hiring a large number of c
                • Cloud, the most recent big IT trend is now in the disillusionment phase of technology with big companies having to examine if moving to the cloud was a cost reduction or a business boosting event.

                  Many cloud based systems over 5 years old are now having to reevaluate the cost and impact of finally needing to do large changes for new business logic and the cost / benefit is in play.

                  • These are just "money in motion" and if money is in motion, then an army of consulting firms, experts, tax collectors, accountants, planners, etc. can make money off oft the money in motion.

                    It's now to the point where whenever you see such "money in motion" equivalents it is a warning siren for the same reason that a "for the children" is in marketing outreach / news articles. People want to help the children, though adding a "for the children" for clout and sympathy points is worrying.

                  • Cloud, the most recent big IT trend is now in the disillusionment phase of technology with big companies having to examine if moving to the cloud was a cost reduction or a business boosting event.

                    Many cloud based systems over 5 years old are now having to reevaluate the cost and impact of finally needing to do large changes for new business logic and the cost / benefit is in play.

                    And at least a few of us said the cloud was a really bad idea even back then when the cloud was perfectly secure, and always available, the final solution for computing.

                • by zlives ( 2009072 )

                  nah, india/whereeverelse will always be cheaper then you.

            • I also have noticed it's really good at doing things like writing documentation for code I upload. That not only saves me a metric assload of time, but makes my job more fun

              Yeah no shit, but now whoever comes after you has to read through that garbage and figure out what is going on. Documentation isn't for you it's for the people who come after you.

              more than just bang out boilerplate code snippets I've written variations on more than 1000 times.

              Stop writing boilerplate. Encapsulate it in a function.

              • Documentation isn't for you it's for the people who come after you.

                There won't be any. Nobody will read this stuff in my case, it's just required. If anyone does work on this codebase in the future it'll be me. It's almost a certainty in this case.

                Stop writing boilerplate. Encapsulate it in a function.

                That's not always possible as one switches code bases. Sometimes it's due to licenses. Sometimes it's due to customers having different standards. Other times it's because there is too much variation. I've been coding for more than 15 minutes. I'm aware of functions and use them where possible.

              • Some boilerplate is an inevitable response to questionable architecture forced upon you by libraries.

                You can't not make your Django boilerplate if you add a new site to your project, for example.

            • by zlives ( 2009072 )

              and i quote
              "Initially, AI tools will boost productivity"
              but that does not mean enshitification hasn't already taken place around you

      • by xtal ( 49134 )

        Ask GPT to be a security auditor and have it audit your code.

        It's pretty damn good.

        • Damn good at fooling people. Had my heart nearly stop today when I heard someone at work gave as their innovation idea to have AI scan existing code and combine it with Black Duck to find all our bugs. Which is really short cut for him saying "I don't have any useful skills but I'm good at running in place to make it look like I'm one of the movers!"

          We already have HUMAN intelligence, and even the dumbest human is vastly better than the state of the art AI, why not give it a chance for once?

    • by Rujiel ( 1632063 )
      "AI-native"? Seems these assholes have discovered a way to use "native" as branding for something that's actually bad.
    • Once the AI development loop closes to include automatic testing and validation itâ(TM)s over.

      There is still time to get on the boat early.

    • by Darinbob ( 1142669 ) on Wednesday October 09, 2024 @07:54PM (#64852655)

      No, translation is "Gartner is still clueless and is likely to remain in the wrong quadrant for the remainder of their existence."

      • by gtall ( 79522 )

        Yes, but Gartner could also become quite useful. A company could train a bot on their previous "reports" and then use it to produce new "reports". I doubt anything Gartner produces is really insightful, just the usual MBA-Speak to make CEOs feel like they are leading their companies.

        • by Darinbob ( 1142669 ) on Thursday October 10, 2024 @01:05PM (#64854517)

          The concept of Gartner isn't terrible, it's just that Gartner is so terrible at what they claim to do. Gartner doesn't really do analysis of actual technology, they just do analysis of marketing literature which is not a reliable data set. You're better off having the CEO visit a palm reader than to rely on Gartner's analysis. Plus I've seen evidence that you can just pay of Gartner to get into a different quadrant, though I'm sure you can do that with the palm readers too.

    • That's a good thought, but they also want people skilled in data science, software engineering and AI. That's quite a difficult skill set (ie, it takes a lot of time to develop). It's not going to happen.
    • And after all that, those companies will have to hire developers with actual talent, to untangle the mess generated by the types-question guys typing bad questions into their AIs..

    • ... and about to be replaced by software bots epic-style shit-talks "unskilled code monkeys" and thinks he is exempt because he is such a 133+ coder. LOL!

      In a nutshell: Please go away or I will replace you with a very small AI prompt. Like, right now.

    • AI tools will boost productivity, particularly for senior developers. Subsequently, "AI-native software engineering" will emerge, with most code generated by AI.

      Translation: unskilled code monkeys will use AI to generate shit code that seasoned professionals will get to correct and rewrite properly, because somebody in charge who's never done software engineering in their life has been convinced by Microsoft that it's gonna be cheaper and faster somehow.

      It has been a very long time since a person could leave college, and end their learning process the day they graduate.

      I've been upskilling since the 1970's. The pace of technology has required lifelong learning even back then to now. And really, that should be enjoyable. Who doesn't like to learn new things, especially when it can fatten your wallet?

    • by whitroth ( 9367 )

      All of which is, of course, BS. A friend's losing his long-term job; the company's going to use AI, and send the results to a gruop of really incompetent programmers in India, whose code he's had to fix and explain to them what's wrong all along.

  • Companies license "AI" stuff in the hope that it allows them to only require less, less skilled, and primarily cheaper, personnel. You will not find HR departments asking for "upskilled" hires after their company licensed $$$$$ "AI"-services.
    • The corporate dream is to fire the army of dead weight developers and testers. And hire a handful of AI rockstars do all the work.
      Good F'ing luck!

      • It's a very old story, and it hasn't really worked out well yet. "Fire the army of those we assume are dead weight employees and hire a $XYZ to do all the work, while we sit back and pat each other on the back."

        I've seen it personally. We get a new CEO, and we're all to fucking stupid to make IOT products (even though that's what we do) and he brings in his special team that managed to do nothing, and be late with the nothing, while the rest of us kept marching along and earning the bulk of profits. The

        • by gtall ( 79522 )

          Nonsense, the new team produced volumes of reports, bullet points, talking points, and other sparkling things the CEO could point at as evidence of his new "directions" for the company.

          • Yes, we always need more documentation to prove that we're productive! We can even hire people that do nothing but document stuff they don't understand, and many offshoring agencies specialize in the ability to take your money in return for reducing productivity while increasing documentation bulk.

      • by sjames ( 1099 )

        Among other problems, they won't be offering rockstar pay for that.

        • I'm a software developer making rockstar pay and using AI to speed things up, control quality, audit for bugs, etc... If I were fully running a company that needed software written rather than partnering with one, I'd want a guy like me and I'd definitely pay for that wizard in a cave (even if he's got an LLM in there with him) than for the 1000 monkeys + LLM model.
          • Some of the people on my team make $200K-$400K base. Even with such salary, it's has not been easy for us to hire more people. On top of that, one guy announced his early retirement last month.
            Staffing is a real uphill battle at almost every level. Well, except for the short-sighted companies willing to accept a high churn rate. Of course the high churn rate means they get to watch the company's knowledge float over to the competition. Maybe that can't happen with AI. Until some idiot C-suite sells your dat

          • by sjames ( 1099 )

            The difference is you're not looking in the bargain basement in the first place. Companies actually willing to pay rockstar salaries for rockstar performance don't replace competent developers with AI assisted monkeys and then hire a rockstar to sweep up after them.

  • Upskill? (Score:5, Interesting)

    by nightflameauto ( 6607976 ) on Wednesday October 09, 2024 @04:52PM (#64852199)

    I'm thinking the only relevant path now is get the FUCK out of software engineering. Move to management. Move to practical engineering. Basically, move to anything not directly tech related. The AI obsession will subsume real development work, whether we like it or not. The tech oligarchs have spoken. It *WILL* happen, because that's how they will tighten their grip on technology and bend what's left of society not already suckling at their teat of treachery and bullshit to their will.

    • I enjoyed some of the work I did in tech and I was good at it, but these effects of the AI obsession and the instability that has revealed itself in the now year-long and seemingly unending wave of layoffs make me want to get out of the industry and never look back, for the sake of future career stability. I've been applying to about 50/50 tech and non-tech jobs for a while now, and most of the really appealing jobs I've seen are non-tech.

    • It will be disruptive, but unless it produces useful results companies will eventually ditch it in favor of traditional approaches that work even if they are more expensive. It doesn't matter if you saved the company a ton of money by firing developers and reducing labor costs by 80% if the AI produces shit code and all of your customers leave and reduce your revenue by 100%.

      The developers who are laid off still have their skillsets. Given a lack of other gainful employment I wonder how many will start t
      • The developers who are laid off still have their skillsets. Given a lack of other gainful employment I wonder how many will start their own companies. Perhaps they might wind up competing against their former employer given they have some understanding of the business need for whatever they used to get paid to develop code for. Maybe they've secretly been wishing for a chance to start fresh instead of having to support a shitty legacy code base.

        This reminds me of the dotcom waves during the burst/bust end period. Last place I worked, around the year 98-99. There was a guy that got shoved out because "techies are everywhere and your salary is just too high." Two years and some change later, he was working as a consultant at his same job, making about four times more.

        The problem is this time around it feels like management is all aboard the hype cycles, and not paying any attention to the fact that it's not really producing viable results yet. "It w

    • Re:Upskill? (Score:5, Insightful)

      by Brain-Fu ( 1274756 ) on Wednesday October 09, 2024 @05:42PM (#64852329) Homepage Journal

      Gartner doesn't have a crystal ball. This article is hype and speculation.

      In order for his predictions to come true, code-generating AI will have to become a whole lot better than it is now. Not just a little better. Not just a natural linear progression which we can clearly plan for and expect we will achieve. It needs to take an essential "step up" in how it functions before it will be able to deliver at the quality level that is being promised here.

      Of course, articles with claims like this will get a lot of attention, and hence bring in ad revenue, which is why it was written.

      For those of us living in reality, it makes sense to continue steering our careers as we have been. Moving up to management may be wise just for the pay bump, and many software engineers do this anyway once they have the opportunity. Getting out of tech is a good idea for anyone who doesn't like it. But this all-hype future shouldn't motivate anything more than natural curiosity.

      It's not like we will be "left behind" if we just keep working our current jobs. Once this prediction actually comes true (if ever), we can skill up at that time. Until then, we can just adapt to the reality we presently face.

      • AI will have to become a whole lot better than it is now

        Yep. I use it daily and it's simply not good enough yet. It's helpful and does helpful things (esp writing code documentation) but it's not a panacea for labor costs on coders.

        It needs to take an essential "step up" in how it functions

        I also agree on this. It's clumsy. CoPilot included. It can write functions, snippets, and not more than about 100 lines before major bugs are apparent. It cannot even come close to writing large complex programs with inter-dependencies and multiple data sources, etc... I find it's pretty decent at automating boilerplate code and it

    • Jump over to COBOL and Ada. The old programmers still work with real intelligence and real tech, instead of the artificial kind.

    • by gweihir ( 88907 )

      It will also all get scrapped again because it will create one catastrophe after the other. So surviving in some protected niche and then being in huge demand when it all comes crashing down might also be a viable strategy.

  • This is not new (Score:5, Insightful)

    by MpVpRb ( 1423381 ) on Wednesday October 09, 2024 @04:53PM (#64852203)

    I learned programming in the 70s with punchcards on mainframes
    Being a programmer has always required constant learning, and being able to teach yourself is one of the most important skills to master

    • I learned programming in the 70s with punchcards on mainframes

      In HS, paper punched tape [wikipedia.org] on a Teletype Model 33 [wikipedia.org] (or similar) over dial-telephone with an acoustic coupler [wikipedia.org].

      Second year in university, punch cards on an IBM 4381 running MUSIC/SP [wikipedia.org] for a FORTRAN class as an EE student. Switched to CS after that and got access to the VAX 11/785 running 4.3BSD and ACSII terminals.

  • ... or code-monkeys?

    Proper software engineering, i.e. systems engineering applied to software systems, isn't going to care whether its AI-generated code or not.

    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Proper software engineering, i.e. systems engineering applied to software systems, isn't going to care whether its AI-generated code or not.

      Err, no. One of the major differences between coding and software engineering is that you often need to prove the code is correct. In the same way that a real-time OS isn't made to be faster, it's made to operate predictably and reliably.

      In safety-critical scenarios it's not enough that something works, you need to understand and document how it works.

      • A big part of any project is the process. The more critical projects have a stunning amount of formal process. Maybe someone can train a machine to do it. I think for now we would be happy to have a machine that checks the the humans are doing the process and maybe even offer suggestions.
        At the end of the day I need a human being in the driver's seat taking responsibility for the decisions.

      • by Vylen ( 800165 )

        Proper software engineering, i.e. systems engineering applied to software systems, isn't going to care whether its AI-generated code or not.

        Err, no. One of the major differences between coding and software engineering is that you often need to prove the code is correct. In the same way that a real-time OS isn't made to be faster, it's made to operate predictably and reliably.

        In safety-critical scenarios it's not enough that something works, you need to understand and document how it works.

        Uh, yeah? You've successfully described systems engineering.

        Doesn't matter if the code is generated by an artificial intelligence or an actual idiot, the same processes apply. Unit test, system test, document, verify, etc, as dictated by your engineering management system.

        • by gweihir ( 88907 )

          Cutting out the part were the code is created by somebody competent is not going to make the overall process cheaper, faster or better. It will do the opposite. See Crowdstrike for a recent example. You need _all_ safety mechanisms in place and functioning well for systems engineering to work out, and that very much includes the coders.

  • by Anonymous Coward

    All you have to do is the exact opposite of whatever Gartner thinks you should do.

  • Gartner (Score:5, Insightful)

    by abulafia ( 7826 ) on Wednesday October 09, 2024 @05:03PM (#64852223)

    Gosh, I will have to take time off from my upskilling to stay relevant in the Metaverse

    https://www.gartner.com/en/new... [gartner.com]

    to fit in my upskilling to stay relevant with AI.

    Good thing I was totally on top of the Transformational Impact of shitcoin:

    https://www.gartner.com/en/new... [gartner.com]

  • by dfghjk ( 711126 ) on Wednesday October 09, 2024 @05:05PM (#64852227)

    That article was filled with unprecedented amounts of stupid with no indication that the author knew the slightest thing about what he was writing, yet he was paid for that "work" and there will be no accountability for nothing falsifiable even being said. Pure bullshit, content free.

    • It's sort of like how every car company has a million awards from JD Power & Associates. Everybody is #1 at something, even if it's being #2, or taking a #2.

      Ah, yes, your shitty ECM system is the best value visionary niche player in this gartner quadrant.

    • by gweihir ( 88907 )

      Pretty much. There was some low-level manipulative skill involved though. For example, the concrete number (2027) is an indicator of that.

    • The kind of writing that AI are really good at. Someone's job is certainly going to be replaced by an AI.

    • Gartner has never been able to find its ass with both hands, is constantly wrong, yet people who should know better still believe the crap spewing from that shit-fountain. It's stunning.

    • That article was filled with unprecedented amounts of stupid with no indication that the author knew the slightest thing about what he was writing, yet he was paid for that "work" and there will be no accountability for nothing falsifiable even being said. Pure bullshit, content free.

      So what you are saying is that Gartner analysts can be replaced by AI today. Perhaps the author needs to reskill?

  • So, after pushing the âzAI will take over the worldâoe narrativ for several years, Gartner extends the same to software engineers? Need to keep the hype up, guys.

  • Probably written by AI
  • This website doesn’t even support basic Unicode from decades ago — it’s 1980s backend will vomit emojibake over port 80 in response to this perfectly normal message from “the future”.
  • A new job will emerge: "fixer of bullshit insane code generated by AI". It'll be called a DevAIOps Engineer.

    • Re:I am predicting (Score:5, Insightful)

      by Tony Isaac ( 1301187 ) on Wednesday October 09, 2024 @08:03PM (#64852677) Homepage

      We've seen this cycle before. Ever had to fix Entity Framework-generated SQL queries? It's a lot harder than fixing regular SQL queries, because it creates such crap. AI will do this kind of stuff on steroids.

      • by gweihir ( 88907 )

        Exactly. I predict that doing anything larger with AI will result in a minimally functional but grossly unreliable, insecure and unmaintainable mess that cannot actually be fixed anymore. Hence AI will essentially be a tool to help hobbyist coders with simple stuff and that is it.

    • by gweihir ( 88907 )

      Yep. Will probably come wit a massive risk of clinical insanity after 2-5 years of working in it.

  • ... a new breed of professionals ...

    Everyone will have AI (like a phone), so everyone will need to know how to use its services (Eg. Facebook, Whats App, Instagram, TikTok, X/Twitter).

    To translate the translation: Everyone needs to be a better consumer of this new technology.

    This isn't about a much-touted paradigm shift, change in cost/efficiencies, or even the consequence/convenience of ubiquitous devices. This is advertising.

  • I am forced to work from home due to disability, so getting a job is extremely difficult and despite 30 years experience, it's been many months now with only one (unsuccessful) job interview. There are simply no programming jobs left.

    That's pretty astonishing, since before the ChatGPT reveal, it was never more than a month before I had a new job. I expect to be unemployed permanently now.

  • up skill to vote union!

  • Remember how we were all gonna be out of work because visual programming paradigms were going to obviate the need for old school "identify and implement appropriate layers of abstraction" programmers ?

    • by gweihir ( 88907 )

      Yes. And that was not even the first time this stupid and disconnected idea has failed.

  • Kind of like learning how to formulate a Google search, 20 years ago?

    A lot of people still don't know how to do it. But if you know how to get good info out of Google, you're probably going to be all right asking AI for stuff.

  • I am approaching 30 years of Gartner fluff. To this day I have never heard someone say, "This is why we follow Gartner's guidance and why we are winning."

    Why can't AI just read the headlines and blogs and play futurist? I bet they'd be as good or better than Gartner.

  • "AI" will be a thing of the past by then, just like every other passing fad... such as blockchains, the metaverse, NFT's, and other corporate trends in tech.
  • Software Engineer skills up to use AI
    Software Engineer: AI write a function to do X in language Y, Here are the input parameters, here is what the output should be.

    The future
    Software Development Manager: AI write me a Platform that allows me to fire all my Software Engineers.
  • The job of a software engineer is to constantly learn new skills and tools and apply them to solve problems.

    Coders are tools software engineers use to pump out code.

    So, an engineer is like the person who designed the car, the coder is the assembly line worker and the AI is the tool which makes it so we need 1/2 the engineers and 1/50th the assembly line workers
    • by gweihir ( 88907 )

      I agree on the definitions, but AI will only increase the need for engineers. The average (really bad) coder may be a dying breed though.

  • You still need to be able to code and understand computers, or you'll be death prompting in circles.

  • by gweihir ( 88907 ) on Thursday October 10, 2024 @01:53AM (#64853037)

    By 2027, most of the AI craze will be over and the pathetic state of the technology will be have become clear to everybody except a few die-hard morons. Might also take a bit longer.

  • Don't know if any of you has used the newest shit in any meaningful manner, but with us ChatGPT o1 came up with a non-trivial VDS migration that would've taken seasoned admins 2 days at minimum in 20 seconds, just the other week. It was a 5 page script that described in detail what we had to do. Flawlessly.

    Even right now AI shrinks a task that would require a seasoned expert a day or two if not a week of research and experimentation down to 60 minutes, manual execution included. And the AI crew are just wa

    • That may be true, but I love programming, designing en building software, with teams of real people. I don't love reviewing AI code. Maybe they can get an AI to do the review to? If this is where the software industry is going, i'd rather be doing something else.

    • But then we only learn HOW to do things - and only THE WAY THE AI TELLS US.

      I prefer to learn WHY it should be done that way (and HOW it works) and also WHY NOT to do things.

      We are risking losing our rich heritage of wisdom built up over human history, using our own and others' experience and understanding of what works, what doesn't and the consequences of jumping to LET'S TRY A NEW THING that nobody else ever thought was a good idea - often it's because of BAD THINGS that you CAN'T SEE YET, because you are

    • by bartle ( 447377 ) on Thursday October 10, 2024 @10:21AM (#64853869) Homepage

      How do these LLM based AIs do at modifying human-written existing code? Lets say a project that consists of hundreds of source files, across multiple platforms and languages? I ask because there are A LOT of legacy systems out there that aren't going away anytime soon.

      Personally, I'm skeptical that an AI will ever be any good at maintaining its own code. Lets say that some manager has an AI spit out the source code to create a website or phone app and the AI does it perfectly. How will the AI respond when the manager asks it to add a few new fields later on, or to change the existing logic in some way? I'm genuinely curious to know the answer.

  • It's our day job to upskill. If something is going to make something easier, we'll use that framework.

    On the other hand, look at all of the legacy shit still in production today. There is a mountain of old code that gippity won't help with, and it isn't going to be magicked away anytime soon. So, if you don't want to deal with new frameworks, there is still work for you.

  • ...was brought to you by a fine tuned LLM. One day we'll look back on humanity's AI moment and frown how about how people didn't go "can't you see the internet will change everything?!" Oh.
  • But costs for that are moving on up, so... not so much enthusiasm any more. Now it's AI. It's not going to stay cheap for long.
  • How much harder is a middle-class living gonna get? As we get older “upskilling” gets more difficult and the grind doesn't get any easier. Soon working will be as bad for your health as alcohol and cigarettes.
  • When will AI finally replace Gartner because their BS cannot be that difficult to make up and shit out into the world..

"The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts." -- Bertrand Russell

Working...