Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
AI Programming

Could AI Transform Continuous Delivery Development? (thenextweb.com) 78

An anonymous reader quotes The Next Web: According to one study, high-performing IT units with faster software releases are twice as likely to achieve their goals in customer satisfaction, profitability, market share and productivity. Acknowledgement of this has fueled a headlong rush toward what software developers call "continuous delivery"... It's a process most technology departments aspire to but only a fraction have achieved. According to a recent survey by Evans Data, 65 percent of organizations are using continuous delivery on at least some projects, but only 28 percent are using it for all their software. Among non-SaaS companies, that proportion is just 18 percent...

So what comes next? The future of application development depends on using artificial intelligence within the continuous delivery model... We're at the precipice of a new world of AI-aided development that will kick software deployment speeds -- and therefore a company's ability to compete -- into high gear. "AI can improve the way we build current software," writes Diego Lo Giudice of Forrester Research in a recent report. "It will change the way we think about applications -- not programming step by step, but letting the system learn to do what it needs to do -- a new paradigm shift." The possibilities are limited only by our creativity and the investment organizations are willing to make.

The article was written by the head of R&D at Rainforest QA, which is already using AI to manage their crowdsourced quality assurance testing. But he ultimately predicts bigger roles for AI in continuous delivery development -- even choosing which modifications to use in A/B testing, and more systematic stress-testing.
This discussion has been archived. No new comments can be posted.

Could AI Transform Continuous Delivery Development?

Comments Filter:
  • Welcome to the new age of yesterday.
    • Re: (Score:2, Insightful)

      by Anonymous Coward

      Yeah, this is an incredibly low quality article. It doesn't specify what it means by what AI should do, doesn't specify which type of AI, doesn't specify why AI should be used, etc. Junk article.

      It's basically a bullshit bingo post where someone repeats a buzzword without any knowledge of the material behind it.

  • buzzwords (Score:5, Funny)

    by xbytor ( 215790 ) on Sunday August 27, 2017 @03:00PM (#55093989) Homepage

    >a new paradigm shift.

    I stopped reading after this.

    • Not enough leveraging core competencies through blue sky thinking and synergistic best of breed cloud machine learning for you?

    • Same Old Thing (Score:5, Insightful)

      by sycodon ( 149926 ) on Sunday August 27, 2017 @03:10PM (#55094039)

      Holy Fuck.

      Continuous integration
      Prototyping
      Incremental development
      Rapid application development
      Agile development
      Waterfall development
      Spiral development

      Now, introducing, "Continuous Delivery"...or something.

      Here is the actual model, a model that will exist for the next 1,000 years.

      1. Someone (or something) gathers requirement.
      2. They get it wrong.
      3. They develop the wrong thing that doesn't even work they way they thought it should
      4. The project leader is canned
      5. The software is implemented by an outside vendor, with all the flaws.
      6. The software operates finally after 5 years of modifications to both the software and the workflows (to match the flaws in the software).
      7. As soon as it's all running properly and everyone is trained, a new project is launched to redo it, "the right way".
      8. Goto 1

      • No no. We got rid of line numbers a long time ago.

      • +1 Depressing
      • It's a genetic algorithm where YOU are the population being flushed out each cycle.

      • Here is the actual model, a model that will exist for the next 1,000 years.

        1. Someone (or something) gathers requirement.
        2. They get it wrong.
        3. They develop the wrong thing that doesn't even work they way they thought it should
        4. The project leader is canned
        5. The software is implemented by an outside vendor, with all the flaws.
        6. The software operates finally after 5 years of modifications to both the software and the workflows (to match the flaws in the software).
        7. As soon as it's all running properly and everyone is trained, a new project is launched to redo it, "the right way". 8. Goto 1

        You just accurately described a 6 year project within our organization....and it made me cry
        Does this model have a name? an urban dictionary name?
        if not it needs one.

    • Re:buzzwords (Score:5, Insightful)

      by alvinrod ( 889928 ) on Sunday August 27, 2017 @03:15PM (#55094063)
      Yeah, maybe there's something useful in TFA, but I'm not really inclined to go looking based on what was in the summary. At no point, did the person being quoted actually say anything of substance. It's just buzzword soup with a dash of new technologies thrown in. Five years ago they would have said practically the same words, but just talked about utilizing the cloud instead of AI.

      I'm also a little skeptical of any study published by a company looking to sell you what the study has just claimed to be great. That doesn't mean its a complete sham, but how hard did they look for other explanations why some companies are more successful than others?
  • by Anonymous Coward

    that's all, folks...

  • I notice the targets are all set from the company's point of view... including customer satisfaction. However it's quite easy to meet any goal, as long as you set it low enough.

    Companies like Comcast or Qwest objectively have abysmal customer satisfaction ratings; but they likely meet their internal goal for that metric. I notice, in their public communications, they always use phrasing along the lines of "giving you an even better customer service experience" - again, the trick is to set the target low and

  • by petes_PoV ( 912422 ) on Sunday August 27, 2017 @04:56PM (#55094339)
    This might be good for developers, but it's a nightmare for the poor, bloody, customers.

    Any professional outfit will test a new release (in-house or commercial product) thoroughly before letting it get anywhere close to an environment where their business is at stake.

    This process can take anywhere from a day or two to several months, depending on the complexity of the operation, the scope of the changes, HOW MANY (developers note: not if any) bugs are found and whether any alterations to working practices have to be introduced.

    So to have developers lob a new "release" over the wall at frequent intervals is not useful, it isn't clever, nor does it save (the users) any money or speed up their acceptance. It just costs more in integration testing, floods the change control process with "issues" and means that when you report (again, developers: not if) problems, it is virtually impossible to describe exactly which release you are referring to and even more impossible for whoever fixes the bugs to produce the same version to fix and then incorporate those fixes into whatever happens to be the latest version - that hour. Even more so when dozens of major corporate customers are ALL reporting bugs with each new version they test.

    • Any professional outfit will test a new release (in-house or commercial product) thoroughly before letting it get anywhere close to an environment where their business is at stake. This process can take anywhere from a day or two to several months, depending on the complexity of the operation, the scope of the changes, HOW MANY (developers note: not if any) bugs are found and whether any alterations to working practices have to be introduced.

      I wanted to chime in with a tangible anecdote to support your

      • I can sympathize with that few, of it appearing to have too many developers focused upon deployment/testing then actual development.

        However, I come at it from the other side, the developers just push new development out and production support is responsible for addressing the mess, it is horrible, there is too much disconnect between developers and their resulting output creating consistent outages.

        The most successful teams follow the mantra "Eat your own dog food", developers who support the crap they push

    • This might be good for developers

      It's not even good for developers.

  • AI is enough of a problem, why make it worse?

  • One study, well then I'm sold.

    But do you know who likes Continuous Delivery?

            Not the users.

    The users hate stuff changing for the sake of change, but trying to convince management seems an impossible task.
           

    • Why should users not like it?
      If you shop on amazon you don't know if a specific feature you notice today came there via continuous delivery or a more traditional process.

      • by Junta ( 36770 )

        The crux of the problem is that we (in these discussions and the analysts) describe *all* manner of 'software development' as the same thing. Whether it's a desktop application, an embedded microcontroller in industrial equipment, a web application for people to get work done, or a webapp to let people see the latest funny cat video.

        Then we start talking past each other, some of us terrified what 'continious delivery' means in the context of software in the microcontroller of a health care device, others t

        • Well,
          'continuous delievery' is a term with a defined meaning. And releasing phone apps with unwanted UI/functionality in rapid succession is not part of that definition.
          Continuous delievery basically only is the next logical step after continuous integration.
          You deploy the new functionallity automatically (or with a click of a button) when certain test criteria are met. Usually on a subset of your nodes so only a subset of your customers sees it. If you have crashes on those nodes or customer complaints you

          • You deploy the new functionallity automatically (or with a click of a button) when certain test criteria are met. Usually on a subset of your nodes so only a subset of your customers sees it. If you have crashes on those nodes or customer complaints you roll back.

            Why do you consider this to be a good thing? It's certainly not for those poor customers who were chosen to be involuntary beta testers, and it's also not for the rest of the customers who have to deal with software that is constantly changing underneath them.

            • Why do you consider this to be a good thing?

              Because it is?

              It's certainly not for those poor customers who were chosen to be involuntary beta testers
              They are not forced to use the new functionality.
              And sooner or later they get it anyway, what has that to do with "continuous delivery"?

              and it's also not for the rest of the customers who have to deal with software that is constantly changing underneath them.
              Software is constantly changing. Deal with it.

              • They are not forced to use the new functionality.

                Assuming that it is new functionality, and not a change to old functionality. Or, assuming that it isn't replacing something outright.

                And sooner or later they get it anyway, what has that to do with "continuous delivery"?

                I was responding to your comment about part of "continuous delivery" being that you use a subset of the user base to do your testing. That means they are running code that hasn't been sufficiently tested. That they would get (hopefully fully tested) code eventually has nothing to do with it.

                Software is constantly changing. Deal with it.

                This amount of callous disregard is a great example of why users are increasingly view

                • There is a misunderstanding. The code usually is sufficiently tested.
                  However there are two reasons to not fully roll it out: there could be a glitch, especially in conjunction with other parts of the system, or simply connecting to the life data.
                  Secondly some systems are so big (e.g. Amazon, Zalando) that it makes sense to gradually deploy the new software and gradually shut down old nodes.

                  As a developer myself, I think that what we're supposed to be doing is solving problems users have, not making more.
                  And

          • by Junta ( 36770 )

            'continuous delievery' is a term with a defined meaning. And releasing phone apps with unwanted UI/functionality in rapid succession is not part of that definition.

            It is a natural consequence of a continuous delivery, emphasis on always evolving and changing and that the developer is king and no one can question developer opinion. Devolper decides it should move, it moves. No pesky human testers to stand up and say 'you confused the piss out of us' to make them rethink it. No automatic test is going to capture 'confuse the piss out of the poor non-developer users'.

            If you have crashes on those nodes or customer complaints you roll back.

            Note that a customer with a choice is likely to just go somewhere else rather than use your software.

            • The problem is that automated testing is no substitute for a QA team.
              The QA team is supposed to provide the automatic testing.

              Why don't you check out "shops" that actually do "continuous delivery" instead of boring us with your nonsense rants?

              This is a long video, you will find shorter ones coming to the point of continuous delivery more quickly.
              https://www.youtube.com/watch?... [youtube.com]

              Zalando is the only company I know that has realized it and is marketing, positioning itself, as an IT company, not as "a shop".

              If

              • by Junta ( 36770 )

                Again, not all software development is the same. A QA team in many situations is a team that cannot develop automated testing because they know how to use the software, not develop software. Because they represent the target customer base, not 'more development'. They provide a *distinct* perspective from what those skilled in software development can do for themselves. While automated testing can be done to assure that the various functions execute according to the design intent of the developer, usabi

                • So the QA then would click the button to trigger the last step: automatic delievery/deployment.

                  • by Junta ( 36770 )

                    So in all my prior conversations with folks advocating the approaches, the 'continuous delivery/integration' referred to reacting automatically to a commit. Triggering automation doesn't seem a particularly new concept to me. For example, software I deal with executes unit test on commits alongside human code reviews. Upon code review and unit test success, then a merge which triggers test builds for QA. Upon the human QA cycle and some external beta testers marking success, then a human triggers update

  • I suspect that article was actually written by an AI. That would explain why it makes so little sense to human mind.
  • IT in my company does network, Windows, Office and Virus etc. type of work. Is this what they talk about? Anyway, it's been long outsourced to IT (as in "Indian"
      technology)...

  • I recently interviewed at a couple of the new fangled big data marketing startups that correlate piles of stuff to help target ads better, and they were continuously deploying up the wazoo. In fact, they had something like zero people doing traditional QA. It was not totally insane at all. But they did have a blaze attitude about deployments -- if stuff don't work in production they just roll back, and not worry about customer input data being dropped on the floor. Heck, they did not worry much about da

    • But they did have a blaze attitude about deployments -- if stuff don't work in production they just roll back, and not worry about customer input data being dropped on the floor.

      It's amazing how common this attitude has become. It's aggressively anti-customer, and a big part of the reason for the acceleration of the decline of software quality over the past several years.

      • by Junta ( 36770 )

        I'd say a large chunk of the problem is people churning out trivial 15 minute frontends to existing functionality being very loud and thinking too much of their perspective, drowning out the voices of people that work on more complex software.

        Also, decision makers see more the impact of .css and it's very easy to make a sophisticated looking UI, which was formerly a viable indicator for someone at least had some sort of skill. The good thing is people who are more purely design minded can provide better qu

      • To some degree, I try to keep an open mind to engineering that seems to be the right tool for the problem on hand. But I do believe that this sloppy kind of engineering that is not much more than taking dirty data and making it somewhat less dirty will prove very easy to replace. Some day there will be an Amazon service where one business data analyst with an AI assistant can try out a hundred different correlation models in an afternoon, and deploy it out to a 1024 AWS cluster on a 1AM crontrigger while
  • by Njovich ( 553857 )

    You want your deployment system to be predictable, and as my old AI professor used to say, intelligent means hard to predict. You don't want AI for systems that just have to do the exact same thing reliably over and over again.

    • by Junta ( 36770 )

      This is what people keep missing about 'AI' as it stands today. AI is a good approach for problems that are impractical to address with manual programming. Areas where programmers know the general algorithms that would be useful, but to describe how to apply them in a general chaotic dataset to order it would be incredibly complex. So we know various algorithms will find edges and glossiness and color and those are characteristics that appear in photos, but to describe how to apply all those techniques t

  • A continuous delievery pipeline has as much AI as a nematode has natural inteligence ... probably even less.

  • Analyst who understands neither software development nor AI proceeds to try to sound insightful about both.

  • All I know is that, as a user, rapid-release or continuous delivery has been nothing but an enormous pain in the ass to me and I wish it would die the horrible death it deserves already.

  • As long as customers are comfortable with doing this, I do not see a problem. Now, that will require that developers keep making continuous, worthwhile improvements to the code. Not some fluff change from a marketing recommendation that users want a lighter shade of red for their 'Stop' button.
    • As long as customers are comfortable with doing this, I do not see a problem

      I'm a developer, not a "normal" user, and I would not be comfortable with this at all. However, it would be better than what is usually being done, because at least with that system I could easily decide when to upgrade and when not to.

"Someone's been mean to you! Tell me who it is, so I can punch him tastefully." -- Ralph Bakshi's Mighty Mouse

Working...