Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming AI Technology

Dry.io Wants To Democratize Software Development Using AI (venturebeat.com) 122

An anonymous reader writes: We've seen companies big and small build everything from AI-driven developer tools to AI-powered developer environments. But what if instead of having AI merely help developers write code, it did all the heavy lifting? Dry.io, a developer playground that helps you write web apps using just a few lines of code, began accepting signups today for its first wave of external testing. The programmable software platform lets you set the parameters of what you want to build, "and the AI takes care of the rest."
This discussion has been archived. No new comments can be posted.

Dry.io Wants To Democratize Software Development Using AI

Comments Filter:
  • by Anonymous Coward

    Simple webapps are just barely software development. If your use case is this simple, Ruby on Rails pretty much does this already and you don't really need an experienced software developer involved. More AI buzzword crap.

    • by Anonymous Coward

      Millennials need job security too, you asshat.

    • by goombah99 ( 560566 ) on Tuesday February 26, 2019 @02:59PM (#58184680)

      What's the point of software development anyhow? presumably to solve problems not to develop software.
      Douglas Adams proposed the interface of the future would be a desk you work at trying to solve a problem. The computer would observe what you were doing, then write an algorithm to do it for you. At the time he meant this as a joke. But this is infact exactly the sort of problem that so-called Artifical Intelligence is getting good at. It's getting good at recognixing a start on something then completing it. For example deepFakes fills in a face into a removed face. Adobe's eraser removes defects and fills them back in. And combinatorial material ascience is having success in taking in some basic physics and examples of compounds that exhibit desired properties and then suggesting new molecules that might have similar properties.

      AI is really crappy at figuring out what to do. It's really good at observing what you think is important then extrapolating that. Thus Douglas Adams desk interface is no long a joke concept.

      How hard would it be to have a computer write a sorting algorithm just by watching someone sort numbers? It's plausible.

  • by Anonymous Coward

    "Founded in April 2018, Dry.io has not raised any money" - Shouldn't that be kind of a red flag or something? Maybe they should let the AI run the business side too?

    • Re:Er, wot? (Score:4, Interesting)

      by ShanghaiBill ( 739463 ) on Tuesday February 26, 2019 @03:26PM (#58184868)

      "Founded in April 2018, Dry.io has not raised any money" - Shouldn't that be kind of a red flag or something?

      No, it is not a red flag. Most companies should not need to "raise" (borrow) money to stay in business.

      When a company funds growth from their own revenue stream, that is good, not bad.

  • by thereddaikon ( 5795246 ) on Tuesday February 26, 2019 @02:47PM (#58184576)
    I have heard about the death of programming for years. Since the mid 90's people have been telling me that software that can write software better than programmers can is just around the corner. I'm still waiting. Development tools have gotten better and newer languages are certainly easier and faster to develop on, although they don't result in faster code. Right now AI is little more than an industry buzzword. It isn't real yet, not in the way its marketed at least. Don't expect anyone to change this soon.
    • by aix tom ( 902140 )

      Same her. Actually it has gotten somewhat worse. It used to be "Start project, write software, done...".

      Now it's "Start project, decide framework, get framework set up install all libraries, write software, done" most of the time.

      • by Tablizer ( 95088 )

        For custom CRUD apps, what used to take one hybrid programmer/analyst in the late 90's with desktop IDE's now often takes 4x the staff. It's a lot of cross-module (re) wiring-together busywork.

        Sure, the Web made deployment (mostly) easier, but made development harder. And, MS has improved auto-deployment of Windows apps such that it's almost seamless.

        Many shops got fairly close to the 90's with MS Web Forms, but then got spooked MS was going to pull the plug like it did with VB6, leaving them without a migr

    • Since the mid 90's people have been telling me that software that can write software better than programmers can is just around the corner.

      If my memory isn't failing with age, people have been saying that for a lot longer than that.

      • I'm sure they have. I'm just not old enough to remember. I wasn't exactly discussing future programming paradigms in the 80's.
    • Dude, I heard about pocket computing for the first 3 decades of my life, and then, as if overnight, it was here.

  • by theCat ( 36907 )

    I've been coding for 30 years, and for the last 5 years I could see this coming. Recently I've been telling people that there is maybe 2 more years left in this field before the door starts to close. Some types of work will continue, but the overall era of throwing rooms full of coders at software will have ended. Surprised it took this long; much of my work for the last 10 years has been a process of cut/paste from my earlier work, or just Goolge a question, follow a link to Stackoverflow, read a few posts

    • ...much of my work for the last 10 years has been a process of cut/paste from my earlier work, or just Goolge a question, follow a link to Stackoverflow, read a few posts for 10 minutes, and only then copy/paste.

      Huh, I hate to say it, but it sounds like you really should not have been employed for the past 10 years.

      If I wrote code that way I would have for sure been fired LONG ago...

      • ...much of my work for the last 10 years has been a process of cut/paste from my earlier work, or just Goolge a question, follow a link to Stackoverflow, read a few posts for 10 minutes, and only then copy/paste.

        Huh, I hate to say it, but it sounds like you really should not have been employed for the past 10 years.

        If I wrote code that way I would have for sure been fired LONG ago...

        Rubbish. 90% of all coding these days is plumbing libraries. Computers have gotten so fast that glue code now is not inefficient. While at the same time big data wrangling, and numerical optimization and network management have become so complex it's literally dangerous to role your own. Much better to borrow code that has been demonstrated to work well and adapt it.

        This is not to say that research algorithmic development, or scaling of research algorithms does require artisanal coding. But THAT is tha

        • by wed128 ( 722152 )

          if I should mulitply by 2 or add x+x

          Neither, you should shift left by one.

          • if I should mulitply by 2 or add x+x

            Neither, you should shift left by one.

            Not if you're using floats. Even if you managed not to mess up the exponent, you're causing a lack of precision.

        • Computers have gotten so fast that glue code now is not inefficient.

          You realize efficiency is not the same thing as out right speed right? Something can be wildly inefficient and still run fast through brute force. And most new software falls into that category. I'm reminded of the idiotic re-implementation of Win95 in JS. They took an OS that ran well on a 32bit 33mhz single core, in order scalar processor and 8 megs of single data rate memory on a 32 bit data bus and now requires a 64bit multicore processor and a few gigs of ram once you factor in everything running benea

          • First gen is reference code. THen for a while people hack it for speed. Then some saint refactors it to generalize the issues that cause speed problems. And then the refactorization allows a consistent acceleration.

      • If I wrote code that way I would have for sure been fired LONG ago...

        Fired for code reuse? Programmers get paid for solving problems, not for reinventing wheels.

        Professors want to see original work. Employers prefer plagiarism.

    • by es330td ( 964170 )

      read a few posts for 10 minutes

      So you spent 10 minutes doing some VERY high level discrimination to determine, in the context of the problem as understood by your brain, which already written solution fits your particular problem

      and only then copy

      If you copied it, somebody else wrote it in the first place.

      Step one is a really, really big deal. The copied code isn't too small a deal either.

    • Doubtful you are a real developer, but an AI can't even do what you describe.

      • If you were a code monkey maybe, since the modern frameworks and IDEs can do a lot of the lifting for you, but that's only because a lot of that code has become boilerplate. But we certainly aren't anywhere close to not needing developers.
    • You must have been incredibly lucky or haven't challenged yourself.

      Easily half of my programming career has been spent understanding existing code (ie. not my own) as part of reimplementing it in another language or on top of another set of libraries/frameworks. Two-thirds would be writing code that is specific to the problem at hand (ie. no solution would be found by Googling). Maybe one-tenth of the problems I see can be solved with heavy inspiration from a search.

      But I do see the writing on the wall

    • I'm a big proponent of AI myself, but I think it simply shift what work gets done - that is to say, there will still be a lot of programmer jobs, but they will be more about directly higher level concepts than lower level programming we are used to...

      But even with that, two years sound really optimistic for taking over programming, because there is such. large mash-mash of things it could possibly help with.

      I think we'll have real honest-to-goodness self driving cars running around the world long before we

  • by Anonymous Coward

    The hard part isn't the coding, it's figuring out the boundaries of the problem. I have no doubt it can write it's slack alternative in 50 lines of code or a "social network" in 150 lines, but they'll be trivial examples that don't take into account the realities of complex software development.

    This whole concept of ultra-high level was tried before with Visual Basic and Java and Scratch and other languages that promised to make programming accessible to everyone, the problem is that the *problems* are co

    • by Tablizer ( 95088 )

      A distinction should be made between easy to learn and easy to use. Easy to learn means the learning curve to decent productivity will be shorter. Easier to use generally means it's less hand and eye movements to get what you want, although it may take longer to learn.

      Granted, the terms are often commingled, but I seperate them here at least as a working definition.

      To some extent one can achieve both by making the language or tool closer to the domain. The tool's objects/parts then are conceptually closer t

      • by Livius ( 318358 )

        A distinction should be made between easy to learn and easy to use.

        Not always, but a lot of the time, these are contradictory goals.

  • I want to cure world hunger, I have a hand-wavy idea about giving food to hungry people.

    Maybe we should save news articles until people actually accomplish something instead of wanting to?

    • by Matheus ( 586080 )

      ...but how then are they going to Slashvertise their next funding round??

    • World hunger is simple when tackled using recursion. Just take half of the current hungry people and feed them to the other half. Algorithm terminates upon reaching the base case of 0 hungry people in the world remaining.
  • Oh, Lordy (Score:4, Insightful)

    by JoeDuncan ( 874519 ) on Tuesday February 26, 2019 @03:02PM (#58184700)

    People have been touting variations of this concept for decades, and it NEVER pans out.

    You know what happens when you let "AI" do the "heavy lifting" of writing code?

    You wind up with crap like Dreamweaver's garbage HTML code...

    This MAY work for trivial, formulaic crap like CRUD coding, but for the 50% (minimum) of programming that requires coming up with something novel to address a unique situation? It's going to produce nothing but non-performant, fragile, unmaintainable garbage.

    • You know what happens when you let "AI" do the "heavy lifting" of writing code? You wind up with crap like Dreamweaver's garbage HTML code...

      And when you let a compiler generate assembly, you end up with something an assembly programmer might regard as crap as well. But since you're not the consumer of the result, who cares?

      It's going to produce nothing but non-performant, fragile, unmaintainable garbage.

      Yep, and the solution is to re-generate the results from the inputs. Just like with any other toolchain.

      • ...But since you're not the consumer of the result, who cares?

        False assumption on your part. Being responsible for maintaining the end result DOES make me the consumer, which means I do have a vested interest in a tool that creates a readable and maintainable product.

        Yep, and the solution is to re-generate the results from the inputs. Just like with any other toolchain.

        Which only ever works properly if the problem you are working on is one of the few things the toolchain author was able to predict in advance (so again, possibly usable for CRUD coding). The minute you need to write code that the author was not able to predict you would need to write in advance, the whole

        • Being responsible for maintaining the end result DOES make me the consumer, which means I do have a vested interest in a tool that creates a readable and maintainable product.

          You're the guy who maintains C programs by editing binaries in a hex editor?

          Which only ever works properly if the problem you are working on is one of the few things the toolchain author was able to predict in advance (so again, possibly usable for CRUD coding). Better to just write the code yourself the first time around than risk getting "locked in" to a toolchain that is going to cost you more in the long run than you get from the short-term benefits.

          I actually happen to use Lisp macros properly, thank you.

      • And when you let a compiler generate assembly, you end up with something an assembly programmer might regard as crap as well. But since you're not the consumer of the result, who cares?

        That's all fine unless you someday have to tweak the lower-level code in a way not supported by the thing that generated it in the first place. And if you need to re-generate the code, and re-apply your changes... a creek named shit.

        • Then you update your generator, obviously. That's the only systematic approach.
          • Then you update your generator, obviously. That's the only systematic approach.

            If I need a generator to write code for me, what are the chances I'm capable of augmenting the generator itself for my requirements?

            • Pretty high, I'd say, since in both cases you have an understanding of your needs, i.e., what it should be doing.
              • Pretty high, I'd say, since in both cases you have an understanding of your needs, i.e., what it should be doing

                Okay. How often do you find yourself making changes in your C compiler because it doesn't support some feature you'd really like? I always thought it'd be nice if C could have a bound-safe array. I think I'll add it. I'm sure it's just a small change. After all I understand my needs so it'll be simple.

                The target for code generation tools are higher-level folks. They don't understand coding. That's the whole point. Software development without software development (or with much higher level constructs).

                • C compilers are not built for that (aside from perhaps GCC, which is more enlightened in this respect [starynkevitch.net]), but I have no problems with doing that to my Lisp compilers. Also, just because you're targeting higher-level programming does't mean you're targeting people who don't understand programming. Maybe you simply want to make things easier for yourself? The bigger the problem you're solving is, the more likely it is that raising the level of abstraction will save you a lot of work.
    • What if they integrated something with e.g. Visual Studio Code, such that as you program, it starts figuring out what you want to do? You start giving it an explanation of how you want various pieces to interface and it starts making suggestions about improving your data structures, interfaces, various options for design patterns.

      In other words: a person like me, who knows quite a bit about program architecture and can competently do anything with programming languages but isn't a programmer, can use t

  • by Chris Mattern ( 191822 ) on Tuesday February 26, 2019 @03:11PM (#58184750)

    They make it sound like that's the easy part.

    • by ShanghaiBill ( 739463 ) on Tuesday February 26, 2019 @03:41PM (#58184974)

      They make it sound like that's the easy part.

      Indeed.

      Question: What do you call specified parameters of what you want to build?
      Answer: Source code.

      Commitstrip.com [pinimg.com]

      • They make it sound like that's the easy part.

        Indeed.

        Question: What do you call specified parameters of what you want to build? Answer: Source code.

        . . . and then the customer says:

        That's what I asked for . . . but not what I need!"

        • . . . and then the customer says: That's what I asked for . . . but not what I need!"

          This is why you should not write code based on rigid BDUF [wikipedia.org] specs. Instead, the customer needs to be involved in the process, providing regular feedback.

          If you use Agile, you should have a customer rep at the bi-weekly sprint meetings. Both to review what was accomplished in the last sprint, and to set the priorities for the next sprint.

          • Except that the customer doesn't necessarily want to be part of the process. The customer just wants to tell the development team what he or she wants, not necessarily coherently, then go away and have the development team deliver what the customer needs.

    • They make it sound like that's the easy part.

      Exactly!

      We just might need some sort of structured language to set those parameters with ... hmm, what could we call it ... programming?

  • by Anonymous Coward

    Dry.io, a developer playground that helps you write web apps using just a few lines of code

    Just what the world needs. More crappy web apps....

    ... or that everything should become a crappy web app.

    So, other than for ideological reasons, why would you event want to democratize coding? Why not have people who are good coders code? I mean, is it important that I, as a coder, can take out an appendix? Should medicine be democratized?

    And in terms of ideology, I'm all in favor of fairness. I think all should be given equal opportunities, and when social conditions are such that that can't happen, fixes

  • by cascadingstylesheet ( 140919 ) on Tuesday February 26, 2019 @03:33PM (#58184912) Journal

    The dream of businesses since the dawn of programming ... yes, as long as you can fully specify all the branches of logic, the machine can write the code for you!

    Of course, it would help if we could devise some sort of symbolic written language to represent the logic, since human languages tend to be imprecise ...

    Then the computer could tell you if you got the syntax wrong with the symbolic language or something.

    It should only take Marketing a few years to get up to speed with using this. (In the meantime they will stop actually marketing, is that a problem?)

    Hurrah, no more pricey programmers!

    • I heard the same thing with a little programming language came to market called Visual Basic ... and Access would end the need for DBA's ... and in a way they did ... both lowered the bar for entry into creating applications. But... The end result was a torrent of sub-standard programs (and programmers) and a lot more work and opportunities for experienced/skilled programmers. To be sure things like AI and Dry.io are going to solve lots of problems but they will created a whole new set of problems (read o
  • are 'learning to code'? Are they out of a job already?

  • It needs people with insight, skill and experience. Doing it by committee routinely produces the worst possible outcomes.

    • The "democratize" buzzword here appears to not have connection to anything. There's no sort of group social decision making involved.

      • by gweihir ( 88907 )

        So I should have read the story then? Well, the title was all the stupidity I could stand, but thanks for pointing this out.

  • This is probably more of an attempt to let the software pick the underlying software for you with pre-packaged open source software ready to configure. Rather than going to WordPress for a blog, Shopify for a store, or whatever, you just go to one provider and their software picks the package that suits your needs.

    Developers will always have a job because the skilled ones already have the toolbox for setting up the baseline software or working with the existing software. They are paid good money to fill t

    • Any skilled developer already has written code that writes code for them based on patterns they use that need repeating. The most common being an ORM. The skilled developer designs the database and then the code generates the code so that the developer does not need to hand code a bunch of repeating patterns.

      I don't understand why data-dictionaries are not used more. Automatic code generation speeds up initial development, but is still difficult to maintain, as you have to sift through auto-generated verbos

      • by Tablizer ( 95088 )

        One problem with data dictionaries is how to override the defaults for specific problems...

        Clarification: I meant specific spots or areas of the application, not "problems" in the puzzle sense. An example would be for a particular listing were screen real-estate is at a premium such that I want the column title to be "Emp. No." instead of "Employee Number". A systematic way is needed to locally override the default title.

        Also a way is needed to create dummy columns for specific needs that are not necessaril

  • I'm probably missing something, but did they just re-invent fourth-generation programming languages? Or maybe they're just recycling the hype, because now "AI" which no-one can even define in the first place.

    • Fourth-generation programming languages were primarily pushed in the 80s, as I remember, so reviving them when it's thirty years later means that most of the people in the field won't remember why they didn't work.

  • Most any project worth having is a system at heart.

    Sometimes a quick chunk of code will get you a feature. But wherever true systems roam, actually software developers will be found to formulate the interactions and clean up the mess.

  • For years now, the hardest part of creating any new software was...getting the requirements for what it should do.

    At my company, the development bottleneck is not the programmers, it's the business analysts, trying to figure out what the company wants to build. Once they decide, our team is able to build it quite quickly with existing tools.

  • Houses today can be build pre-fabricated in a factory, and thrown together in a few weeks. You get what you pay for. If you want a quality house, you still have to build it from the ground up, on site, using more traditional methods.

    Software isn't much different. If you use tools that use "AI" (i.e., pre-fabricated parts), you'll get what pre-fab can do. Crap.

  • All the tools that came before it were good at building some very specific kind of software. The most common use case is data entry screens. Yeah, that's easy. It's also easy with traditional tools. If you want to do something sophisticated, "AI" isn't going to get you any farther than the old 4GL or RAD tools did.

Don't tell me how hard you work. Tell me how much you get done. -- James J. Ling

Working...