Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Intel Programming AI

Will We Someday Write Code Just By Describing It? (zdnet.com) 158

Using millions of programs in online repositories, Intel, Georgia Tech, and MIT researchers created a tool called MISIM (Machine Inferred code Similarity) with a database of code scored by the similarity of its outcomes to suggest alternatives (and corrections) to programmers.

The hope is "to aid developers with nitty-gritty choices like 'what is the most efficient way to use this API' or 'how can I correctly validate this input',"Ryan Marcus, scientist at Intel Labs, told ZDNet. "This should give engineers a lot more time to focus on the elements of their job that actually create a real-world impact..." Justin Gottschlich, the lead for Intel's "machine programming" research team, told ZDNet that as software development becomes ever-more complex, MISIM could have a great impact on productivity. "The rate at which we're introducing senior developers is not on track to match the pace at which we're introducing new chip architectures and software complexity," he said. "With today's heterogeneous hardware — CPUs, GPUs, FPGAs, ASICs, neuromorphic and, soon, quantum chips — it will become difficult, perhaps impossible, to find developers who can correctly, efficiently, and securely program across all of that hardware."

But the long-term goal of machine programming goes even further than assisting software development as it stands today. After all, if a technology can assess intent and come up with relevant snippets of code in response, it doesn't seem far-fetched to imagine that the algorithm could one day be used by any member of the general public with a good software idea. Combined with natural language processing, for example, MISIM could in theory react to verbal clues to one day let people write programs simply by describing them. In other words, an Alexa of sorts, but for software development.

Gottschlich explained that software creation is currently limited to the 27 million people around the world who can code. It is machine programming's ultimate goal to expand that number and one day, let people express their ideas in some other fashion than code — be it natural language, visual diagrams or even gestures.

Intel currently plans to use the new tool internally.
This discussion has been archived. No new comments can be posted.

Will We Someday Write Code Just By Describing It?

Comments Filter:
  • by Dunbal ( 464142 ) * on Sunday August 09, 2020 @05:38AM (#60382119)
    What you mean like pseudocode?
  • Already possible (Score:5, Insightful)

    by swilver ( 617741 ) on Sunday August 09, 2020 @05:46AM (#60382139)

    My product owner describes what he wants to me, and I create the code. However, as I'm not psychic, the product may still not do quite what he expected. I suspect similar problems will occur when you replace "me" with AI.

    • by davecb ( 6526 ) <davecb@spamcop.net> on Sunday August 09, 2020 @06:40AM (#60382225) Homepage Journal
      As part of the "automatic programming" efforts of the 1950s, people like John Backus (of Backus-Naur fame,)worked on "automatic programming", in which you simply wrote down a formula, and a computer program converted that into a computer program. We called this a "formula translator", FORTRAN for short. See http://www.cs.toronto.edu/~bor... [toronto.edu]
      • by jythie ( 914043 )
        Heh, and here I was thinking 'didn't we call this COBOL last time?'
      • by ODBOL ( 197239 )

        I count at least 3 times that "automatic programming," where a computer translates a "description" into actual code, was solved:

        1. 1. Programming with wires on a patchboard was automated by machine language.
        2. 2. Programming in machine language, including putting in the binary codes but worse, having to assign addresses to particular pieces of data, was automated by assembly language.
        3. 3. Programming in assembly language, in particular the translation of ideas such as mathematical formulae into sequences of indivi
    • by CastrTroy ( 595695 ) on Sunday August 09, 2020 @06:50AM (#60382235)

      This is where the good software developers shine. Taking the drivel spewed forth by clients or managers and turning it into something useful. This is why you won't ever have computers programming themselves. By the time you've sufficiently described what your program should do, you've basically written a program. There is still a lot of headway to be made in reusable components and frameworks so that the programmer has to do as little coding as possible. But I don't think you'll ever really get to the point where you won't need specialized workers who understand how to "speak computer". Theoretically you could design an entire application in Excel or Access with almost no real code, but you see very few people doing a good job at this simply because most people lack the skills of how to properly do it.

      • This. Once you have clearly specified the program in a consistent way, you have written it. And it will always be that way. Imagine trying to debug code that has no consistent logical / mathematical description of its behavior, you would be like a psychiatrist for code, no hard boundaries, no telling what it would do next.

      • This is where the good software developers shine. Taking the drivel spewed forth by clients or managers and turning it into something useful.

        Indeed. But what's critical is what they're actually doing while they're doing that.

        What a good software developer is doing is creating the mental model in their head, and then asking the implementation questions. Asking about default settings, asking about edge cases, asking about interactions with other system parts. Assuming how the client will be using it, and confirming that assumption.

        I don't see AI being able to do this anytime soon.

        Here are some good examples of why:

        Validating input from customer se

      • This is where the good software developers shine. Taking the drivel spewed forth by clients or managers and turning it into something useful.

        * Vague functional requirements because the client is not a programmer.
        * Derived requirements from the function requirements.
        * Client didn't tell you about all the details he needed.
        * Hitting a showstopper because of an inflexible architecture, or shoehorning the other requirements into system and making it work.

        This requires mental effort on both the client and progra

    • "My product owner describes what he wants to me, and I create the code. However, as I'm not psychic, the product may still not do quite what he expected."

      Normal, since you do what he said and not what he meant, that has been the bane of programming since Ada.

    • It's worse than this. You as a person can learn to think like your clients do. You can understand their problems and then translate these into your own world of thinking and solve them your way.

      An AI cannot do this and in this case here will it only suggest more and less accurate algorithms the longer the guessing takes. It will not learn to understand your clients. Rather will an AI force your clients to think like itself or possibly like a programmer.

  • by Lanthanide ( 4982283 ) on Sunday August 09, 2020 @05:47AM (#60382141)

    I mean yeah in principal this is possible. But the thing with languages, natural ones in particular, is that they aren't precise.

    Stating a problem to be solved is easy. It's when you add all the different circumstances and what you want to do in those circumstances that creates complex - and useful - software.

    The hard part for programming is keeping all those things in your head and coming up with code that solves all or at least the important ones. Whether you describe that in C, Java or English doesn't really make that job much easier.

    It's kind of amazing how in Star Trek whenever they used English to ask the computer to do something, it almost always did exactly the right thing the first time.

    • by Joce640k ( 829181 ) on Sunday August 09, 2020 @05:53AM (#60382151) Homepage

      It's kind of amazing how in Star Trek whenever they used English to ask the computer to do something, it almost always did exactly the right thing the first time.

      Star Trek also has warp drive, teleporters, force fields, tractor beams and time travel... this is about on the same level as those things.

    • by war4peace ( 1628283 ) on Sunday August 09, 2020 @06:29AM (#60382197)

      "please help my uncle jack off a horse"

      There are three possibilities:

      - Help Uncle Jack get off the horse
      - Help Uncle Jack kill the horse
      - Help Uncle Jack make the horse happy.

      I wonder what would an AI choose.

      • by chthon ( 580889 )

        Since Google Translate apparently prefers to translate "wiener" into "little dick" instead of sausage, I know where to bet on...

      • by swilver ( 617741 ) on Sunday August 09, 2020 @07:23AM (#60382269)

        The last one is incorrect, the name of the uncle is unknown in that case :)

      • It depends on whether life is a comedy, tragedy, or porno.

        Maybe the response will defy genre, and it will help with all three

      • Presumably an AI would go about that in exactly the same way humans do.

        1: use context. Is someone sitting on a horse?
        2: make assumptions. Humans are notoriously bad at making too many assumptions, but a robot would quickly learn that it has helped people off horses before.
        3: ask for clarifications. There are 2 people on horses. Which one am I supposed to give a hand job?

        Asking for clarifications is probably the first thing it would get good at because it would be great at seeing ambiguities in the requireme

    • by west ( 39918 )

      > Stating a problem to be solved is easy. It's when you add all the different circumstances and what you want to do in those circumstances that creates complex - and useful - software.

      Absolutely this. I remember interrogating a client about automating a manual task and what should happen in all the cases, and sub-cases, and sub-sub-cases, and he eventually became quite annoyed. How was he supposed to know what should happen for situations that almost never happen. When I asked what did the people on t

  • by bluegutang ( 2814641 ) on Sunday August 09, 2020 @05:53AM (#60382153)

    We already write code "just by describing it". Coding is nothing more than exactly describing the algorithm you want used.

    This article seems to be talking about something else though - a kind of "autocomplete" and "autocorrect" for code.

    For example: You write a for loop, it suggests that you instead use a while loop with the same functionality. Or more interestingly: you write a for loop, it suggests a single command that accomplishes the same thing in a vectorized manner, more efficiently. Or, you definite a variable, it suggests the function you should use in the next line to correctly initialize the variable.

    This wouldn't allow a non-programmer to write a useful program, but it could allow a programmer to avoid weaknesses in their code, by teaching them about better code alternatives when they exist.

    It still seems like a bit of a niche case, but I wouldn't be surprised if it's relatively common in IDEs in a decade or two.

    • So it's going to work like excel and drop unnecessary leading zeroes in zip codes, and turn everything into a date, eh? Or like google auto-complete which auto-selects the first option so when you're typing too fast and hit enter you end up somewhere you didn't intend to be instead of searching? Sweet.

      Because that's how this always ends up working.

      It's great for the average joe, but terrible for anyone trying to do anything complicated. And while it's optional at first, at some point it becomes baked in, an

  • by flyingfsck ( 986395 ) on Sunday August 09, 2020 @06:17AM (#60382185)
    The road to Computing Nirvana is littered with failed 5th generation computer languages.
    • by Entrope ( 68843 ) on Sunday August 09, 2020 @06:40AM (#60382223) Homepage

      Pretty much this. Modern 3GLs have made enormous strides in helping programmers be concise, precise, and clear, although to some extent those three objectives are in a "choose any two" type of tension. Natural language is terrible at being both concise and precise, unless one makes it practically artificial through use of jargon. (Think "shalls" and similar formalisms in work specifications, which in turn impair clarity.)

      Sure, most 4GLs are good at generating code, but the domain of that code is correspondingly narrow.

  • C'mon man! It's called Python.

  • Seen that before (Score:5, Informative)

    by heikkile ( 111814 ) on Sunday August 09, 2020 @06:29AM (#60382199)

    They once had this idea that you could describe the program in more or less plain English, and the computer would figure how to get the machine do what you asked for. So was born the first Cobol compiler, in 1959.

    • Re:Seen that before (Score:5, Interesting)

      by znrt ( 2424692 ) on Sunday August 09, 2020 @06:56AM (#60382237)

      more or less plain English

      not really, just a small grammar constructed with english vocabulary so it appeared more familiar to english speakers. this 'plain english' thing was hyped at the time for marketing, and is nowadays too just for vintage nostalgia, mostly by people that haven't written a line of cobol ever.

      you have to consider that the other option at the time was rpg which didn't even look like language at all, as it was just putting numbers into forms, the pinnacle of its sophistication in expression were 4 letter operators (only at positions 28-32 of a calculation record, you get the idea). actually a very practical, efficient and clever system, but quite alien on first approach.

      returning to topic, i see no problem with coding getting to ever higher abstractions, this has already been happening all the time. there has been a mix of progress, genius, hype, failure and snake oil and will continue to be. as abstractions get higher and applications more diverse, i think it is ever more important to think about the questions to ask. a program that wipes out all life on the planet when asked to eradicate coronavirus would actually be working as expected.

    • Anyone who thinks COBOL is English either doesn't know COBOL or doesn't speak English.
  • 'And then randomly every day or two hang all the interrupts and use hundred percent of CPU. Then go back to normal without logging any errors.'
  • We are already there (Score:5, Interesting)

    by Casandro ( 751346 ) on Sunday August 09, 2020 @06:48AM (#60382231)

    As a programmer, once you are out of your "wrangling with broken APIs"-Phase, you learn that there are many ways in which you can simply describe your problem and let the computer solve it.

    In fact a common trope in C programs is to describe what your program is supposed to do in data structures. You then only write a comparatively small program to interpret that structure to solve your problem. A typical example for this are state machines in embedded systems. Surely you could write a loop that handles the switching of traffic lights, but once you need to add in ways for it to adapt to traffic (e.g. emergency vehicle priority) the far easier way is to create a state table that describes what the program should do.

    More advanced ways include writing programs in high-level languages like Prolog. In Prolog, for example, you tell your computer what you want by giving it facts and rules. Then you ask the computer a question and it'll give you a result.

    Then for GUI stuff there are more advanced APIs and frameworks like Delphi. In Delphi (or Lazarus) you can create a database application simply by clicking around, selecting GUI elements and pulling them onto your window. You can then connect those objects to objects representing your database and there you go. The system will automagically deal with the nitty gritty stuff behind it... but if you don't want it to do that, you have hooks to deal with that yourself with a minimal amount of code.

    Of course each "generation" of programmers has to find out about those things themselves. Unfortunately most programmers spend their first decade by wrangling broken APIs, because those are easy to get jobs that can bring fairly decent amounts of money.

  • Go to stackoverflow.com . Read a few posted questions. Check how many come anywhere _near_ close to describing what their code should do.
  • by Rosco P. Coltrane ( 209368 ) on Sunday August 09, 2020 @06:56AM (#60382241)

    In the aeronautical world, there are several tools that can automatically generate DO-178-certified code from requirements. SCADE [ansys.com] for instance.

    The problem with this is, while the code is indeed generated automatically, and automatically certified (since it comes from a DO-certified tool), the difficulty become writing the requirements properly. And believe me, those who can write requirements in the aero world are very, very well paid.

    So you just move the problem: you don't need talented programmers to implement a functionality, you need talented req writers to describe that functionality instead.

    Or said another way, you still need talented people at some point in the loop. You won't get a code generation software that can take a teenager's wishlist and turn it into a great video game in the foreseeable future.

    • by clawsoon ( 748629 ) on Sunday August 09, 2020 @09:19AM (#60382505)

      There's a joke in video post-production that clients expect you to have knobs for vague terms. "Can you make this more... dynamic?" "Sure, I'll just turn up the 'more dynamic' knob." (That's never said directly to the client, of course, because clients have money and don't like sarcasm.)

      One of the insights of Agile programming - whether or not the insight is useful in practise - is that people don't know what they want until you translate their vague descriptions into a real product and they realize that it's not actually want they want.

      • This is where I'm stuck. How does the client communicate with the AI in a manner that helps the AI determine where to go and what to change?

        If the client is non-technical, this seems even more impossible. How does AI figure out how humans work well enough to make sense of their incoherent ramblings? And if it takes a dedicated AI handler to make this communication work, we've just replaced a developer with AI plus a handler, which doesn't seem like much of a step forward.

    • by PPH ( 736903 )
      From TFA:

      Scade language

      I see what they did there. Is whitespace syntactically significant in SCADE?

  • by Joe2020 ( 6760092 ) on Sunday August 09, 2020 @07:21AM (#60382265)

    The art of programming isn't just about gluing algorithms together. Just like one doesn't write a good novel by knowing all 26 letters of the alphabet or all the words in a dictionary. A good artist can create something wonderful with just a pencil, while others cannot draw nor paint no matter how many colours you give into their hands. It's about make something new by also removing all that which isn't needed, like chiselling a sculpture out of a rock, or writing a novel by knowing what to say and also what not to say. Describing code does not eliminate the need to understand it. A new, description-based tool can make it easier for some, but I don't see how it will produce better code, but only how it will add more bloat, and with bloat come stability and security issues.

    Frankly, it sounds like one of those ideas, which is too good to be true. Like the AI that makes you rich by "understanding" the stock markets, or the pill that ends all hunger.

  • God knows they have enough trouble doing that.

  • This will take "It's just what they asked for but not what they wanted." to a whole new level. Now even people with very little skill or critical thinking ability will be able to create programs.

    Let's face it, if people knew what they wanted upfront, coding would already be easier.
    Unfortunately, people don't know and often times they don't know what they don't know.

    "It ain’t what you don’t know that gets you into trouble. It’s what you know for sure that just ain’t so."

  • We had no-code/low-code programming in the 80's with pfs:File. Then Hypercard and Supercard. Then PowerBuilder. Then UML / Rational Rose. Heck, most early dedicated OO machines (Symbolics, Explorer) were based on the concept that you built up a large library of code and plugged them together at a very high level to construct applications.

    The problem isn't building applications like this. The problem is making them equally easy to maintain.

  • by BAReFO0t ( 6240524 ) on Sunday August 09, 2020 @08:04AM (#60382331)

    It is *literally* "describing it".

    The only difference between languages like Haskell, where you simply describe what it is, instead of what it does, is that you're falsely assuming that you can be vague and still get expected results.

    Reminding me of the dumbest common denominator mold, thar "smart" phones force users into. Works "great" unless you want *anything* out of the ordinary. So unless you simply eat whatever pap standard ration you are served... never.

    If you actually want to get some exact thing, you have to describe it exactly too. No way around it. And that is, and has always been, the core job of a programmer.

    • Writing code is closer to giving instructions to a very smart but severely autistic child who takes everything literally.

  • by OneHundredAndTen ( 1523865 ) on Sunday August 09, 2020 @08:15AM (#60382351)
    Once we have been able to solve the do-what-I-mean-not-what-I-say problem. Good luck with that.
  • Someday many things will be possible. What day is that is another story. That could be from next week or in 100 years, or never... (the human race being not able to survive its own stupidity).
  • You record your voice giving the specification of the code and then send it into the cloud. After waiting a bit, for various definition bit, code comes back and you publish it. Done and done.

    Of course up in the cloud 1000 code monkey take your specification to actually produce the code. But you don't need to know that, just write a check for a subscription to the service.

  • by fuzzyfuzzyfungus ( 1223518 ) on Sunday August 09, 2020 @08:42AM (#60382401) Journal
    Even if such technology works at basically sci-fi levels, rather than being a real but unthrilling iterative advance in compilers and autocompletion, I suspect that a lot of people would be in for a shock about the fact that you need to actually know what you want in order to describe it.

    So long as implementation remains a nontrivial specialist exercise it's easy to maintain the belief that you really do know what you want, just not the fine details of how to get it. If implementation becomes trivial everyone gets an exciting crash course in whether or not they ever possessed conceptual clarity.
  • If the code has sufficient safeguards and blocks of pre-defined code it will simply slot together, then might be OK albeit limited to start with.

    Howvever when you consider the damage a mispelled variable or missing terminating semi-colon can cause in some high profile apps, you'd better pray to your favrouite deity that the code is not built on the fly from standard syntax coding rules, or at the very least the source parser is absolutley shit-hot!

  • need to write perfect code for every platform, need fpga, get a fpga guy

    we used to have a developer at work that claimed he could do it all, after he left I got tasked to review some of his code, it was borderline negligent and it was clear he did not understand the platform he was working with

    • What do you do when you need specialty integration of CPUs, GPUs, and FPGAs, but your CPU guy only knows CPUs, your GPU guy only knows GPUs, and your FPGA guy is fresh out of college and is far from a seasoned vet?

      That's the kind of problem this tech is trying to solve, because the stated problem is too few Senior Devs for the emerging technologies, and not enough cross-platform expertise.

  • by MMC Monster ( 602931 ) on Sunday August 09, 2020 @08:51AM (#60382421)

    Create a unique Sherlock Holmes mystery with an adversary who is capable of defeating Data.

  • Glancing at this I got this horrible image of someone hooking up google to stack exchange with a bunch of magic glue and tired to build a system that would automatically program things based off questions and answers....
  • We already do that. We "describe" the problem in a High Level Language and a "compiler" translates description that into a computer program. Of course, some "high level" descriptive languages are more better than others. And the eternal Septemberites invent more of them every day.

  • Sure! We'll probably need some sort of symbolic language with precise syntax and operators to describe it with though.
  • Describing precisely what the computer is supposed to do? That's called programming. And if you babble a incoherently as most project leads I've had, the computer will have a hard time greasing what you want, just as me.

    This is, as we know, the whole purpose of all existence, the ephttps://www.profiling-institut.de/psychische-belastung-im-studium/ic and eternal battle between the universe and us developers. We try to program ever more foolproof programs whilst the universe tries to produce ever more epic fo

  • Some manager jackass tells me how to design what he doesn't understand, and I am the vehicle for which his "code" gets done.

    If you think adding even more non-thinking people to the mix will make it better, glad I will be fully retired by then.

    GOOD LUCK!
  • Is it limited?

    Or is this just the current amount of the global population who actually want to code?
    Who fell into the idea of coding, whether by choice or circumstance, found they could do it and stuck with it?

    Would being able to describe code - possibly through just speech - make it any more widespread?

    I don't think so, to what end?

    Modern code is more often than not, an abstraction on top of many other abstractions which end up with machine code at a lower level. (and then binary).

    We already describe our c

    • I should add, if the current energy crisis continues, where energy being used is ultimately environmentally destructive, the onus on us as programmers is to ensure our code uses as little energy as possible to get the job done.

      However, older coders have seen the idea of optimal code being pushed aside and the extra grunt from CPU/GPU power covering a multitude of coding sins - bloated, poor performing software is "Good enough", because the hardware can run it.

      Imagine what describing software could end up wi

  • Do what I meant

  • by pele ( 151312 )

    it's called Z

  • by GerryHattrick ( 1037764 ) on Sunday August 09, 2020 @09:55AM (#60382573)
    50 years ago I wrote COBOL in near-plain language clear enough for the industrial client to check the logic from the source, with meaningful data-names. We input it on 80-column cards which had been 'interpreted' (ie content auto-printed in plaintext along the top of each card, to make debugging easier - just switch card/s), and the pack 'compiled' overnight, with 4 magical tape decks flipping away while we drank cocoa (sometime with rum) - hard drives weighed half a ton then, and only Government had them. With all the resources now available, things seem to have gone way downhill since then.
  • No matter how advanced the automated code generator may be, it will invariably produce code that contains more bugs than the automated code generator. Those who want to make use of this [old||new] paradigm should very carefully consider the application being built, and whether the automated code generator is sufficiently safe and secure to be allowed to produce the code for that application.

    For those not familiar with the usage of "safe and secure" in this context, I offer the definitions as put forth by Jo

  • They also cook food by describing it to the waitress.
  • Clippy: It looks like you're trying to custom write a common piece of code, may we suggest the following standardised module:
    void NSA_backdoor(void);

  • Sounds like a new standard: https://xkcd.com/927/ [xkcd.com]
  • Back in the golden age of CGI and visual effects, a lot of people were in awe of the work John Knoll did on a Mac and those of us who were privileged enough to here him talk about his methods all wanted a plugin with one button labeled "Make me like John Knoll". Not gonna happen folks. You still have to work hard at something if you want it to be good.

  • Something that you learn quickly is the importance of knowing what you are asking for. My boss will regularly ask for some data, and I'll get him the data. His responsible will inevitably be something along the lines of "No, when I said 'Long time' I meant an hour, not a day. Oh, and I only wanted it for operations during peak hours, etc". After having had years of these back and forth exchanges I figure he'd be getting better at asking more specific questions, but he has not. There's no way a computer will
  • Try describing anything to speech to text converter, Siri for example. Error rates huge, there is no concept what so ever of context. There is no – none – zip -zilch – zero – automated process that construes meaning from text. Till context can be conveyed to an automaton, this doesn’t even reach the level of a thought experiment. We lack the technology to even begin discussing this.
  • There's a word for what adequately describes what code needs to do.

    It's "code."

  • I've never been a professional programmer, I was entirely self-taught in mostly classic 'C', but it sounds to me that what they're describing here amounts to some half-assed software 'interpreting' what you input to it, and then slapping together some 'software' for you out of files full of pre-written copypasta.
    At best I'd hope it would at least deliver sourcecode to you, which I'd hope 'programmers' of the time would still have the smarts and the education to be able to review, line-by-line, and edit it,
  • code describe you
  • Coding is the easiest part of "programming". Identifying a problem, describing the problem, creating solutions for the problem, designing the solution, and finally implementing it aka coding. I will spend all day on a problem and make 5 lines of code change. And as a programmer, I spend more time reading coding than writing it. How does this play into the computer making the code for you? Will it be easy to read or is it write only and you have to start over if you make a mistake?

Keep up the good work! But please don't ask me to help.

Working...