Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
AI Education Networking Programming Software

Has The NSF Automated Coding with ExCAPE? (adtmag.com) 140

The National Science Foundation is developing a way to create working code using "automated program synthesis," a new technology called ExCAPE "that provides human operators with automated assistance.... By removing the need for would-be programmers to learn esoteric programming languages, the method has the potential to significantly expand the number of people engaged in programming in a variety of disciplines, from personalized education to robotics." Rajeev Alur, who leads a team of researchers from America's nine top computer science programs, says that currently software development "remains a tedious and error-prone activity." Slashdot reader the_insult_dog writes: While its lofty goals of broadly remaking the art of programming might not be realized, the research has already made some advances and resulted in several tools already in use in areas such as commercial software production and education...
For example, the NSF created a new tool (which they've recently patented) called NetEgg, which generates code for controlling software-defined networks, as well as Automata Tutor and AutoProf, which provide automated feedback to computer science students.
This discussion has been archived. No new comments can be posted.

Has The NSF Automated Coding with ExCAPE?

Comments Filter:
  • Dumb (Score:5, Insightful)

    by RightwingNutjob ( 1302813 ) on Sunday August 14, 2016 @04:40PM (#52701267)
    If you don't have to learn the intricacies of some esoteric computer programming language, you'll have to learn the intricacies of this esoteric NSF project. Next!
    • Re:Dumb (Score:5, Insightful)

      by Intron ( 870560 ) on Sunday August 14, 2016 @05:01PM (#52701349)

      Why do people keep trying to automate coding, which I spend less than 10% of my time on? What about:

      • - Converting nebulous requests into requirements docs
      • - Convincing the "architect" who hasn't coded anything in years that your functional spec is the 21st century way to meet the requirements.
      • - Going through countless design reviews on the proper background color of the alert dialog
      • - Finding the bug in the vendor-supplied library which is 6 versions behind the current version.
      • - Updating the night before release based on the new customer requirement that your manager forgot to tell you about.
      • Re:Dumb (Score:5, Insightful)

        by RightwingNutjob ( 1302813 ) on Sunday August 14, 2016 @05:24PM (#52701407)
        I'll tell you exactly why: only a small segment of the population (maybe 1%) has figured out that the point of using computers, as opposed to doing things with pencil and paper, is precisely that computers do exactly the things they are told in exactly the order they are told, every time vs humans who don't. That 1% has always understood that the hard part isn't crunching the numbers, and never has been; the hard part is figuring out what operations to do in what order to get the right answer.

        The rest are (and always have been) operating under the misapprehension that computers are electronic brains or oracles that you can converse with as you would another human being.

        The fact that people like Vanevar Bush and Norbert Weiner encouraged this attitude in their attempt to explain computers to 'the common man' did more damage to the public perception of computing than anything else because it was done early and set the tone for public perceptions and those public perceptions are the first thing that almost everyone encounters first in their lives, whether they go on to be an MBA in the corner office or whether they go on to be a kernel hacker who makes more money than the MBA for shaving a few microseconds off of a trade an HFT software stack.
        • The rest are (and always have been) operating under the misapprehension that computers are electronic brains or oracles that you can converse with as you would another human being

          Well, that is true - I have seen so many people trying to explain in great detail to Google what it is they are looking for, not understanding what a database search actually does. But given that the NSF probably contain quite a lot of well educated, highly intelligent and competent people, I would expect a project like this to have an aim that is somewhat intelligent; could it not be that somewhere on the journey from research to summary on /. the essence of the thing got somewhat muddled? Stranger things

        • Exactly. Whenever people say to me: "being a programmer, you must be really smart"! I reply with "no, I've spent lots of time learning to be dumb -- but dumb in the same specific way as a computer. Only by being dumb in the same way as a computer can I know how to tell it what to do."
        • by e r ( 2847683 )

          The rest are (and always have been) operating under the misapprehension that computers are electronic brains or oracles that you can converse with as you would another human being.

          I agree with you; also Dijkstra was right about this. [lambda-the-ultimate.org]

          At risk of being modded off-topic I'll also note that this stuff bears implications for whether or not strong AI is possible...

          • by Altrag ( 195300 )

            this stuff bears implications for whether or not strong AI is possible...

            Not in the slightest. It only bears implications for whether or not strong AI is possible using a standard sequential computer.

            It says absolutely nothing about whether strong AI is possible in general, such as by using a paradigm shift. In particular, real brains operate using massive parallel networks of very simple (to first approximation) "computers" ie: neurons. We've built smaller "brains" following that paradigm with several thousand neurons -- enough to study the concept in relation to very simple

            • There's no difference between what you can compute with one single processor and many parallel ones, although it's likely to affect the performance. A computer can simulate any physical (and hence any biological) process arbitrarily closely, although the performance difference can be extreme. If strong AI (behavior that apparently involves human-level intelligence) is possible for a biological machine, like a brain, it's possible for a computer.

              The arguments against AI entities generally assume that th

              • by e r ( 2847683 )

                A computer can simulate any physical (and hence any biological) process arbitrarily closely, although the performance difference can be extreme.

                A square is an approximation of a circle, but still totally unsuitable for use as a wheel.

                Likewise, our advanced adding machines are, I would argue, unsuitable for implementing artificial intelligence.

      • It sounds like this is what the fourth and fifth generation programming languages were supposed to be all about - describing the problem rather than writing specific code to solve it, I think. Or something like that.

        And still, people keep writing code in boring old third gen languages that actually solve the problems through hand-crafted algorithms. As it turns out, 4GL and 5GL work pretty well within the problem sets imagined by the designers of those languages, but the real world has to deal with issues

        • by Intron ( 870560 )

          Functional programming languages like Lisp predate procedural languages and are arguably more efficient, more productive and produce software that is easier to maintain. The reason we use procedural languages is only that they are easier to learn to program. 5GL constraint-based programming seems so special-purpose as to be useless in a general-purpose programming environment. It is in the same class as ExCape - solving the tiny part of problem that doesn't need to be solved, instead of the 90% that is no

          • by Anonymous Coward

            > Functional programming languages like Lisp predate procedural languages and are arguably more efficient, more productive and produce software that is easier to maintain.

            I'm afraid to say you've probably never had to maintain LISP for anything other than a freshman computer course. Its flexibility is its drawback. Tossing software over the wall to someone else, "and then a miracle occurs", is the underlying problem with most object oriented programming, and LISP was used on a lot of programmers to teac

      • > - Convincing the "architect" who hasn't coded anything in years that your functional spec is the 21st century way to meet the requirements.

        I'm afraid you left out "finding out that the way it was done originally had a very good though undocumented reason, and explaining why the new software actually provides no gain whatsoever".

      • by Altrag ( 195300 )

        Because that 10% of the time is something the vast majority of the population will never be able to accomplish.

        Assuming one of these projects eventually results in a system that doesn't require a degree itself to operate, all of your points could then be done by any old dumbass manager (maybe not as well, but good enough is good enough in the business world. Managers and salesfolk stop giving a crap the second the check is signed.) Except the 4th point, which would presumably be irrelevant since you could

    • If you don't have to learn the intricacies of some esoteric computer programming language, you'll have to learn the intricacies of this esoteric NSF project. Next!

      Agreed, which is the same reason my coworkers and I resisted replacing our existing system with a CASE tool about 18-20 years ago.

    • by BCooley ( 899054 )

      If you don't have to learn the intricacies of some esoteric computer programming language, you'll have to learn the intricacies of this esoteric NSF project. Next!

      I think you're missing the point.

      The point is not that computer assisted programming environment will be easier for you to learn. The point is that it can make you dramatically more productive, allowing you to potentially write the same code that it might take a small team of programmers to write otherwise.

      For example, having the system complete partial specifications or partial algorithms and intelligently (at least for a computer) fill in the blanks as you work means that you can work at a greater leve

  • by Todd Knarr ( 15451 ) on Sunday August 14, 2016 @04:48PM (#52701293) Homepage

    They won't automate software development until they come up with a system that can handle creating correct software from incomplete and partially erroneous specifications which don't remain constant between the start of development and delivery. At best they'll be able to automate some of the tedious boilerplate coding.

    • by Tablizer ( 95088 )

      Yip, they really need to automate the slapping of PHB's if they want smoother software development.

      Granted, PHB's are job security. CRUD and GUI idioms have been around long enough that they could largely be encoded into data dictionaries which could control about 90% or more of the rendering of typical applications.

      But UI fads, stupid "standards", Microsoft's forced obsolescence, ego-based customization, customers/managers who don't want to bother to think things through, and other shit muck it up, causing

      • Yip, they really need to automate the slapping of PHB's if they want smoother software development.

        Unfortunately there are two types of programmers: those who can self-manage, and those who need someone to constantly push then to get to work.

        It's unfortunate because most programmers are of the latter type, and those are the ones who are helped by having a PHB. The rest of us are just collateral damage.

        • by Tablizer ( 95088 )

          Good managers use the appropriate technique per employee. Unfortunately, good managers are hard to come by.

  • Oh no. (Score:4, Insightful)

    by mhkohne ( 3854 ) on Sunday August 14, 2016 @04:51PM (#52701315) Homepage

    Yet another thing that will draw a bunch of people who can't think into programming.

    Guys, the languages are NOT the root of the problems we have (they don't necessarily HELP, but they aren't the problem). The problem is people who can't wrap their heads around what they are doing, or the problem they are trying to solve, or the fact that they actually have to check their own work.

    The problem isn't the languages, it's the people.

    • by gtall ( 79522 )

      Well, it is the people, but it is also the problems themselves. Many problems are intricate, many-layered affairs. The best strategy for getting started is to stop with Agile madness and think hard about the problem itself. Talk it over with others, explore the facets, figure out where are the unknowns, etc. Business isn't in business to do these things...well, certainly not the ones run by the MBAs. No automated tooling will help with this.

      A similar thing happens in research but from a slightly different p

      • by Anonymous Coward

        You owe me an apology. As one of the first of the public to be diagnosed with the "disorder" in question, I resent being compared to Millenials, which are not even remotely ADHD, they just never got TAUGHT how to pay attention. There *IS* a deep and fundamental difference.

      • Both issues have another opposing force, millenials. They seems to have a "I want it now" mentality. The fixation on handheld gizmos only enforces their ADHD and leaves them with the attention span of gnat. Agile only feeds the disease as do these gadgets.

        I agree. They fondle their phones like lovers* and feel like they're marooned on a desert island if they don't have one within reach every second of every day. Many of them have ridiculously short attentions spans and are completely unable to escape from their self-imposed immediacy. They're like psuedo-zen creatures with cellphones, living solely in the moment.

        -

        *Except most millennials have never actually touched another human sexually because they're so socially awkward / retarded. It's like they have all

      • by Altrag ( 195300 )

        The issue is far deeper than that though:

        - We want to open up software development to a wider audience. We developers want to liken ourselves to the architect of an amazing bridge, but management would prefer we were the manual laborers, mostly because there's so many of us involved when building even moderately sized applications. So there's a huge push to reduce software development out of the realm of "people who can think digitally" to "people who can think slightly better than a chimpanzee."

        Sure you

    • Simulink does that already and the sky hasn't fallen. There are plenty of jobs to be found. [indeed.com]

      I've, for the most part, forgotten a lot of my C because it's not needed. Turns out you don't have to know strict C to be able to think through logic problems.

    • The problem isn't the languages, it's the people.

      Yep. We all know people who are completely unsuited mentally for the job of coding or programming because they can't think their way through a problem. They may be good at doing other things but not problem-solving or coding.

    • by Jeremi ( 14640 )

      The problem isn't the languages, it's the people.

      Quite right! So the sooner we engineer the people out of the process, the better. ;)

    • That is why I am 100% ok with this. As a competent software developer, this will mean sooooooo much more work for me because these "new" programmers will be awful. This will be a boon for good software developers everywhere.
  • There is nothing fundamentally different from spoken languages in programming languages. Difficulties with learning programming languages or any second language has more to do with how we teach people to learn more than anything else. We spend so much time trying to have students focus on the boring aspects of literacy when there is no evidence that they are prerequisites for everything else. Knowing the multiplication table doesn't prepare you for other mathematics.
    • by JustAnotherOldGuy ( 4145623 ) on Sunday August 14, 2016 @06:09PM (#52701553) Journal

      There is nothing fundamentally different from spoken languages in programming languages.

      This is so wrong it's clear off the x1000 scale of Wrongness.

      • Oh? Care to elaborate, or are you satisfied with mere posturing?
        • trolling or ignorant?

          When you say there is anything more than the most superficial similarity between programming languages and natural languages then you come across as someone who knows nothing of at least one of the two. Everything I've thought to say would most likely be considered insulting to your intelligence. Perhaps you could offer some actual evidence to support your extraordinary thesis?

          • Lets start with something simple then. To my understanding there is nothing that can be done that cannot be reduced to computations on a Turing machine. Also, operations can be reduced to machines that have connections between logic gates that process an AND, OR, NOT, 1 or 0 or a generator that randomly emits one of the above. I may have missed an operator; I'm not certain. Where would you like to go from there?
        • You wrote, "There is nothing fundamentally different from spoken languages in programming languages."

          This is completely untrue, even with no more than a moment's reflection.

          You can't use a given keyword or function in any way other than the way it's programmed to act. But language is flexible, open to interpretation on multiple levels, and the meanings of words change based on their context, time period, and even location.

          For example, trim() will always mean what it has been assigned to mean, and will alway

          • Not entirely true you can reassign a keyword to mean whatever you want it to mean according to certain rules. Ever hear of the #undef operator? Even if it does not work in as a complete a fashion as i think it might, surely you can see how it might work if extended to the fullest. And simply checking a global variable can change the meaning of the function. Here's one way that your bad example can be represented in psudo-code: IF WithOldMan then BadMeansBad = True ELSE IF WithFriends THEN EnergeticBadMeansG
            • Challenge accepted! Mind if I use QB64?

              Oh please, could you be any more ridiculous if you tried? Your example doesn't work, even as a half-hearted attempt to prove that a function can be made to do something it's not intended to do. I'm sure there are legions of programmers out there using trim() to find cube roots....oh wait, no, there aren't.

              If you really think that "there is nothing fundamentally different from spoken languages in programming languages" then you probably don't understand either of them.

              But your example was amusing. Now use i

              • What do you mean it doesn't work? Is there a syntax error somewhere? Did i get my math wrong that something raised to the power of 1/3 is the cube root? What does the number of programmers using this version over the one in a standard library have to do with anything? Your statement as I interpreted it was thst the computer could not be told that trim() has a different meaning from the commonly used one, so I attempted to show you how it wasn't the case. What did you really mean, then? Resigning yourself t
                • I'm still waiting for your example of how to use imagecreatetruecolor() to connect to a mySQL database, and use strtolower() to write a text file.

                  Or is it that those functions can't be used for those things?

                  • It's merely that I don't know how to get any function to do those things. No matter what you called one, once you know how to do it with a routine named one thing, you generally know how to do it with whatever you want to name the routine. Taking a step back, I do know how to write a routine called strtolower() to write a text file, but I think I've demonstrated that is beside the point and that the real issue is that if I were to use the words you used it would mean something completely different and as a
                    • It's merely that I don't know how to get any function to do those things.

                      No, it's that those functions don't do those things and never will.

                      -

                      I do know how to write a routine called strtolower() to write a text file

                      Yes, you can create a function to do anything you want and name it anything you want, but that's not what we're talking about. This is about using a pre-existing function to do something that doesn't do, like using the strtolower() function to write a text file.

                      That's how programming languages are different than spoken languages- the former has a very strictly defined meanings that don't change, while the latter has meanings that can and do

                    • No, in natural languages when the meaning changes it is the exact same thing you have just taken a label with a preexisting meaning (function) and just created a new word that's a homonym. It only actually changes in the exact same sense that the function appeared to change because I wrote a new function using the same label.
                    • When you create new code for a function, you tell the computer to rerun the compiler and linker and with natural language you do effectively the same thing.
                    • No, in natural languages when the meaning changes it is the exact same thing

                      Right, but that's not true of programming languages. You can't take a function meant to do one thing and use it to do something completely different. If you disagree, please show me how you can use the two PHP functions below to do what I'm asking:

                      1) Use imagecreatetruecolor() to connect to a mySQL database.

                      2) Use strtolower() to write to a text file.

                      In a human language you can change the meaning or usage of the word, but you cannot do that with a programming language. You'll never be able to use the imagec

                    • When you create new code for a function, you tell the computer to rerun the compiler and linker and with natural language you do effectively the same thing.

                      Sure, and now every bit of code previously written to use that function fails. Not so with a human language because human language is contextual. Computer code is not.

                      Again, show me how to use imagecreatetruecolor() to connect to a database, or how to use it to strip periods from a string. You can't, because that's not what it does. You can do that with a human language but not with a programming language.

                      But I can use the word "bad" (for example) in lots of different ways and the different meanings will al

                    • No, every function theat used that function does not fail because you had enough sense to put the oruginal version in a separate library file. Human language is the same way. Both are contextual, just in computer languages you are more aware of managing the context, which ironocally led you to the conclusion that there is no context. C++ actually deals with managing the context by mangling names. Here's some pseudo code:
                      main
                      {{
                      using namespacewherecoloreoutinemeanswhatituusuallydoes
                      colorroutine
                      }
                      {
                      using
                    • Both are contextual, just in computer languages you are more aware of managing the context,

                      No, computer languages are not contextual. In any given computer language a function does only what it's been designed to do, which is why you cannot use imagecreatetruecolor() to connect to a database, or use it to strip periods from a string.

                      Computer languages wouldn't work if they were contextual, that is, if the meaning or output of a function provided irregular or arbitrary output. Human language, however, works just fine in a contextual mode.

                      Again, if you're correct, prove it by using imagecreatetruec

                    • I've already shown you pseudocode on how natural language is actually processed in the same way as computer language, thus any apparently context that natural language has that programming languages doesn't is an illusion and asked you to explain what the correct processing for natural language in the human brain is if I am wrong, and if you can't do that, then I guess we are done here. Every time you create a new folder, file, or namespace, you create a new context. I'm sorry if you can neither see that or
                    • You originally wrote, "There is nothing fundamentally different from spoken languages in programming languages."

                      That statement is simply incorrect. Ask linguistics professors if there is anything fundamentally different from spoken languages and programming languages, and they'll say "yes". So will nearly every Comp Sci student and teacher.

                      It may sound cool to claim that there is nothing fundamentally different between them, but it's simply not so.

                      Anyway, cheers, and thanks for keeping things civil. It's ni

                    • It doesn't matter if they say "yes" or not, they still have to back it up. Maybe I should try to get one of them to give me more than the time of day but I expect they'll want me to take formal classes and shell out tons of money instead. I think I can counter any claim made with facts, however.
    • Learning how to conjugate ser and estar doesn't help you speak Spanish, either.

      Oh, wait...

      • You saw a pattern and decided to attempt to apply it in a situation where it doesn't apply. In doing so, you created a stawman to attack. I am not entirely sure, but conjugation might be an area where talking more about the concept may aid in handling all the cases. And one more thing about language education: It might be more advantageous to teach by concept instead of by language. If the similarities between laguages and their roots are taught first, you are better off. Vater in German and father in Engli
        • what are you smoking? You provide no evidence to support your moronic thesis and dismiss as a strawman counterexamples.

          Ah, wait. You aren't smoking anything: you are trolling.

          For the edification of anyone who might come across this tripe unawares: a native English speaker has an easier time learning German than Japanese because of the many similarities. The same for a native English speaker learning a romance language. A native speaker of German can learn Dutch or Swedish even more easily -- and once you le

          • Oh, it was meant as a counterexample and not an attempt to extend my logic in my initial example into an area that you thought it applied and I did not. Sorry, I misunderstood you. I still do not understand how it serves as a counterexample though.
            The implication isn't that it is completely unknown or unutilized, but to elaborate on the other part of my thesis, (my experience with learning a second language in formal education is three years of high school German, and I have been working on learning Japane
          • I think you fleshed out what I was trying to say about the connection between natural languages rather well, so good job.
    • There is nothing fundamentally different from spoken languages in programming languages

      Well, there is the matter of ambiguity. Most human languages have scope for ambiguity in the syntax. A piece of computer code means one thing and one thing only. If it doesn't, it's a bug.

      Also computer languages evolve differently from spoken ones. Spoken languages may have a precise syntax, but the speakers are free of ignore or adapt it and the meaning can still be carried across. If you try and get creative with th

      • In human language the apparent ambiguity is usually resolved with context and programming languages can also have statements that need context to be resolved. See also logos naki world https://youtube.com/results?q=... [youtube.com] The song has meaningless sounds designed to trick the hearer into resolving them into words.
        I am also working on constructing a programming language thst incorporates as many features as can be implemented together of all existing programming languages. You can use DO as an instruction to be
    • There is nothing fundamentally different from spoken languages in programming languages.

      ...is a thing exclusively said by people who have clearly never studied the topic of formal languages and what sets them apart from informal languages.

      • Easy to say, but more difficult to show evidence. I see you didn't present any. Care to have a go. I don't require absolute proof just the preponderance of evidence. All swans are white until they found a black swan. https://www.bing.com/search?q=... [bing.com]
        • You were basically claiming that we could use swans to teach students the mechanics of jet flight. I generally don't provide evidence in response to posts such as those, since the distinction should be obvious to anyone who gives the topic even half a glance.

          But backing up for a sec and giving you the benefit of the doubt (since you responded more politely than my comment warranted), whether you realize it or not, your original assertion effectively says that there isn't a distinction between natural langua

          • No, I was claiming that you could use the concepts of the mechanics of jet flight to understand how swans fly. Or at least that's closer to what's going on.
            You weren't paying attention to your link very closely because Wikipedia just did.
            You bring up recursion and then proceed to contradict your point.
            My point is that level of rigor is useful in natural language too. If we taught people that they can apply that level of rigor to natural language but that natural languages are a special case where you don
            • You weren't paying attention to your link very closely because Wikipedia just did.

              I linked to Wikipedia's page for general closure properties, but I specifically was talking about them with regards to formal languages. I stand by what I said: you can't explain them without first explaining formal languages. Of course, to make another pedantic caveat (I'm not saying that you said this, but just in case), I suppose we could point out that Wikipedia uses natural language to explain biology, astronomy, and nuclear physics too, but we wouldn't assert that any of them lack are in any way simil

              • You missed my change in word usage, from defined to describes, The mathematical proof in showing that the truth of the system taken as a whole cannot be defined describes the very nature of truth and how it is illusive. The truth of the system can break down sn any second is all that really means. There's no net that competely prevents you from setting something down and it suddenly ceasing to exist, for example, and one more concrete example is that of electronics failing. One thing goes wrong and suddenly
                • You may think that things that are not mathematically definable exist, but in fact they do not.

                  No, I don't think that. I know that and backed it up with a link to the math to prove it. You asked for evidence earlier, and I've been providing it. You can't just dismiss it with "in fact they do not".

                  I may be willing to accept your premise that all things are describable, even if they aren't definable, but changing the discussion to being about describing instead of defining is an example of moving the goalposts. After all, whether or not something can be described is a tangential topic of no relevance t

                  • I keep interpreting.things different from the way you mean. You say something is ineffective and instead of providing evidence, you speak from your own anecdotal experience, and then instead of describing that experience, you decide that the fact you have had an experience similar to the criteria it should suffice, without any detail to explain why yor experience meets the criteria.
                    You say both "I disagree with the apparent implication that they can be taught similarly." and I have had experience doing it.
          • One thing you said that I missed in my initial response to this post is your notion of mathematically defined. I believe that no corner of reality can escape being able to be described mathematically.
  • by bfwebster ( 90513 ) on Sunday August 14, 2016 @05:28PM (#52701419) Homepage

    Not to be an old programming fart (but, hey!), but this comes up about every 5-10 years. Someone has created a system for automatic program generation that is going to replace programmers (4th generation languages, anyone? How about "The Last One"?), and it turns out to have only limited usefulness.

    Of course, code generation programs exist. They've existed almost as long we've been programming computers. The most common are assemblers and compilers, which take in text specifications and generate running code (or sometimes bytecode to be interpreted). And if you stop and think about the difficulties that most of us who code have with making source code that we write produce running code that meets our needs, you can immediately see the issues with replacing or bolting on top of that system a 'source code generation' system. It can work very well as long as you don't exceed what it can actually do and only if the code generation system itself is well-written and reliable. (This is why developers feel a sense of betrayal and anger with compiler bugs more than any other kind of tool bug.)

    So, yeah, like strong AI, self-coding systems are always 5 to 10 years out and have been for half a century. ..bruce..

    • The poltical types are unaware how much money has been wasted in the effort for decades. Most people believe computers work like "Open the pod bay doors, Hal" rather than complex mechanisms that must be carefully prescribed tasks.

      This effort has had a budget of 10 million spent over 5 years so these guys need to show some results in this election year. I would like to see the budget for the 4th generation computer effort and all the other work to make programmers obsolete. When the real bottleneck is alwa

    • by Tablizer ( 95088 )

      Most of the cost of a working software product is maintenance, not original code creation. Something that automatically generates code better produce grokkable code, or else maintenance will be even more expensive.

  • ... they did not.

    The software engineers are required to extract the problem out of customers, who often don't know what they want.

  • by JustAnotherOldGuy ( 4145623 ) on Sunday August 14, 2016 @06:07PM (#52701545) Journal

    "Has The NSF Automated Coding with ExCAPE?"

    NO, they have not "automated coding". All they've done is provide a layer of abstraction to some predefined procedure functions.

    And who wrote that layer of abstraction? Real programmers working with actual code, that's who.

    Can you program through a Joe Sixpack GUI? Maybe, but that GUI and all the shit behind it didn't fall out of a fucking tree. It had to be written...in code...by actual developers.

    When Joe Sixpack uses this thing to write a medical billing program with a data warehouse and credit card gateways, let me know.

    • by raymorris ( 2726007 ) on Sunday August 14, 2016 @07:04PM (#52701715) Journal

      > When Joe Sixpack uses this [gui] to write a medical billing program with a data warehouse and credit card gateways, let me know.

      Oh they WILL point and click their way to a Sharepoint site that stores personal medical information, accepts credit cards, and emails it all as an Excel spreadsheet. And it'll look like it pretty much works, most of the time. (It doesn't bother anyone with alerts when it fails on numbers with more than four digits, so nobody sees any problem.) They just saved $6,000 over having a developer with a clue involved!

      Then some script kiddie will find it, the manure with strike the ventilation, and the company will spend $250,000 cleaning up the mess, much of that going to the security company I work for.

  • Programmers like myself like to believe we are exempt from being made as replaceable as possible. We're not. Whether it's Outsourcing, H2B, or crappy projects like this... Greed finds a way. The programmers of today will be the mill workers of tomorrow. I hope society figures out a better economic model by then, because *everything* we've tried up until now seems to fail significant amounts of people.
    • by HiThere ( 15173 )

      I do believe that automated coding is possible. I also believe that it would require a program that could handle English (or some other full language). This doesn't sound like it.

      Actually, it would need to do more than handle English, it would also need to have a rather complete model of the world. This just sounds like another domain specific language.

      • Thank god people haven't understood yet this. As soon as they realize that what a computer lacks is a model of the world, i.e. information, we are almost done as programmers.

        The real reason I cannot give a command to my computer to "create me a site like slashdot" for example, is that the computer doesn't know what I am saying.

        As soon as deep knowledge AI systems are fully developed, and loaded with the appropriate knowledge, our days as programmers will be over.

        • by Altrag ( 195300 )

          You're assuming that human-like intelligence doesn't inherently come with human-like failings. I mean it might be possible to have the best of both worlds, but that's far from clear at this point and at least some research suggests that certain failings (particularly forgetting things) might be an intrinsic part of the way our brains store knowledge, above and beyond simple capacity re-use and recall speed.

          So we'll have to wait and see (probably quite a long time still..) but there's certainly no guarantee

  • by Tom ( 822 ) on Sunday August 14, 2016 @06:55PM (#52701691) Homepage Journal

    By removing the need for would-be programmers to learn esoteric programming languages, the method has the potential to significantly expand the number of people engaged in programming

    Because we really need more amateur programmers fucking things up and creating software with exploitable bugs. Who needs information security anyway...

  • The choice of 'automated" word is unfortunate. This helps coding, but an human operator still has to tell the machine what to do, which is programming.

    True automated coding could only be claimed the day human operator will be removed from the process.

    • by r0kk3rz ( 825106 )

      The choice of 'automated" word is unfortunate. This helps coding, but an human operator still has to tell the machine what to do, which is programming.

      True automated coding could only be claimed the day human operator will be removed from the process.

      As someone who works in industrial automation, I have to disagree. This is what automation looks like, you take something that is labour intensive and make it less so with machinery. Until we create hard ai someone will always have to tell computers what to do, and the manner in which to do it, what this is doing is evolving the language we use to get computers to do what we want.

  • (Instead of spending billions on this mad A.I. boondoggle though, they probably should have hired me to write whatever it was instead.)

  • ExCAPE is a research program and grant, not a single finished piece of software. The output from such programs is mainly publications and ideas:

    https://excape.cis.upenn.edu/p... [upenn.edu]

    Automated programming, program synthesis, and similar projects have a long, long history:

    https://en.wikipedia.org/wiki/... [wikipedia.org]

  • The NSF isn't developing anything. The NSF has created a program that funds large scale research grants to universities. In this case, the grant is to a collaboration of several large universities to explore ways to meet this goal. If you click through the article and then to the page about the project, including the universities involved in the collaboration (MIT, Cornell, Michigan, UPenn, etc...), you can see actual useful information: https://excape.cis.upenn.edu/i... [upenn.edu]
  • IDEs like Visual Studio generate code using a GUI. Report builders like SSRS generate code using a GUI. There are lots of examples, and they all have one thing in common: they work within very limited boundaries. Visual Studio generates code to produce data entry forms; SSRS generates code to produce reports. This project might generate some specific class of applications. That's nothing new, but it will certainly never replace general purpose languages.

  • by the agent man ( 784483 ) on Monday August 15, 2016 @05:06AM (#52703259)

    NSF is a US government foundation supporting science through grants. They are NOT developing anything nor are they patenting anything. NSF is funding organizations, mostly universities, but has a clear disclaimer statement: "Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation."

    The original article does not make any such claims and indeed states "a research project funded by the National Science Foundation" - the poster, EditorDavid, should have been a bit more careful.

  • NSF does not actually do things - they *fund* other people who do things. NSF also cannot patent anything (nor can any part of the government), and the submitted patent for NetEgg has nothing to do with NSF.

  • The joke's on you! This is just another of the latest and greatest, for ever and ever frameworks to re-implement everything in. Another latest and greatest, for ever and ever framework, no need for anything more, ever, we really mean it this time, will come along again shortly.

    I've been through about 10 of them during my quarter century career.

You know you've landed gear-up when it takes full power to taxi.

Working...