Slashdot is powered by your submissions, so send in your scoop


Forgot your password?

Dumbing Down Programming? 578

RunRevKev writes "The unveiling of Revolution 4.0 has sparked a debate on ZDNet about whether programming is being dumbed down. The new version of the software uses an English-syntax that requires 90 per cent less code than traditional languages. A descendant of Apple's Hypercard, Rev 4 is set to '...empower people who would never have attempted programming to create successful applications.' ZDNet reports that 'One might reasonably hope that this product inspires students in the appropriate way and gets them more interested in programming.'"
This discussion has been archived. No new comments can be posted.

Dumbing Down Programming?

Comments Filter:
  • by eldavojohn ( 898314 ) * <eldavojohn AT gmail DOT com> on Thursday November 26, 2009 @05:47PM (#30239900) Journal
    There's two major sides to this issue. One seems (note I said seems, not implying at all that it's unavoidable) to be that the more human readable and dummy proof the more overhead you pay when you design/implement a programming language. You might find the C/C++ crowd commonly accuse the Java or Ruby crowd of this overhead. Indeed, Java has a garbage collector designed to protect you from memory leaks and Ruby is an interpreted language that pays a mild additional overhead since it cannot be optimized upon compilation. But that's another debate altogether, it just is evident that the more you move away from actual machine language and assembly then more overhead you pay (generally).

    The other side of this issue is that computers are our servants, not the other way around (and if anyone reading this is a bot or script, don't you forget that). I don't recall where I read it but this is the reason why the string is the most important data structure in computer programming. Because simply put, the string is the best way to communicate with the user. What follows from this logic is to screw the optimizers (or 'misers, if you will) and make the servant learn the language of the master--not the other way around. And isn't this how the most complex applications have progressed? Once requiring training and years of experience, now even a kindergartner can master a word processor. Computers and applications will forever be bending over backwards for the most important thing to us: us.

    And yet if an implementation of a language incurs on average of 10% overhead, your hardware will catch up in a matter of months.

    And yet if you run a data center the size of Google's and have several applications in said implementation running on hundreds of thousands of machines, a cycle here and a cycle there isn't so laughable to work toward saving. And isn't it the big players that ensure the lengthy life of a language and its implementations?

    So it's a good debate with several sides. Personally, I love the fact that I can code a web application in Ruby, run some old C code off sourceforge in Java with JNI (sort of) and bust out a C++ application for manipulating ID3 tags across my entire music library. To those arguing against Rev4, I ask simply "why not?" I mean, you don't have to use it, it's a natural progression so let it happen. Maybe you'll find it useful for prototyping? Maybe you'll find it's a useful tool for some problems and your toolbox will grow? Who knows?

    In the end, I would like to opine that the the chip makers are forcing us towards languages that make multi-threading more intuitive and useful. I mean they concentrate on threads communicating or even implementing APIs to help automate (by enabling what is appropriate to multithread) in loops and algorithms. That's going to be a large factor on whether a new language is adopted and survives.
  • What's new? (Score:5, Insightful)

    by Rising Ape ( 1620461 ) on Thursday November 26, 2009 @05:49PM (#30239910)

    People have been saying this since FORTRAN meant you didn't need to know assembly language to make use of a computer.

  • by rdean400 ( 322321 ) on Thursday November 26, 2009 @05:54PM (#30239940)

    The big thing I notice in their "competitive comparisons" is that they strive to make Java, C#, C++, and PHP as verbose as possible when they're creating what looks like it should be optimal Rev code.

    I wonder if they didn't compare themselves to Ruby or Python because they couldn't contrive examples that produce huge LOC differences?

  • by jaymz2k4 ( 790806 ) <> on Thursday November 26, 2009 @05:55PM (#30239956) Homepage
    The development of new languages and new ways of simplifying coding has been a part of the computer landscape since the whole thing began. You can argue that coding in Python is a form of "dumbed down" assembly. I wouldn't think of creating a webapp with assembly! Django has "dumbed down" much of the mundane parts I often have to create and dealing with forms and templating. But the one thing I have noticed is that no matter how easy "programming" gets there are still people that will just not "do it".

    I still can't see the masses suddenly deciding that they're going to program applications now. Hell, most of the people I know think conditional formatting in excel is just too much effort. I can see this just being used by actual programmers for users but I dont think it will see in a swath of uber-uber-amateur programmers all of a sudden.
  • by DAldredge ( 2353 ) <SlashdotEmail@GMail.Com> on Thursday November 26, 2009 @05:56PM (#30239964) Journal
    .net appears to be doing rather well.
  • by 4D6963 ( 933028 ) on Thursday November 26, 2009 @05:59PM (#30239982)

    Dumbing down programming can only get you so far towards the democratisation of programming : the most dumbed down programming language still requires a user whose mind can express algorithms. And of all the people who can express algorithms and would want to, few are limited by the commonly used languages, that is, if you have a mind made for creating algorithms, learning to use a programming language will be fairly trivial.

  • by Anonymous Coward on Thursday November 26, 2009 @06:02PM (#30240004)

    Natural languages are full of ambiguities, so these "natural language programming environments" always use a more formal syntax (and semantic) and only look superficially like a natural language. Until you can actually talk to a computer (and the computer can take all the context into account), programming in such a language irritates people to no end when they stumble upon one of the differences between the programming language and the natural language it imitates.

    Programming is the act of understanding and structuring a problem. The coding that follows is practically trivial compared to that first step. There's certainly a need for more programmers, because more and more is automated and someone has to write that software, but please don't create the impression that you can eliminate thinking from programming. Fixing bad code costs more than writing good code.

  • Submarine article (Score:5, Insightful)

    by bjourne ( 1034822 ) on Thursday November 26, 2009 @06:05PM (#30240018) Homepage Journal
    Paul Graham [] wrote a very informative article [] about "news stories" like this one many years ago. And congrats to the company behind RunRev, it is not that often /. runs slashvertizements for costly commerical software no one has ever heard of.
  • Re:What's new? (Score:5, Insightful)

    by WED Fan ( 911325 ) <akahige@trash m a i l . net> on Thursday November 26, 2009 @06:06PM (#30240040) Homepage Journal
    If I had mod points I'd find someway to give all of them to you for insight. What follows in this thread is the same tired religious discussion. Back in my day of programming that included paper tape, teletype terminals on time shares things were tough. We were making fun of the new little "home" computers. I can't tell you how many computer languages that I sneered at. I was sure C++ would go down in flames, same for Java. I was sure Modula-2 would be the next great thing. The simple fact that people don't like that languages they hate are still seeing wide spread usage shows that the discussion is more religious than logic.
  • Re:BAD IDEA (Score:4, Insightful)

    by hairyfeet ( 841228 ) <bassbeast1968 AT gmail DOT com> on Thursday November 26, 2009 @06:07PM (#30240046) Journal

    The reason VB got a bad rap isn't because of VB, which IMHO is fine if it is used as intended, it is the fact that too many folks tried to stick a square peg in a round hole and use VB where it didn't belong.

    VB does ONE job and does it quite well-GUIs for databases. That's it. Nothing fancy, just a quick and basic GUI so sally secretary can input data into a database and get data out of it. While you may argue that you can do the same thing in other languages, which of course you can, the simple fact is most SMBs and SOHOs simply couldn't afford it. Hiring a programmer isn't cheap, whereas with VB even a simple PC repairman like myself can whip off a nice GUI for a database, which in my experience is one of the biggest needs for a custom app that most SMBs have and is why VB is still the # 3 language for business, even though MSFT has done everything they can to kill it.

    So if you want to complain about dumbing down in general please do. But VB when used as intended by someone who knew the limits of the language was and still is just fine for the job at hand. For making custom GUIs for databases I have yet to see anything that works as easily and as affordable as VB does in that role.

  • by reporter ( 666905 ) on Thursday November 26, 2009 @06:08PM (#30240050) Homepage
    This progression toward using English words and syntax to program a computer is less about dumbing down code and more about encouraging people to document their code.

    Ideally, a programmer should document each section of code by writing a block of comments explaining (1) why the code is used and (2) how the code works -- in plain English. However, given the intense pressure to produce code by an unreasonable deadline (imposed by brutal managers risking a bullet through their head), the first thing that is sacrificed is comments. Not writing comments saves several minutes per section of code and -- in total -- saves days of work over the course of the project. In other words, not writing comments means that an impossible deadline becomes slightly more possible.

    A programming language that uses mostly English words and syntax is essentially an environment for self-documentating code: the holy grail of brutal managers everywhere. However, this self-documentation addresses only "how the code works". The programmer must still write comments explaining "why the code is used". Still, getting half of a loaf is better than getting nothing at all.

  • Rev4 syntax (Score:3, Insightful)

    by ickleberry ( 864871 ) <> on Thursday November 26, 2009 @06:11PM (#30240072) Homepage
    is it just me or does an ordinary 'easy' programming language like PHP, VB or python seem much easier to work with? the syntax of rev4 seems far too verbose, not nearly enough parenthesis.

    also if you understand how the machine works is there any real need to program in 'plain english'? the syntax doesn't quite make sense to me like other languages would. for example it seems more logical to have a loop to move something along by a tiny amount and then wait a bit rather than telling it to move a thing from one side to the other "in 5 seconds". with plain english you also end up with stuff that has multiple equally valid meanings

    i have nothing against making programming easier, just don't think this is the right way to go about it. a good IDE with syntax highlighting and prompting features like VB and a good set of libraries with decent error handling is better than any of this plain english stuff that introduces mostly redundant keywords for the sake of having plain english
  • by Anonymous Coward on Thursday November 26, 2009 @06:16PM (#30240108)

    I hope they embrace this stuff like crazy. There's nothing better for our careers than lots of shitty code written by people who are just barely above monkeys in terms of intelligence.

    Why is that? Because they will fuck it up horribly. And then they will need real programmers to come in and clean up the mess. If you don't mind getting your hands dirty, there's a huge amount of work to be done.

    This is perhaps the best thing to hit the industry since outsourcing to India. My company has been fixing their fuckups for the past decade.

    It's entertaining to see them screw up. It's even better to see the outsourcing managers who have to come to us and admit that their "cost-saving" measures will now cost their companies 5x to 10x what it would have had they just done it properly and had real programmers write their code.

  • Re:xkcd relevance (Score:5, Insightful)

    by jedidiah ( 1196 ) on Thursday November 26, 2009 @06:16PM (#30240112) Homepage

    In this respect, I think "clarity" is improved much more by using constructs from mathematics than from "english".

    So computing languages that try to avoid classic mathematical syntax are probably more a reflection of "the fear of math" rather than "the fear of computers". Although there's bound to be some overlap. The real problem in both cases is widespread fear and ignorance. This isn't just about people writing their own programs. The reluctance to learn and explore hamper the usefulness of basic end user interfaces (GUIs).

    We may be encouraging people to run with scissors when they haven't even figured out turning over yet.

  • by sznupi ( 719324 ) on Thursday November 26, 2009 @06:18PM (#30240116) Homepage

    ...which is largely due to external factors.

    (not saying .Net isn't nice, but ask yourself if it would do that well if it came from some small 3rd party dev)

  • by QRDeNameland ( 873957 ) on Thursday November 26, 2009 @06:19PM (#30240124)

    I think one issue here is the concept of "dumbing down", which goes back to COBOL at least. PHBs have always had this idea that better tools will make it so that that people who have little talent or interest in programming (or as you say, can express algorithms) can write software. That, I think, will always be a pipe dream.

    However, the goal of "simplifying programming" will always be a valid and necessary one, just due to the nature of advancing technology. The trick is to adjust the goal from "helping people with little programming competence to write software" to "create tools that enable competent developers to be more productive and their work less tedious".

  • by GWBasic ( 900357 ) <slashdot@andrewr ... com minus author> on Thursday November 26, 2009 @06:20PM (#30240134) Homepage

    Let dumber people program and you end up with dumber programs. Way back in year 2000 I found that most of the Y2K bugs were actually from more recently written programs in dumbed down languages.

    I don't see it that way.

    After spending a few years in garbage collected land, I was assigned to work on a C++ application. The C++ application was a mere SOAP wrapper for a database, so it wouldn't have any noticeable performance advantages over an equivalent program written in a garbage collected language.

    The problem that quickly became obvious to me, and why I ran away from the project, is that manual memory management is so time consuming; even as an expert programmer, that the "soft and easy" languages are more about letting great programmers get more things done in less time.

    This is even the approach taken in Python; the recommendation is to write well-tuned libraries in C for the parts where C will increase performance. For parts where C is a waste of time, Python lets you not worry about silly details.

  • by gbjbaanb ( 229885 ) on Thursday November 26, 2009 @06:25PM (#30240162)

    Unfortunately, that makes sense to managers only. Those of us at the coal face know that you can hire cheaper, less skilled programmers and let them loose with easy-to-use languages (eg Visual Basic) and you will get a monstrous mess that is impossible to maintain.

    If you make them use a reasonably difficult language, most of them will not bother becoming programmers. This a good thing.

    One other point that is never noted in these ideas to simplify programming and make programmers generic 'coding resources' is that a good, experienced, coder can do more work ( that is better quality) than half a dozen cheap, less skilled coders. This is never factored into management ideas of how you can outsource your coding and get the same quality for a tenth the price. This could be why a lot of outsourced contracts don't tend to last unless they're lost in a sea of big-corporate bureaucracy.

    Oh, and don;t forget that the more you chop and change programming languages, the less programmers you have who are experienced using them - you will get C programmers who have 40 years experience, you tend to get programmers who've "had a tinker" with languages like runrev.

  • Re:What's new? (Score:5, Insightful)

    by ScrewMaster ( 602015 ) * on Thursday November 26, 2009 @06:29PM (#30240188)

    People have been saying this since FORTRAN meant you didn't need to know assembly language to make use of a computer.

    Yes, they have. But at least FORTRAN, for the things that it did do, it did very well. However, in a more modern context dumbed-down languages invariably have severe restrictions on performance and capability, which makes makes them unsuitable for many purposes. Putting that aside for the moment, the reality is that unless you're coding mindlessly-simple applications, coding is hard. It just is, and it takes both skill and talent to pull off a well-written application, and I don't care what language you're talking about. Furthermore, to a skilled developer a dumbed-down language is a liability ... it just gets in the way. A better approach to this problem is to identify and train more good programmers, but that costs money and time. Certain people think they see a cheap way out by replacing sophisticated developers and their tools with sophisticated tools and simpleminded developers. Good luck with that.

    This all comes down to one faulty but cherished premise (one of many held by today's business community, I might add): that complex, reliable applications can be built by minimally-skilled developers if we can, somehow, just put enough power into the tools. The problem is, the tools aren't the problem, it's the people. In the end, software development is as much an art as it is a science, and there really aren't enough artists to go around.

    All such attempts to short-circuit the need for skilled developers are doomed to failure until such time as computers truly can program themselves. Of course, at that point it won't really matter, and most of us software engineers will be slapping burgers anyway.

  • by icebraining ( 1313345 ) on Thursday November 26, 2009 @06:34PM (#30240222) Homepage

    Isn't the best approach to develop fast, identify the bottlenecks and then rewrite those parts in a faster language, like Python C modules?

  • by arevos ( 659374 ) on Thursday November 26, 2009 @06:36PM (#30240232) Homepage

    I wonder if they didn't compare themselves to Ruby or Python because they couldn't contrive examples that produce huge LOC differences?

    Probably. There's no difference in length between:

    get the last item of line 2 of URL url



    I guess the former is easier to read, but languages that have a lot of "magic" in them tend to be pretty bad at scenarios the developers didn't think of. Which will inevitably turn out to be something you want to do.

  • by broken_chaos ( 1188549 ) on Thursday November 26, 2009 @06:38PM (#30240250)

    Not entirely. In some cases they can be, but not in others.

    They might be if something is opaque enough that anyone who is not a 'dummy' will simply fail to produce something working. Like trying to create an assembly program with no knowledge. However, they might not be if they're more in-between incomprehensibility and English. Like someone enters the wrong command into a shell.

    For the latter, imagine if, instead of "rm -rf *", you'd have to type "delete all files in this folder, and I'm sure I want to do this". It's more verbose and much less efficient, but it's both more human-readable and likely much more dummy-proof. If someone can more easily understand what they're doing, they're more likely to stop and realise it may not be what they actually intended to do.

  • by SharpFang ( 651121 ) on Thursday November 26, 2009 @06:45PM (#30240296) Homepage Journal

    I can say one thing.
    You've never done embedded programming.

    No chips that have to work in temperatures between -40C and +120C.
    No devices that work 120 meters under surface of water which is 1800 meters above sea level and good 4 hours of march from the last road where you can get by car.
    No chips for appliances, toys and small devices where $0.03 per unit savings by choosing a model with 64 bytes(!) of RAM instead of 128 bytes of RAM converts to a six-digit profit.
    No devices where failure to perform according to specs and fail gracefully will land you in prison for between 2 and 15 years.
    No devices that run a dozen sensors and send the results every hour over GPRS running off a single battery the size of a standard "A/R20" for a year.
    No devices where you measure time between sending out a beam of light and receiving it bounced off the obstacle, to determine distance with 5cm resolution.
    No devices where you have to do error correction, encoding and driving control and data lines at 100 megabit/second - or more precisely, at one bit per 10 nanoseconds plus/minus 1.5 nanosecond.

    This kind of applications won't have the hardware catching up to let you replace C, Asembly and VHDL with Ruby or Java for decades yet.

  • True, but the looser the syntax, the more that you need to know to debug. Supporting complete natural language is a daunting task. Whatever constructs that you don't employ end up forming an exceptions list.

    To be honest, the language is hardly the real problem. Its been a while since I did it, but I picked up my first couple from books. The challenge was seldom the language itself, and more about breaking down a task logically into discrete units and defining them, ordering them, and putting the right logic around them.

    Text based languages had many reasons to evolve the way that they did. However, I see nothing invalid about producing code in a way and or language that defines this information in a different manner. Couldn't you just as easily replace the text editor with a flow chart where each operation or function was represented as an object in the chart? Not saying this is how I want to roll, but, I see no reason that it couldn't be made functionally equivalent.

    In truth, I am not sure that it will shorten the time that it takes to learn, as it will still take time to learn the skills of putting the pieces together. A calculator makes you an instant basic math wiz. Addition, subtraction, no need to learn times tables. However, its not going to obsolete learning the concepts. It can't make you an algebra god.

    Once you learn one or two languages, picking up another is usually easy (I never really gave lisp a fair shake, but it was the exception). The concepts are the same. I would imagine that a person who became proficient with something more hypercard like would have little trouble translating those concepts and learning some of the high level text languages.



  • by Anonymous Coward on Thursday November 26, 2009 @06:48PM (#30240322)

    This can be seen even in universities, there are CS majors with no education in ASSEMBLER or C during the whole studies, some haven't even used C++.

    And money dictates here too. We are forced to study that horrible, disturbed nightmare that is Symbian for mobile development class, although I did manage to talk the lecturer into allowing one half of the course test to be for Android if one wanted it to.

    I don't consider myself a good coder; my experience comes via a hobby from a simple c-like language which pretends to have objects. I've also seen some so, so much better coders, especially within in the demo scene, but even I can see that the level of a regular CS student is horrible. There are few rare gems out there, but the average is nasty. It'd be absurd to expect them to learn those languages to any decent level within the regular course times either.

    The efficiency of one's code is hardly ever noted within courses. It's good if it works. There's probably just so huge need for 'bulk coders' and other 'it professionals' that they've had to dumb down most of the courses. There are only a few rare courses which really challenge you (logic programming being the usual cause), and they're not needed to get the papers.

    It's infuriating having to drowse through such highest level education.

  • Nope (Score:5, Insightful)

    by Colin Smith ( 2679 ) on Thursday November 26, 2009 @07:00PM (#30240412)

    The best approach is to develop fast, identify bottlenecks and then require the user to upgrade their computer.... their IT infrastructure... Worldwide network and datacenters.

    That's the economic history of programming.


  • Re:What's new? (Score:5, Insightful)

    by eddy the lip ( 20794 ) on Thursday November 26, 2009 @07:01PM (#30240416)

    I can't help thinking that particular grail quest comes from mistaking which part of programming is hard. I can't count how many times I've heard "I could do that if I had the time to learn the language." Except the hard part of programming isn't the syntax.

    Tools like the one under discussion seem to be aimed at the crowd that think the only thing to learn about programming is the language, and then you can skate on that knowledge. If the language is english, suddenly you can bypass all those expensive, crotchety programmers.

    (This may be true for some tasks - there must be some utility here. But I'm sure I see some scales floating in that oil.)

  • by vadim_t ( 324782 ) on Thursday November 26, 2009 @07:04PM (#30240438) Homepage

    Ideally, a programmer should document each section of code by writing a block of comments explaining (1) why the code is used and (2) how the code works -- in plain English.

    Yes, but how the code works should be pretty obvious from the code itself. The "how" you may want to explain is the overall algorithm, and that won't appear in the code on its own no matter what language you use. What will appear is a (possibly flawed or misused) implementation of it.

    What you may need to describe is what the code is trying to accomplish, not what it's actually doing. A comment of "This code calculates the square root of the sum of the squares of the differences between between points X and Y" on "sqrt((P1.X - P2.X)^2 + (P1.Y-P2.Y)^2)" isn't terribly helpful.

    Now an explanation of that it calculates the distance between points P1 and P2 using the pythagorean theorem would be more helpful, and an explanation of what's that used for would be better still. But there's no way more verbose code will give you that. Code is just what the program is doing, not what it should be doing, what it's trying to accomplish, and why it needs to do it.

    A programming language that uses mostly English words and syntax is essentially an environment for self-documentating code

    Self documenting code isn't about turning "x++" into "add one to x". That's more verbose, but doesn't explain anything extra. Self-documenting code is about writing something like:

    float distance = get_distance(&start_pos, &end_pos);

    Where the naming and flow make it perfectly clear what the program is doing without any further explanations.

  • by eldavojohn ( 898314 ) * <eldavojohn AT gmail DOT com> on Thursday November 26, 2009 @07:05PM (#30240452) Journal

    This kind of applications won't have the hardware catching up to let you replace C, Asembly and VHDL with Ruby or Java for decades yet.

    The part where I stated that Rev4 may be an appropriate tool for some tasks can be applied to all languages/tools. No one's writing web applications in assembly. And no one's using Jakarta Struts to control an embedded device.

    Nowhere did I make any claims that Ruby, Java or Rev4 will ever replace C, Assembly or VHDL for these problems. I was speaking about the largest chunk of desktop applications and applications that a non-coder might be able to use Rev4 to produce.

    Furthermore, Rev4 makes no such claims to tackle a single one of the examples you listed. So the premise to the discussion was Rev4 and its target users. No one is going to select a "dumbed down" language or technology to tackle any of the problems you listed and no one is claiming Rev4 is going to someone who's never coded tackle the problems you listed.

    They are interesting problems and must be kept in mind before declaring "No one will ever program in assembly again!" But I made no such claims nor should anyone. It would take a fundamental revolution in hardware and at least one hundred years to be able to make any such claim.

  • by GWBasic ( 900357 ) <slashdot@andrewr ... com minus author> on Thursday November 26, 2009 @07:21PM (#30240592) Homepage


    Yup, I worked with that when it was appropriate. That's not the issue. The real issue is that, when using boost::shared_ptr, you need to treat memory on the heap and stack differently. It's kind of silly to worry about these things in a thin SOAP wrapper for a database where the database is the real bottleneck and the overhead of garbage collection is negligible.

    The other issue that I failed to mention is that compiling C++ is often an order of magnitude slower then newer languages. This slow compile time is much more "expensive" then any lost sales due to requiring a slightly faster CPU and slightly more RAM.

    I can give you another example: An AI researcher that I know of re-wrote newer versions of her algorithm in Java instead of C++. Even though garbage collection givers her some visible performance implications, she can program and test her algorithms much faster, which has tangible economic value for her company.

  • by Anonymous Coward on Thursday November 26, 2009 @07:24PM (#30240610)

    They really do push it. Comments, unnecessary code, helper functions, unused namespace inclusions. Here is the result of two minutes refactoring of the C# code example they provide:

    class Program
        static void Main(string[] args)
            string[] t_quote_row = new System.Net.WebClient().DownloadString(@"").Split('\n')[1].Split(',');
            System.Console.WriteLine(t_quote_row[t_quote_row.Length - 1]);

    From 42 lines down to 8. Its obviously a contrived example but whatever, all I did was remove anything unnecessary. I could shave two more lines by moving the open brace to the line above on lines 1 and 3, and another line by not being pedantic about getting the last entry from the quote row (the original example code just indexes in with a magic number).

    I cant imagine coding using that language. It seem so imprecise, forced to make assumptions about what you want. The line of revTalk they provide to do the above work is:

        get the last item of line 2 of URL ""

    How does it know to treat the string as CSV? I can think of a number of ways it could easily guess this like checking the content encoding, but the point is that I can't immediately see that it is guessing. What if I want the last character in the string, or the string is tab (or some other character) delimited?

    Gives me the heebie jeebies just thinking about debugging it.

  • by kbahey ( 102895 ) on Thursday November 26, 2009 @07:43PM (#30240784) Homepage

    This is an age old quest. There has been attempts at making programming English like for many decades ...

    First there was COBOL: COmmon Business Oriented Language. Its syntax is very similar to English. It was sold as a way to make Managers able to write programs without the need of having a developer involved.

    ADD 1 TO IDX.
    MOVE X TO Y.

    What happened instead is that a generation of developers learned COBOL and specialized in it, and managers were still managing.

    Next, there was SQL: Structured Query Language. Despite the mathematical model behind relational databases, SQL was again sold as a way for managers to execute queries and get reports for themselves. That may have worked until the manager who ran a query on seven tables without any joins. That made everyone go again to "leave it to the programmers" mode again ...

  • Stupid argument (Score:4, Insightful)

    by jcr ( 53032 ) < .ta. .rcj.> on Thursday November 26, 2009 @07:51PM (#30240842) Journal

    They said the same thing about COBOL.


  • by henrypijames ( 669281 ) on Thursday November 26, 2009 @08:43PM (#30241232) Homepage

    From TFA: "I suppose that adds up, 90 per cent less code equates to ten times faster."

    Really? I suppose you can write haiku ten times faster than stream-of-conscious recordings, too?

    There's one thing that computer code and natural language text have in common: For both, confusing "writing" with "typing" is moronic.

  • by Nahor ( 41537 ) on Thursday November 26, 2009 @09:05PM (#30241388)

    There's no difference in length between:

    get the last item of line 2 of URL url



    I wonder what the code becomes if the URL contains a list separated by ':' or a more complex data structure.

    With such an example, I can make an even better language. I'll call it dot. It has only one character: a period. Here is how it compares:


    which in that verbose Revolution 4 language gives (approximately):

    get the item 42 of the line 3465 of the structure 3 that contains "foo" in the record 6 counting from 3 of the 38990nd [ cut to avoid the slashdot lame filter...]

    Now, my language does only one thing but it's unbeatable at it :)

  • by Anonymous Coward on Thursday November 26, 2009 @10:01PM (#30241744)

    Depends on how you define "best". The best practices approach is to define what your user needs -> define what the software needs to do to meet those needs -> Design the software architecture to fulfill the software requirements -> implement -> review -> verify -> validate. Implementation is usually a breeze once you have already fully defined the software design. If the target audience of a programming language are people who are afraid of programming, then they will be ignoring the best practices steps anyways. Labview is a good example of this. Labview empowers people who honestly should not be empowered. Following best practices, good programs can be made using Labview (like all languages, it should be selected for certain purposes). However, when you empower people who don't know squat about good programming practices, you get garbage that works, but is impossible to maintain, is unstable, and gives the language a bad name.

  • by shutdown -p now ( 807394 ) on Thursday November 26, 2009 @10:20PM (#30241856) Journal

    ... which is to say, not at all.

    The idea is not new, and something like this has been tried pretty much since the notion of a programming language higher-level than assembler was invented. Most of them die straight away. A few enjoy brief prominence, but then die anyway (e.g. dBase and dialects, and many other "4GL").

    Very few survive and take over some niche, but by that time they've inevitably strayed from the original "can be understood by any random guy" design goal. One example of such is SQL - yes, originally, it was intended to be a language in which managers and like-minded non-techies could easily write queries for reports etc. Yeah, right... every time I'm explaining why you can't write WHERE after GROUP BY, and have to use HAVING instead, I chuckle to myself.

  • 90% claim is fake (Score:3, Insightful)

    by bbn ( 172659 ) <> on Thursday November 26, 2009 @10:35PM (#30241948)

    One of the astro-turf comments in TFA reads like this:

    "Even though ‘90-per cent less code than traditional languages’ reads like a big claim, it is valid one.

    If you have a string you want to extract the first 3 characters of the second word on the 5ths line from and display it in an alert box, how many lines of code would you need to write in traditional languages? In rev this is a one-liner.

    answer char 1 to 3 of word 2 of line 5 of theString /* where theString is a variable that holds the content */"

    Of course any competent programmer can do the same in just as little code. For java it would be something like this:



    alert(theString.split("\n")[4].trim().split(" +")[1].substring(0,3));

    This is roughly the same amount of code. Not 90% less.

    "Text processing" is apparently touted as one of the strong points of the language. Yet, I am sure old fashioned perl and regular expressions are likely more concise and powerful. As shown above, even java can compete.

    How would this fare with real programming tasks? First order functions? List comprehension? Closures? A sound typesystem? You could go on forever.

    These topics seem to be ignored. This is a VisualBasic clone, not an attempt at a language that you would create "real" programs in.

  • by Dragonslicer ( 991472 ) on Thursday November 26, 2009 @10:36PM (#30241960)

    But take some hack's C or Python code and then have to sift through it and fix it up just so you can make it a bit more modular.

    Bad programmers will be bad programmers no matter what language you give them. Some languages, however, are easy enough to figure out that even bad programmers can write something that mostly works.

  • by whoop ( 194 ) on Thursday November 26, 2009 @10:48PM (#30242026) Homepage

    You must be confusing Slashdot for rational thinking. You see, 'round these parts, every article the describes something new must be the only one of its kind to exist.

    ChromeOS? It's gotta be able to serve 500 trillion web hits, decode MRI scans, run MS Exchange to 10 people in an office, telephone you when the temperature outside goes above 50, and browse web sites from the couch. If it cannot do all that, and everything everyone in the world wants to do, it's useless.

    Nintendo Wii? It cannot play the simplest version of Pong 40,000 in 1080p across fourteen 10,000-inch televisions with fully realistic explosions. Therefore, it is crap.

    So, don't let it surprise you that this little "language" is crap since it cannot do some things.

  • by Infernal Device ( 865066 ) on Thursday November 26, 2009 @11:45PM (#30242294)

    So what if it's easier or dumber or whatever?

    Most of us aren't out there revolutionizing the world with our leet skills - we're pulling numbers out of a database and shuffling them into some other database. It happens - we get paid.

    If this language gets some of the shovelwork off my back and frees up time for me to solve some interesting problems, then I'm all in favor of it. If it provides a way for me to earn an income (or someone else to) then I'm all in favor of it.

    If it gets a few more people interested in programming, I think the world can handle that. Just because there's a new language on the block doesn't mean that all the other languages are suddenly useless. After all, we still have stuff written in COBOL floating around.

    The big picture guys will still hire programmers to do what we do because we can think about a task and break it down into it's component steps.

  • by GigaplexNZ ( 1233886 ) on Friday November 27, 2009 @02:52AM (#30243176)
    The calculate distance comment is completely redundant as the variable that it is being assigned to is already named distance. Unless, of course, the intention was to state that it was in fact calculated instead of magically pulling a number out of a hat. You don't need one liner calculations to be commented to state their intentions if the variable names are chosen properly (which is what self documenting code is largely about).
  • by mcrbids ( 148650 ) on Friday November 27, 2009 @04:05AM (#30243514) Journal

    It is a joke...its code runs so slow.

    But, it's being slow is an implementation detail, not a language detail. The fact is, you were able to write some code that the machine "understood" well enough to perform. The rest is just implementation detail. There are many C/C++ compilers, all of which have various performance/price/compatibility trade-offs. Surprisingly, the most popular compiler is also one of the slowest: gcc generally prefers cross-platform capability over performance, and neither compiles the quickest nor produces the quickest executing code.

    But, NONE of this deals with the real reason why not everybody can be a good programmer. A good programmer must be able to precisely articulate exactly what he/she wants to have the machine do. And, it's quite surprising how few people can really do that. Most people think that much of what programmers and computers do really is just so much hand-waving, and while they crave the power of a programmer, they don't crave the attention to detail that something so simple as transposing two numbers can destroy.

    Yet, to get something done, you MUST know that you can't mix up a divisor and a dividend. This is a detail, and one of countless details that a programmer is paid to articulate. The REAL skill in being a good programmer isn't in the details of XYZ language, but in the details of the problem being solved!

    Languages are progressing to be easier to code, and this is a good thing. Programmers are paid to solve real problems, and in the process, have to solve implementation problems. Languages that minimize implementation overhead give programmers more skill to solve more complex and more challenging real-world problems.

    Don't worry - the world won't have any real shortages of problem-solving, logical people any time soon. Today's problems are getting harder, not easier!

"Never give in. Never give in. Never. Never. Never." -- Winston Churchill