Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming

Dumbing Down Programming? 578

RunRevKev writes "The unveiling of Revolution 4.0 has sparked a debate on ZDNet about whether programming is being dumbed down. The new version of the software uses an English-syntax that requires 90 per cent less code than traditional languages. A descendant of Apple's Hypercard, Rev 4 is set to '...empower people who would never have attempted programming to create successful applications.' ZDNet reports that 'One might reasonably hope that this product inspires students in the appropriate way and gets them more interested in programming.'"
This discussion has been archived. No new comments can be posted.

Dumbing Down Programming?

Comments Filter:
  • by mwvdlee ( 775178 ) on Thursday November 26, 2009 @06:10PM (#30240064) Homepage

    Reply to myself

    Just looked at the portfolio of their consulting bussiness; they use their own language only for the simple bits and use other languages for the interresting stuff. Other projects they don't even use their own language at all.

    Says enough to me.

  • by rolfwind ( 528248 ) on Thursday November 26, 2009 @06:34PM (#30240226)

    You might find the C/C++ crowd commonly accuse the Java or Ruby crowd of this overhead. Indeed, Java has a garbage collector designed to protect you from memory leaks and Ruby is an interpreted language that pays a mild additional overhead since it cannot be optimized upon compilation. But that's another debate altogether, it just is evident that the more you move away from actual machine language and assembly then more overhead you pay (generally).

    Progression? Like chronological? "Garbage collection was invented by John McCarthy around 1959 to solve problems in Lisp."

    http://en.wikipedia.org/wiki/Garbage_collection_(computer_science) [wikipedia.org]

    In fact, I would argue that the "overhead" is the best case scenario for the C/C++ type languages because these are usually measurements on programs written by people who know exactly what they are doing and furthermore have had multiple eyeballs look at that code over and over and over again. But is that speed really there for the average code. How many OS security flaws are still discovered because of incorrectly implemented pointers or function calls like malloc (checks and what not). And so, what about bugs that use resources.

    But one thing code can never get away from, no matter how highly you're abstracting, is that the programmer has to know what they are doing. That's why it's a tool. There is no replacement for that. Photoshop can't make you an expert photographer either or turn crap photos into masterpieces. Why would anybody expect a programming language do the same for a programmer? It might hold your hand more and keep you out of trouble, but idiots always have ways to get around the idiot-proofing.

  • Re:What's new? (Score:3, Informative)

    by WED Fan ( 911325 ) <akahige@NOspAm.trashmail.net> on Thursday November 26, 2009 @07:02PM (#30240428) Homepage Journal

    Let me rework my own comment: People really hate when a language does well despite their hate for the language. It rises to an almost religious fervor.

    Now, I can has my edit button?

  • by Anonymous Coward on Thursday November 26, 2009 @07:36PM (#30240704)

    curl http://ichart.finance.yahoo.com/table.csv?s=RBS.L | awk 'BEGIN {FS =","} {if (NR == 2) print $NF}'

    Not too shabby...

    It is also worth noting that the Rev code ('get the last item of line 2 of URL "..."')is fairly ambiguous and have to make several guesses about the returned contents: is it a text file at all? If not, what does "the last item of line 2" mean? If it is a text file are they then refering to "the last item of line 2" of the text source or its rendered result? If it is the rendered result then how do they determine what "the last item" is? In this case, what if the rendered CSV-text had used ";" or as the separator? Etc.

    BTW, awfully verbose "natural" programming languages have been around for quite some time: COBOL and AppleScript comes to mind...

  • by aix tom ( 902140 ) on Thursday November 26, 2009 @07:56PM (#30240890)

    Well. Not the "failure to perform according to specs directly", but the results that may cause.

    I did program PLCs for a while, and if I messed up the emergency stop procedure in an obvious way, and someone would have died as a result, I might have faced jail time for reckless homicide or involuntary manslaughter. Although I have heard only a few cases where there where actual convictions, and most of those were placed on probation instead of actually going to jail, like in the case of this electrician. [democratic...ground.com]

    In the case of the Therac-25 incidents, there were too many contributing factors to really pin down the problem to one person. The person who originally wrote the software wrote it for the Therac-20, where it didn't cause any problems because of additional hardware interlocks, so technically the software worked on the "machine" it was written for. So the cause for the incident was not an obvious one, like using a not suitable language for the task.

  • by ottothecow ( 600101 ) on Thursday November 26, 2009 @08:39PM (#30241206) Homepage
    I have written in it...obviously not the new release but some sort of earlier version.

    It is a joke...its code runs so slow. I was happy not to be using VB but soon realized that VB is at least reasonably quick. It makes sense that the creators don't use it...anyone capable of creating a language is perfectly capable of writing in a better language.

    It could be good for gui prototyping...it made very fast gui interfaces that were fully interactive and you could actually extend it pretty far. It just ran so slow that I can't see any benefit for using it in a professional environment. The extra time your users will have to spend waiting on the program would far outweigh the time/money it would take to hire someone to do it right (or even VB-right)

  • Re:Lowering the bar (Score:3, Informative)

    by GWBasic ( 900357 ) <`slashdot' `at' `andrewrondeau.com'> on Thursday November 26, 2009 @08:39PM (#30241212) Homepage

    But in the hands of the in-experienced or sloppy coder ... they can unleash the most horrendous leak riddled code possible.

    The same can happen in a garbage collected environment: Nothing will ever be collected!

    There was a bug in version 1.0 of .Net where, if you used a calendar control, your entire window wouldn't be collected. If your window referenced other data, it wouldn't be collected as well.

    Cause of the bug: The calendar control "forgot" to unregister its event handler for the "system colors change" event, thus keeping an active reference after the window was closed.

    It's also important to understand that garbage collection isn't "magic" memory management. It's just meant to be less time consuming. This is because a poor programmer will still forget to release resources that need to be explicitly released and thus end up "leaking" memory.

  • Re:Lowering the bar (Score:0, Informative)

    by Anonymous Coward on Thursday November 26, 2009 @08:42PM (#30241228)

    C++ has had automatic memory management for a long time. Just keep most objects on the stack, use containers, and smart pointers for everything else and there will be almost no need to bother with manual memory management.

    Manual memory management is a pain, I don't understand why so many C++ programmers end up doing it. It is possible to have the language handle most of it without giving up any performance.

  • by Simetrical ( 1047518 ) <Simetrical+sd@gmail.com> on Thursday November 26, 2009 @08:43PM (#30241238) Homepage

    It doesn't really matter in the web as 90% of the time is spent hitting the database.

    Depends on the application. Wikipedia is much more CPU-bound than database-bound. Look at the database (db*) vs. application (srv*) servers lists here [wikimedia.org]: there are at least five times as many app servers as DB servers, at a quick glance. A typical request that hits the backend spends (IIRC) tens of milliseconds in the database, hundreds in PHP. Try formatting 500 or 5000 rows of a table when each one takes 1 ms – because yes, that happens when you try writing nice abstract formatting stuff in PHP.

    The website I administer is also much more CPU- than database-bound. Generating the front page of the forums [twcenter.net] is 602 ms, with only 14 ms in MySQL and the rest in PHP. This is a >20G database, by the way.

    I really don't see how any typical web app could spend more than a few tens of milliseconds per request at the database, unless it's poorly written (too much/too little normalization, bad indexes, etc.). But it's very easy to do hundreds of ms of pure computation in a slow language like PHP, even if your code is well-written. Are most web apps really DB-bound? I just haven't seen it, personally.

"When the going gets tough, the tough get empirical." -- Jon Carroll

Working...