Forgot your password?
typodupeerror
Programming

Dumbing Down Programming? 578

Posted by timothy
from the accessible-is-not-dumb dept.
RunRevKev writes "The unveiling of Revolution 4.0 has sparked a debate on ZDNet about whether programming is being dumbed down. The new version of the software uses an English-syntax that requires 90 per cent less code than traditional languages. A descendant of Apple's Hypercard, Rev 4 is set to '...empower people who would never have attempted programming to create successful applications.' ZDNet reports that 'One might reasonably hope that this product inspires students in the appropriate way and gets them more interested in programming.'"
This discussion has been archived. No new comments can be posted.

Dumbing Down Programming?

Comments Filter:
  • by Anonymous Coward on Thursday November 26, 2009 @05:54PM (#30239944)

    Oh wow a proprietary software development system? It's a new world boys there are plenty of solutions out there and just because these guys have marketing dollars doesn't mean people will take them up on it.

    Unless it is open, it will die.

    Remember all those old proprietary platforms? They are all but dead. It is a new age, get used it.

  • by mwvdlee (775178) on Thursday November 26, 2009 @05:59PM (#30239976) Homepage

    The simple truth is that many applications don't need that much performance or strange features and if a language like this enables more people to make their own custom apps, then I applaud it.

    Some people will argue "job-security through obscurity", but if your job depends on other people not understanding what you do, it's bound to end sooner rather than later anyway.

    I do wonder what the limits of this language are feature-wise. What type of applications could you NOT make with this language?

  • by thetoadwarrior (1268702) on Thursday November 26, 2009 @06:01PM (#30240000) Homepage
    Mind you I only skimmed a couple pages in the tutorial but it's just a programming language adding more words and more typing because it may do something like spell out add rather than using +. That may let idiots grasp programming a bit more than they would have before but programming as it is does not require a degree in rocket science. It just requires that you actually have enthusiasm for rather than thinking it's just a way to make lots of money.

    Not everyone is a programmer just as not everyone is a mechanic, painter, etc. I don't think we have a lack of programmers but a lack of dirt cheap programmers and companies will do whatever they can to lower wages. Perhaps they'd be better off make better programs to earn more profits.
  • by mustafap (452510) on Thursday November 26, 2009 @06:05PM (#30240020) Homepage

    >the more human readable and dummy proof

    Actually those two are exclusive, not inclusive.

  • by MightyMartian (840721) on Thursday November 26, 2009 @06:18PM (#30240120) Journal

    You're quite right. Plenty of applications don't need the kinds of optimization one is going to get with C/C++. What concerns me, as we've seen with Ruby or PHP suddenly finding their way into production servers, and suddenly all the design choices (ie. simplicity vs. efficiency, footprint, etc.) come and bite you in the ass. There seems to have been this attitude over the last ten years as memory and storage prices have fallen that if you have a slow app, just throw it on a faster computer and away you go! Java UI's have suffered from this sort of "the future will fix it" thinking for 15 years now.

  • by Daishiman (698845) on Thursday November 26, 2009 @06:47PM (#30240316)
    It doesn't really matter in the web as 90% of the time is spent hitting the database.
    Youtube runs pretty much 100% on Python, Facebook runs on Erlang and PHP. Erlang has the benefit of being highly scalable, yet it is relatively slow.
    Speen in the web doesn'trelly matter much. What's important is scalability, and today's shared-nothing approach pretty mucha guarantees that at the language level.
  • by SharpFang (651121) on Thursday November 26, 2009 @07:40PM (#30240744) Homepage Journal

    A traffic lights controller.

    Failure to react (detect the condition and perform action) within the regulatory 300 milliseconds since a green light occurring on two colliding directions (~100ms to detect current, ~100ms for the mechanical switch to cut off power, leaves 100ms for the software) equals our fault. If someone dies - Unintentional causing of road traffic accident with a fatality as consequence. The same rules apply as if I ran over someone by car, as result of my own traffic violation.

    Of course the 15 years option would be for malicious intent, like a hidden "all green" mode resulting in multiple deaths.

    And of course the level of redundancy of the cut-off systems and general paranoia in the software is awe-inspiring.

  • by innocent_white_lamb (151825) on Thursday November 26, 2009 @08:18PM (#30241054)

    Plenty of applications don't need the kinds of optimization one is going to get with C/C++
     
    Optimization aside (though it's nice to have), stuff written in C is about as future-proof as you can get with computers. C has been around for 37 years now, and userland code written in 1972 can generally be compiled with a modern ISO-compatible C compiler with very few or no modifications.
     
    Sure, printf() isn't necessarily all that pretty in a GUI environment, but that's not exactly the point -- it still works.
     
    C does take a bit of thought and it might take a bit longer to get a usable result out of, but that result will generally be serviceable a lot longer than a program written in the programmer's flavour-of-the-week.
     
    Maybe it's just my approach, but I would rather put in a bit more time up front and have something that I can just recompile to run on a CPU that hasn't been invented yet, instead of having to rewrite it from scratch.

  • Slick (Score:4, Interesting)

    by thePowerOfGrayskull (905905) <.marc.paradise. .at. .gmail.com.> on Thursday November 26, 2009 @09:20PM (#30241480) Homepage Journal
    Seriously, this is slick. I don't mean the language (it appears I need to install a plugin to view samples, which is a bit silly - I just want to see the language). No, I mean the advertising. Post it to slashdot with a title the casts it in doubt; link to the web site that requires you to install the plugin... poof! instant installed client base.
  • by Have Brain Will Rent (1031664) on Thursday November 26, 2009 @09:45PM (#30241646)
    I don't think the point is whether or not someone would still start a project in it (that certainly wasn't my point) but that it was based on the same idea as the article summary mentions and was in fact successful if you use a metric for success based on the number of programs written in it. There almost certainly far more programs written in Cobol than in Fortran. And even though there may still be a lot of Fortran programming still going on in scientific/engineering circles, given the relatively tiny size of that programming community I bet there are still a competitive number of Cobol programs still running out there even though there may not be any more being developed.

    Since I didn't make myself clear the very first time let me say - what is being proposed has been proposed before in the form of Cobol (old boss, new boss thing). Cobol may not have been successful at the goal of enabling non-programmers to turn out useful code but given its very widespread use in the real world it was certainly successful just as a programming language. So while the new idea may not be any more successful than Cobol in enabling non-programmers to generate useful software (but, hey it might!) it may still yield useful tools for the generation of software by actual programmers.
  • by fast turtle (1118037) on Thursday November 26, 2009 @09:47PM (#30241654) Journal

    You mean it's not a good idea for Joe Sixpack to write his own code? Damn. He can't be anyworse the MS over the last 20 years as far as crappy code goes. The really nice thing about it is that what actually works well will get improved and garbage will fall by the wayside as it should.

  • Re:What's new? (Score:3, Interesting)

    by scamper_22 (1073470) on Thursday November 26, 2009 @09:54PM (#30241704)

    Are we moving up the value chain of programming? Of course.

    However, almost all the things we can automate via compilers are things you wouldn't want to do anything.
    Who wants to code the assembler for a for loop? I welcome C!
    Who wants to code a GUI specifying pixel positions and handling resizing? I welcome come GUI designers and XAML.
    Who wants to write protocols encoding and decoding data? I welcome XML serialization and RPC! ...

    None of those things make good programs. Heck, using the right libraries and programmers that like nice sounding function and variable names, you can make c# or java really easy too.
    I've never found anyone hampered by syntax. A semi-colon here or brace there is simply not an obstacle to me.

    I taught computer science in high school. You would be amazed at the difference. Some students get it... and some just don't. Even a simple thing like variable assignment.
    Those that can get it...they get variable assignments, they get sequential steps, they get functions... they really pick up the fundamentals of any language quite quickly. The rest is design.
    I don't think people who can program can ever understand the mind of someone who doesn't get these things. I suspect its why some kids get algebra and some just don't.
    They're unable to understand variables and abstract notions.

    The only things you can really remove the programming from are those whose rules are pretty static. Things like very very very basic forms backed by a DB. But a variety of companies and environments already allow you do do such things.
    Anything that you can write a compiler for or auto-code generator for, I welcome.

  • dBase / Clipper (Score:2, Interesting)

    by daver_au (213961) on Thursday November 26, 2009 @10:06PM (#30241766)

    Wow Deja Vu.
    I remember the xBase (dBase / Clipper etc) languages being touted as the programming solution for non programmers. Programmers would be out of jobs, everyone would be writing their own applications. My first programming job when I left University was writing business applications in Clipper.

  • Re:Lowering the bar (Score:2, Interesting)

    by Anonymous Coward on Thursday November 26, 2009 @10:08PM (#30241782)

    I choose you to pick on :)

    Python is a very good example. Recently I have spent many hours writing a C app (because of the possible destinations of this app this made sense being an embedded application). Then I ended up on another embedded box that had python on it. Easy right? Port it use the same structs and what not right. Just use class and pop everything in. Wrong. I was out of memory PDQ. Processor usage was crap. Short story is I ended up 'optimizing' to the language. Basically tricking python to preform correctly. Not what I should be having to do in a language.

    Learning/using a language is not about 'how hard it is to manage things' or 'getting things done faster'. It is about what the language is intended for and cost trade offs. Python is meant for smallish short to the point scripts. Sure you can write large programs in it. I have written crazy programs in DOS batch script. C is meant to be lean and mean and 'on the metal'. But that comes at a cost as you have pointed out.

    The 'dont worry about it your hardware will catch up' crowd really ticks me off. It shows a true lack of care in their craft. With 4g type languages such as python, Ruby, Perl, VB, and C# there *ARE* trade offs. Ignore those details of what they do and why they do it at your own risk. To demonstrate I will show you a nice example of the sort of thing I talk about. It hits all the points.

    I will use a C struct to show it

    struct abc{ int a; int b; int c; } abcstruct;

    Clearing out that struct in C is a snap of a call to memset. In a 4gl I have to set each var to 0 with an abc.a=0 type thing. Pass this struct into other parts of the system are a snap as the rest of the system probably expects C type structs. But with a 4gl I have to 'thunk' it. Not to leave C un picked on I will show something python does 'very' well.

    x = [1,2,3,[4,5,6,7]]

    and poof x is already inited properly with variables. In C that would be a painful exercise of properly nesting things and getting it 'just right'. Then iterating over that struct is a snap in python as it is 'built in'. But with C it is a tedious exercise in nest for loops and type definition.

    Higher level languages are good at what they do and have their place. But many times people become enamored with them because of the 'ease of programming'. That ease of programming comes at a cost.

    I have noticed many times programmers are in love with their language of choice. Usually it is whatever they learned first. So they see that language as the 'naturally' right one.

    As a side note whoever came up with the idea of not having a begin and end in python needs to have a tire iron taken to their heads. Sure it makes your code all 'nice and pretty' by default. But it also creates bugs (I am on my 5th one this year because I misstyped a tab).

    Also back to the original point Let dumber people program and you end up with dumber programs very true! In *any* language you can write dumb things. It doesn't take much usually either. But let dumb people write things and you will end up with dumb code. I have seen awsome programs written in javascript, grotesque monstrosities in python, and works of art in C. But it all comes down to understanding what you wrote is going to cause the compiler/interpreter to do to your code.

  • by Mortice (467747) on Friday November 27, 2009 @07:34AM (#30244442)

    All right, I can't resist. A ruby implementation...

    def should_uncheck(item)
        item.checked? && current_font.provided_styles.include? item.style
    end

    menu.items.select{|item| should_uncheck?(item)}.map{|it| it.uncheck}

    You could do something similar in any language supporting higher-order functions. The code meets your requirement of not expressing things through nested loops and reads more like "do this to all items for which 'condition' is true." That said, Inform7 is very cool. :)

  • by Simetrical (1047518) <Simetrical+sd@gmail.com> on Friday November 27, 2009 @10:32AM (#30245260) Homepage

    The thing is, many webapps are actually DB bound despite appearing to be webserver bound, because they store most of their state in the database (and nearly the rest of it at the user's browser).

    Now this means you are shifting the burden of locking, serialization and other tricky stuff to the DB.

    When you do that, you can have as many identical webservers as you want (scaling "horizontally")- since the state is all at the DB (and most of the rest in the user's browser).

    Okay, in that sense I agree that databases are the major optimization concern for web apps, at least from a scalability perspective. It's worth pointing out, though, that this doesn't save you from latency issues, only scalability issues. If your PHP app takes 500 ms to generate a page, it doesn't matter how much hardware you throw at it, users' experience is going to be considerably worse than it should be. On Wikipedia, large pages can take literally 20 seconds or more of CPU time to generate – although this is hidden by two different layers of caching, so it's not usually visible to most users.

"Our vision is to speed up time, eventually eliminating it." -- Alex Schure

Working...