Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Software IT Technology

Removing Software Complexity 178

t482 writes "Charles Simonyi (ex Xerox Parc & Microsoft ) says that Software "has become a field where we focus on incremental improvements in processes. That course is futile, because it can never solve the problem of human imperfection." Even as software collapses under the weight of its own complexity, we've barely begun to exploit its potential to solve problems. The challenge, Simonyi believes, is to find a way to write programs that both programmers and users can actually read and comprehend. Simonyi's solution? To create programming tools that are so simple and powerful that the software nearly writes itself. "Software should be as easy to edit as a PowerPoint presentation," Simonyi asserts."
This discussion has been archived. No new comments can be posted.

Removing Software Complexity

Comments Filter:
  • by extrasolar ( 28341 ) on Monday November 03, 2003 @01:06PM (#7378358) Homepage Journal
    If everyone could do it, I wouldn't be doing it.
    • You will still have to create rulesets for this system. I don't think they'll be all that simple.

      I think his intention is that a programmer creates the initial ruleset, and then the programmer's clients/customers can create incremental changes, such as change the sales tax rate from 8% to 8.25%, or modifying income tax tables.

      That eliminates a lot of work for programmers, but by no means all.

      Visual Basic made it possible for untrained people to write software, and Access made it possible for untrained p
      • Visual Basic made it possible for untrained people to write software, and Access made it possible for untrained people to write database applications,

        No, VB and Access made it possible for untrained people (and naive managers) to THINK (quite incorrectly) that they could write software and DB apps.

        but neither of those applications has reduced or eliminated the need for people to create software.

        Exactly.

        There is no silver bullet for making software easy, and this has been known for decades.

        Cobol, for instance, was supposed to be English-like, and hence understandable and programmable by non-programmers. Other English-like programming languages have made the same claim. Wrong every time on all counts.

        The problem is that specifying arbitrary algorithms requires extreme precision, unambiguity, and tedious detail far beyond anything the average person is even interested in attempting, let alone capable of. It doesn't particularly matter which language or tool is offered, what matters most is the person's abilities (and willingness!) to be excruciatingly detailed and logical and patient.

        This has been studied to death, but hope springs eternal...

        Another kind of lack of silver bullet are declarative languages...it's vastly easier to declare what is needed than it is to specify procedurally how to achieve the goal. However, no one has ever invented a Turing complete declarative language, and there is good reason to think that such a thing is impossible (infinite potential problem domains).

        Simonyi is a very intelligent and experienced guy, so he likely knows this. I hope he does; he should. So I like to interpret what he's saying as a grandiose way to say that he's hoping to make a big improvement in the art of programming -- there certainly is huge room for improvement.

        But if he literally means he hopes to make all programming as easy as making powerpoint slides, then he is a fool or a liar (but he might still produce some cool tools).

        (Making really cool graphics for the backgrounds of powerpoint slides is an art, BTW ;-)

        • as long as functional programming languages are considered to be declarative (at least according to foldoc [ic.ac.uk]) then there are plenty of declarative programming languages. arguably, functional languages are not completely declarative, but they aim high and do a much better job of defining the problem.

          • as long as functional programming languages are considered to be declarative (at least according to foldoc) then there are plenty of declarative programming languages. arguably, functional languages are not completely declarative, but they aim high and do a much better job of defining the problem.

            I had in mind the older and less fuzzy definition of "declarative language", meaning one that is not Turing complete because there it lacks explicit conditionals, loops, and recursion.

            For instance, BNF is use

            • good point. interesting you mention BNF, explicit conditionals, loops, and recursion. loops are not necessary (e.g. SML), BNF *is* recursive (still not turing complete, because CFGs are strictly less powerful than turing machines). conditionals aren't even really necessary, yet BNF (when implemented as an NPDA) sorta does have conditionals since its output depends on the value on top of thestack; a turing machine can be implemented with 2 integer counters, or give a PDA two stacks-- and it's as powerful as
              • good point. interesting you mention BNF, explicit conditionals, loops, and recursion. loops are not necessary (e.g. SML), BNF *is* recursive (still not turing complete, because CFGs are strictly less powerful than turing machines). conditionals aren't even really necessary, yet BNF (when implemented as an NPDA) sorta does have conditionals since its output depends on the value on top of thestack; a turing machine can be implemented with 2 integer counters, or give a PDA two stacks-- and it's as powerful as

                • Oh, and I should've mentioned that I was a strong enough BSD fan to lose my loyalty to Sun (I used a Sparc at home) exactly when they switched from BSD to System V.

                  That's when I started to look for another solution, and found Linux -- which has never, obviously, been pure BSD-like, but did always include enough BSD stuff rather than exclusively SysV side of the world to make me happy.

    • well, you could be creating really really REALLY complex systems with this then(and that's what they would end up being used for.. and we would be at point zero).

      it doesn't sound bad, but all in all it does sound a bit like basic, and no matter what way the program is done you still have to know what you're trying to do(what this seems to imply is that there would be some super ai figuring out what exactly the program was supposed to be doing in the first place so that it would do "not what i told it but w
    • Me too.

      But seriously, Every once in a while someone comes up with the idea that software should be easy enough to create that anyone could do it.

      It never ceases to amaze me that people think they want that. What you will get is 5 million shitty apps. A great software developer doesn't simply code the solution, but can provide insight into the solution.

      I, for one am not worried in the slightest.

  • Didn't Apple have some QuickCard thingy for a while. I recall them touting it as programming for the everyman...
    • HyperCard [aol.com]. I believe it is officially cancelled by Apple, but it is still for sale [apple.com] at the Apple Store.

      JP

    • by jmccay ( 70985 )
      Another company tried something like this in the mid 90s (1996 to 1998). Their product was called Icon Author (glorified flow chart using icons and flowchart symbols) it was simpler, but the logic still perplexed some of the temps we hired at the company where I was working during that time. They even tried to make it cross-platform, but they failed because they used MFC (and MFC on the Mac sucked {on purpose probably}). They might have succeed today if they use wxWindows.

      The problem with this idea
  • by WasterDave ( 20047 ) <davep AT zedkep DOT com> on Monday November 03, 2003 @01:14PM (#7378415)
    Little known fact: This is the same Simonyi who invented hungarian notation.

    Google for "the tactical nuclear weapon of code obfuscation" to receive further enlightenment [mindprod.com]

    Dave
    • by darkov ( 261309 ) on Monday November 03, 2003 @01:23PM (#7378500)
      That little invention really reflects how stupid this guy is. So much for abstraction when all your variables have their name encoded with their concrete representation. God help you if you want to change the type of your var. It global search and replace for you.
      • That little invention really reflects how stupid this guy is.

        Calling Simonyi stupid, is, well, stupid. Or at the very least ignorant. He's brilliant. And he invented his notation while writing in C, a language not known for its abstraction.

      • by GCP ( 122438 ) on Monday November 03, 2003 @04:36PM (#7380509)
        I don't know who you are, but the chance that you're qualified to call Simonyi "stupid" is statistically insignificant.

        Hungarian notation is a means for overcoming a critical flaw in the C language: the lack of type safety. There are about a million different "abstractions" that look to your C compiler like just a sequence of bytes. C code collects bugs like a porch light every time you try to evolve your code by changing abstractions. Hungarian notation, macros, other coding conventions, special "lint" tools, etc., are pretty much all designed to reduce the problems caused by the poor design of C itself.

        Simonyi contributed a workaround that's useful to those who know when and how much to use it.

        • Disagree. C problems are in no way improved by making code less readable.

          Including data type in variable names is bad. If your functions are so long that you can't even see where your variables are declared, you need to break them into smaller pieces. And with C inlining, you don't even have the performance argument from Java.

          Hungarian notation bad.

          Code generation bad.

          Simple readable code good.

        • by nathanh ( 1214 ) on Tuesday November 04, 2003 @01:38AM (#7384338) Homepage
          Hungarian notation is a means for overcoming a critical flaw in the C language: the lack of type safety.

          But Hungarian notation doesn't fix that flaw. It's only as reliable as the programmer who writes the code. In most cases, that means not reliable at all.

          I have been bitten before by relying on the Hungarian junk at the start, only to discover an hour later that it was completely unrelated to the actual type.

          Hungarian notation gives the illusion of improved type safety. It achieves nothing. If you want type safety then use a language that supports type safety.

          • But Hungarian notation doesn't fix that flaw. It's only as reliable as the programmer who writes the code. In most cases, that means not reliable at all. ... I have been bitten before by relying on the Hungarian junk at the start,

            So? You could have been bitten buy assuming that a function called IncrementValue() doesn't divide the value by 12. You could have been bitten by assuming that the comment above the function accurately describes the contents. There is no programming technique whatsoever that ca
            • So? You could have been bitten buy assuming that a function called IncrementValue() doesn't divide the value by 12. You could have been bitten by assuming that the comment above the function accurately describes the contents. There is no programming technique whatsoever that can stop you from being bitten by the mistakes of dumb cow orkers.

              I think, without reservation, that was entirely my point.

              This does not mean that meaningful function names, accurate comments and a judicious amount of type inf

        • > reduce the problems caused by the poor
          > design of C itself.

          C was designed to be a simple, powerful, low-level language, as easy to use and with as compact a compiler as Pascal, but with improved speed and flexibility.

          With the territory came the ability to write more subtle and deadly bug, but that was understood.

          So what language would you recommend to rewrite the Linux kernel in, say ?
      • I don't think anyone can accuse Simonyi of being dumb, but even smart guys have poor ideas.

        I don't like Hungarian notation just because it makes my code look ugly. I spend a lot of time making things look clean and simple, and Hungarian notation ... well, let's just say it doesn't exactly advance that goal.

        D
        • I don't think anyone can accuse Simonyi of being dumb, but even smart guys have poor ideas.

          I don't necessarily think the idea is poor, but rather has been abused.

          Hungarian notation may serve as a useful mnemonic for beginning programmers to remember how an object can be used. Of course, abstraction and encapsulation are supposed to hide the very implementation that your striving to describe! Something in my gut tells me that if you don't understand an object's type, you've got no business mucking with i

      • That little invention really reflects how stupid this guy is.

        May I ask what you've achieved to be able to call "stupid" someone who has influenced the current state of the art of programming with several of his ideas?
  • Simonyi (Score:3, Insightful)

    by 4of12 ( 97621 ) on Monday November 03, 2003 @01:15PM (#7378430) Homepage Journal

    I take it that Hungarian notation has been left by the waysideon the road to less complexity:)

    I agree wholeheartedly with the complexity issue.

    I measure my success as a programmer by whether or not another programmer (or myself far in the future) can throw my work onto the screen and understand very quickly what the code is trying to do.

    Bugs can be fixed, features can be added and performance can be enhanced later. But not very easily if the code is too complex or, equivalently, has too much abstraction.

  • And just who (Score:5, Insightful)

    by kalidasa ( 577403 ) * on Monday November 03, 2003 @01:16PM (#7378433) Journal
    Will write the programming tools? Seems to me Simonyi's not talking about a replacement for modern programming, but an incremental advancement over say AppleScript or Hypercard. More powerful userland tools will not completely replace programming: someone will need to write the components. Or is he thinking that all the components will be in the OS, and thus third party programmers could be eliminated and the OS vendor and the user would be the only parts of the transaction?
    • "Or is he thinking that all the components will be in the OS, and thus third party programmers could be eliminated and the OS vendor and the user would be the only parts of the transaction?"

      .NET? Having the base classes as part of the operating system so you can truthfully tell the judge that the programming language is 'Part of the Operating System'?

    • How to write a program in AppleScript:

      1. Write your prototype in English pseudocode.
      2. There is no second step.

      (Okay, okay, well, Mac users will get the joke.)

    • by GCP ( 122438 ) on Monday November 03, 2003 @05:10PM (#7380925)
      I think it's great for them to pursue tools that make it easier for non-programmers to do more useful things for themselves.

      I'm not too concerned that this will replace the current type of programming, though. The biggest problem is that the real-world problem being solved is almost always more complicated than the domain experts themselves realize.

      When I sit down with a client domain expert to write a program for them, they are invariably surprised by what I uncover. I gradually tease out a huge variety of scenarios that they've never thought through or decisions that they make on the basis of "experience" whose rules they can't possibly express explicitly, comprehensively, unambiguously, and without contradiction -- on their own.

      It just doesn't matter how easy it is to explain the rules to a computer if you don't have the skill that experienced programmers have: to completely specify the problem. Fully explaining how to solve a problem to something other than another intelligent and experienced human is harder than most non-programmers realize. (Of course Simonyi knows this, but the journalists who cover his work probably don't.)

      • When I sit down with a client domain expert to write a program for them, they are invariably surprised by what I uncover.

        Indeed. Humans are accustomed to expressing their desires to other humans, and relying on common sense to guide the interpretation of their orders and specifications. This doesn't fly with computers; the orders aren't algorithms, they're vague wishlists.

        The first problem is that people don't say how to achieve goals. The second problem is that people don't even specify the goals ver

  • by Oddly_Drac ( 625066 ) on Monday November 03, 2003 @01:16PM (#7378437)
    ...just for saying;

    "Software should be as easy to edit as a PowerPoint presentation,"

    Powerpoint is _evil_ and should be destroyed, and the ground that it rested on salted.

    • While PowerPoint is evil in terms of the boiling down of the nice Red Meat of content into a Grey Goo worthy of an Army chef.... it's better than the truely evil of ... FLASH.

      At least you can print a PowerPoint, look at the slides, and the notes. PowerPoint forces a nice, easy to deal with, linear structure on a Presentation.

      A Flash presentation is too programmable, you never know how to advance to the next slide... and you often can't go back!

      --Mike--

  • .. there goes my job ;-)

    Problem is that does anyone want to write an operating system in such a high level language, where the optimization is questionable?

    Oh, and what the hell does he think MS macros are? They write themselves almost.

  • Now go build it... (Score:3, Insightful)

    by darkov ( 261309 ) on Monday November 03, 2003 @01:18PM (#7378457)
    Saying software is too complex is stating the bleeding obvious. But the world is complex and it's not that easy building software, wether you're a programmer or user, that can simplify it. A clue to this is how good, user-friendly software is much harder to write.

    He keeps on pushing his Intentional Software barrow, but where are the techniques that actually deliver results. Anything most programmers will come up with will be just as impenetrable as C. The problem is programmers are not known for their empathy for users and don't really want to try to find out what it means to not know how to use a computer or its software.
    • by Oddly_Drac ( 625066 ) on Monday November 03, 2003 @01:28PM (#7378548)
      "The problem is programmers are not known for their empathy for users"

      Oh, I don't know...my customers and I often share amusing stories of stuff that's Blue screened just before you meant to save it. If you mean that some Programmers consider themselves godlike because of years of hubris and the certain knowledge that you couldn't be found out if you cloaked as much as you could in jargon, then you might have a point.

      The customer knows what they want, and they'll express it to you in their terms; you have to translate their ideas into something workable, which can be a ballache, but if you've started from a position of billable hours rather than a fixed cost, you're already ahead of the game simply because you can work on milestones rather than having to constantly rehash.

      The idea of having a high-level language that 'anyone' can use has largely already happened with HTML. As a result we have some outrageously bad HTML out there because of the complete lack of understanding of abstraction. Put simply, Decamino wouldn't look through Galileo's telescope because he _knew_ that Galileo is wrong; the paradigm that allowed for the earth to move out of the center of the universe hadn't yet come to pass.

      Although OO is currently viewed with some suspicion because it may not be the best way to do _everything_, everyone involved in commercial programming has at least started to view things in the terms of objects; that concept may take a while to filter down to a public that thinks animated cursors on the web are the dog's back wheels, or seem surprised when you have to keep AV software updated.

      • I agree with everything you wrote except for, "the customer knows what they want". That made me giggle. Teehee.

        • customer knows what they want

          A quaint, outdated model.

          In today's mass production world we already know what the customer wants.

          The customer wants

          to feel good about himself.

          Target your pitch that way and you'll get the sheep coming in to be sheared, as you praise them for being adventurious, vigorous, attractive, and knowledgeable sheep (or wolves, if they like being called wolves better)....

          Pretend to listen to your customer, because if they think you're listening to them and care about them, then t

        • "I agree with everything you wrote except for, "the customer knows what they want"."

          They do. That they frequently can't describe what they want isn't necessarily their failing, it's because they aren't in our industry and they may be picking up ideas, but not know the ramifications.

          I frequently have to talk people out of popups and email spamming...they've seen it done and they think it's a good idea; I ask whether they've ever bought anything from either, and do they consider it a nuisance...that usua
      • Put simply, Decamino wouldn't look through Galileo's telescope because he _knew_ that Galileo is wrong; the paradigm that allowed for the earth to move out of the center of the universe hadn't yet come to pass.

        There must be some new meaning for the word "simply" that I am unaware of.
      • > a public that thinks animated cursors on the web
        > are the dog's back wheels

        "The dog's back wheels" is a truly inspired turn of phrase. If I had mod points at the moment, you'd have one right now.
    • Is there a reason that your sig has "i" and "j" in the wrong position? Just wondering.... :-)
  • nothing new (Score:3, Insightful)

    by mikeburke ( 683778 ) on Monday November 03, 2003 @01:28PM (#7378541)
    There's a whole raft of tools out there that put this philosophy into action - witness MS Access, VB etc. Even an Excel spreadsheet can be viewed as a 'programming environment' really.

    There are 2 kinds of problems that programmers solve - technical problems and business problems. The technical problems can be abstracted by tools like the above, but the business problems remain.

    Techniques such as Object Oriented design, abstraction, etc etc are just as useful for solving these kinds of problems as they are, for example, when writing a new web browser.

    It's difficult to see how a groovy GUI can hide or solve these problems. You're still going to need a certain set of skils to guide the development and architecture of any nontrivial system.

    I'm sure we've all see complex websites that have been put together by naive users of Frontpage - bloated HTML, endless redundancy (cut-n-paste) and a hideous task for someone else (with a similar skill level) to pick up and modify. It's hard to see how you can prevent these kind of problems when you go down the "everyone can use it" path.

    • MS Access (Score:3, Insightful)

      by Ender Ryan ( 79406 )
      Where I work, we have a guy who has done some simple "programming" with MS Access. Sure, it works pretty darn well, but at the first sign of trouble, it falls all over itself with undebuggable garbage.

      Not only that, but it is entirely unmaintainable, even by him.

      Real programming is a whole lot more than just pushing some buttons around.

    • Re:nothing new (Score:3, Interesting)

      by Godeke ( 32895 ) *
      You hit the nail on the head when you mention Excel as a programming environment. Because the environment determines order of evaluation for you, you have a declarative language, and I think in the long run that is the key.

      Procedural programming runs into complexity limits much more quickly, because the programmer is responsible for all of the interactions explicitly. Excel allows business people to work in their domain, using ideas that make sense because they were extracted from the domain. With the exce
    • On the dog food principle, if the people writing the programs are writing it for their own use, it's going to get better and better for their purposes. If any schmuck can put together software to automate the things they're doing now, then that's less software that gets sold, and more support that gets sold (or bartered for) and it's a win for us all.

      While I was writing that sentence I realized that I agreed that software engineers ought not to get paid. Kinda. The truth is that for most of the software o

  • Who bells the cat? (Score:3, Insightful)

    by crmartin ( 98227 ) on Monday November 03, 2003 @01:47PM (#7378701)
    Simonyi has a good point. Don't let Hungarian notation bother you -- it's a kludge on top of an essentially untyped unprotected language (C) trying to get back some of the protections offered by strong typing, and while Hungarian notation is a horrible solution, so are all the others.

    But the problem is that while keeping clarity is a great idea, it's proven immensely hard to do. Fred Brooks (viz., the No Silver Bullet paper) argues that this is because the problems we're solving with software are intrinsically complex; there's no way to reduce the complexity below a certain point. On the other hand, anyone who writes real code knows that they spend a hell of a lot of their time writing the equivalent of a for loop over index i again and again and again. There's some unnecessary redundancy there.

    But saying that you want less complexity is a lot different from saying you know how to get rid of the complexity.
    • There's some unnecessary redundancy there.

      Don't confuse "complexity" with "details". This is the mistake that the author is making. Take for example a web server. The http protocol is complex no matter what language you write one in. With C you have to worry about buffer overflows. With Ruby you don't, but you still have to worry about invalid URLs. With C you're stuck with the drudgery of for loops. With Ruby you don't have to wade through that, but you still have to manage your server threads or process
      • by crmartin ( 98227 )
        Yes, exactly. There is some redundancy in the way we code, but the complexity is not just in that redundancy. There are some hints of the intractability of reducing complexity in Greg Chaitin's algorithmic information theory.
      • But the good news is that we can encapsulate all that http work into an entirely separate program and still interface to it relatively trivially from a variety of languages, like php, perl, et cetera. Now, if we just had easier to use wrappers for everything else, too, well, we'd have delphi I guess. However I would prefer a less goofy glue language than pascal, no matter how sweet delphi might be (and if it is, I have yet to see proof, which might just be borland's fault for writing crappy runtimes, but I
  • Riiiighhhtt. So who is going to write this 'programming environment for idiots?" Surely it must be recognized that you are just moving the complexity problem to a different layer with this approach, PLUS losing the ability to gain low level access when needed.

    • Meta complexity? (Score:5, Interesting)

      by FunkyRat ( 36011 ) <.moc.liamg. .ta. .taryknuf.> on Monday November 03, 2003 @02:56PM (#7379385) Journal
      Surely it must be recognized that you are just moving the complexity problem to a different layer with this approach, PLUS losing the ability to gain low level access when needed.

      I don't know if that necessarily has to be the case. Back in the old 8 bit CP/M days I got my introduction to Forth through an application named KAMAS, which stood for Knowledge And Mind Amplification System. Lofty sounding name aside, KAMAS was really an outlining tool. A very good one at that. A few years later after the PC and DOS had taken over a whole slew of these outlining tool programs appeared and all claimed the ability to revolutionize the way you worked with information. For the most part, this was all bunk but in a way KAMAS almost stood up to its self-aggrandized promotion.

      What made KAMAS different, IMHO, was that it was based on a FORTH like language that was at the core of the product and its author (Adam Trent) left that programmability exposed. Yet, he was able to organize the program in such a way that the average user didn't have to interact with the language at all or even know it was there if they didn't want to. Heck, you didn't even have to use it as an outliner -- if you wanted it could just act as a simple To Do list.

      As the owners' manual stated, KAMAS was arranged in rings,like a Venn diagram, with the outliner at the outermost ring. However, if the user wanted they could issue a command that would expose the next inner layer ofr complexity and do simple programming tasks on their outlines. Because of its' Forth heritage, the programming was interactive and could easily be undone? Screw up a word definition? Just tell the interpreter to FORGET it.

      For the true geek crowd, another word could be issued (only while inside the programming layer) that would then expose the inner-most layer and open up access to the all the words defined. At this point, the user/prorammer would have access to basically a full Forth programming environment and actually change or extend the outliner tool by rewriting it! At this point, if one wished to devote the time to learning how to program in a stack based threaded interpreted environment, your computer was wide open to you. It was like have the keys to the gates of heaven laid at your feet.

      Later on, when I started playing around with Forth proper, I was still impressed with what KAMAS's author (whatever became of Adam Trent anyway?) had done and felt that this managing of complexity was the true power of Forth based systems. However, even I have to admit that Forth is far from ideal given its' RPN and stack based roots -- at least for Joe Everyuser. More time passed and I discovered Smalltalk and Alan Kay and his idedas for Dynabook and lately, Squeak [squeak.org].

      Smalltalk, Squeak and OOP share with KAMAS the idea of bringing the power of the computer to leverage the mind to the everyday user. And, as with KAMAS and Forth too, they are able to prevent a useful, simplified environment at the surface, but still making the power and complexity available to those who want to use it.

      So, in short, I think you're wrong here. One does not have to lose the ability to gain low level access in order to mask complexity from the average user. What I do question after all these years is how many users will actually want access to the power hidden at the core of systems such as Squeak and KAMAS before it? I mean, come on, I live in a country (US) where a sizeable portion of the population can't identify the Pacific ocean on a map! I think its likely that in the end we'll end up with just about the same mix of truly technical users to clueless lusers that we have now.

      As depressing as that may be, and the thought does depress me, I still think it's important to implement Charles Simonyi's ideas (as well as Alan Kay's and Doug Englebart's and Steve Wozniak's and all the others who believe that the computer can serve as a tool to liberate people). If only for the sake of providing a migration path for people to make that crossing from clueless luser to someone who is able to effectively use the computer as a "Knowledge and Mind Amplification Tool."

      • Was this just a typo, or was it intentional?

        And, as with KAMAS and Forth too, they are able to prevent a useful, simplified environment at the surface, but still making the power and complexity available to those who want to use it.

        Look at the 12th word.

        • ROTFL! No, it wasn't intentional! I hope you know I meant present. You'll have to excuse me, I'm doing Slashdot today like Rush Limbaugh does his radio show -- I'm taking hyrdrocodone for a knee injury. On top of it all, I have this miserable stinking cold!
      • I still think it's important to implement Charles Simonyi's ideas (as well as Alan Kay's and Doug Englebart's and Steve Wozniak's and all the others who believe that the computer can serve as a tool to liberate people).

        All coolness, but how does that eliminate the need for humans to deal with complexity? It's still there at some level in the system until you build a system that can eliminate complexity adaptively. When you do that there will be no need for human beings any more.

        • Well, actually, in a way, that was my point or at least a corrolary to my point.

          We can do really cool things to manage complexity and still make computers useful tools for the general populace. I agree with Simonyi's arguement that more needs to be done in this area.

          However, at some point, people must be ready to accept the fact that power brings with it complexity and if they wish to do grander things then they need to be willing to learn. Most people aren't willing to do so. Yet for the few who are,

  • by mugnyte ( 203225 ) * on Monday November 03, 2003 @01:56PM (#7378786) Journal
    People want to make the world in their image. So, they hot-rod their cars, paint their rooms new colors and ask for new software. That software need to do something that hasn't been done *and published in a coherent way* before. So the programmers delve into the details of APIs and language capabilities and create complexity.

    Also, the migration between new hardware, capabilities (higher bandwidth, wireless) or goals (FPS gaming) and such are always going to create a complex "first example" that need many iterations before commodization.

    I think this guy is premature to assume all programming goals are easily commoditized right now. If people were to give up behaviors when the plug-ins given to them don't exist or are buggy, thee'd be a lot of hodge-podge solutions.

    Example: Visual Basic programming was supposed to be a "glue" for clicking together COM ocmponents, and MS was touting a new era of component "publishers" and "subscribers" (and next up is the same thing via .NET and web services) We all know how Visual Basic attracted lots of newbie programmers without formal degrees who clamored to read Compu-Mags for tips, and MS beefed it into a full-fledged development environment (compiled exe's, generate COM natively, getting away from variants). It has solved many problems, but didn't create a world of commodization as hope (even if there are 100's of OCX vendors in those same Compu-mags)

    But it just doesn't happen in the long run. You can buy enterprise that does thing from soup to nuts and still find tons of work in "making it your own" with interfaces, reports and other customizations (talk to an SAP project manager).

    Anyway, this is an interesting topic, but ultimately limited.

  • COBOL (Score:4, Insightful)

    by aridhol ( 112307 ) <ka_lac@hotmail.com> on Monday November 03, 2003 @02:00PM (#7378817) Homepage Journal
    Wasn't COBOL orignally written in order to allow the user to bypass the programmer? One of the lessons they learned from that experiment was that, even given a simplified language, most people don't understand computers well enough to write a program.

    I'm not saying things like API obfuscation or similar. I mean people don't generally think logically. Computers don't think emotionally. The average person has no idea about algorithms, or why you may want an O(lg(n)) algorithm in preference to an O(n^2) algorithm.

    For these things, professional programmers will still be required.

    • I think the point that the article was about was lessening the need for software engineers in favor of process/algorithmic engineers. Processes are still very complex and require intellegence and education to solve efficiently. It's the difference between writing quick sort in puedo-code or whatever and actually implementing in a language where you need to worry about implemention details. It's those details which really tend to be bothersome. If they can find a way to make "psuedo-code" executable, or
      • Yeah, I don't think people would even be that good at writing pseudocode. Even before they get to worrying about details like performance and efficiency, the fact is most people are poor at breaking down tasks, even the ones they do all the itme, into small enough pieces that even a computer could do it.

        In short, I don't think something like this will happen until we really get AI going. What this guy propses looks at first glance like a visual tool for what COBOL was supposed to be, first glance meaning t
        • Re:COBOL (Score:2, Interesting)

          "...the fact is most people are poor at breaking down tasks..."

          This has been my observaion as well. I used to teach beginning programming at a local University and I have to say that it was amaizing how many people had problems thinking through what needed to be done to accomplish a thing as simple as swapping two integers. I'd try and get them to talk about the steps that would need to be taken. I'd have them transfer two objects between two hands. They'd do that fine. I'd then ask them to break it

          • Re:COBOL (Score:3, Insightful)

            by kisrael ( 134664 ) *
            This has been my observaion as well. I used to teach beginning programming at a local University and I have to say that it was amaizing how many people had problems thinking through what needed to be done to accomplish a thing as simple as swapping two integers

            Interesting. And really, learning to do break up tasks of fairly great complexity is probably 75% of programming...once you got that, most of the rest is just details.

            Personally I think this ties into a fallacy the vast majority of us share, that w
    • > Wasn't COBOL orignally written in order to allow the user to bypass the programmer? One of the lessons they learned from that experiment was that, even given a simplified language, most people don't understand computers well enough to write a program.

      In my experience, most people don't understand what they're trying to do well enough to write a program, let alone understand 'computers' well enough to write it. I've lost count of the number of times I've had to explain to a PHB that it's useless to

    • Re:COBOL (Score:2, Informative)

      Another technology along these lines was CASE (Computer Aided Software Engingeering) which went up in a ball of hype since the complexity of the tools could not keep up with the demands of the user base or the promises of the sales team.

      That will always be the case. Simonyi's marketing position seems to be trying for sympatico with CTO's whose TCO on their ERP solutions are, shall we say, unmanageble. Like the late Michael Dertouzos, Simonyi is attempting the "I'm mad as hell and I can't take it anymore

  • by cheezus ( 95036 ) on Monday November 03, 2003 @02:06PM (#7378864) Homepage
    "Software should be as easy to edit as a PowerPoint presentation"

    oh great, now nearly every app is going to have a random ass ugly transtion between user interfaces, will use no fewer than 20 fonts, and have clipart everywhere. You will have to wait for each line of the EULA to slide, spiral, disolve or some other animation it's way onto the screen before you can click ok. Not only that, the application will surely present no other information than reading the bullet points to you.

  • I'd have to say that by far the problem is not the development tools we use. I've used GCC and cosmic (for HC11,12 & 68332) and some ridiculous proprietary IDE for the Hitachi H8S series. I'll have to admit that I'll take vim over these IDEs but JTAGs, BDMs are great. What causes the most problems? Feature creep and the odd attachment management types have to old crufty code. (I supposed they think they paid to much for it).

    Currently my favorite hobby is ripping out code for old models or features wh

  • Something about this article seems familiar, but I'm not sure where I saw it before. It makes reference to using UML as a 'easy-to-understand interface' for desigining programs.

    Please.

    Essentially, what his idea comes down to, is finding representations that are more removed from what's going on under the hood, such as using pictures to represent program flow instead of the current textual representation. This is not only an old, established idea, but completely bypasses the fact the the fundamental diff
    • Unfortunately, turbine blades are still designed by highly skilled engineers

      Exactly. Once the highly skilled engineers design the blade, he wants to ensure that a bunch of hamfisted craftsmen and welders and such don't screw up the implementation with mistakes, sabotage, job-security-slackness, labor negotiations, etc.

      Simonyi says he wants the code to "look like the design". There is still a role for the designer.

      what he wants to do is give somebody the ability to say "I want a fan-like-thing that mo
    • Ever seen a simple UML diagram for any real world solution? I haven't. UML wasn't designed to be simple. It was designed to that you could design complex software without worrying about the trivial details of implementation.

      Unfortunately, even the UML advocates tend to forget this. I remember a Rational saleman giving a presentation at my work. "Look how easy this is," he said, as he brought up the UML diagram for Rose itself. Ick!
  • it's called visual basic.

    if it just wasn't for all those script kiddies exploiting vulnerabilities of such a beautiful language.. let's hope they make script kidding illegal with the next version of the Patriot ACT, codenamed LongCorn..
  • ... all of those technologies make designing simple apps a piece of cake. Shouldn't be that hard to make a visual IDE for newbies that generates those XML.
  • by Slowping ( 63788 ) on Monday November 03, 2003 @03:13PM (#7379604) Homepage Journal
    Someone just needs to write a program that users can run, to check and make sure that the target program runs correctly!

    (yes, I'm joking)
  • That's exactly what FORTRAN does. You don't do any programming yourself, you simply describe the problem you want solved in a natural, easy-to-learn language, and the FORTRAN compiler writes a bug-free program that implements the solution.

    If you're not using FORTRAN, you're wasting time and effort. Why, when you write a single line of FORTRAN the FORTRAN compiler writes an average of ten lines of code for you, so you become ten times as productive and can get projects shipping and earning revenue times as
  • by Brandybuck ( 704397 ) on Monday November 03, 2003 @03:19PM (#7379677) Homepage Journal
    "Software should be as easy to edit as a PowerPoint presentation," Simonyi asserts.

    When's the last time you saw a quality PowerPoint presentation? I've seen them, but they're rare. Presentations from people who don't know how to communicate effectively are lame as Visual Basic programs from people who don't know how to program. The style takes precedence over the actual substance.

    Complexity is not something that needs to be hidden away. Software is complex. Using software is a complex activity. Writing software is more complex still. You cannot manage that complexity by imagining that it is not there. The way to manage it is to recognize that it exists.

    It doesn't matter if you use C, Java, VB or Ruby, the complexity is still there. The advantages of high level languages is not that they hide away the complexity, but rather that they enable you to manage the complexity by taking care of the details.

    Take any book on software development. Not programming, but development. How much time is spent on implementation? Not much. For a good project, 90% or more of the time is spent analyzing, specifying, designing and testing. This is the HARD part of developing. Give me complete specs, a valid design, and a top-notch QA group, and I could code just about anything. All that other stuff is there to MANAGE the complexity.

    I've seen what Microsoft offers to make things easy. They're solutions to complexity is to ignore it, which is the wrong approach. And thus we end up with crap presentations, crap documents, and crap VB programs. It's not because these tools are crap in-and-of-themselves, but simply because they lead the user to disregard the existing complexity.
  • NakedObjects? (Score:3, Interesting)

    by BigGerman ( 541312 ) on Monday November 03, 2003 @04:28PM (#7380431)
    there is a framework [nakedobjects.org] where they believe in exposing the business objects inside the app to the end user. Kinda like spreadsheet / powerpoint but the real deal.
  • by Anonymous Coward on Monday November 03, 2003 @05:00PM (#7380817)
    This is from late September, so unfortunately there's no direct link to the full article at nytimes anymore.

    The Level of Discourse Continues to Slide
    By JOHN SCHWARTZ
    Is there anything so deadening to the soul as a PowerPoint presentation?

    Critics have complained about the computerized slide shows, produced with the ubiquitous software from Microsoft, since the technology was first introduced 10 years ago. Last week, The New Yorker magazine included a cartoon showing a job interview in hell: "I need someone well versed in the art of torture," the interviewer says. "Do you know PowerPoint?"

    Once upon a time, a party host could send dread through the room by saying, "Let me show you the slides from our trip!" Now, that dread has spread to every corner of the culture, with schoolchildren using the program to write book reports, and corporate managers blinking mindlessly at PowerPoint charts and bullet lists projected onto giant screens as a disembodied voice reads

    every

    word

    on

    every

    slide.

    When the bullets are flying, no one is safe.

    But there is a new crescendo of criticism that goes beyond the objection to PowerPoint's tendency to turn any information into a dull recitation of look-alike factoids. Based on nearly a decade of experience with the software and its effects, detractors argue that PowerPoint-muffled messages have real consequences, perhaps even of life or death.

    Before the fatal end of the shuttle Columbia's mission last January, with the craft still orbiting the earth, NASA engineers used a PowerPoint presentation to describe their investigation into whether a piece of foam that struck the shuttle's wing during launching had caused serious damage. Edward Tufte, a Yale professor who is an influential expert on the presentation of visual information, published a critique of that presentation on the World Wide Web last March. A key slide, he said, was "a PowerPoint festival of bureaucratic hyper-rationalism."

    Among other problems, Mr. Tufte said, a crucial piece of information -- that the chunk of foam was hundreds of times larger than anything that had ever been tested -- was relegated to the last point on the slide, squeezed into insignificance on a frame that suggested damage to the wing was minor.

    The independent board that investigated the Columbia disaster devoted an entire page of its final report last month to Mr. Tufte's analysis. The board wrote that "it is easy to understand how a senior manager might read this PowerPoint slide and not realize that it addresses a life-threatening situation."

    In fact, the board said: "During its investigation, the board was surprised to receive similar presentation slides from NASA officials in place of technical reports. The board views the endemic use of PowerPoint briefing slides instead of technical papers as an illustration of the problematic methods of technical communication at NASA."

    The board echoed a message that Mr. Tufte and other critics have been trying to disseminate for years. "I would refer to it as a virus, rather than a narrative form," said Jamie McKenzie, an educational consultant. "It's done more damage to the culture."

    These are strong words for a program that traces its pedagogical heritage to the blackboard or overhead projector. But the relentless and, some critics would say, lazy use of the program as a replacement for real discourse -- as with the NASA case -- continues to inspire attacks.

    It has also become so much a part of our culture that, like Kleenex and Xerox, PowerPoint has become a generic term for any bullet-ridden presentation.

    Dan Leach, Microsoft's chief product manager for the Office software, which includes PowerPoint, said that the package had 400 million users around the world, and that his customers loved PowerPoint. When early versions of Office for small business did not include PowerPoint, customers protested, he said, and new versions

  • Reminds of the fable where mice want to be protected so they decide to put a bell on the cat, that way it can't sneak up on them.
    Great idea! Except of course who is going to put the bell on the cat?

    Yes, I would love software that could do everything he wants. Now who is going to write it? Will it collapse under it own weight?

  • by idontgno ( 624372 ) on Monday November 03, 2003 @06:19PM (#7381579) Journal
    Hind's [demon.co.uk] 7th Law of Computer Programs:

    Make it possible for programmers to write programs in English, and you will find that programmers cannot write in English.

  • misread the title of the story as "Removing Software Completely" and get enthused about a new technique to comprehensively uninstall crappy software from Windows?

    You may commence the usual /. geek-bigot joke answers: "fdisk", "install (linux|bsd|beos)", etc.

  • by Frans Faase ( 648933 ) on Tuesday November 04, 2003 @06:24AM (#7385087) Homepage
    I am getting more and more convinced that most of the complexity is caused by the fact that computers are too slow for the kind of tasks want them to perform. The real problem is that we are no longer aware that 99.5% of our efforts are related to optimization, and that 90% of our code is related to moving data around in the memory pyramide, or to calculate differential queries.

    Calculating a differential query means that you modify the outcome of a query based on how the data changed instead of reexecuting the whole query.

  • Software should be as easy to edit as a PowerPoint presentation

    I don't know... I'm capable of programming in a handful of different languages. But I've never been able to spend less than three hours on a bloody PowerPoint presentation and produce more than a single slide.
    I say programming is much simpler ;)
  • From the article:

    What, exactly, would a machine for writing software look like? It would itself be software.

    Well, I call mine a "compiler". They're not new.
  • by PinglePongle ( 8734 ) on Tuesday November 04, 2003 @07:09PM (#7391019) Homepage
    And abstraction is the fundamental means of reducing complexity.
    The history of programming is the movement from physically inserting patch cables to program a computer to manipulating abstractions. In languages like C, those abstractions are still pretty close to the hardware; in OO languages they tend to be closer to the problem domain. Edsger Dijkstra once said that software development was unique as a profession because it required practitioners to operate at 7 levels of abstraction - from transistor to algorithm to software architecture to business domain. Of course, very few of us deal with "transistor-level" programming these days.
    So, Simonyi's "intentional" programming is part of this broad sweep of improvement in programming languages in the last 50 or so years. The current emphasis behind Model-driven architecture [omg.org] is a similar desire to somehow take away all that messy code stuff and replace it with nice, easy to understand pictures.

    The problem with both these approaches is that complexity exists inherently in the problem domain. The role of a software development team is to chose a path through that complex problem domain and implement it with working code. Right now, I don't believe we have tools which are sufficiently expressive and intuitive to model the complexity of the problem domain, and we must be years if not decades away from being able to convert such models to working code.
    UML is lovely - it's a great language for expressing software ideas and conveying a lot of information in a graphical format, but the average business user just does not get it; in my experience they are primarily useful for communicating between developers.
    Use cases (in textual form) are far more useful for communicating with business users, but to convert a usecase into a working program would require natural language parsing at a level that must be a generation away.

    We should wish Simonyi luck - his ambitions are worthy, and will benefit all working developers if they bear fruit. And what better use to put a couple of billion dollars to ?
  • Simonyi still believes only a handful of super/uber geniuses can design software, but once that designs done, a standard office weenie can build it.

    He believed that back in PARC, he took that belief to Microsoft where it failed utterly, and still thinks so today.

    and he's still wrong.

On the eighth day, God created FORTRAN.

Working...