Forgot your password?
typodupeerror
Programming Bug IT Technology

Intuitive Bug-less Software? 558

Posted by michael
from the dream-on dept.
Starlover writes "In the latest java.sun.com feature at Sun's Java site, Victoria Livschitz takes on some ideas of Jaron Lanier on how to make software less buggy. She makes a couple of interesting points. First, making software more 'intuitive' for developers will reduce bugs. Second, software should more closely simulate the real world, so we should be expanding the pure object-oriented paradigm to allow for a richer set of basic abstractions -- like processes and conditions. The simple division of structures into hierarchies and collections in software too simple for our needs according to Livschitz. She offers a set of ideas explaining how to get 'there' from here. Comments?"
This discussion has been archived. No new comments can be posted.

Intuitive Bug-less Software?

Comments Filter:
  • Well... (Score:5, Funny)

    by bobbinFrapples (598252) on Friday February 13, 2004 @03:25PM (#8272427)
    First, making software more 'intuitive' for developers will reduce bugs

    Feels right.
    • Re:Well... (Score:5, Insightful)

      by RocketScientist (15198) * on Friday February 13, 2004 @03:38PM (#8272600)
      Man I hate this.

      How many times do we have to have fundamental truths reiterated?

      "Premature optimization is the root of all evil"

      I'd submit that nearly every bit of non-intuitive code is written because it "should be faster" than the intuitive of equivalent function. Just stop. Write the code the way it needs to be written. Decide if it's fast enough (Not "as fast as it could be" but "Fast enough") and then optimize if necessary.
      • You are wrong! (Score:3, Insightful)

        by sorbits (516598)

        Bad code arise when the requirements change and the code needs to be updated to these.

        Bad code arise when the beautiful algorithm needs to deal with real-world constraints.

        Bad code arise when the program grows over a certain size and too many modules depend on each other (this is often not avoidable).

        Bad code arise for many reasons -- premature optimizations is not a problem I face often (in my 15 years of programming), and I have worked with a lot of bad code, much of it my own, which did not start o

        • I agree with you completely that bad code often arises through innocent intentions. I disagree, however, that it's often not avoidable. It's almost always avoidable: if you wrote the same code from scratch, knowing what you know after writing it the first time, you would produce a much better result.

          The problem is just that complete rewrites are very expensive, and most development teams/managers are too stubborn to do minor rewrites when they should, preferring to add hacks or workarounds instead of main

    • Re:Well... (Score:5, Funny)

      by cybermace5 (446439) <g.ryan@macetech.com> on Friday February 13, 2004 @04:30PM (#8273281) Homepage Journal
      I once had a teacher whose recurring theme, loudly stated in every applicable situation plus several more, was that "INTUITION SUCKS!!!"

      I never did get around to asking him how he knew that, or if it was kind of a gut feeling he had.
  • Objects (Score:5, Insightful)

    by tsanth (619234) on Friday February 13, 2004 @03:25PM (#8272431)
    I would love to use a C/C++/Java-like language that utilizes pure objects, versus the mish-mashy hybrid typing that exists in most languages that I've used. To me, Livschitz's observation about how programmers work in metaphors, while mathematicians work in pure syntax, is very true: I breeze through all my programming and software engineering classes, but struggle mightily with math courses (save boolean algebra, but I digress).

    I, for one, would like software writing to resemble (really resemble) building structures with Legos.
    • Re:Objects (Score:5, Insightful)

      by weston (16146) <westonsd&canncentral,org> on Friday February 13, 2004 @04:00PM (#8272845) Homepage
      Livschitz's observation about how programmers work in metaphors, while mathematicians work in pure syntax

      It's an interesting thought, but it's not necessarily true at all. Mathematics is metaphors, even though they're often very abstract. But it's more like working with somebody else's codebase, most of the time. Unless you're striking out and creating your own formal system, you are working with metaphors that someone else has come up with (and rather abstract ones at that).

      The good news is that most mathemeticians have an aesthetic where they try to make things... as clean and orthogonal as possible.

      The bad news is that terseness is also one of the aesthetics. :)
    • Re:Objects (Score:3, Informative)

      by be-fan (61476)
      See: Lisp, Dylan, Scheme, Ruby, Python, Smalltalk, among others. These languages are all "objects all the way down" though Dylan, Ruby, and Python are more-so than Lisp and Scheme.

    • by Paradox (13555) on Friday February 13, 2004 @05:04PM (#8273760) Homepage Journal
      There is a language like that. In fact, both C++ and Java borrowed several ideas from it. It's called Smalltalk. :) In Smalltalk, everything is an object. Objects talk to each other via methods. Smalltalk has a limited form of closures, can handle exceptions, and has double-dispatch.

      As languages go, it's pretty awesome. It was well ahead of its time, anyways. Ruby (as another poster mentioned) also does some of this.

      Smalltalk and Ruby are great if you're just working with components and assembling them lego style, sure. But what'd be really nice is to use a language that can do both high level coding and systems programming. Someone else thought of it. Brad Cox came up with Objective-C, which NeXT later expanded upon.

      Apple is using Objective-C with the old OpenStep library as their primary development environment for awhile now. It's very nice, supports a lot of full features, has explicit memory management that is very flexible but also circumventable and tunable (using reference counting, but people have made mark-and-sweep extensions, both are not implicit like java though).

      Objective-C supports late binding, weak typing, strong typing, static typing and dynamic typing, all in the same program. It can directly use C, so if you know C you're already 3/4 of the way there. The message syntax is slightly odd, but works out. Unfortunately, Objective-C doesn't have closures. David Stes developed a meta-compiler that turns Objective-C with closures into regular C (called the Portable Object Compiler) which might get you some distance if your work demands them.

      ObjC can either use C-style functions, smalltalk style message passing, or a hybrid of both. It's a very interesting language. Apple added C++ extensions, so now in most cases you can even use C++ code (however C++ classes are not quite ObjC classes, and there are some caveats).

      If you're looking for a language that splits the difference between Ruby/Python and C/C++, Objective-C might be your best bet. It's pretty hard to find an easy-to-use language that also provides a lot of performance.
  • I'm sure... (Score:5, Insightful)

    by lukewarmfusion (726141) on Friday February 13, 2004 @03:26PM (#8272433) Homepage Journal
    "software should more closely simulate the real world"

    Because the real world doesn't have bugs, right? Our company doesn't have project management software yet - but we're working on it. Personally, I don't think it's worth it until we fix the real world project management issues that this software is supposed to help with. Maybe that's not quite the point, but it raised my eyebrows. (Which I'm thinking about shaving off.)
  • Jaron Lanier? (Score:3, Insightful)

    by jjohnson (62583) on Friday February 13, 2004 @03:26PM (#8272437) Homepage
    Someone please explain to me why anyone listens to this guy. I've read his essays; they're pedantic and hand-wavey. The term "Virtual Reality pioneer" should be enough to disqualify him from serious discourse.

    Somebody, please point me to something significant he's done so I'll know whether or not I should pay attention to him because, from everything I've seen so far, I shouldn't.
  • by Leffe (686621) on Friday February 13, 2004 @03:26PM (#8272441)
    Writing bugless code is not a good idea, in my opinion, think about it: Debugging is the art of removing bugs, therefore programming is the art of adding bugs.

    Writing bugless code would throw the universe upside down and could possibly mean the end of the world!

    Moderation Guideline: +3 Funny.
  • Comments? (Score:5, Funny)

    by jayhawk88 (160512) <jayhawk88@gmail.com> on Friday February 13, 2004 @03:26PM (#8272446)
    I'd say that with buzz-speak like that, she's going to make some CIO very happy someday.
  • by Anonymous Coward on Friday February 13, 2004 @03:27PM (#8272450)
    This type of stuff is not a problem for me to worry about anymore. It's India's. Direct me to the nearest auto-mechanic school please. It's time to learn how to fix problems that can put money in my pocket.
  • by Telastyn (206146) on Friday February 13, 2004 @03:28PM (#8272464)
    It might be me, but I've seen more bugs created because of assumptions made about abstractions, or because someone was used to a pre-made abstraction and didn't learn how things actually worked.

    Want to make better software? How about actually scheduling enough QA time to test it? When development time runs over schedule, push the damned ship date back!

    • by kvn (64836) on Friday February 13, 2004 @03:44PM (#8272678)
      I agree completely. Whether a developer uses a functional language or an object oriented language doesn't matter. What does matter MORE THAN ANYTHING is understanding the process that the software is supposed to support. If it's hospital management software, you have to know how hospitals are managed. If it's banking software, you have to understand banking.

      And testing, testing, testing. Because people aren't perfect. Nor would we want them to be... Too much money to be made in support contracts. :)
    • by boomgopher (627124) on Friday February 13, 2004 @04:14PM (#8273049) Journal
      True, but good abstraction skills are really important.

      Some of the guys I work with think that "tons of classes == object-oriented", and their code designs are f-cking unreadable and opaque. Whereas a few, thoughtfully designed classes that best model the problem would be magnitudes better.

    • by Greyfox (87712) on Friday February 13, 2004 @04:19PM (#8273113) Homepage Journal
      All the process and buzzwords in the world will not help you if your programmers don't understand your needs.

      Want to make better software? Make sure your programmers understand what you're trying to do and make sure that enough people have "the big picture" of how all the system components interact that your vision can be driven to completion. It also helps if you have enough people on hand that losing one or two won't result in terminal brain drain.

      Recently management seems to be of the opinion that people are pluggable resources who can be interchangably swapped in and out of projects. Try explaining to management that they can't get rid of the contracting company that wrote a single vital component of your system becauase no one else really understands how it works. They won't get it and you'll end up with a black box that no one really knows how to fix if it ever breaks.

    • When development time runs over schedule, push the damned ship date back!

      Not gonna happen. The vast majority of the time, the economic penalty associated with being late is much greater than the economic penalty associated with being buggy.

  • by pc-0x90 (547757) on Friday February 13, 2004 @03:30PM (#8272495)
    She misuses the term functional programming. I'm assuming she meant imperative languages. A lot of the problems could be solved with true functional languages (Haskell, OCaml, etc) but the learning curve is too high. Especially when you can get a team of second rate VB coders for the price of one haskell coder (if you can find one) But really, do you want working code now? Or perfect code in 10 years? That's where the problem is. Time.
    • by pokeyburro (472024) on Friday February 13, 2004 @03:40PM (#8272616) Homepage
      ...it also seemed like she misstated Java's approach as a "sandbag architecture" as opposed to a "sandbox architecture". I keep trying to visualize programmers writing more and more Java code to stave off the inevitable surge of bugs....

    • by Tailhook (98486) on Friday February 13, 2004 @05:17PM (#8273962)
      A lot of the problems could be solved with true functional languages (Haskell, OCaml, etc) but the learning curve is too high.

      A lot of problems are solved with functional languages. Functional advocates claim to have the answer to software correctness and they decry the present state of imperative logic programming. What I think they fail to realize is that functional programming is ubiquitous, solving problems on a scale that contemporary imperative tools will never approach.

      Microsoft Excel is, in essence, a functional programming language. It is utilized by non-"programmers" planet wide every day to quickly, accurately and cheaply "solve" millions of problems. It has, effectively, no learning curve relative to typical coding. I have found it to be an invaluable software development tool. I take it a bit further than the typical spreadsheet task by using to model software systems.

      It is especially helpful with business logic problems. I recently implemented a relatively complex web-based product configurator. I know that if I can model the complete problem in a stateless manner using a spreadsheet, writing bug-free, efficient client and server side imperative code becomes a simple matter of translation. For any given state of a collection of inputs there is exactly one atomic result. In this case the result is a (possibly lengthy) structured document computed dynamically from a collection of input forms, both on the client (because page refreshes suck) and on the server (because validation must not depend on an honest client.) Both independent implementations (in different languages) are "obviously" correct in the sense that they are derived from a clear, functional model, built in a spreadsheet.

      You may substitute any contemporary spreadsheet product in place of Excel; I have no love of Excel specifically. It's just what I've happened to have handy in all cases. The fact is that modeling most software problems requires very little of what any reasonablely competent spreadsheet can accommodate. Feel free to lecture me on precisely why it is blasphemous to suggest that a spreadsheet qualifies for the designation "functional programming." I know the difference because I've studied LISP and used Scheme. The subset of true functional programming that provides the most value is clearly represented by the common spreadsheet.
      • I think both parent posters are just right, functional programming is a great way to encounter problems, and excel spreadsheets are too.

        The big advantage of FP is its clearness and rigidness. To an experiences functional Programmer, its exactly clear what a piece of Haskell Code means, since the code is half general functions that are easy to understand (map, zip, fold et.al.) and half problem-specific functions that are about as easy. The solution is built from simple bricks everywhere, other than in impe

  • Test? (Score:5, Insightful)

    by JohnGrahamCumming (684871) * <slashdot@NosPAm.jgc.org> on Friday February 13, 2004 @03:30PM (#8272501) Homepage Journal
    I find it enlightening that this article does not include the word "test" once. Rather than spending a lot of time hoping that the purest use of OO technology or some other fancy boondoggle is going to make software better actually writing tests that describe the expected behaviour of the program is a damn fine way to make sure that it actually works.

    Picking just one program from my experience, POPFile: intially we had no test suite, it quickly became apparent that the entire project was unmanageable without one and I stopped all development to write from scratch a test suite for 100% of the code (currently stands around 98% code coverage). It's particularly apparent when you don't have all day to spend fixing bugs because the project is "in your spare time" that it's vital to have fully automatic testing. You simply don't have time to waste fixing bugs (of course if you are being paid for it then you do :-)

    If you want to be really extreme then write the tests first and then write the program that stops the tests from breaking.

    John.
    • Re:Test? (Score:4, Insightful)

      by ziggy_travesty (611150) on Friday February 13, 2004 @03:45PM (#8272687)
      I agree. From the disposition of her interview, it seems like testing is beneath her; programs should will themselves to work flawlessly. Just like NASA's reflector tests on the Hubble...right. This is definitely hand-waving. She whines about how modern OO languages aren't intuitive for certain relationships and offers no concrete (or abstract) solution for these shortcomings. The bottom line is: software has bugs because it is complex. Deal with it. It's very hard to write large, qualtiy applications. We need more skilled and better educated engineers, not more language constructs. Launching a space shuttle or writing a weapons targeting system will never be an intuitive process. Also, intuition and simplicity will never be a substitute for testing. What malarkey. -AZ Strengths: whining and moaning Weaknesses: independent thought
    • Re:Test? (Score:3, Insightful)

      by rwa2 (4391) *
      There are two ways to assure quality work, both in manufacturing & software.

      One is to have inspectors look at everything and make sure they're right. QA or "testing"

      The other is to actually fix the broken machines / processes that are stamping out broken widgets / buggy software in the first place. I think she's after this path.

      Of course, you still need both.
  • by irhtfp (581712) on Friday February 13, 2004 @03:31PM (#8272502)
    This is all well and good, but I think there are two primary reasons software continues to be buggy:

    The first is the intense pressure to get the product to market. This is especially true for custom code, written specifically for one client. They want it fast and cheap and in order to satisfy this desire, code invariably gets released/installed before it's ready. Then the "month of hell" starts as the client starts complaining about bugs, "bugs" and other problems and we bend over backwards to get it right.

    As a ISV, we have no choice but to do it this way. If we don't quote the project with this in mind, the client will hire somebody else with a better "can-do attitude".

    The second big reason software is buggy is because all the underlying tools (e.g. code bases, code objects, .dlls, etc.) are buggy as hell. I spend more time working around inherent bugs than I do debugging my own code.

    Most programmers are perfectly capable of making their own code solid, given enough time.

    • by FreshFunk510 (526493) on Friday February 13, 2004 @03:48PM (#8272724)
      I like to compare it to civil engineering.

      Civil engineering is superior for 2 reasons. 1) Time of QA and 2) Dependability of materials.

      In short, look at the time it takes to QA a bridge that is built. Not only is there QA done from design to finish, but real load testing is done. Although software does have serious QA, the time spent QAing civil engineering products is far more as a ratio to time spent actually building.

      Dependability. The thing with building things is that you can always take it for granted that the nuts, bolts and wires you use have a certain amount of pressure and force they can handle. Why? Because of the distinct same-ness behind every nut, bolt and wire ever built. One nut is the same as the other nut. All nuts are the same.

      In software, not all "nuts" are the same. One persons implementation of a string search can widely vary. Yes, we do have libraries that handle this issue, but there is a higher chance of error in software construction because of the ratio of libraries (third-party) used that are not as robust.

      Lastly, one reason why software hasn't been addressed with the same urgency is because of the consequences (or lack of). When a bridge is poorly built, people die. Laws go into affect, companies go out of business, and many people pay the price. When software starts failing, a patch is applied until the next piece of it starts failing when another patch is applied. In the end the software becomes a big piece of patched-up piece of crap.

      One advantage, though, of software is that a new version can be released with all the patches in mind and redesigned. :) This certainly has been proved by products like Mozilla that were probably crap when first released but definitely has matured into a solid product (imho).
      • While I agree with most of what you said, I do see one difference to the "bridge" analogy and that is fault tolerance.

        Bridges are built to be extremely fault tolerant. MechEs and CivEs use safety factors - big ones. Multiple bolts must fail before the structure becomes critical. Adding safety factors in mechanical structures is relatively cheap and easy.

        In most software, nearly everything is critical in some way due to the logical step-by-step nature of code execution. It's possible to write good fa

      • Anyone can call themselves a programmer, or even a software engineer. Someone who graduates from a BCS program is required[1] to do zero practical work in the field before they get their degree - which is the height of their qualifications.

        Engineers may graduate, but they require at least a few years of work before they can be licensed. Lawyers have to pass tests beyond those based in the fantasy world of academia. Medical doctors require years of on the job training under close supervision before they are turned loose. All of these professions are self-governed and discipline their members if - nay, when - one of them screws up. Potentially they can loose their license.

        The IT world has no such professional designation. The IT world has no such self-governing body. Given companies/individuals in the IT world can consistently produce almost criminally negligent code, and provided they bid low, will survive.

        MDs, PEngs (and even lawyers) can always refuse to do something if it is clearly dangerous, unsafe, or illegal. Their clients cant really go anywhere else to get that task done as all professionals will be bound by the same rules.

        Even trades: plumbers, carpenters, electricians, pipe fitters..... have some non-academic certification process. Most, beyond a (say) two year school program have to have years of apprentice work before they can be qualified. They are required by law to build things to some safety standard, building code, electrical code, fire code....

        Anyone can call themselves a programmer.

        The closest thing that the IT world has is various certs from for-profit companies. But they are generally for variations on systems administration, rather then programming. While, so far as I know, they cant be revoked for cause, they do all expire after some finite time.

        What the IT world needs is the equivalent of a PEng professional 'grade' designation for, ie 4 year BCS level of schooled people. And also a trades grade designation for 2 year community collage types. Implicitly from there you get higher quality product, because the people designing the product (PEng grade types), and the people implementing it (trade grade type) have higher obligations then just to the customer. They would have professional responsibilities, violation of which could cause them to loose their respective licenses. This would solve most of the bugs caused by cutting corners to save on cost, releasing before its done, etc. By no means all, but a lot.

        [1] yes, some schools have Co-Op programs. But I know of none that are requirements.
    • by Slak (40625) on Friday February 13, 2004 @04:26PM (#8273212)
      The zero-th reason software is buggy is the state of requirements. I've seen so many requirements documents that lack any form of internal consistency.

      These issues don't seem to be addressed until the rubber hits the road - when code starts compiling and demos are given. The pressure to market builds, as these issues are being resolved. Unfortunately, that's when The Law of Unintended Consequences strikes, scrapping much of the existing codebase.

      How can a programmer make "their own code solid" when the work it is supposed to perform is not clearly defined?

      Cheers,
      Slak
      • Exactly.

        There are too many Marketing droids talking about the solution they visualize without ever clearly formulating the use case.

        A lot of people can make feature suggestions. Without a clear picture of what the user actually wants to accomplish, the feature suggestions can't be evaluated for usefulness, nor can better suggestions be made to solve the user's problem. Varying solutions also can't be compared on their merits. And so it gets down to an "I have more contact with the customer so I must

  • by brokeninside (34168) on Friday February 13, 2004 @03:31PM (#8272511)
    To produce bugless software we need to start with software designs that are provably correct and then produce code that is provably in line with the design. Using more objects that more closely model the "real world" is an invitation to producing larger number of bugs as the ambiguity of the real world infects the design and implementation of the program.
    • To produce bugless software we need to start with software designs that are provably correct and then produce code that is provably in line with the design. Using more objects that more closely model the "real world" is an invitation to producing larger number of bugs as the ambiguity of the real world infects the design and implementation of the program.

      You're absolutely right. Some people think that turning to the "real world" for guidance a good idea, but I've found that it only confuses things. Nobo
      • A fair point. The trick is to take a much more principled approach to analyzing real-world entities and relationships. This is the field of formal ontology [stanford.edu], a branch of philosophy.

        Good ontology modelling software would check assumptions about objects such as "if you remove a man's arm, he is still considered the same man" (in business context, yes) and "a company is the same as the people who work in it" (it's not). Basic stuff; people tend to know it intuitively, but that intuition tends not to make it
    • I'm sorry, but I've heard this argument from software design "purists" one too many times. The term "provably correct" as applied to software *design* is laughable. For software development we have unit tests, which take some effort but are well worth the overhead. The only way to prove that a software design matches real-world requirements is to implement the design, and completely test its interaction withing the domain.

      Simply put, the proof is in the pudding. You can cleanly define (and test) the in
  • by richmaine (128733) on Friday February 13, 2004 @03:36PM (#8272567)
    So she wants to make software more intuitive and wants to make it more like the real world.

    Perhaps she should make up her mind. :-)
  • avoiding the obvious while promising the impossible

    this is an exercise in wish-fulfillment, in suspending disbelief

    writing software with less bugs by making things more intuitive and less hierarchical?

    i mean, that's funny!

    we're talking about telling machines what to do, that is what software writing is

    writing software is an extremely hierarchical exercise, the art is giving people want they want
  • by pottymouth (61296) on Friday February 13, 2004 @03:41PM (#8272644)


    "....especially because I've always thought that the principles of fuzzy logic should be exploited far more widely in software engineering. Still, my quest for the answer to Jaron's question seems to yield ideas orthogonal to his own. "

    I fear people that talk like this. It makes me wonder if they go home at night and plug themselves into something.....

  • by The Slashdolt (518657) on Friday February 13, 2004 @03:41PM (#8272645) Homepage
    "The constant security-related problems associated with Microsoft's products are due to its fundamental platform architecture. Java technology, in contrast, enjoys exceptional immunity to viruses because of its sandbag architecture."

    I think she means sandbox architecture [javaworld.com]
  • by GSVNoFixedAbode (398577) <gf...hyland@@@es...co...nz> on Friday February 13, 2004 @03:43PM (#8272670) Homepage
    "and more closely simulate and resemble the real world". Hey, I know! How about a COmmon Business-Oriented Language? We could call it COBOL perhaps.
  • Jaron Who? (Score:3, Interesting)

    by popo (107611) on Friday February 13, 2004 @03:45PM (#8272685) Homepage

    No really... does anyone care about Jaron Lanier?

    I'd put his contributions to technology right up there with Esther Dyson's.

    He's another person who calls himself a "visionary" because the specifics of technological development are far beyond his capacity.

    He is, always was, and always will be, a non-player.

  • by bigattichouse (527527) on Friday February 13, 2004 @03:46PM (#8272702) Homepage
    Most of my apps have been moving to a "State Machine" based workflow.. Each item of work or task sits in front of you, and only gives you the necessary choices to move it along... Once the engine is in place, you end up doing these simple "OK, lets build the code to show them enough info to make a choice".. and the idea translates well to automated processes, just pop the next item from the queue and work on it.
  • I agree, somewhat (Score:5, Insightful)

    by Captain Rotundo (165816) on Friday February 13, 2004 @03:48PM (#8272722) Homepage
    A lot of the article is common sense. But I have been perturbed by the ease with which a lot of people seen to claim that OO is the end all and be all of everything.

    Even on simple project I have sometimes found myself designing to fit into Java's OO and not to fit the problem. Its really a language issue when it comes down to it. I am most comfortable in C, so I start writing a Java app and can feel myself being pulled into squeezing round objects into square holes. You have to then step back and realize whats happening before you go to far. I think this is the main source of "design bugs' from myself either ignoring the strengths of a system (not taking advantage of Java's OO) or trying to squeezing a design that is comfortable without billions of objects into some vast OO system, in effect falling into the weakest parts of a language.

    Its probably very similar to the ways people screw up a second spoken language, mis-conjugating verbs and whatnot -using the style they are already most familiar with.

    So with that its such ridiculusly common sense to say we need an all incompasing uber-language that is completely intuitive, I jsut would like to see someone do it rather than go on about it.

    Why not experiment with added every feature to Java that you feel it lacks to see if you can achieve that? because then you end up with perl :)

    Seriously programming languages aren't that way by and large because they have to be designed to fight whatever problems exist that they are created to take care of. It a bit foolish to say we need a language that is perfect for everything, instead you look at what your problems are and develope a language to fight those. Invariably you end up with failings in other areas and the incremental process continues.
  • by mysterious_mark (577643) on Friday February 13, 2004 @03:49PM (#8272731)
    I think the current tools exist to produce code that is way less buggy. For instance much of the industrial code I have seen written in Java poorly uses the OO capabilities of Java, and this in itself causes more bugs and maintainability problems. It seems fairly rare that the existent OO languages and tools are well used at least at the application developer level. So I think the problem is really more with the abililty of developers to use proper and well implemented OO methodology. It also seems as though the design process is flawed in that class designs etc., are often done with a priori with insufficient in depth understanding of the process that is being modelled, this is usually because management insist upon having an immutable design before development starts and often before the problem and proceeses the code is being written for is sufficiently understood. Bottom line is you can hand anyone a scapel, but it doesn't make them a surgeon. Skilled developers with a good understanding of the underlying process they are coding for will produce better qualilty and maintainable code. Because it is developer skill that is the issue, not tools the current race-to-the bottom to off shore all devepment to the lowest bidder and lowest developer skill will inherently produce less maintainable, buggier code. The solution to less buggy code is to use skilled programmers wha really understand and can use the available OO techniques (ie NOT offshore, NOT H1-B etc.). I think it also helps if the developers have some understanding of the field for which they code ie medical, financial etc. When you go with the lowest bidder, you get what you pay for /rant) MM
    • by ratboy666 (104074) <fred_weigel AT hotmail DOT com> on Friday February 13, 2004 @05:06PM (#8273784) Homepage Journal
      "ability of developers to use proper and well implemented OO methodology..class designs...a priori with insufficient in depth..."

      The reason is that most developers INSIST that class structure should model the application domain. Even if it doesn't make the slightest lick of sense.

      Reason? Because of how OO was taught. Concrete to abstract, keeping in line with a problem domain.
      (coloured rectangle->rectangle->shape). This certainly makes teaching easier, but doesn't make for sensible class hierarchies.

      OO is separate from a class hierarchy. The only reason we HAVE a hierarchy is to allow code to be reused. Therefore, the proper hierarchy is not a taxonomy, it is the one that leverages the code maximally.

      As an example - Where to put a Date class?

      Smalltalk classifies a Date as a Magnitude -- things that can be compared. So comparisions can be leveraged (eg. =). If it were NOT there, all comparisions need re-implementation.

      Also Character should be a Magnitude as well.
      Maybe String, but that's a bit shaky (mixins help, it's comparable, but is a collection of Character).

      Where to put a class in the hierarchy should be driven by the principle of minimizing code. *NOT* modelling the real world. If you model the "real world" you are probably in a serious "world of hurt". Also, in this case, the OO "paradigm" isn't going to save you much in the way of coding (will save you debugging, hopefully).

      Avoidance of bugs...

      Stay away from stupid languages. Insist that optimization is the compiler/computers job. The Rosetta Stone is to ask for a factorial function, *without* specifying any details. Code it in the *most* natural way, and then test it with 10,000!

      Now, determine how much breakage has occurred (if any).

      The answer to LARGE projects is to write code ONCE, and be able to reuse it in any context that needs the same processing. I don't want to have to code the factorial algorythm for small integers, large integers, and really big integers.

      I want the code to accomodate the data-type that is needed. If I sort, and use "" ordering, I want that to work across any datatype.

      If I have to re-implement, I lose on the previous work.

      Class hierarchies can help structure (look at Smalltalk), but are not often used in this way.

      Ratboy.
  • fluff (Score:5, Insightful)

    by plopez (54068) on Friday February 13, 2004 @03:50PM (#8272743) Journal
    Well.. some interesting ideas in there mainly flawed.

    1) The concept that software should 'feel' right to the developer. First of all this cannot be formalized in any sense of the word. Secondly even if it could be it is focused on the wrong target, it should feel right to the end user/problem domain experts. More about this in point 2.

    2) Software tools should model the real world. Well.. duh. Any time you build software you are modeling a small part of the real world. THe next question is: what part of the real world. The reason that OOP has not progressed farther is that the real world is so complex that you can only build some generic general purpose tools and then have a programmer use those tools to solve a particular subset. So the programmer must first know what the problem domain is and what the tool set is capable of.

    3) Programmers should be average. Absolutely not. In order to model the real world a good programmer must be able to retrain in an entire new problem domain in a few months. This is what is missing in may cases, most people do not have that level of flexibility, motivation or intelligence and it is difficult to measure or train this skill.

    4) Programmers shouldn't have to know math. Wrong again. Programming IS math. And with out a basic understanding of math a programmer really does not understand what is going on. This is like saying engineers shouldn't need to know physics.

    5) The term 'bug' is used very loosely. There are at least 3 levels of bugs out there:
    a) Requirements/conceptual bugs. If the requirements are wrong based on misunderstanding you can write great software that is still crap because it does not solve the correct problem. This can only be solved by being a problem domain expert, or relying heavily on experts (a good programmer is humble and realize that this reliance must exist).

    b) Design flaws. Such as using the wrong search, bad interface, poor secuirty models. This is where education and experience come in.

    c) Implementation bugs, such as fence post errors and referencing null pointer. THis can be largely automated. Jave, Perl and .Net ae eliminating may of those issues.

    In short, a bad simplistic article which will probably cause more harm than good.
  • by farnz (625056) <slashdot@NoSPAm.farnz.org.uk> on Friday February 13, 2004 @03:51PM (#8272752) Homepage Journal
    I noticed that she touched on strong typing as an aid to avoiding bugs; would a really strong type system help avoid bugs, or would it just introduce bugs into people's data types?

    I ask because I'm currently looking into dependent type systems, which aren't currently practical. However, their claim to fame is that the type system is much more expressive; it is possible to define types like "date" or "mp3" in them, and ensure that wrong data cannot be supplied to functions. As I play though, I get the feeling that if the type system is too powerful, people will just create bugs in types, and we'll not improve by as much as we could do.

  • by Anonymous Coward on Friday February 13, 2004 @03:51PM (#8272763)
    expanding the pure object-oriented paradigm

    1. WTF does that mean? It's all just buzzwords. Woohoo. Another buzzword engineer. Just what the world needs.

    2. Making programmers program in an OO paradigm doesn't stop bugs. So why should "expanding the pure object-oriented paradigm" do anything productive?

  • by G4from128k (686170) on Friday February 13, 2004 @03:56PM (#8272806)
    Having watched many people struggle with physics, chemistry, and biology courses, I'm not sure that the real world is all that inituitive. Even in the non-scientific informal world, many people have incorrect intuitive models for how things work. For example, many people think that increasing the setting on the thermostat will make the room warm up faster (vs. warming at a constant rate, but reaching a higher temperature eventually). And my wife still thinks that turning off the TV will disrupt the functioning of the VCR.

    Another problem is that the real world is both analog and approximate, while the digital world calls for hard-edged distinctions. In the real world, close-enough is good enough for many physical activities (driving inside the white lines, parking near a destination, cooking food long enough). In contrast, if I am moving or removing files from a file system, I need an algorithm that clearly distinguishes between those in the selection set and those outside it.

    I like the idea of intuitive programming, but suspect that computers are grounded in logic and that logic is not an intuitive concept.
    • I like the idea of intuitive programming, but suspect that computers are grounded in logic and that logic is not an intuitive concept.

      Slightly OT:
      This is indeed hitting the nail on the head. My father and I have lots of disagreement on the issue of "common sense." I am very smart and he is so too, but tends to fall behind when it comes to explaining ... or rather, just ACCEPTING unknowns and their repercussion in logic. Say, If I withhold a fact in an argument but claim to be "right," he will say there i

  • by Tablizer (95088) on Friday February 13, 2004 @03:59PM (#8272832) Homepage Journal
    After many debates and fights over paradigms and languages, it appears that everyone simply thinks differently. The variety is endless. There is no universal model that fits everyone's mind well.

    As far as "modeling the real world", my domain tends to deal with intellectual property and "concepts" rather than physical things. Thus, there often is no real world to directly emulate. Thus, the Simula-67 approach, which gave birth to OOP, does not extrapolate very well.

    Plus, the real world is often limiting. Many of the existing manual processes have stuff in place to work around real-world limitations that a computer version would not need. It is sometimes said that if automation always tried to model the real world, airplanes would have wings that flap instead of propellers and jets. (Do jets model farting birds?)

    For one, it is now possible to search, sort, filter, and group things easily by multiple criteria with computers. Real-world things tend to lack this ability because they can only be in one place at a time (at least above the atomic level). Physical models tend to try to find the One Right Way to group or position rather than take full advantage of virtual, ad-hoc abstractions and grouping offered by databases and indexing systems.
  • by Dirtside (91468) on Friday February 13, 2004 @04:00PM (#8272846) Journal
    - I accept that humans are fallible, and as long as software is produced by humans, or by anything humans create to produce software for them, the software will have bugs.

    - I accept that there is no magic bullet to programming, no simple, easy way to create bug-free software.

    - I will not add unrelated features to programs that do something else. A program should concentrate on one thing and one thing only. If I want a program to do something unrelated, I will write a different program.

    - I will design the structure of the program, and freeze its feature set, before I begin coding. Once coding has begun, new features will not be added to the design. Only when the program is finished will I think about adding new features to the next version. Anyone who demands new features be added after coding has begun will be savagely beaten.

    - A program is only finished when the time and effort it would take to squash the remaining obscure bugs exceeds the value of adding new features... by a factor of at least two.

    - If I find that the design of my program creates significant problems down the line, I will not kludge something into place. I will redesign the program.

    - I will document everything thoroughly, including the function and intent of all data structures.

    - I will wish for a pony, as that will be about as useful as wishing that people would follow the above rules. :)
  • by scorp1us (235526) on Friday February 13, 2004 @04:12PM (#8273021) Journal
    I was thinking about the same axact thing the other day. It's 2004, where are our common primatives?

    glibc is 'it' but it still gets updates, bug fixes, etc. It is not used on every platform. Yet it gets recreated over and over again.

    Then I thought about .Net. Finally any language an interface to any other language's compiled objects. So we're getting closer.

    But I think the biggest problem is the lack of software engineering from flow-charting. As mentioned, flowcharts allow us to map out or learn the most complicated software.

    I think we can accomplish all she describes inside an OOP language, be it Java or C++ or Python. The master-slave relationship is easily done. The cooler thing that I would like to see more of is the state.

    Rather than a process starting off in main(), and ini code run in constructors, each process and object need to have a state associated with it. This state is actually a stack, and not a variable.

    my_process {

    resister state handler 'begin' as 'init'
    resiter state hander 'end' as 'exit'
    state change to begin
    }

    init() {
    do_something();
    register handler condition (x=1, y=1, z=1) as 'all_are_one'
    }

    all_are_one() { // special state
    state change to 'in_special_case'
    do_something_else();

    pop state
    if (exit_condidtion) exit()
    }

    exit(){
    while (state_stack.length) pop state
    }

    What I'm tring to do is model the logical process with the execution of code, but in an asyncrounous manner. Sort of like a message pump, but its been extended to take process stages and custom events.
  • by Cereal Box (4286) on Friday February 13, 2004 @04:26PM (#8273208)
    Why is it so difficult, if not impossible, to write bug-free programs that contain more than 20 to 30 million lines of code?

    Maybe because the programs contain 20 to 30 million lines of code.

    Look, I understand that a lot of people are yearning for the good old days when software was less buggy. You know what? I suppose that if your entire application consists of something like 4000 assembly code instructions, you might just be able to make the program bug-free.

    But it's not 1983 anymore and programs are on the order of millions of lines of code. Of course it's not feasible to go over the entire program manually and root out every single bug. The stuff I work with every day is considered extremely small and yet it depends on all sorts of external libraries, each of which may have dependencies, etc. It all adds up to amazingly large amounts of code. But, it requires large amounts of code to do extremely complicated things. Is this a surprise to her or something? I don't think there's any "paradigm shift" in the field of programming that's going to change the fact that:

    * Doing complicated things requires lots of code.
    * The more code you write, the higher the chance of bugs.

    I reiterate: duh...
  • by richieb (3277) <[moc.liamg] [ta] [beihcir]> on Friday February 13, 2004 @04:26PM (#8273217) Homepage Journal
    She says:
    It is widely known that few significant development projects, if any, finish successfully, on time, and within budget.

    What bothers me about statements like this, is that no one is suggesting that perhaps our estimation and budgeting methods are off.

    What if someone scheduled one week and allocate $100 for design and construction of a skyscraper, and when the engineers failed to deliver, who should be blamed? The engineers?!

    • by dutky (20510) on Friday February 13, 2004 @06:02PM (#8274571) Homepage Journal
      richieb [slashdot.org] wrote
      She says:

      It is widely known that few significant development projects, if any, finish successfully, on time, and within budget.


      What bothers me about statements like this, is that no one is suggesting that perhaps our estimation and budgeting methods are off.


      What if someone scheduled one week and allocate $100 for design and construction of a skyscraper, and when the engineers failed to deliver, who should be blamed? The engineers?!


      First, there are lots of folks who have been saying, for a long time, that our estimation and budgeting methods are inadequate: Fred Brooks [amazon.com] and Tom DeMarco [amazon.com] are just two of the best known advocates of this position. It seems, unfortunately, that it is not a message that many folk like to hear. It is, I guess, easier (and more expedient) to blame the tools or the craftspeople than to figure out what really went wrong.

      Second, your example would be more apt if the building materials (steel and concrete) or the blueprints and construction tools were being blamed for cost overruns and schedule slips. No one would suggest that building skyscrapers would be easier and more reliable if the bricks and jackhammers were more intuitive.

      What she is saying smacks of silver bullets (see Fred Brooks Mythical Man-Month, chapter 16: No Silver Bullets - Essence and Accident in Software Engineering [virtualschool.edu] (and succeeding chapters in the 20th Anniversary Edition)) and just can't be taken seriously. To summarize Brooks:

      There is simply no way to take the programming and software engineering tasks and make them easy: they are difficult by their very essence, not by the accident of what tools we use.

      While we may be able to devise languages and environments that make the creation of quality software by talented experts easier, we will never be able to make the creation of quality software easy and certain when undertaken by talentless hacks, amatures and diletants. Unfortunately, the later is what is desired by most by managers, becuase it would mean that the cost of labor could be greatly reduced (by hiring cheaper or fewer warm bodies). It also happens to be the largest market, at least in the past two decades, for new development tools: think of the target markets for VisualBASIC, dBASE IV, Hypercard and most spreadsheets.
  • by barryfandango (627554) on Friday February 13, 2004 @04:52PM (#8273573)
    "Livschitz grew up in Lithuania, where she was the women's chess champion and a National Chess Master in 1988 -- the same year in which she won the prestigious Russian national junior mathematical competition."

    She's the real story here. I think I'm in love.

  • by BrittPark (639617) on Friday February 13, 2004 @05:35PM (#8274186) Homepage Journal
    Because of human nature and because of the extreme complexity of the ideas we attempt to encapsulate in non-trivial software, buglessness is not an achievable goal, regardless of the methodology of the day. The interviewee seems to think that there is some magic bullet waiting (in new tools or methodologies I guess). This shows a fundamental rift between her and reality, and makes her opinions fundamentally suspect.

    The goal in any real software project is to meet customer's (and I use that in the broadest sense) expectations adequately. What is adequate? That depends on the software. A user of a word processor for instance is likely to not mind a handful of UI bugs or an occasional crash. A sales organization is going to expect 24/7 performance from their Sales Automation Software.

    The canny programmer (or programming group) should aim herself to produce software that is "good enough" for the target audience, with, perhaps, a little extra for safety's sake (and programmer pride).

    Of course their are real differences among the tools and methodologies used in getting the most "enough" per programmer hour. Among the one's I've come to believe are:

    1. Use the most obvious implementation of any module unless performance requirements prohibit.

    2. Have regular code-reviews, preferably before every check-in. I've been amazed at how this simple policy reduces the initial bug load of code. Having to explain one's code to another programmer has a very salutary effect on code quality.

    3. Hire a small number of first class programmers rather than a larger number of lesser programmers. In my experience 10% of the programmers tend to do 90% of the useful work in large software projects.

    4. Try to get the technical staff doing as much programming as possible. Don't bog them down with micromanagement, frequent meetings, complex coding conventions, arbitrary documentation rules, and anything else that slows them down.

    5. Test, test, test!
  • Old ideas.... (Score:3, Interesting)

    by FoFi (143144) on Friday February 13, 2004 @06:42PM (#8275006)

    I couldn't stop thinking of existing theories and/or implementations of her ideas...

    Modeling processes out of the OO paradigm (opposite to what Design patterns started to sacralize for example) is precisly the subjet of so-called business rules. But BR people are close to relationnal model of data, that is too quickly assimilated with SQL DBMS(*), so OO oriented people don't buy it (see the almighty impedance mismatch).

    Data-structures other than trees and collections are already genericaly implementable in any modern OO language. See Eiffel for example which can perfectly do that for 15 years (parametric classes, with full type safety). May be the java generics will help to build highly reusable data structure... I doubt that, anchored type is missing (ie the possibility to declare the type of a variable as equal to another type, nearly mandatory when dealing with inheritance of generics).

    Tom.

    (*) I warmly recommend the writings of Chris Date and Fabian Pascal to really see how the relationnal model of data is different from SQL databases...see DBDebunk [dbdebunk.com] for references.

  • by ninejaguar (517729) on Friday February 13, 2004 @08:09PM (#8275711)
    From the post:
    "software should more closely simulate the real world"

    From the article: "It's not the prevention of bugs but the recovery -- the ability to gracefully exterminate them -- that counts."

    While the need to gracefully recover your design from bugs (bugs come from design, or lack of, not code) is laudable. The proper technique is to design without bugs in the first place. Assuming that you're actually meeting the business requirements or functional specifications, there is a straightforward method to flattening bugs before they become fruitfully overripe and multiply.

    Once you have obtained the proper [amazon.com] requirements [amazon.com] (your goals), and after you've properly atomized it to its smallest component parts, you need to model those parts. Once you've modeled those parts, you need to test the model. This works in single process design, but it really shines in concurrency where anyone can truly screw up.

    Get a good [amazon.com] book [amazon.com] on design [amazon.com]. Then get a good [amazon.com] book [ic.ac.uk] on modelling, mechanically analyzing, and testing those designed processes before commiting to code.

    = 9J =

  • by Dominic_Mazzoni (125164) * on Friday February 13, 2004 @09:34PM (#8276344) Homepage
    The article doesn't seem to address all of the different types of bugs, nor how to best address them. Anyone care to add to or refine this list?

    1. Algorithmic bugs - you have a function with well-defined input and output, and it does the wrong thing (may include giving the wrong answer, looping forever, leaking memory, or taking too long to return). Can be avoided with a combination of code review, unit tests, and correctness proofs when possible.
    2. Interface bugs - this includes validating input, both from the user and over the network or other ways in which your program gets input data. These bugs include buffer overruns, GUI bugs caused by an unanticipated sequence of clicks, etc. These bugs are mostly found by testing, but sometimes also with automatic code checkers or memory debuggers that highlight potential problems.
    3. Bugs in the operating system or in sublibraries - any large project depends on large volumes of operating system code and usually lots of other libraries. These systems almost certainly have bugs or at the very least undocumented or inconsistent behavior. The only way to avoid this is to validate all OS responses and do lots of testing.
    4. Cross-platform bugs - a program could work perfectly on one system, but not on another. Best way to address this is to abstract all of the parts of your program that are specific to the environment, but mostly this just requires lots of testing and porting.
    5. Complexity bugs - bugs that start to appear when a program or part of a program gets too complicated, such that changing any one piece causes so many unintended side-effects that it becomes impossible to keep track of them. This is one of the few areas where good object-oriented design will probably help.
    6. Poor specifications - these are not even necessarily bugs, just cases where a program doesn't behave as expected because the specifications were wrong or ambiguous. The way to avoid this is to make sure that the specifications are always clear. Resolve any potential ambiguities in the specs before finishing the code.

    My overall feeling is that there are so many different types of bugs in a real-world programming project, and any one technique (like object-oriented design) only helps address one type of bug.
  • by master_p (608214) on Saturday February 14, 2004 @07:05AM (#8278723)
    This slashdot discussion about software engineering is one the best I've ever read. It should be almost mantatory for students of computing to study. Lot's of posters have raised very significant points about why it seems impossible to write bugless software.

    Let me add my 2 cents: the problem is that computer programs represent the 'how' instead of 'what'. In other words, a program is a series of commands that describes how things are done, instead of describing what is to be done. A direct consequence of this is that they are allowed manipulate state in a manner that makes complexity skyrocket.

    What did object orientation really do for humans ? it forced them to reduce management of state...it minimized the domain that a certain set of operations can address. Before OO, there was structured programming: people were not disciplined (or good) enough to define domains that are 100% indepentent of each other...there was a high degree of interdepentencies between various parts of a program. As the application got larger, the complexity reached a state that it was unmanagable.

    All todays tools are about reducing state management. For example, garbage collection is there to reduce memory management: memory is at a specific state at each given point in time, and in manual memory management systems, the handling of this state is left to the programmer.

    But there is no real progress in the computing field! both OO, garbage collection, aspect oriented programming and all other conventional programming techniques are about reducing management of state; in other words, they help answer the 'how' question, but not the 'what' question.

    Here is an example of 'how' vs 'what': a form that takes input and stores in a database. The 'what' is that 'we need to input the person's first name, last name, e-mail address, user name and password'. The 'how' is that 'we need a textbox there, a label there, a combobox there, a database connection, etc'. If we simply answered 'what' instead of 'how', there would be no bug!!! instead, by directly manipulating the state, we introduce state dependencies that are the cause of bugs.

    Functional programming languages claim to have solved this problem, but they are not widely used, because their syntax is weird for the average programmer (their designers want to keep it as close to mathematics as ever), they are interpreted, etc. The average person that wants to program does not have a clue, the society goes away from mathematics day by day, so functional languages are not used.

    The posted article of course sidesteps all these issues and keeps mumbling about 'intuitive programming' without actually giving any hint towards a solution.

What this country needs is a good five dollar plasma weapon.

Working...