Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming Bug IT Technology

Intuitive Bug-less Software? 558

Starlover writes "In the latest java.sun.com feature at Sun's Java site, Victoria Livschitz takes on some ideas of Jaron Lanier on how to make software less buggy. She makes a couple of interesting points. First, making software more 'intuitive' for developers will reduce bugs. Second, software should more closely simulate the real world, so we should be expanding the pure object-oriented paradigm to allow for a richer set of basic abstractions -- like processes and conditions. The simple division of structures into hierarchies and collections in software too simple for our needs according to Livschitz. She offers a set of ideas explaining how to get 'there' from here. Comments?"
This discussion has been archived. No new comments can be posted.

Intuitive Bug-less Software?

Comments Filter:
  • I'd rather have... (Score:2, Interesting)

    by American AC in Paris ( 230456 ) * on Friday February 13, 2004 @03:24PM (#8272408) Homepage
    Frankly, I think the better approach would be to make software developers less buggy. Educate them properly!

    So long as you allow developers to do such things as basic arithmetic and variable assignment, you're gonna have to deal with buggy code written by self-recursive sphincter-spelunkers.

  • by pc-0x90 ( 547757 ) on Friday February 13, 2004 @03:30PM (#8272495)
    She misuses the term functional programming. I'm assuming she meant imperative languages. A lot of the problems could be solved with true functional languages (Haskell, OCaml, etc) but the learning curve is too high. Especially when you can get a team of second rate VB coders for the price of one haskell coder (if you can find one) But really, do you want working code now? Or perfect code in 10 years? That's where the problem is. Time.
  • by irhtfp ( 581712 ) on Friday February 13, 2004 @03:31PM (#8272502)
    This is all well and good, but I think there are two primary reasons software continues to be buggy:

    The first is the intense pressure to get the product to market. This is especially true for custom code, written specifically for one client. They want it fast and cheap and in order to satisfy this desire, code invariably gets released/installed before it's ready. Then the "month of hell" starts as the client starts complaining about bugs, "bugs" and other problems and we bend over backwards to get it right.

    As a ISV, we have no choice but to do it this way. If we don't quote the project with this in mind, the client will hire somebody else with a better "can-do attitude".

    The second big reason software is buggy is because all the underlying tools (e.g. code bases, code objects, .dlls, etc.) are buggy as hell. I spend more time working around inherent bugs than I do debugging my own code.

    Most programmers are perfectly capable of making their own code solid, given enough time.

  • by brokeninside ( 34168 ) on Friday February 13, 2004 @03:31PM (#8272511)
    To produce bugless software we need to start with software designs that are provably correct and then produce code that is provably in line with the design. Using more objects that more closely model the "real world" is an invitation to producing larger number of bugs as the ambiguity of the real world infects the design and implementation of the program.
  • avoiding the obvious while promising the impossible

    this is an exercise in wish-fulfillment, in suspending disbelief

    writing software with less bugs by making things more intuitive and less hierarchical?

    i mean, that's funny!

    we're talking about telling machines what to do, that is what software writing is

    writing software is an extremely hierarchical exercise, the art is giving people want they want
  • Jaron Who? (Score:3, Interesting)

    by popo ( 107611 ) on Friday February 13, 2004 @03:45PM (#8272685) Homepage

    No really... does anyone care about Jaron Lanier?

    I'd put his contributions to technology right up there with Esther Dyson's.

    He's another person who calls himself a "visionary" because the specifics of technological development are far beyond his capacity.

    He is, always was, and always will be, a non-player.

  • by mysterious_mark ( 577643 ) on Friday February 13, 2004 @03:49PM (#8272731)
    I think the current tools exist to produce code that is way less buggy. For instance much of the industrial code I have seen written in Java poorly uses the OO capabilities of Java, and this in itself causes more bugs and maintainability problems. It seems fairly rare that the existent OO languages and tools are well used at least at the application developer level. So I think the problem is really more with the abililty of developers to use proper and well implemented OO methodology. It also seems as though the design process is flawed in that class designs etc., are often done with a priori with insufficient in depth understanding of the process that is being modelled, this is usually because management insist upon having an immutable design before development starts and often before the problem and proceeses the code is being written for is sufficiently understood. Bottom line is you can hand anyone a scapel, but it doesn't make them a surgeon. Skilled developers with a good understanding of the underlying process they are coding for will produce better qualilty and maintainable code. Because it is developer skill that is the issue, not tools the current race-to-the bottom to off shore all devepment to the lowest bidder and lowest developer skill will inherently produce less maintainable, buggier code. The solution to less buggy code is to use skilled programmers wha really understand and can use the available OO techniques (ie NOT offshore, NOT H1-B etc.). I think it also helps if the developers have some understanding of the field for which they code ie medical, financial etc. When you go with the lowest bidder, you get what you pay for /rant) MM
  • by farnz ( 625056 ) <slashdot&farnz,org,uk> on Friday February 13, 2004 @03:51PM (#8272752) Homepage Journal
    I noticed that she touched on strong typing as an aid to avoiding bugs; would a really strong type system help avoid bugs, or would it just introduce bugs into people's data types?

    I ask because I'm currently looking into dependent type systems, which aren't currently practical. However, their claim to fame is that the type system is much more expressive; it is possible to define types like "date" or "mp3" in them, and ensure that wrong data cannot be supplied to functions. As I play though, I get the feeling that if the type system is too powerful, people will just create bugs in types, and we'll not improve by as much as we could do.

  • by pokeyburro ( 472024 ) on Friday February 13, 2004 @04:22PM (#8273152) Homepage
    A fair point. The trick is to take a much more principled approach to analyzing real-world entities and relationships. This is the field of formal ontology [stanford.edu], a branch of philosophy.

    Good ontology modelling software would check assumptions about objects such as "if you remove a man's arm, he is still considered the same man" (in business context, yes) and "a company is the same as the people who work in it" (it's not). Basic stuff; people tend to know it intuitively, but that intuition tends not to make it into software, which causes breakage.
  • by MattRog ( 527508 ) on Friday February 13, 2004 @04:24PM (#8273187)
    Scatological reference aside, I do agree. Not only that, but educate them *properly*. Most higher education these days is a sham. They 'teach' products and not processes, cook-book approaches and not critical thinking.

    Dijkstra has a few things to say on the topic as well:

    On Education, Specifically:

    The ongoing process of becoming more and more an a-mathematical society is more an American specialty than anything else (It is also a tragic accident of history).

    The idea of a formal design discipline is often rejected on account of vague cultural/philosophical condemnations such as "stifling creativity"; this is more pronounced in the Anglo-Saxon world where a romantic vision of "the humanities" in fact idealizes technical incompetence. Another aspect of that same trait is the cult of iterative design.

    Industry suffers from the managerial dogma that for the sake of stability and continuity, the company should be independent of the competence of individual employees. Hence industry rejects any methodological proposal that can be viewed as making intellectual demands on its work force. Since in the US the influence of industry is more pervasive than elsewhere, the above dogma hurts American computing science most. The moral of this sad part of the story is that as long as the computing science is not allowed to save the computer industry, we had better see to it that the computer industry does not kill computing science.


    And then on Computer Science in general:
    "I hope very much that computing science at large will become more mature, as I am annoyed by two phenomena that both strike me as symptoms of immaturity.


    The one is the widespread sensitivity to fads and fashions, and the wholesale adoption of buzzwords and even buzz notes. Write a paper promising salvation, make it a "structured" something or a "virtual" something, or "abstract", "distributed" or "higher-order" or "applicative" and you can almost be certain of having started a new cult.

    The other one is the sensitivity to the market place, the unchallenged assumption that industrial products, just because they are there, become by their mere existence a topic worthy of scientific attention, no matter how grave the mistakes they embody. In the sixties the battle that was needed to prevent computing science from degenerating to "how to live with the 360" has been won, and "courses" -- usually "in depth"!-- about MVS or what have you are now confined to the not so respectable subculture of the commercial training circuit. But now we hear that the advent of the microprocessors is going to revolutionize computing science! I don't believe that, unless the chasing of dayflies is confused with doing research. A similar battle may be needed."
    --Edsger W. Dijkstra, My Hopes Of Computing Science, 1979
  • by Cereal Box ( 4286 ) on Friday February 13, 2004 @04:26PM (#8273208)
    Why is it so difficult, if not impossible, to write bug-free programs that contain more than 20 to 30 million lines of code?

    Maybe because the programs contain 20 to 30 million lines of code.

    Look, I understand that a lot of people are yearning for the good old days when software was less buggy. You know what? I suppose that if your entire application consists of something like 4000 assembly code instructions, you might just be able to make the program bug-free.

    But it's not 1983 anymore and programs are on the order of millions of lines of code. Of course it's not feasible to go over the entire program manually and root out every single bug. The stuff I work with every day is considered extremely small and yet it depends on all sorts of external libraries, each of which may have dependencies, etc. It all adds up to amazingly large amounts of code. But, it requires large amounts of code to do extremely complicated things. Is this a surprise to her or something? I don't think there's any "paradigm shift" in the field of programming that's going to change the fact that:

    * Doing complicated things requires lots of code.
    * The more code you write, the higher the chance of bugs.

    I reiterate: duh...
  • by brokeninside ( 34168 ) on Friday February 13, 2004 @04:29PM (#8273264)
    If I follow your train of thought to its natural conclusions, I should arrive at the idea that when building a bridge, it is not necessary to prove that the finished construction will be able to withstand the load that it bears. Would you agree with that assessment?

    As for cost, given the high rate of failure in the current system and the astronomical costs of bugs in current software products, I don't think that cost considerations support your argument.

    Further, virtually every software text in existence states that more time spent in design reduces defects and yet very few projects spend sufficient time in the design state. Proper design is currently the path not chosen, so its costs are unknown. One cannot reasonably argue that an unknown cost is higher than a known cost prior to the unknown cost being known.

    Lastly, your request for a proof exemplifies my point. You cannot offer such a proof and that is why Apache has to be patched. If Apache had been properly designed and constructed from the beginning, the only updates to Apache would be for new features. The cost of all the bugfixing that has gone into Apache over the years was unnecessary.

    Unfortunately, computer science is still in its relative infancy. It is currently more akin to a skilled trade than a science. Also, our system of education (at least in the US) is geared toward producing artisans of computers rather than computer scientists. One hopes that this will continue to change over time.

  • by American AC in Paris ( 230456 ) * on Friday February 13, 2004 @04:36PM (#8273338) Homepage
    You misinterpret what I've written.

    I'm not suggesting that we do away with basic arithmetic or variable assignment. You can't do that and still have a programming language. The very idea of writing a program of any complexity that doesn't incorporate basic arithmetic or variable assignment is just plain silly.

    Rather, I'm saying that so long as programmers can use such essential and basic functionality, the bad ones will find ways of producing buggy code. Inaccurate formulae. Hard-to-maintain code. Inefficient design. Poorly formed logic. Bad algorithm selection. You just can't 'fix' bad programmers with better languages.

    There's just no way to teach a compiler to recognize bad code design, and there's no way to tell a programming language, "do as I mean you to do, not as I say you to do." Yes, things like garbage collection and bounds-checking help prevent some bugs, but the really nasty ones--the ones that take ages to fix--are the result of good ol'-fashioned bad design and programming.

    As for the troll remark, go ahead and dig through my user info page. I may be snide at times, but I'm no troll.

  • by GeckoX ( 259575 ) on Friday February 13, 2004 @04:37PM (#8273351)
    I don't believe I've advocated anything in my posts...I was continuing a thought process.

    Actually, the very fact that we live in a binary world kinda makes your post redundant, we ended up here for a reason. However, does that mean we are not allowed to think outside of this paradigm? Can we not discuss things like this? Or shall we stay confined to our little box at all times?
  • by irhtfp ( 581712 ) on Friday February 13, 2004 @04:37PM (#8273354)
    While I agree with most of what you said, I do see one difference to the "bridge" analogy and that is fault tolerance.

    Bridges are built to be extremely fault tolerant. MechEs and CivEs use safety factors - big ones. Multiple bolts must fail before the structure becomes critical. Adding safety factors in mechanical structures is relatively cheap and easy.

    In most software, nearly everything is critical in some way due to the logical step-by-step nature of code execution. It's possible to write good fault tolerant software (i.e. w/ exception handlers) but that's one of the first things to suffer under the deadline as it's very expensive. I've never done a study but I would guess that at least 70% of good code writing is designing for all the non-standard cases!

    I think software is a bit closer to a chain than it is to a bridge.

  • Some ideas (Score:1, Interesting)

    by Anonymous Coward on Friday February 13, 2004 @04:39PM (#8273386)
    I've got some ideas about making software less buggy and faster+easier to program. If somebody reads this, maybe he could comment?

    - A lot of time is spent trying to get the syntax right. By moving software development away from pure-text writing and towards a more humane form (like dragging widgets together), the head could be freed up so that we don't have to think about adding ; at every line end and focus more on the problem.
    - The form of source code (text) today doesn't reflect the complexity of the problem behind it. Surely there are tools that overcome some of these problems, but since they still work on the basis of a text document, they are error prone and won't free our heads. A good example for this is CVS: it compares *lines*, but it should compare the syntax - so it sees differences where there are none.

    I feel that programming is held back in the era of the "command line" while all other fields have long moved to some sort of GUI.

    I've got a list of dozens of ways to improve programming and making it humane without changing the syntax/language... does anybody know if there is research / finished products in this field? What have others already tried out?

    Thank you for any suggestions!
  • What's intuitive?? (Score:2, Interesting)

    by EmbeddedJanitor ( 597831 ) on Friday February 13, 2004 @04:41PM (#8273409)
    The only things that are truely intuitive to humans are those things directly linked with survival: sex, food, sex, shelter, sex, defence, sex.

    Now I really struggle to see how one can dress up programming to be like any of these though I do admit getting a horn when I see a reaaly good linked list.

    Software is inherently chaotic and complex. I think any attempts to say otherwise are just a front for pushing some new case tool or whatever. What sets geeks apart is that they are wired in a non-intuitive way: hence the ability to program and the problem coping with sex.

  • by spikeham ( 324079 ) on Friday February 13, 2004 @04:43PM (#8273424)
    I have been a software developer for more than 15 years. Currently employed as a Senior Software Engineer.

    I've only come across one way to write code that is close to bug-free: Test-Driven Development (TDD.)

    In TDD, you never write a line of code unless there is a unit test in place to check the results.

    I have seen very complex systems get built from scratch with virtually no bugs when TDD is followed.

    There are lots of online resources about TDD. It is one of the foundations of XP (Extreme Programming.)
  • Mod parent up! (Score:2, Interesting)

    by PatSmarty ( 135304 ) on Friday February 13, 2004 @04:43PM (#8273439)
    Great point! I wondered about this, too. Could anybody with more knowledge comment?

    • I've got some ideas about making software less buggy and faster+easier to program. If somebody reads this, maybe he could comment?

      - A lot of time is spent trying to get the syntax right. By moving software development away from pure-text writing and towards a more humane form (like dragging widgets together), the head could be freed up so that we don't have to think about adding ; at every line end and focus more on the problem.
      - The form of source code (text) today doesn't reflect the complexity of the problem behind it. Surely there are tools that overcome some of these problems, but since they still work on the basis of a text document, they are error prone and won't free our heads. A good example for this is CVS: it compares *lines*, but it should compare the syntax - so it sees differences where there are none.

      I feel that programming is held back in the era of the "command line" while all other fields have long moved to some sort of GUI.

      I've got a list of dozens of ways to improve programming and making it humane without changing the syntax/language... does anybody know if there is research / finished products in this field? What have others already tried out?

      Thank you for any suggestions!

  • by ratboy666 ( 104074 ) <fred_weigel@[ ]mail.com ['hot' in gap]> on Friday February 13, 2004 @05:06PM (#8273784) Journal
    "ability of developers to use proper and well implemented OO methodology..class designs...a priori with insufficient in depth..."

    The reason is that most developers INSIST that class structure should model the application domain. Even if it doesn't make the slightest lick of sense.

    Reason? Because of how OO was taught. Concrete to abstract, keeping in line with a problem domain.
    (coloured rectangle->rectangle->shape). This certainly makes teaching easier, but doesn't make for sensible class hierarchies.

    OO is separate from a class hierarchy. The only reason we HAVE a hierarchy is to allow code to be reused. Therefore, the proper hierarchy is not a taxonomy, it is the one that leverages the code maximally.

    As an example - Where to put a Date class?

    Smalltalk classifies a Date as a Magnitude -- things that can be compared. So comparisions can be leveraged (eg. =). If it were NOT there, all comparisions need re-implementation.

    Also Character should be a Magnitude as well.
    Maybe String, but that's a bit shaky (mixins help, it's comparable, but is a collection of Character).

    Where to put a class in the hierarchy should be driven by the principle of minimizing code. *NOT* modelling the real world. If you model the "real world" you are probably in a serious "world of hurt". Also, in this case, the OO "paradigm" isn't going to save you much in the way of coding (will save you debugging, hopefully).

    Avoidance of bugs...

    Stay away from stupid languages. Insist that optimization is the compiler/computers job. The Rosetta Stone is to ask for a factorial function, *without* specifying any details. Code it in the *most* natural way, and then test it with 10,000!

    Now, determine how much breakage has occurred (if any).

    The answer to LARGE projects is to write code ONCE, and be able to reuse it in any context that needs the same processing. I don't want to have to code the factorial algorythm for small integers, large integers, and really big integers.

    I want the code to accomodate the data-type that is needed. If I sort, and use "" ordering, I want that to work across any datatype.

    If I have to re-implement, I lose on the previous work.

    Class hierarchies can help structure (look at Smalltalk), but are not often used in this way.

    Ratboy.
  • by Tailhook ( 98486 ) on Friday February 13, 2004 @05:17PM (#8273962)
    A lot of the problems could be solved with true functional languages (Haskell, OCaml, etc) but the learning curve is too high.

    A lot of problems are solved with functional languages. Functional advocates claim to have the answer to software correctness and they decry the present state of imperative logic programming. What I think they fail to realize is that functional programming is ubiquitous, solving problems on a scale that contemporary imperative tools will never approach.

    Microsoft Excel is, in essence, a functional programming language. It is utilized by non-"programmers" planet wide every day to quickly, accurately and cheaply "solve" millions of problems. It has, effectively, no learning curve relative to typical coding. I have found it to be an invaluable software development tool. I take it a bit further than the typical spreadsheet task by using to model software systems.

    It is especially helpful with business logic problems. I recently implemented a relatively complex web-based product configurator. I know that if I can model the complete problem in a stateless manner using a spreadsheet, writing bug-free, efficient client and server side imperative code becomes a simple matter of translation. For any given state of a collection of inputs there is exactly one atomic result. In this case the result is a (possibly lengthy) structured document computed dynamically from a collection of input forms, both on the client (because page refreshes suck) and on the server (because validation must not depend on an honest client.) Both independent implementations (in different languages) are "obviously" correct in the sense that they are derived from a clear, functional model, built in a spreadsheet.

    You may substitute any contemporary spreadsheet product in place of Excel; I have no love of Excel specifically. It's just what I've happened to have handy in all cases. The fact is that modeling most software problems requires very little of what any reasonablely competent spreadsheet can accommodate. Feel free to lecture me on precisely why it is blasphemous to suggest that a spreadsheet qualifies for the designation "functional programming." I know the difference because I've studied LISP and used Scheme. The subset of true functional programming that provides the most value is clearly represented by the common spreadsheet.
  • Re:Mod parent up! (Score:3, Interesting)

    by Greyfox ( 87712 ) on Friday February 13, 2004 @05:23PM (#8274051) Homepage Journal
    Eventually the syntax becomes instinctive(ish).

    IDEs hide what's really going on in the development process. Once you learn your environment, the command line is still the fastest and the easiest to isolate build problems in.

    The problem, as I see it, as a lot of programmers don't want to have to understand how to program or the problem that they're coding to, becuase that's hard. Understand what's going on, write higher quality code. That's all there is to it.

    Now if we were talking about Joe Sixpack banging together an application to track his baseball cards and not some professional IT developer, I could see a less complex environment being useful to him. I was banging out dbase III procedures for my dad's office back when I was a boy, and it didn't have to be the most complex environment in the world for them to get their job done. They still had a solid idea of what they wanted the machine to do though. They just gave me the requirements, I implemented them, presented them them and let them come back with changes until we had a report they liked.

  • by ahdeoz ( 714773 ) on Friday February 13, 2004 @05:26PM (#8274081)
    Oh my! In going beyond Object Oriented Programming, she has managed to re-invent constructs that OOP was meant to replace. Subroutines, and even lowly "if" statements have a place in programming. And what's more, they need to be constructs even more heavyweight than objects. Sun really is trying to sell hardware! I can't wait for the next advanced programming abstraction of registers and gates. I'm sure there will be a market for distributed, clustered, logic servers, and several competing (standard) XML schemas for ANDsm OSs, XORs, NOTs, and MAYBENOTs. I can't wait for version 1.5, where goto jumps will be implemented via Inversion of Control in an Aspect Oriented Framework! Seriously, what bugs
  • by Adartse.Liminality ( 742343 ) on Friday February 13, 2004 @05:39PM (#8274247) Journal
    Perl? brrr!! thanks but no thanks, POOP==Ruby && ruby==perl.upgarde.

    perls has its place, I've tried to learn perl no big deal until you start to see "perlisms" AKA hacks, and then it becomes unreadable, I preffer perl's prettier and younger sis, Ruby, OOP done if not right at least a lot better, mmmh how to avoid bugs? typing less helps ;-) a very readable sintax does too, intuitive language design and POLS avoid bugs even more.

    And POOP stands for Pure Object Oriented Programming.
  • by Anonymous Coward on Friday February 13, 2004 @05:56PM (#8274483)
    is to _test_ your code.
  • by dasmegabyte ( 267018 ) <das@OHNOWHATSTHISdasmegabyte.org> on Friday February 13, 2004 @06:00PM (#8274539) Homepage Journal
    Eckel's Design Principle #1: Don't be astonishing.

    I find True, False or Maybe to be fairly astonishing. Does maybe mean quit? Does it mean give me more options? Does it mean the same as True or False (it would if my wife coded it, oh!)?

    All the third option gives us is more questions. Which defeats Echel's sixth principle: Simplicity before generality.
  • by ajs ( 35943 ) <{ajs} {at} {ajs.com}> on Friday February 13, 2004 @06:18PM (#8274754) Homepage Journal
    Perl's OO model isn't.

    That's not to say it's bad, but it simply isn't. Perl provides you with all of the tools you need to build a GREAT OO system, but that's not an OO system.

    This is one of those things that Larry does that's just unfathomable to the rest of the world. He didn't really grok OO back 10 years ago when P5 was in the works. He understood it well enough to program in the large in C++, but he didn't quite have his head fully around the implications, so when he added OO to Perl 5, he did so in such a way that all of the various ways of approaching code from an OO standpoint could be accomodated.

    This means that writing OO code in Perl kind of sucks, but if you want to design an OO model for a programming language, no tool (other than a parser generator) will be more powerful.

    Come Perl 6, Larry finally feels that he gets it enough to tell all of the people who are going to have to use his language how to do it. He doesn't take that responsibility lightly, and the fact that SO MANY other language designers do should worry you.

    That said, if you want to write medium-sized programs that are heavily OO-dependent, I suggest Ruby or Python or even Java. If you are writing small tools, OO vs non-OO won't matter that much.

    If you are building huge systems, then you don't care because the amount of work required to lay out how you will use OO in Perl is insignificant next to the architecture that you have to lay out for the rest of your app. It's just noise in your timeline, and you can fully re-use that policy in every other project that your company tackles.

    What's amazing is how Perl lets all of these OO models interact. I'm always stunned by this, and frankly it's a tribute to the language and its designer.

    As for your comment about typing less... I don't think that languages with the level of abstraction of Ruby or Perl really need to have line-count contests. Dynamic typing, run-time data structure definition and garbage collection make programming SO much easier that Perl and Ruby are in the same order of magnitude, and I don't see a reason to quibble over the details.
  • Re:Test? (Score:4, Interesting)

    by JohnGrahamCumming ( 684871 ) * <slashdot@jgc.oERDOSrg minus math_god> on Friday February 13, 2004 @06:36PM (#8274929) Homepage Journal
    > This is the old way of thinking. When you see the answer, I'm sure you'll jump on board.

    I did a PhD at Oxford in the Programming Research Group and studied Z, CSP and all that stuff. My thesis even includes a program written in Occam proven via an algebra to meet a security specification.

    Believe me, I'm aware of what the world could be like, but it is not practical to write real software this way yet. Hence we still need to test, and not enough people write tests today. Unit and system testing are best practices for the industry today, sure, there's a better theoretical way to do things, but I need to code in 2004 not 2054.

    John.
  • Old ideas.... (Score:3, Interesting)

    by FoFi ( 143144 ) on Friday February 13, 2004 @06:42PM (#8275006)

    I couldn't stop thinking of existing theories and/or implementations of her ideas...

    Modeling processes out of the OO paradigm (opposite to what Design patterns started to sacralize for example) is precisly the subjet of so-called business rules. But BR people are close to relationnal model of data, that is too quickly assimilated with SQL DBMS(*), so OO oriented people don't buy it (see the almighty impedance mismatch).

    Data-structures other than trees and collections are already genericaly implementable in any modern OO language. See Eiffel for example which can perfectly do that for 15 years (parametric classes, with full type safety). May be the java generics will help to build highly reusable data structure... I doubt that, anchored type is missing (ie the possibility to declare the type of a variable as equal to another type, nearly mandatory when dealing with inheritance of generics).

    Tom.

    (*) I warmly recommend the writings of Chris Date and Fabian Pascal to really see how the relationnal model of data is different from SQL databases...see DBDebunk [dbdebunk.com] for references.

  • by TigerNut ( 718742 ) on Friday February 13, 2004 @06:49PM (#8275087) Homepage Journal
    Go read some of Henry Petroski's books (their titles escape me right now - there is one that opens with the failure of the hotel skybridge in Kansas City), and you'll see that civil engineering occasionally still has big gaps between the designer's intent and what actually gets screwed together. On the software front, you are correct. Software eventually rots because it is usually "maintained" by continually applying patches as opposed to periodically taking a whole chunk and rewriting it to fit the new extended requirements.

    Even when the latter happens, in a lot of cases the overall soundness of the implementation is compromised by the perceived need to maintain backwards compatability (e.g. why does the Borland Builder compiler use the 80386 instruction set by default, when almost any executable you are likely to build with it will be larger than the 386 can support?)

  • Bugless code!?!? (Score:2, Interesting)

    by marco0009 ( 716718 ) <marco0009@gm[ ].com ['ail' in gap]> on Friday February 13, 2004 @07:02PM (#8275189)
    There is truly no such thing as bugless software. No matter how well or how stupid proof a program is coded, a user will find some way to do something to it which was not intended. If they keep this up, eventually it will cause something to go wrong in your program.
    Even if, by some miracle, all programs were immune to user errors, there are infinitly many hardware/OS/software combinations that some conflict with code will eventually surface. IMO, this ideal of bugless code is just that, an ideal.
  • by ninejaguar ( 517729 ) on Friday February 13, 2004 @08:09PM (#8275711)
    From the post:
    "software should more closely simulate the real world"

    From the article: "It's not the prevention of bugs but the recovery -- the ability to gracefully exterminate them -- that counts."

    While the need to gracefully recover your design from bugs (bugs come from design, or lack of, not code) is laudable. The proper technique is to design without bugs in the first place. Assuming that you're actually meeting the business requirements or functional specifications, there is a straightforward method to flattening bugs before they become fruitfully overripe and multiply.

    Once you have obtained the proper [amazon.com] requirements [amazon.com] (your goals), and after you've properly atomized it to its smallest component parts, you need to model those parts. Once you've modeled those parts, you need to test the model. This works in single process design, but it really shines in concurrency where anyone can truly screw up.

    Get a good [amazon.com] book [amazon.com] on design [amazon.com]. Then get a good [amazon.com] book [ic.ac.uk] on modelling, mechanically analyzing, and testing those designed processes before commiting to code.

    = 9J =

  • Excellent! (Score:1, Interesting)

    by Anonymous Coward on Friday February 13, 2004 @08:42PM (#8275936)
    Excellent quotes! Your point is valid although not universal.

    I'm a CS professor and I completely agree. I constantly challenge my tenured senior colleagues (I'm untenured) about their love affairs with UML, Java, and other tools. I teach undergraduate data structures and algorithms and never write one line of source code on the board -- it is all pseudocode. I don't ask students to use a particularl language for their programming assignments, either. As a result, I don't lecture from a book either. I lecture about topics and employ several books.

    The problem as I see it is that it is human nature to get in a routine and not stretch your boundaries. Many profs take the easy way by letting the tools hide their complacency. I don't think they are stupid -- they knew things at one time. But they get lazy and "eat their brain". I hope I never eat mine because it is a great disservice to students and taxpayers.
  • by Dominic_Mazzoni ( 125164 ) * on Friday February 13, 2004 @09:34PM (#8276344) Homepage
    The article doesn't seem to address all of the different types of bugs, nor how to best address them. Anyone care to add to or refine this list?

    1. Algorithmic bugs - you have a function with well-defined input and output, and it does the wrong thing (may include giving the wrong answer, looping forever, leaking memory, or taking too long to return). Can be avoided with a combination of code review, unit tests, and correctness proofs when possible.
    2. Interface bugs - this includes validating input, both from the user and over the network or other ways in which your program gets input data. These bugs include buffer overruns, GUI bugs caused by an unanticipated sequence of clicks, etc. These bugs are mostly found by testing, but sometimes also with automatic code checkers or memory debuggers that highlight potential problems.
    3. Bugs in the operating system or in sublibraries - any large project depends on large volumes of operating system code and usually lots of other libraries. These systems almost certainly have bugs or at the very least undocumented or inconsistent behavior. The only way to avoid this is to validate all OS responses and do lots of testing.
    4. Cross-platform bugs - a program could work perfectly on one system, but not on another. Best way to address this is to abstract all of the parts of your program that are specific to the environment, but mostly this just requires lots of testing and porting.
    5. Complexity bugs - bugs that start to appear when a program or part of a program gets too complicated, such that changing any one piece causes so many unintended side-effects that it becomes impossible to keep track of them. This is one of the few areas where good object-oriented design will probably help.
    6. Poor specifications - these are not even necessarily bugs, just cases where a program doesn't behave as expected because the specifications were wrong or ambiguous. The way to avoid this is to make sure that the specifications are always clear. Resolve any potential ambiguities in the specs before finishing the code.

    My overall feeling is that there are so many different types of bugs in a real-world programming project, and any one technique (like object-oriented design) only helps address one type of bug.
  • by dirt_puppy ( 740185 ) on Friday February 13, 2004 @09:38PM (#8276379)
    I think both parent posters are just right, functional programming is a great way to encounter problems, and excel spreadsheets are too.

    The big advantage of FP is its clearness and rigidness. To an experiences functional Programmer, its exactly clear what a piece of Haskell Code means, since the code is half general functions that are easy to understand (map, zip, fold et.al.) and half problem-specific functions that are about as easy. The solution is built from simple bricks everywhere, other than in imperative Programming.

    Another thing is, we're talking about functions aren't we? And shouldnt the First Class Citizens of our Language be functions then?

    Besides, Functional Programs are very much easier to prove, and thats a thing that will be very important in some time in the future (or so I pray... or everyone in IT buisiness will go nuts over the giant pile of bugs that accumulates). For a little introduction into the theory behind this check this document about The Curry Howard isomorphism [www.diku.dk] (at the very Bottom of the page). Besides, this is a fundamental link between programming and philosophy (intuitionistic reasoning)

    "Excel" type spreadsheets are very useful too, maybe not for everyday programming but for short and programs intended for quick use by non-programming coworkers. In fact Excel is the one Program that tells me that not everyone in Microsoft can be a moron.

  • by Adartse.Liminality ( 742343 ) on Friday February 13, 2004 @09:42PM (#8276400) Journal
    Yes I did RTA, but is Java OO the absolute best OO implementation?, OOP is not the solution to all. do I really want to create a full class method for a simple task, NO, "OO paradigm could have limitations", is not "could" but "HAS", OOP is a step ahead of functional programming IN certain situations, speed is not OO forte, but reuse of code/refactoring,encapsulation,etc is.
    I don't really need someone telling OO has limitations, I already know and you should, a programmer should be limited by his ability to use his/her tools and not by knowing only a single lang or a single programming paradigm
    What I want to point out is that:
    • There should be less filler in the code to keep interpreters/compilers happy, how Java or any lang rates in this point?
    • High level language, the more you have to type the more errors/bugs that you can make, but HLL or VHLL has its place as low level also does.
    • Simple logical grammar of said lang, with time even perlisms become easier to read ;-) but if is readable at the very first minutes? if is already familiar? type some code and does what you expected? Intuitive is the word but is relative to one's experience.
    • AFAIK there's no lang that does all and does very good and is also fast, easy, multiplataform and maintenable. If so please inform me.
    Besides I was answering to the proposition of using perl instead of java ;-), looking ahead? fine with me, but giving Java as an exemplar OO lang is not, neither as easier to use.
    Her proposition of a notch more to OO looks quite nebulous and ungraspable, I think she should research more languages.
  • by thinkerdreamer ( 682437 ) on Friday February 13, 2004 @11:31PM (#8277008)
    Programming is littered with complexities for there not to be bugs. You have to define something just right and not misspell anything or put something just "so" or there is a bug. I know there is a reason for all these things. But in the future I see Computer Aided Programming (CAP). The processing power of the PC will be used so that the computer will know what you are programming while you are programming it. The Artificial Intelligence will make suggestions and even write some of the code for you. The AI will keep you from making disasterous mistakes by analyzing every bit of recent code in comparison with all past code looking for conflicts. He will even suggest that you put something just "so" or not do it. The AI in essence will be your companion, debugging as you go. This will produce huge programming projects with no bugs whatsoever.
  • Re:Objects (Score:2, Interesting)

    by jonadab ( 583620 ) on Saturday February 14, 2004 @01:19AM (#8277505) Homepage Journal
    > I would love to use a C/C++/Java-like language that utilizes pure objects

    If it had a good object system, wouldn't that make it totally unlike C, C++,
    or Java? At least, semantically it would. I suppose the syntax could be
    C-like. Inform for example has C-like syntax, but the object system is so
    much more advanced than the one used by C++ that there is no comparison.
    (Inform, however, is not a general-purpose language. It doesn't have the
    I/O capabilities to be general-purpose. This is mostly due to the virtual
    machine it's designed to compile to, which for portability reasons doesn't
    support such things as a filesystem per se. But it's a great language for
    its intended problem space.)
  • by fractaltiger ( 110681 ) on Saturday February 14, 2004 @03:37AM (#8278128) Journal
    I like the idea of intuitive programming, but suspect that computers are grounded in logic and that logic is not an intuitive concept.

    Slightly OT:
    This is indeed hitting the nail on the head. My father and I have lots of disagreement on the issue of "common sense." I am very smart and he is so too, but tends to fall behind when it comes to explaining ... or rather, just ACCEPTING unknowns and their repercussion in logic. Say, If I withhold a fact in an argument but claim to be "right," he will say there is just NO way of winning the argument --and then I produce the "new evidence." He sometimes recoils thinking his logic models cannot be blown away by my (normal logic + hidden evidence).

    Whenever he says that I should know something because it's intuitive, I bring up example after example of why he's mistaken to expect all logical conclusions to be == to his. We saw a lady in a TV contest who had to see words hidden behind her husband and make him guess the , through signs and gestures she made for him. She stumbled upon "otorrino," (this TV show is in spanish) which is short for otolaringologyst, and said that she didn't know the word. Well, I won't get into more complex translation details. Suffice to say that she didn't know what it was and had to skip to the next thing. My father was outraged:

    1. She speaks 7 languages.
    2. The word is pretty simple latin and her 7 languages MUST therefore encompass latin.
    3. Young children and thus, anyone raising them, should be familiar with this doctor because he deals with "ubiquitous" eye problems, ear infections and nose issues.
    4. Besides being familiar with the doctor, must know the exact name of the doctorate which is literally "ottorhinolaryngologyst."
    5. Her job in Europe is with a law firm TRANSLATING legal documents.
    6. Any hispanic woman speaking 7 languages must have gone through enough 'education' to have seen the word, if she cannot derive it from latin
    7. Only my father, by (5) can decide whether you should or shouldn't KNOW a word, concept or whatever, just based on your "expected" life experience


    I asked my dad why he rationalizes which concepts she SHOULD know rather than why she just DIDN'T know what he's 100% sure she already grasps. Moreover, he's always too shocked to see through his own failure at accepting that common sense doesn't exist, and instead trying to verbally fix something that is has proven false before his own eyes. But some people think they already know what is and isn't IMPOSSIBLE.

    I can list three other languages besides english and spanish that I can understand, so I speak 5 languages, right? No. This is an example of misinformation and generalization: If she says she speaks 7, it doesn't mean her job has made her command them all --even if you 'knew' as many languages as the Pope and had to work in New York City's linguistic melting pot, you will never exert more than 3 language roles in your official capacity, and your "other 4 languages" will be pet languages, specially if you're only 35. But sometimes dad waves off my facts as crazy talk of today's young and naive offspring. Too bad. Sometimes he's surprised at how lucky I am when my "wrong logic" can get him so many nice surprises when his ways should be the only solution to my "poor common sense." You can tell I deal with control freak parents eh?

  • by master_p ( 608214 ) on Saturday February 14, 2004 @07:05AM (#8278723)
    This slashdot discussion about software engineering is one the best I've ever read. It should be almost mantatory for students of computing to study. Lot's of posters have raised very significant points about why it seems impossible to write bugless software.

    Let me add my 2 cents: the problem is that computer programs represent the 'how' instead of 'what'. In other words, a program is a series of commands that describes how things are done, instead of describing what is to be done. A direct consequence of this is that they are allowed manipulate state in a manner that makes complexity skyrocket.

    What did object orientation really do for humans ? it forced them to reduce management of state...it minimized the domain that a certain set of operations can address. Before OO, there was structured programming: people were not disciplined (or good) enough to define domains that are 100% indepentent of each other...there was a high degree of interdepentencies between various parts of a program. As the application got larger, the complexity reached a state that it was unmanagable.

    All todays tools are about reducing state management. For example, garbage collection is there to reduce memory management: memory is at a specific state at each given point in time, and in manual memory management systems, the handling of this state is left to the programmer.

    But there is no real progress in the computing field! both OO, garbage collection, aspect oriented programming and all other conventional programming techniques are about reducing management of state; in other words, they help answer the 'how' question, but not the 'what' question.

    Here is an example of 'how' vs 'what': a form that takes input and stores in a database. The 'what' is that 'we need to input the person's first name, last name, e-mail address, user name and password'. The 'how' is that 'we need a textbox there, a label there, a combobox there, a database connection, etc'. If we simply answered 'what' instead of 'how', there would be no bug!!! instead, by directly manipulating the state, we introduce state dependencies that are the cause of bugs.

    Functional programming languages claim to have solved this problem, but they are not widely used, because their syntax is weird for the average programmer (their designers want to keep it as close to mathematics as ever), they are interpreted, etc. The average person that wants to program does not have a clue, the society goes away from mathematics day by day, so functional languages are not used.

    The posted article of course sidesteps all these issues and keeps mumbling about 'intuitive programming' without actually giving any hint towards a solution.
  • by Anonymous Coward on Saturday February 14, 2004 @10:20AM (#8279253)
    I have spent most of this last week debugging a large Excel "work-book". Without going into excessive detail, this one uses look-up tables to match specific Code requirements to equipment ratings for a large family of electrical products. This job requires many "worksheets" for the models involved (each to be readable in one screen and printable on one page). Because of the opacity of the spreadsheet "language"--you can only see the formula "source code" for one cell at a time--it's really hard to keep everything synchronized as new models are added and calculation formulas are copied into the new section. This was a design and debugging nightmare. God help anybody else who has to decipher this "intuitive" piece of software.
  • by dmhayden ( 752539 ) on Saturday February 14, 2004 @12:30PM (#8279944)
    To see how software development could be made less bug-prone, all you have to do is look at a more mature industry. Take home-building for example.

    First of all, homes are build from standard components. Lots of standard components. If software development, if you want a linked list, you have, at most, a handful of choices. When building a home, if you want a nail, you have dozens or hundreds of choices. There are roofing nails, framing nails, finish nails, decking nails, galvanized nails, nails with ribs, nails with spiral grooves and all available in a variety of sizes.

    The point is, if you need a nail, you can walk into home depot and get *exactly* the nail that you want. In software, you have to make do with what's available.

    You could argue that one can create whatever custom list behavior is needed from the building blocks available. That's true. You can also make whatever nail you want from steal wire. But we don't do that because it's error prone and you might not get exactly the right kind of nail necessary for the job. It's better to get one off the shelf because you can be pretty sure that someone has thought of all the issues related to the problem it's designed to solve and taken care of them.

    Next, look at who builds a home and compare it to who builds software. If we build houses the way we build software, you'd bring 150 people to the job site and say "you there, do the framing; you over there, you're on plumbing; you two in the back have electrical work. After all, builders are builders, right? Of course not. There are specialized skills involved in each step and you need the expertise to do them. You wouldn't want your plumber doing the roof work any more than you'd want an excavator doing the plumbing. To build a complex system, you need lots of people with very specific skills.

    Standards and interfaces. So how is it that you can walk into Sears and buy a blender that interfaces so perfectly with the electricity in your house? Because there are standards, of course. The standards specify the voltage and frequency on the line, the exact shape of the plug that you use, how long the cord should be, what sort of insulation it has and probably a lot of other stuff. Software systems need the same sort of standardized interfaces for their basic properties. We've started this with things like constructors, destructors and I/O operators.

    Basically, I see software development as a very young science. As the standards evolve, the quality and reliability will grow also.

  • by voodoo1man ( 594237 ) on Saturday February 14, 2004 @04:19PM (#8281364)
    It was called the Programmer's Apprentice, and sat on top of Interlisp. I think that PA built on top of a lot of features already came with Interlisp, like spelling correction (DWIM - Do What I Mean), versioning tools, undoable assignment, and a program cross-referencing/refactoring tool called Masterscope. What it added was an AI-enabled layer for error detection and source generation. This is kind of neat, but although I've never tried it I don't think this is all that useful of a tool (I can't help but think that anything that deals out programming advice at such a high level will get in the way more often than it helps). Do a google or citeseer search on it - there are about a dozen papers on the project.

    I think in the end the lower-level techniques of such a system prove more useful to a programmer. At the very least they have had a higher rate of succesful adoption. Most modern programming languages use Lisp's ideas of uniform object referencing and automatic memory management, and stuff like quasiquoting pre-processors has started popping up for other languages in the past couple of years. Concepts a little higher-level, like AOP and refactoring, go a very long way toward the intelligent programmer's apprentice. Of course, if nothing else all this stuff makes making programming apprentices much easier - just try to imagine building one to deal with C's pointer referencing!

An Ada exception is when a routine gets in trouble and says 'Beam me up, Scotty'.

Working...