Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Software

Calling Software Reliability Into Question 461

phillymjs writes "CNN is running a story on software reliability, and how the lack of it may cost more and more lives as technology creeps further into everyday products. It appears a debate is finally starting amongst everyday (read: non-geek) people about vendor liability for buggy software. Some opponents of the liability push are unsurprising: Says the story, 'Microsoft contends that setting [reliability] standards could stifle innovation, and the cost of litigation and damages could mean more expensive software.' The article also says, however, that consumers' favortism of flashy products over reliable ones is partly to blame for the current state of software."
This discussion has been archived. No new comments can be posted.

Calling Software Reliability Into Question

Comments Filter:
  • Microsoft (Score:3, Informative)

    by Pres. Ronald Reagan ( 659566 ) on Sunday April 27, 2003 @03:46PM (#5820681)
    I like what Microsoft has been doing qith security these days, quite frankly. The new security features in Windows Server 2003 look innovative and very modern, and quite easy to use.

    Linux may be secure when configured correctly, but Windows Server 2003 looks to be the most secure OS out of the box at the moment.
  • But how... (Score:4, Interesting)

    by C.Maggard ( 635855 ) on Sunday April 27, 2003 @03:46PM (#5820684)
    ...could reliability standards stifle innovation? How hard is it to design something that works well and is extremely robust, yet, be creative and innovative in its design?
    • Re:But how... (Score:4, Insightful)

      by Tyler Eaves ( 344284 ) on Sunday April 27, 2003 @03:52PM (#5820713)
      Well, based on all the software I've ever seen, pretty damn hard.
      • Why should... (Score:5, Insightful)

        by Uber Banker ( 655221 ) on Sunday April 27, 2003 @04:33PM (#5820916)
        Why should liability be software or hardwar based?

        If i design a system to move some gears via an operator pressing big electronic buttons as a mechanical engineery, why should an electronic engineer who designs a program to operate the gears be exempt?

        We are both designing a system to do a job. As an electronic engineer, I make my system based on some OS, so either I or the OS manufacturer (which, I add, licences an OS, if it is used against the license terms, it is my liability) has the liability.

        Don't be lazy allocating responsibility.

        • why should an electronic engineer who designs a program to operate the gears be exempt?

          • Because none of his development tools are sufficiently accountable to trust.
          • Because he is paid less than a plumber - he can't afford to be bonded on the wages you're willing to pay.
          • Because nobody is set up to bond programmers.
          • Because nobody wants to pay for quality.
          • Because guaranteeing the correct operation of a complex piece of machinery in an uncontrolled environment is idiocy. At least a manual gearbox leaves
          • Re:Why should... (Score:4, Interesting)

            by dubious9 ( 580994 ) on Sunday April 27, 2003 @11:33PM (#5822681) Journal
            Programmers build on top of an OS

            Not always. There are alot of embedded applications where there is no operating system at all. Each program would function as its own operating system. There is overhead with OSes and sometimes you don't need the functionality. When you have simple hardware with a simple interface, dropping the OS is a good option.

            Also, I'm pretty sure the software that runs air traffic control or cars has a chain of responsibility going back to the programmer.
    • Re:But how... (Score:5, Insightful)

      by ctr2sprt ( 574731 ) on Sunday April 27, 2003 @04:14PM (#5820820)
      Look at how much time NASA's programmers spent writing bug-free code. That's a pretty reasonable estimate, unfortunately. The number of bugs in any given program increase dramatically with the size of the program. (I don't know if it's geometric or what, but trust me, it goes up fast.) So while you may be able to whip out 1000 lines of code a day at the beginning, by the end you'll be writing 5 new lines a day if you're lucky. The rest of your time will be spent making sure those 5 lines work correctly with the 150,000 you've already written.

      This is what Microsoft is, quite rightly, afraid of. If I can sue Microsoft for $100k because IE crashed, MS isn't going to have time to do anything except fix bugs. This isn't even entirely their own fault, since the nature of programming makes it impossible to write any large program without bugs. And unless you grandfather all of MS's products, they'd be screwed.

      But this is even worse. Unless the laws are written to special-case free software, we might see Linus sued because Linux crashed one day. RMS might end up $15m in debt because Emacs ate somebody's email. How's that for stifling innovation? If I (personally) might get sued for some bug I missed, there's no way I'm going to give away my programs.

      The guy in the article advocates only a limited sort of liability: you're liable only up to a point, or only if you don't divulge the bugs you know about. But does anyone out there really think the politicians, who are more in the pocket of trial lawyers than of anyone else, are going to make it hard to sue?

      • Re:But how... (Score:5, Insightful)

        by JimDabell ( 42870 ) on Sunday April 27, 2003 @05:29PM (#5821202) Homepage

        Where life-critical systems are put in place, there will be an insurance policy. The insurance company should require a guarantee from the software vendor. Therefore, in life-critical systems, the software vendor should always be able to be held accountable. Yes, this will be expensive, but not as expensive as all those lawsuits.

        Most software does not fall into this category. Virtually every business is heavily dependant upon software though, so it is mission-critical. The nature of closed-source software is a massive imbalance between vendor and customer though. The vendor is the only one who can fix bugs; it's the ultimate form of vendor lock-in. Those vendors with monopolies (for example Microsoft) should therefore be regulated in some way, as they can literally hold a majority of businesses to ransom.

        Suppose a defect that only affected a small number of businesses was found in Windows? Microsoft has little economic incentive to fix the issue. The businesses are heavily dependent on the software, yet nobody can help them - the only thing they can do is work around the issue somehow, which may not be possible, or an expensive migration to another platform (expensive in terms of resources; even if the software is free, the downtime is not).

        What can be done to fix this situation? Obviously, if you run a business, you take appropriate notice of this business risk, and plan accordingly. But this doesn't escape the fact that sometimes you have to resort to using software you cannot rely on. I'm a web developer; I have no choice but to test in Internet Explorer. If a bug prevents me from running it, it's a major setback.

        I believe a solution to this is to loosen the grip the vendors have on the software. Copyright is an artificial monopoly on creating copies; it shouldn't be an artifical monopoly on fixing bugs. If you are a software vendor, you should have three options:

        1. Provide the software in a form that allows the customer to fix bugs and rebuild. In other words, provide the source and everything needed to compile it.
        2. License the buildable source code to third parties for free. These third parties should pay the original vendor the retail price + 10% for any copies they sell. Any third party should be able to license the code in this way.
        3. Be unable to disclaim liability for the software.

        This, I feel, is the balance between protecting businesses from having no control over their software, and protecting the rights of the software vendor. Have I missed anything?

      • And even worse... (Score:5, Interesting)

        by raehl ( 609729 ) <raehl311.yahoo@com> on Sunday April 27, 2003 @06:23PM (#5821494) Homepage
        Lots of people don't even WANT reliable sofware - at least, they don't want to pay for it. I'll happily accept my software crashing once a week if it saves me $300 on the cost of what would otherwise be $100 software. The last thing we need is to have the software industry start to look like the healthcare industry, where everyone pays 3x what they should to cover the insurance in case someone needs to sue someone else.

        If you need absolutely, positively reliable software for some purpose, than contract with a company who is willing to provide it, and pay the price it takes to get it. But Joe Blo software user should have to foot the bill because someone ELSE wants to force ALL software to be reliable under penalty of multi-million dollar lawsuit. If I sell an operating system designed to let you play MP3s and video games and browse the internet for $99, and you use it you run your mission-critical application that causes you to lose $100 million when it crashes, why should I be liable because you used my (albeit buggy) tool for a $100 million mission critical ap? It's YOUR job to assure that you are using the correct tools for the job, NOT the guy who makes the tools!

        It's like cars - just because your transmission goes out doesn't mean you get to sue the manufacturer. You get your transmission fixed if you've purchased a car with warranty terms that allow it to be fixed, and otherwise you pay for it yourself.
  • strange (Score:3, Funny)

    by fjordboy ( 169716 ) on Sunday April 27, 2003 @03:47PM (#5820688) Homepage
    So...basically people are just finding out now that not all software is as perfect as it is intended to be?

    Great..I'm gonna have to explain this one to my parents...
  • by Ed Avis ( 5917 ) <ed@membled.com> on Sunday April 27, 2003 @03:49PM (#5820698) Homepage
    The trouble is, the more accustomed users become to bugs, the harder it is to get them reported and fixed. If my computer crashes, I just reset it and get back to work. I don't bother to investigate what caused the bug, to try to reproduce it, to contact the vendor (hah!) and try to work out the problem. Crashes occur much too frequently for that.

    OTOH, if computers were reliable enough to crash only once every few years, then users might report every crash that happens, the vendor can diagnose it, and fix the bug or family-of-bugs so that it never happens again. This is roughly what happens when a mainframe crashes, I believe - it's a big event.

    Imagine if when your microwave crashed, you could call some hotline, they would come and replace the microwave and take away the old one for analysis. Instead, even on complex software systems the standard first resort for the help line is 'reboot and see if it goes away'.
    • And yet those IT staffs who run mainframes are quite willing to install NT servers running IIS or SQL-server and put up with Microsoft's poor security and stability. Where's the sense in that?
      • by Dark Lord Seth ( 584963 ) on Sunday April 27, 2003 @04:34PM (#5820920) Journal

        Because some IT staffs have a higher-up who went to the most recent Microsoft seminar ($25.000,- for entry & attendance, $750,- for the hotel, $2.250,- on the flight) and got amazed by MS. After budget-cutting away the drinks dispenser and replacing it with an old coffee maker (Hey, that $28.000m- is more important then employee satisfaction! *sarcasm*) hte higher up has a great idea, replacing all server with Windows 2003 Enterprise Server! All the crying and complaining from the IT staff wont convince the higher-up, because a shifty, 40b USD company that can throw a flashy seminar is far more trustworthy in his opinion then his IT staff, who worked with the company before he got there. Several budget-cuts later to accomodate the win2k3 licensing costs, the entire department switches to Win2k3. Several more budget-cuts later, mainly used on MS support, the entire company goes to hell. IT staff gets fired, along with the rest of the company while management gets scattered among several other companies, ready to ruin them anew.

        Welcome to the modern economic system.

    • "
      "OTOH, if computers were reliable enough to crash only once every few years, then users might report every crash that happens, the vendor can diagnose it, and fix the bug or family-of-bugs so that it never happens again. This is roughly what happens when a mainframe crashes, I believe - it's a big event."

      I think that has alot more to do with the critical, often costly, tasks that mainframes are used for than because its an infrequent occurance.

      In my experience, infrequent crashes are much easier to ignor
      • Exactly. I have a homebrew machine running my linux firewall/NAT, web server. In 2+ years it has had exactly one freeze not attributable to a power outage. I rebooted it. I don't care what happened; it's not worth my time to try to figure out.

        On the other hand, when I was managing physics reconstruction software, that software, when I started, would crash once every couple of days. Those were repeatable so you track them down and fix them. When that process was done, we could run for months on 60+ machine

    • by sasami ( 158671 ) on Sunday April 27, 2003 @06:33PM (#5821527)
      The trouble is, the more accustomed users become to bugs, the harder it is to get them reported and fixed.

      This is absolutely and shockingly true. Microsoft is almost singlehandedly responsible for the widespread cultural mentality that faulty software is okay.

      You'll find this notion all over the place but the worst part is seeing it in the upcoming generation. I work with teenagers, bright kids who are totally immersed in technology. Yet almost none of them understand why I complain about Windows all the time. If I tell them that a real OS doesn't crash and is not permitted to crash... they laugh -- or glare -- and say, you're crazy.

      --
      Dum de dum.
  • by conan_albrecht ( 446296 ) on Sunday April 27, 2003 @03:51PM (#5820706)
    What's wrong with flashy stuff for somethings? I like flashy (i.e. sometimes buggy) software for my laptop. I don't mind if my beta-version browser crashes once in a while because I get the new features.

    My servers, OTOH, are another story. I wouldn't use anything but Debian (for linux, that is) because it is incredibly stable. My two Debian boxes on woody stable run 2+ yr old software. Guess what? They don't crash. I didn't upgrade from potato right away, but waited a little while.

    Consumers are generally willing to accept more buggy software because they don't run servers. So what if Word crashes once in a while? Most consumers are so conditioned to it that they don't think another thing of it.

    I realize that mail servers, electricity systems, and space probes need stable software, but most consumers don't administer these things. They use browsers, email, and cell phones that don't cause (much) physical harm when they crash.
  • by supton ( 90168 ) on Sunday April 27, 2003 @03:51PM (#5820708) Homepage
    Free and open-souce software are threatened by the idea of forcing liabillity on software, This has been discussed on ./ before [slashdot.org].

    Remember, one thing M$ does well is pay lawyers.

    • Indeed. For-Pay software vendors will just pass any legislated liability costs on to the customer. I read somewhere that, on average, about 15% of the cost of goods bought in america goes to cover liability insurance. It doesn't make the products any more reliable, it just makes them more expensive, and protects the manufacturer's asses. Oh yeah, and it makes a whole bunch of lawyers filthy rich. Besides, have you read a EULA recently? They usually have a section that says you can't sue the developers for a
    • It's only a problem if a "one size fits all" approach to liability is taken. What many of us would like to see is consumers given a choice:

      - they can have access to the source and are responsible for identifying and fixing their own problems. This won't help the average user, but organizations can often provide their own support more efficiently than going through the vendor,

      - they don't have access to the source but the vendor has to deliver what they promised,

      - they have access to the source but paid
    • No, the Open Source zealots have an answer for that...

      Their software will be exempted.

      Of course that right there guarantees Open Source software will never be used in government or business climates.

      Most regulations are in place to protect the existing companies from competition by raising the barrier to entry even higher. So I'm actually surprised Microsoft is against this, although maybe it's a Brer Rabbit defense.
      • No, the Open Source zealots have an answer for that...

        Their software will be exempted.

        Since when do lawmakers take their guidance from the Open Source Community? I think such regulation will be devastating for open source or low-cost software. It will be like medicine, where malpractice insurance will raise prices. And don't expect any exemptions for free/Open Source software.

      • by dogfart ( 601976 ) on Sunday April 27, 2003 @05:58PM (#5821380) Homepage Journal
        Or companies would buy OSS from middlemen who would obtain the appropriate insurance. In order to ensure a reasonable profit, the middlemen would have to perform some sort of due diligence to minimize their risk. Even with the insurance (plus other costs plus profit) passed through, the software would be MUCH less expensive than what you would buy from Microsoft.

        Sort of like the RedHat/IBM model for making money from OSS/FOSS - sell the services, give away the software. In this case the service is managing the risk.

        What about free (as in beer) software? In this case, the best solution would be for the user of the software to assume the liability. The software user could either accept the liability for free software, or pay someone else to assume that liability (meaning buy the software from the middlemen).

        The point is we need the ability of software users and producers to rationally cost the risks of software malfunction, then assign these risks to the party that makes most sense. What we have now is a unilateral non-negotiable assignment of ALL risks to the purchaser.

        Why should software companies face multi-million lawsuits for software errors? The same reason that software users ALREADY assume multi-million dollar costs of flawed software. Allowing tort liability does not change the fact that there are real costs to bad software - it only allows a mechanism for allocating these costs (versus the current unilateral buyer-takes-all-the-risks).

    • Don't forget that you can separate the role of a certifier (of integrity, of quality etc.) from the software owner or the development group.

      In fact, this probably happens inside a commercial shops anyway.

      Just as I trust SuSE to pull together a decent set of Linux apps for me, I might trust them or some other organization to certify a package by signing the code or similar technique.
    • I think that there is one large distinction here. In the case of Microsoft and other vendors, people are buying the software. If you are BUYING a product, you SHOULD expect that the vendor is subject to a degree of liability. If you are using a product that you have not "bought", such as OSS, you should NOT expect a degree of liability on the developer. Sure this may stifle the acceptance of OSS, but I hope that lawmakers keep this in mind. On the other hand, I believe that if you have paid for a modifi
    • by Sanity ( 1431 ) *
      It sounds like they are proposing that people will only be liable for bugs that they know about and don't disclose - not exactly a serious problem in the Open Source world.

      Forcing companies to disclose bugs in this way could only serve to allow users to make more educated purchasing decisions on the basis of software reliability.

      Imagine that I wrote some software that I *knew* was buggy, and then sold it to a hospital or into another situation where people's lives were at risk. Imagine then that one of

  • by Realistic_Dragon ( 655151 ) on Sunday April 27, 2003 @03:53PM (#5820720) Homepage
    IMO if a company is unwilling to supply you with the source code (under whatever license) to let you see and fix problems that exist they should have no possible exemption from litigation, no matter what POS EULA they persuade you to sign.

    They are asking you to place your trust in them that their code is good enough to bet your business on. If their software is not all it's cracked up to be and you had no chance to check their claims (but instead had to take their word for it) then they clearly are responsible for breaking their word.

    Unless they told you that it was a buggy product that you couldn't rely on in the first place... now that would make for amusing adverts.

    (Can you imagine Windows boxes with cigarette-health-warning style labels on them saying "The Computer-General warns that this product may be bad for your business.")
  • Sad. So very sad... (Score:5, Interesting)

    by mcrbids ( 148650 ) on Sunday April 27, 2003 @03:53PM (#5820721) Journal
    The company with the most to gain from this (with its unique cash reserve - Microsoft) is the company most in opposition...

    Yes, I said it. I'll say it again. Microsoft could gain *alot* from this movement.

    With their resources, they are the ones that could easily afford a true source-code audit the likes of which the BSDs are only beginning to approach.

    They could build an operating system that fully, completely, and truly matches the concept of "secure by default" and they have the resources, manpower, and ability to do so.

    But, instead, they oppose it. Building a secure system is against corporate culture, so they won't do it.

    Thanks xBSD!
    • by Sycraft-fu ( 314770 ) on Sunday April 27, 2003 @05:40PM (#5821272)
      You want verified design? Cool, you can get it. You can get a design that is gaurenteed to have no bugs and to never crash. This exists today, however you need to be prepared to PAY for it, in many ways.

      First, say goodbye to the concept of being able to load your own apps or run it on your own hardware. If the company is going to certify that everything will be bug free, they need to know that a 3rd party isn't going to fuck that up. Your system will be verified to run on a certian hardware and using certian software. You will not change any of that without consulting the company first to do a verification of the proposed changes, or you'll invalidate the gaurentee and service contract. After all, you can have 100% stable code, but if a peice of hardware has a dodgy kernel mode driver it doesn't matter, that can being the system down.

      Second, you will have the access restricted. You won't be able to just put this system on teh Internet to be accessed in any way you like, it will be accessed only through verified channels. You cannot potentially have the integrity compramised by clients sending unforseen data to it so all access must be controlled.

      Finally, you will pay in terms of price. IF you want a system of this level you are not getting it for under a thousand dollars. Think 6 or 7 figures, plus a yearly matenence contract since you yourself aren't allowed to maintain it.

      We have systems of this level in the real world. Like the AT&T/Lucent phone switches that run most of your phone network. We have one at the university and know what? IT never goes down, it didn't even go down when they upgraded it from a 5ESS to a 7R/E. It is 100% reliable. However, it is totally inflexable. We can't mess arnound with new technologies with it, it does phones and it does them only one way. We don't even work on it directly, it came with two technicians as part of the service contract. Oh, and it cost nearly 20 million dollars.

      Look, if you want to have a computer market where anyone is free to build hardware and assemble it how they like, and you can freely use whatever software you want, you have to accept that there WILL be problems and you won't get verified design. The big part of a verified design is just that, verification. You check EVERY part of the design and make sure it works properly with the other parts and has no errors. Well the problem is that hardware, software, and user interaction are all a part of that and all have to be restricted. Once a design has been tested and verified, it can't be changed without reverfying.

      So, if you really want 100% reliability, and can afford it in terms of monetary cost and teh sacrafices you have to make, then don't whine, go and get it. Talk to IBM, EMC, Dell or the like. They'll design you a system to do what you need that will never crash ever. However you'll need to decide what it needs to do and be happy with that, because you won't be able to change it, and you'll have to pay a real cash premium for it.
    • They could build an operating system that fully, completely, and truly matches the concept of "secure by default" and they have the resources, manpower, and ability to do so. But, instead, they oppose it. Building a secure system is against corporate culture, so they won't do it.

      I think you dramatically underestimate the work in creating a secure, robust system.

      First, Microsoft's money only buys them so much. You can't just put more money into something and get more out of it. Of course, they can p

  • Flash and burn (Score:5, Insightful)

    by HanClinto ( 621615 ) <hanclinto@nosPaM.gmail.com> on Sunday April 27, 2003 @03:54PM (#5820726)
    Isn't the trend towards "flashy products" rather than reliable ones the same reason why current marketing pushes sex rather than product qualities (Pepsi, A&F, etc), movies flaunt big-name actors and actresses, and people won't go see a movie unless it has a high PG-13 or R rating (PG? It's got to be boring). This is the same reason why Legos now has 3-piece dumptrucks and 8-piece Hogwarts castles. Why is this? Dumbed-down education? Why is it that people have just started to gobble up whatever the media tells them rather than understanding what they need for themselves. *sigh* What's society coming to?
  • by fjordboy ( 169716 ) on Sunday April 27, 2003 @03:54PM (#5820730) Homepage
    I've often thought about how many products use simple programs and stuff to run correctly...like traffic light systems...right now they work pretty well and everything goes together properly. However, the day that cities decide to have a central server run the traffic lights so they can...say, control traffic around accident areas...things will go wrong. The "foolproof" software will cause all sorts of problems.

    I don't see this so much as software causing problems as much as the tendency we have to make what used to be simple things really complicated. One example I have in my life is a train system that runs around inside a building by the ceiling at a camp [susque.org] I work at. The system looks really nice..and it could work well. However, having a couple of electrical engineers volunteer their time to make the system made it very different. Now, what could have been a simple on off switch is a whole panel with an LCD display and all sorts of error lights and little IR detectors on the track to make sure the train is in the right place. It is a geek paradise...but the train NEVER works. Despite all the fancy assembly code they have running the whole thing, it doesn't work. An on/off switch would have worked..I'm certain of it!

    As we complicate more and more appliances with complex software, there are going to be more problems. Heck..what's gonna happen next time your toaster oven timer crashes...you could burn down a house!

    The caveman had something going for them...
    • "However, the day that cities decide to have a central server run the traffic lights so they can...say, control traffic around accident areas...things will go wrong. "

      Actually I wouldn't be surprised if traffic lights aren't already centrally controlled in some urban areas.

      Traffic lights have a human safety factor, in the event of bad instructions they can fail over to flashing red in all 4 directions. Humans are trained to understand that flashing red means stop. So the worst case, that the lights are
  • by ptarjan ( 593901 ) <spam@NosPAM.paulisageek.com> on Sunday April 27, 2003 @03:57PM (#5820743) Homepage
    And Bill Gates turns to the CEO of GM Motors and says, 'Why is your technology moving so slowly? If you advanced at the same rate as we do, we would have flying cars by now!' Immediatly the CEO of GM turnes to Billy and says, 'Because the government doesn't allow us to build cars that crash 4 times a day.'
  • Ignorance. (Score:2, Insightful)

    I'd say that most non-geek users are completely ignorant of software reliability. A computer just has errors. They have grown to accept that. To them that's why they have a warranty and that's why tech support exists. The typical windows 9x users believes that a restart is the natural second step to every click or change they make. I knew a girl that thought an illegal operation meant she could go to jail (for what she did not know.) So the first step in making software companies more reliable and mor
  • Just noting that the regulations in UCITA give you the worst parts of liability and disclaiming against it... The bill states that software companies must warranty their software's performance -- but says that they can disclaim that warranty in their license agreement.

    So what does that mean? It means that companies like Microsoft can ask their lawyers for the appropriate legalese to have no liability against their software fuck-ups, but some hobbyist who coded up something and stuck it on their web site
  • by Macrobat ( 318224 ) on Sunday April 27, 2003 @04:01PM (#5820764)
    Holding software liable for failure won't stifle innovation. A great deal of (most?) innovation goes on in academic settings anyhow, where results are published and critiqued by outside experts (i.e., from other universities), not hidden away like some Special Sauce recipe.

    Moreover, how innovative has MS (or anyone else) been about reliability? Adding new features like embedding full-length motion pictures into Word documents (apologies to Neal Stephenson) is one kind of 'innovation,' but it comes at the cost of gains in stability. One could argue, and people have, that most commercial software is derivative anyhow, and its mass adoption has stifled opportunities to create more stable products.

    And finally, do we really need that many new twists on things? I'm not saying stop research or trying new things, but mainframes running COBOL code have been hosting most of the world's financial and business information for decades, and they are legendary for their stability, with low incidence of data corruption and uptimes measured in years to decades.

    • "Adding new features like embedding full-length motion pictures into Word documents (apologies to Neal Stephenson) is one kind of 'innovation,' but it comes at the cost of gains in stability."

      So if you mounted a rocket on your car to help with acceleration but you knew that one out of every ten uses it would completely fail and likely destroy your car are you innovating or are you being stupid? Innovating is when you add a feature and it just works. When Microsoft or any other company adds a feature t
  • by ZenShadow ( 101870 ) * on Sunday April 27, 2003 @04:06PM (#5820783) Homepage
    10 steps for builidng a successful software product:

    1) Fire half (perhaps all) of your programming staff. Most of them don't know know the difference between a heap and a stack, don't have a clue beyond the Java language, and if faced with the prospect of learning x86 assembly language, they'd faint.

    2) Hire people that *do* know the difference between a heap and a stack, perhaps have written in some assembly somewhere (even if just in college), and have figured out how to use a few more languages besides Java.

    3) When doing #2, ignore college degrees. Whether or not someone has one doesn't indicate whether or not they're a good programmer, at least until our the majority of our school system can actually teach the *relevant* skills.

    4) Plan. Plan. Plan. Document. Plan. Flowchart. Plan. Plan. Discuss. Plan. Discuss. Plan. Document. Plan.

    5) Code.

    6) Discuss. Test. Fix. Discuss. Test. Fix.

    7) Refactor

    8) Repeat 6-7 until all the software has been reduced to the simplest, most error-free possible codebase for its functionality.

    9) QA. (Yup, this happens *after* the testing in (6)!)

    10) Ship.
    • Wow, so it's either assembly or nuthin, huh? That's kinda lame. Granted, it's very valuable to know the diff between a stack and a heap, but that's not the end all of everything.

      And frankly, after leading teams, the last thing you should do is fire people on it. I would suggest letting go of clueless managers and real payroll hogs. Then teach/mentor your jr. level programmers. I dunno, being l33t like that actually will hurt a project more than help.

      Also, Step 9.5 should be refactor, test, repeat 9, since
    • by Rombuu ( 22914 ) on Sunday April 27, 2003 @04:40PM (#5820944)
      Boy what fucking useful advice.

      And if someone asked you how to play a flute you'd say, "oh, just blow in here and move your fingers."
    • This sounds a lot like what companies are doing today. They rank their employees and then cut 50% of them because their sales are in the tank.

      The real right answer is to move that 50% to testing, double project timelines, add diagnostics and plan for quality from the very beginning.

    • by cpeterso ( 19082 ) on Sunday April 27, 2003 @05:56PM (#5821362) Homepage

      11) Profit?

  • "It always takes us by surprise when the rocket blows up or the ATM goes down," Guttman says.

    That was Microsoft all this time? Wow. I guess I shouldn't feel so bad when my workstation acts funny. Just one reboot and I'm back to work. But if my workstation blows up, I'll know who to blame.
  • So Frontline has a great 52 min show on this exact thing - viewable on line! (Personally, I copy out the links in the html and watch it in RealPlayer or Quicktime, but whatever suits ya..) It's called Cyberwar. Interviews with white house/govt types along with a cracker and an M$ guy. It's got more of a 'war' slant, but nonetheless, pretty relevant. Check it out here [pbs.org]

    Ah, gotta love Frontline..
  • Says the story, 'Microsoft contends that setting [reliability] standards could stifle innovation, and the cost of litigation and damages could mean more expensive software.'

    The story also specifically proposes holding vendors legally liable, and in some respects I half agree with Microsoft on this one. At the very least, any legislation would have to be very well designed.

    If I write software freelance (as many people here do), I want to be able to give it to people and tell them to use it at the

  • "The idea that we depend on something that's inherently untrustworthy is very frightening," he says.

    This is such crap. Software is not inherrently untrustworthy. The fatal incidents cited all appear more due to human error rather than software bugs, as has happened since man started building machines.

    If software was so inherrently buggy no one would get on a plane or dare trust a traffic control signal.

    As for making manufacturers liable, you can but I would expect a negatibe response rather th
    • I agree. Most software is very reliable. More aircraft crashes are caused by mechanical problems than software issues. If there is a life threatening fault in a piece of software, this usually results in a recall. The only software that's really unreliable is consumer level, and you are quite unlikely to die from Word crashing (even that doesn't happen to me much).
  • I think a big problem is determining the question of who is liable: the person who wrote the software or the person who deployed it? I think software vendors can often successfully argue in court that the user "was trying to do something with it that it wasn't designed to do".
  • by Anonymous Coward
    I guess I've had a different experience with reliability than most of the posters here have had.

    Given a piece of software that has both Windows and Linux versions, the Windows version is almost always more reliable/less buggy.

    The Linux version usually seems to have been done as an afterthought, and most of the development work goes into the NT product.

    I'd like to choose the Linux version everytime, but for most software, the Linux implementation just isn't there yet.
  • Let's be realistic (Score:4, Interesting)

    by phillymjs ( 234426 ) <slashdot@nOsPam.stango.org> on Sunday April 27, 2003 @04:16PM (#5820831) Homepage Journal
    As long as companies like Microsoft are around to pump money into lobbyist firms and election campaigns, we'll never see a software-reliability law that's actually beneficial to consumers.

    I'm pretty much willing to settle for some sort of truth-in-software-advertising law... so when William H. Macy's voice tells us that Microsoft's server software is totally secure and reliable, it also has to tell us that Microsoft's EULA says that if this turns out not to be so, tough shit on you for believing it in the first place.

    ~Philly
  • The article also says, however, that consumers' favortism of flashy products over reliable ones is partly to blame for the current state of software

    Very true. If it werent for flashy junk, I wouldnt have to make a huge project to uninstall the million varieties of Hotbar's spyware.

    However, on the server level, it will hardly be a consumer thing. If they install SkyNet, it probably wont be running a commercial OS.

  • by west ( 39918 ) on Sunday April 27, 2003 @04:21PM (#5820855)
    It is certainly true that users place reliability very low on their list of priorities when buying products, but that does not necessarily means that they don't value reliability. It merely means that they take reliability for granted.

    For example, the last time I filled in a car survey, I didn't put "does not explode when ignition key turned" anywhere on the form.

    The problem is a fundamental one. There are way, way, way too many possible parties to blame. The only logical reaction for MS if such a law was enacted would be to immediately cease running any software that wasn't authorized by MS (with approriate fees, bars for competing programs, etc.), a situation that I imagine they see only in their fondest dreams. Legislation like this would be the perfect excuse. To be honest, even I would barely question their right to secure their system if they are going to be held responsible for its flaws.

    As for the idea that open source software should be exempt - I doubt that you'd accept the idea that cars should be exempt from safety standard if they provided you with the blueprints :-).
    • It is certainly true that users place reliability very low on their list of priorities when buying products, but that does not necessarily means that they don't value reliability. It merely means that they take reliability for granted.

      I think there's also a cost-benefit tradeoff that people make, which varies from item to item. I spend a lot more time in Word than in vi because the added features and usability are worth far more to me than the occasional crashes or file corruption costs.

      If it were a pacema

    • Hmmm... One of the suggestions in the article was that companies only be liable for bugs they fail to disclose. If the entire source code is disclosed, does that mean all the bugs are disclosed?
    • by starseeker ( 141897 ) on Sunday April 27, 2003 @05:04PM (#5821041) Homepage
      "As for the idea that open source software should be exempt - I doubt that you'd accept the idea that cars should be exempt from safety standard if they provided you with the blueprints :-)."

      But I would if the car were given to me for free with the blueprints. When I use such a car I am knowingly accepting the conditions that, while the designers may have done their best to make it work properly, I accept the risk of failure. That's where the no free lunch part comes in for free stuff - you don't get to nail hides to the wall if it doesn't do what you want. If you want someone behind it, pay them to take the legal risk. Otherwise, you're at the mercy of the developer's good will unless you want to become an auto mechanic. The difference is - with the blueprints, I can figure my way out. Commercial software sues you if you try the equilivent operations.

      Anyway. Bad analogy. The act of paying someone an agreed upon sum for support is where the responsibility part comes in. Not supplying blueprints.
  • I'm not entirely sure that new laws are the answer here. So far as I am aware, if someone commits a reckless act that they knew or should have known would cause injury or death to someone, isn't that already actionable?
  • What amazes me is that some shyster (pronounced: Peter Angelos) hasn't filed a class action lawsuit against Microsoft and everyone else. Seems like the money he could make on that deal would dwarf what he expects to get in the (bogus) cigarette lawsuit.

  • So computers crash. And you know what? Cars crash everyday to. Few software bugs end up killing people, but crashing cars is one of the top killers! Why don't they make cars that can't crash? It could be done!

    Get some perspective here people. Computers aren't made perfectly reliable because the free market says they don't have to be. And they don't. The cost of making bug-free software is much higher than the value of bug-free software. If you are going to argue the point, please take that energy a
  • by bcrowell ( 177657 ) on Sunday April 27, 2003 @04:31PM (#5820908) Homepage
    Is there really a downward trend in quality? How should we measure quality?

    I started using computers ca. 1979, when my dad got a TRS-80. I don't remember ever encountering a single software bug on that system, although the hardware certainly had its problems.

    But does that mean that quality is getting worse? The OS on that machine was on ROM, and was about 4 kb. A modern OS weighs in at many, many megabytes. It's possible that the number of bugs per line of code has actually been going down.

    Another possible metric is how often the user encounters a bug. By this metric, non-OSS consumer-level software has certainly been getting much, much worse. I switched to Linux from MacOS, and my average number of bugs encountered per day went from maybe 5-10 to some number less than one.

    Some things have definitely changed since 1979:

    • In the early 80s, software mostly came as BASIC source code. If you encountered a bug, you could fix it.
    • Software houses used to be much more open about bugs. I briefly worked on tech support and quality assurance for Digital Research around 1983. We had a list of known bugs in each product, and we would fax customers the list on request.
    • Performance is much worse. A TRS-80 would boot in a matter of seconds, whereas today the Windows boxes at my work take up to 3 minutes. The first word processor I ever used, Electric Pencil, started up in a fraction of a second, and never had any noticeable delays in handling input. This was on a CPU a hundred times slower than current ones!
  • I think that part of the fault is with people who decide to use commonly available (i.e. usually cheap) components for critical products (see the warship incident couple years ago, for example).

    Most electronics components, from resistors to microcontrollers are usually marked "not for use where human lives can be put at risk" or something like that. Say, if you were to build a pacemaker you wouldn't buy the parts at your local RadioShack. Software (or anything else) should be the same way.

    High availa

  • Leave it to Microsoft to blame consumers for their software's poor reliability. They pretend that people haven't been bitching about it for years. News to MS: Everyone wanted reliability all along. Sure pretty is nice, but I would have traded my WinME interface for an Etch-a-Sketch if it meant it wouldn't crash all the time.
  • I think things could be just fine as-is.

    For normal, non-life-threatening apps, use open standards for your data. Then, when a less-buggy product comes along, start using it. Let your wallet move the market in the right direction. If people keep using buggy software, companies have no reason to do any better.

    In the serious life-and-death software cases, it's always a different ballgame anyway. Companies feel an obligation to test the whole product to make sure it doesn't kill often enough to hurt the c
  • Consequently, Humphrey teaches engineers to plan and pay attention to details early, and reject aggressive deadlines.

    I tried this on my last project. This was a huge complex project with many people working on it. If one person messed something up it would take half a day to find the problem. I explained to the project management that I thought the software was getting out of control and badly needed some refactoring and at least some unit tests to aid in quickly identifying problems. I also wasn't
  • Well (Score:3, Interesting)

    by mindstrm ( 20013 ) on Sunday April 27, 2003 @04:40PM (#5820946)
    generally, any software that LIVES depend on has guarantees..

    There is a world of difference between average windows software, and say, hospital management software, or flight control software, or what runs the space shuttle.

    PEOPLE are liable.

    We KNOW software is prone to not being perfect, just like *any other system*.

    When you build a bridge, you don't just slap it up and hope it works... that's what the guy who throws a board over the creek in his back yard does.. he eyeballs it, decides it's adequate, and that's it.
    When we build a suspension bridge, engineers SIGN OFF on the soundness of the bridge... which is dictated by long-standing test principles. There are many, many things that lead up to a declaration that the bridge is stable... if it turns out not to be, and no negligence it seen, it's something wrong with the process itself.

    If you want software with guarantees, everyone has to agree on test suites, methods, and processes that PRODUCE good software... and we all agree that if they pass said tests, then liability is waived. Something like that.

  • by 1nv4d3r ( 642775 ) on Sunday April 27, 2003 @04:42PM (#5820952)
    Pressing 'File' sends a suggestion to Word that it should open the file menu, which it may or may not choose to do.

    In the event that the file menu does drop down, the user in most circumstances can then press 'Save.' This could potentially update the on-disk copy of the document. The (notional) screenshot, which is likely to be on the next page, depicts a common scenario.

    etc..

    Try to find a bug in that!
  • by Boss, Pointy Haired ( 537010 ) on Sunday April 27, 2003 @04:43PM (#5820954)
    Seriously, is this problem with crappy software and software reliability a result of people out there in programming jobs who really SHOULD NOT BE PROGRAMMING?

    We've all seen those questions asked in bulletin boards and usenet groups, where some newbie pops up and says:

    "I'm learning $language, how do you do $something_obvious".

    And you think to yourself, "If you have to ask a question like that, you should NOT be programming." But we're all too nice to say it.

    Trouble is, people who are having to ask questions like that are writing software that peoples lives depend on.

    Scary stuff.
  • I've always thought the way to approach this problem would be to start from the ground up (hardware) and redesign a new system using proper engineering methods to incorporate things like security, reliability, and modularity. It's very easy to critizise the software out there today (with a few exceptions) because of the environment it was designed and written in. My dream way to design a new computer:

    1.) x86 must die. Kill it. Take RISC as a starting point, and work from there to design an optimal pro
  • .' The article also says, however, that consumers' favortism of flashy products over reliable ones is partly to blame for the current state of software."

    I disagree that consumers are responsible for the state of software. We fix computers for a living. We have clients happily running Windows 98 who wouldn't move up for love or money.

    We have clients buying new computers who want to convert back from XP to Win98. As this esteemed audience knows this can be difficult. Dell boxes have their XP pretty

  • by miketang16 ( 585602 ) on Sunday April 27, 2003 @04:57PM (#5821014) Journal
    Who would have thought Microsoft would oppose setting a standard for reliability....
  • 1991

    Consumer: I would like to buy this newfangled Windows 3.0 I've been hearing so much about.

    Microsoft: (Brings out a large dead carp and slaps it across the consumer's face a dozen times.)

    Consumer: Thank you!


    1995

    Consumer: Having grown tired of the hideous deformity know as Windows 3.1 sitting on my harddrive, and being easily swayed by massive advertising campaigns using music by The Rolling Stones, I would like to upgrade to that spiffy Windows 95 everyone is talking about.

    Microsoft: (Brings out a large dead carp and slaps it across the consumer's face nine times.)

    Consumer: Thank you!


    1998

    Consumer: Having exhausted the aesthetic enjoyment of the Blue Screen of Death, and daunted by the ability to pick a browser not clumsily tied into my operating system as an anti-competative practice by a coercive monopoply, I would like to upgrade to the ever-so-delightful Windows 98.

    Microsoft: (Brings out a large dead carp and slaps it across the consumer's face six times.)

    Consumer: Thank you!


    2000

    Consumer: For reasons unclear even to me, I have decided to upgrade to the heavily-hyped Windows ME!

    Microsoft: (Brings out a large dead carp and slaps it across the consumer's face nine times.)

    Consumer: Hmm, that seemed rather suboptimal...


    2002

    Consumer: Obeying the voices in my head, I've decided to upgrade to Windows XP.

    Microsoft: (Brings out a large dead carp and slaps it across the consumer's face three times.)

    Consumer: Now we're getting somewhere!



    End result: Consumers will never sue Microsoft for defective software.

  • the notion that... (Score:3, Interesting)

    by zogger ( 617870 ) on Sunday April 27, 2003 @05:16PM (#5821106) Homepage Journal
    ..the notion that vendors would be liable for *bugs they know about* has some merit. Think about it. If the large companies-we'll pick on MS because it's such a good example-were forced to fix bugs in a timely manner, then they would need to accept bug reports. They would also have to release bug reports as soon as they knew about them, ie, they couldn't sit on a critical exploit and let people hang out in the wind for months and months. Once a report was made to them, it would then become an official bug they couldn't ignore. They'd have two choices then, switch to open source to find as many bugs as possible in the shortest time, or keep paying claims forever because they ignored bugs. Either way they would release less of better quality, not really a bad idea. If they wanted to hire professional beta testers, so what? More paid jobs. I don't see that as being all that bad. Nope, I don't.

    Open source -FOSS- is in a unique position because it's "free". There can't be any damages if you haven't paid for it, or at least that could be part of "the law" written into it.

    Normally I'm against new laws, but instituting some sort of consumer protection should be in order, if these companies want to make serious profits all the time. There are very few examples of consumer products out there that have no liability at all attached to them. With just a short time reflection on it, I can't think of any off hand, just *some* software. Eventually it's going to happen, so better to sort it out now, it really should have been sorted out 30 years ago, IMO. I tell you what will cause it too, if it's not done voluntarily in advance and adhered to, the first uber killer mass virus or trojan that makes code red or slammer look like a case of the sniffles, a net-killer. You'll get ten times worse legislation out of washington if the software community waits until that happens.

    I'd say it's bound to happen sometime, too. The article cites 50 some odd billion a year already in losses due to either bad or insecure programs, you have something bad happens that does ten times that in one day or something, you WILL see the mother of all knee-jerk reactions from "the software consumers".

    Well, OK, say that "something" is needed - What would be reasonable, but still not stifle development? One would be outright sales of software, not just renting -licensing of software. You buy it, you OWN it. You get it at such and such a date, as of that date it worked as advertised, after that date, well, up to the vendor then, anything "new" that needs to be added is up to them, from free unlimited patches and updates to pay-for individual bugfixes and exploits as you go, forever. Could be a yearly lease thing, whatever. For-profit vendors would get on the ball pretty quickly then if they charged too much or it didn't work all the time. they'd be forced into auditing as the most important part of production. Hmm, is this a bad idea really? The software is sold as "works on this and this, won't work with that and that". Yes, that would make software developers tend to work around just a few pieces of hardware and one or two OSs max no doubt. It would also be very expensive. Very expensive. Maybe people would go to the no liability but free stuff then? And I can see various versions in between those two extremes.

    Could there be set limits per incident? Perhaps. Max liability, perhaps.

    How about classifications of software?

    "Entertainments" might be of lower criticality (so less liable in terms of maximum cash) then say the pacemaker software, or auto-controlling software. "Communications" like browsers and email and chat would be in the middle someplace in those terms of criticality. If your business depends on UPS or FEDEX to ship widgets, and they constantly don't get there or they are smashed, those companies would be sued out of existence. but if your widgets are electronic, well? It's just your tough luck as the consumer then, the software and the infrastructure has let you down, but they all get to say
  • by nomadicGeek ( 453231 ) on Sunday April 27, 2003 @05:48PM (#5821317)
    I disagree about the article's assertion that there is no liability for defects in software.

    I deal with embeddeded controls in industrial control equipment all of the time. I just had to change my insurance company last year and my rates went up because companies are being held accountable and insurance companies are paying out when people screw up. Many companies don't want to insure programmers anymore. Sounds like the hammer is coming down to me.

    You may not be able to sue MS the next time Excel craps out on you but I assure you that you could sue a programmer because the system that he programmed dumped 1000 gallons of a toxic substance into your containment area or because you just released a toxic cloud of ammonia from your plant.

    When the stakes are high, programmers tend to have to test a lot more. You still have to remain economically viable though. Three lines of code a day may work for NASA but the rest of us can't afford to be that inefficient. Of course the stuff that I can blow up is at most worth 10's of millions of $, not billions.

    When it comes to embedded control apps, I don't think that things are much worse than they are for our physical counterparts. Yeah a plane crashed because of a bug in an altitude control system but they also crash because of other design problems in the mechanical, electrical, and materials engineering areas. I don't think that programmers are any less aware that lives depend on their work than any other type of engineer.

    If you are doing number crunching types of applications, you also tend to run the code through a battery of tests. You can definitely be sued for screwing that stuff up.

    Now little controllers in your dishwasher and your run of the mill desktop apps are held to a lower standard, I agree. You are rewarded by the market for getting new stuff out the door cheaply and quickly. You can certainly argue that it shouldn't be that way but the masses have spoken. If your stuff gets too far out of hand then the market will let you know. MS is definitely feeling the pressure from OSS and rightly so. I can bet you that they are atleast trying to respond. I can definitely see a big improvement between the Windows XP that I run on my notebook and desktop and the NT 4 that I ran a few years ago. I can also see that Windows 2000 is much better than NT 4 was on the server, but it isn't good enough yet and that is why a lot of people are moving to Linux for things like web servers, DB machines, etc. The market is speaking.

    I would say that programmers are ultimately held accountable. I would hate to see things swing too far out of hand as I do think that it would ultimate stiffle innovation.

  • by Kjella ( 173770 ) on Sunday April 27, 2003 @06:13PM (#5821459) Homepage
    Price, features, speed and reliability. Pick some but you can't have all.

    To write almost bugfree software, like DoD / NASA (just be sure to check the specs for metric or not), the price is astronomical. Despite the obscene profit margin, Windows would be *much* more expensive if written by the same standards.

    Also, adding features is another reason for instability. Not only commercial software, but also OSS software has been accused on focusing too much on adding features. In the commercial world because features sells, and OSS I think mainly because adding features is more fun than debugging an elusive bug that only happens on friday 13th under a full moon.

    Another thing is speed. Particularly games are running the latest beta drivers on a tweaked and retweaked engine for speed. This is happening both in the high-end (pushing eyecandy) and in the low-end (pushing playability for low power machines). Don't expect perfect stability from that.

    In short, I think the market would normally work this one out by itself. When delivering appliances I feel you should have the same liability as for the rest of the car. I mean whether the brakes fail because of a mechanical or electronic (software) design flaw, is not very relevant. However, for a typical software program that operates only on your computer processing information, I don't see this as very useful. Requiring some kind of standard would not change the basic trade-off, and it's not the producers' fault that the consumers aren't valuing reliability and security. They aren't willing to pay the price in form of money (How many complain about the price of Windows already), features (Go Linux. More stable, less features though) or speed (How many complain about the speed of Java that tries to abstract away from bugs related to not properly terminated strings, pointers arithmetic and array indexes out of bound?). So what did you expect?

    Kjella
  • by silverhalide ( 584408 ) on Sunday April 27, 2003 @06:36PM (#5821539)
    This isn't really a huge issue, it's just illustrating the need for another certification program. Look at the semiconductor market: There's semiconductors that you can use in everything, then there's semiconductors rated MILSPEC and Hospital grade, which have been tested and are approved in critical situations. Same damn semiconductor more or less, just has been exhaustively tested. They usually cost many times that of the other part, but you KNOW it will work, 'cause whoever made it is going to stand behind it.

    We need the same thing for software. Someone to set up some guidelines, and provide certification to software that is going to be used in a critical application. Hell, maybe even the UL could open a division and do it. It is plain stupid to assume authors have liability over all software written, especially in the open source world. However, if I buy a product, and its software has been certified by a trustworthy organization, I'd feel better about myself.
  • by dpuu ( 553144 ) on Sunday April 27, 2003 @06:58PM (#5821615) Homepage
    "The idea that we depend on something that's inherently untrustworthy is very frightening," he says

    If something is inherently unreliable then you don't need to fix it: you find ways to live with it. A perfect example of this is the internet itself. TCP is a reliable transport provided over IP, an unreliable internetworking layer.

    Make no mistake: IP is explicitly and deliberately unreliable. This keeps it simple, and allows upper layers to choose appropriate quality of service parameters for their application.

    How this relates to the issue of unreliable application software is fuzzy: but its pretty obvious that humans have adapted to the reality of the situation: the power-cycling protocol is just one example of the ways in which we cope.

    If a situation is life-critical, then I'd be happier knowing that the system is designed to cope with glitches then if the system assumes these glitches have been tested out of existance. Cosmic Rays really do exist, so some level of unreliability is guarenteed!

  • by zoftie ( 195518 ) on Sunday April 27, 2003 @08:00PM (#5821850) Homepage
    reliability. further, we expect more and more features and expect it at a low price. People who design software, do so on language that is backwards compatible with ones 20 years ago, namely C++, which carries some of the many failures on many levels into living applications. Now the language is not wrong, but how many people really considered writing their applications, say in Lisp, Scheme or Forth. Each language has its advantages, yet economics of software development demand that people should use most widespread language, so that it would not be as hard to hire decent software developers. What most managers do not realize is that by choosing a language they meddle in the affairs of those who know the field much better... The is whole stigma with using software tools, languages being the core. And often it is decided by managers who do not carry responsibility for development and manintenace of the software. And even if they do common fallacies used to justify imposition of specific tools onto software teams.
    However sometimes teams are fortunate enought to have choice in matter of tools, yet they never really have the way to verify that something they have created is exactly what a customer needs. Scrutiny by expert users is often absent from software development ... except games! So what do you expect? Incomplete requirements, unfit tools ... list goes on and on. Very few people are able to cut through the bullshit, and crap in general to get a very good software package out. Nevermind treat their employees right. Bugs is corporate software are just some of the sysmptoms corporate world bearing off, core of the problem being, is sheer miscommunication in way public companies are handled - which is what most of software companies are.

    In the end it is all about compromises and vision. Software bugs are just side effects, that will exaterbate any main problems a software company has. (that is bugs in tested and released software).

    Plus something that was not tested for and does not have fatal outcome on the program is not a bug, i'd rather qualify it as a glitch...
    my 2c.

Every successful person has had failures but repeated failure is no guarantee of eventual success.

Working...