Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Security

Holding Developers Liable For Bugs 838

sebFlyte writes "According to a ZDNet report, Howard Schmidt, ex-White House cybersecurity advisor, thinks that developers should be held personally liable for security flaws in code they write. He doesn't seem to think that writing poor code is entirely the fault of coders though: he blames the education system. He was speaking in his capacity as CEO of a security consulting firm at Secure London 2005."
This discussion has been archived. No new comments can be posted.

Holding Developers Liable For Bugs

Comments Filter:
  • by Agelmar ( 205181 ) * on Wednesday October 12, 2005 @10:15AM (#13772940)
    I will admit that I have seen a lot of bad programmers and bad code over the past few years, but let's step back and think about this. Programming jobs are rapidly being sent overseas to India and China. This is not going to create much of an incentive to keep such jobs in the States, nor does it create much of an incentive for people to go into the field. Holding companies accountable, as suggested in the article, might be a slightly better solution, but again it's somewhat complicated when you start trying to hold an overseas company accountable. (It's more doable than holding an overseas individual accountable, but still not a simple task).

    As for the article's last point about CMM environments: It's not at all an indication that software has been developed by quality developers, all it means is that the code was developed using a reasonable development framework. CMM level 3 means that you document your processes, and typically have peer review. Bad peers means peer review is worthless - it does not guarantee good programs. CMM Level 4 involves"quantitative quality goals" by which productivity, quality and performance are to be measured. This is a bit better, but again it's a matter of where the bar is set. CMM Level 5 is about continual improvement, and is extremely strict. I think that CMM Level 5 is the only environment where one can actually be assured of reasonable quality code. I've seen way too much bad code come out of CMM-3 and -4 environments to give them much credit. If you've got great people, then a CMM-3 environment typically produces great results. For -3 and -4, what you put in is what you get out - not guaranteed greatness.
  • by muellerr1 ( 868578 ) on Wednesday October 12, 2005 @10:16AM (#13772953) Homepage
    Whatever happened to holding the people who exploit vulnerabilities responsible?
  • Sheesh! (Score:5, Insightful)

    by MeBadMagic ( 619592 ) <mtpenguin@@@gmail...com> on Wednesday October 12, 2005 @10:18AM (#13772967)
    Remind me not to work for this guy.....

    Why not make CEO's personally liable for not putting the code through proper QC channels and selling it over-promised.

    Made to sell, not to use? Who's fault is that?

    B-)
  • by HeaththeGreat ( 708430 ) <hborders@mail.win.org> on Wednesday October 12, 2005 @10:18AM (#13772969)
    That proposal sounds fine, but then we should hold government leaders personally responsible for wrongdoings of government.

    I'd love to see the some jail time or a fine for Mike Brown after Katrina, or how about some jail time for Bush after the false pretences of Iraq?
  • by Anonymous Coward on Wednesday October 12, 2005 @10:18AM (#13772971)
    Want me to pay 10x more attention when I code?

    Pay me 10x more. And don't be in such a hurry for your product to get completed.
  • Re:Right.... (Score:5, Insightful)

    by Overzeetop ( 214511 ) on Wednesday October 12, 2005 @10:19AM (#13772975) Journal
    No, gun manufacturers should be liable for producing faulty safetys which do not function properly, or firing pins which may actuate without a trigger press.
  • Not coders fault (Score:5, Insightful)

    by Quasar1999 ( 520073 ) on Wednesday October 12, 2005 @10:19AM (#13772979) Journal
    It's usually poor management that forces the product to be out the door 6 months before it's ready. Either keep your job and release a buggy product or stick to your guns and get fired. I think it should be the company, not the individual developer held accountable. How the company handles things internally is up to them.
  • Right. (Score:5, Insightful)

    by Bozdune ( 68800 ) on Wednesday October 12, 2005 @10:19AM (#13772980)
    Sure, let's sue the pants off anyone who does anything wrong. Let's make it impossible for anyone to create anything new or different. Cradle-to-grave protection, ensured by armies of well-intentioned and socially-responsible attorneys -- that's the sure way to economic success!

  • by scovetta ( 632629 ) on Wednesday October 12, 2005 @10:20AM (#13772986) Homepage
    Not at all. It'd be like holding car manufacturers liable for defects that cause people to get hurt.

    And we do that today.

    Why should software be any different, except that writing bug-free software is probably just as hard as designing a "perfect" car.
  • by pturpin ( 801430 ) on Wednesday October 12, 2005 @10:20AM (#13772988)
    Nah, that requires too much effort. It is much easier to find someone whos name is tied to the code.
  • by Anonymous Coward on Wednesday October 12, 2005 @10:20AM (#13772990)
    CMM level 5 is no guarantee of quality! I worked in India and interviewed many a developer from CMM level 5 companies who were utterly useless. And this idiot who wants to make developers responsible for poor code - does he also advocate Ford or GM workers should be liable for cars that are easily broken into?
  • Re:Right.... (Score:5, Insightful)

    by TheRealMindChild ( 743925 ) on Wednesday October 12, 2005 @10:20AM (#13772993) Homepage Journal
    Only if the gun blew up and killed the shooter.

    Your comarison doesn't match because developers would be held liable for a skill that they present as "Professional". Similar would be making the brick layer accountable for a building coming down.
  • nonsense (Score:5, Insightful)

    by moz25 ( 262020 ) on Wednesday October 12, 2005 @10:21AM (#13772997) Homepage
    While I agree that accountability is a good thing, liability without major restrictions seems like a dangerous thing. I am a software developer myself and I give my clients the guarantee that all bugs they discover within 6 months will be removed free of charge. Since I have no knowledge of how much losses they will claim as a result from even trivial bugs (yes, some clients are greedy), accepting liability is not something I'm going to do.
  • Oh, yeah (Score:3, Insightful)

    by ceeam ( 39911 ) on Wednesday October 12, 2005 @10:21AM (#13773000)
    You can as well ban "software development" as a trade. After all - WTF? You get what you pay for. I say that your average "in-house" enterprise software system has complexity no less than Toyota Camry or something. The difference being that software would be developed by 1-10 men during a year or two whereas any other _industrial_ design costs (both in $$$ and "man/hours") much, much, much bigger. But who cares? Get back to coding, you idiots!
  • CMMI (Score:5, Insightful)

    by pdmoderator ( 63509 ) on Wednesday October 12, 2005 @10:21AM (#13773004)
    CMMI doesn't guarantee good practice any more than membership in the Better Business Bureau guarantees good business. But I'd rather work in a shop that has CMMI in place than one that doesn't. It's insurance against the sort of death marches that create slapdash practice, shoddy product, and security holes in the first place.
  • by killproc ( 518431 ) on Wednesday October 12, 2005 @10:22AM (#13773007)

    I am currently the Development Lead / System Architect at my company. In my experience, the majority of "issues" and or "bugs" that I have seen crop up have been directly tied to poor requirements gathering by our "Business Analysts".

    Often, it turns into a real pissing contest between the two groups. Usually, after testing reveals that the grand vision of the BA is a crock we will usually revert back to the original recommendation of the development group.

    Yeah, let's blame the developers for the problems. That's the ticket.
  • by mfifer ( 660491 ) on Wednesday October 12, 2005 @10:22AM (#13773009)
    The two need not be exclusive.

    One slightly contrived example...

    A house has a door lock that's poorly made. A burglar jiggles the handle and it falls off and the door opens. You can bet yer bippy that the lock manufacturer is gonna hear from the homeowner's lawyer(s).
  • What a dumb idea. (Score:2, Insightful)

    by mjparme ( 9020 ) on Wednesday October 12, 2005 @10:23AM (#13773023)
    So should we hold construction workers who help build a house that gets burglarized be held personally responsible?
  • by digitaldc ( 879047 ) on Wednesday October 12, 2005 @10:23AM (#13773031)
    If we are supposed to hold developers responsible for security flaws, why don't we hold politicians responsible when they give us false reasons for going to war, responding to disasters and evaporating budget surpluses?

    In the world of corrupted politics today, it is hard to find ANYONE accountable for ANYTHING. Why should it be different for everyone else?

    Just a thought.
  • Re:I can see... (Score:5, Insightful)

    by rovingeyes ( 575063 ) on Wednesday October 12, 2005 @10:24AM (#13773034)
    No kidding! If a car manufacturer produces a car that has a faulty part, is the engineer held laible? Hell no! its the company. You don't hear John Doe recalling the cars. Its GM that recalls it. Whether John is fired or not is a different issue and up to the company. Similarly the Software company is liable for the product. You blame Microsoft (sorry it was an easy target)!
  • nice sound bite (Score:1, Insightful)

    by romeo_in_blk_jeans ( 782924 ) <mythandraNO@SPAMjuno.com> on Wednesday October 12, 2005 @10:24AM (#13773038)
    The only thing that's happenening here is a nice sound bite that's engineered to sound good to the clueless masses but, ultimately, isn't meant to go anywhere or do anything. Basically, it's politics in action. "See? I'm tough on problems! I'm a go getter! I want to hold the developers personally responsible for the bugs they write!" Whatever.
  • Bah (Score:2, Insightful)

    by kpat154 ( 467898 ) on Wednesday October 12, 2005 @10:25AM (#13773048)
    This is just what the software industry needs: Another business guy who has never written a line of code trying to tell the rest of us how to do our jobs. For all of the whining and crying about bad software you'd think they'd actually put the developers in charge for once. I can't speak for the industry as a whole but from my perspective 70% of the problems in the development world come from business types setting impossible deadlines and failing to listen to their developers.
  • by Skye16 ( 685048 ) on Wednesday October 12, 2005 @10:25AM (#13773050)
    We hold them liable for defects that cause people to get hurt.

    If you're going to attempt to compare apples and oranges, let's at least use an orange colored apple, shall we?

    It'd be like holding car manufacturers liable for not making a car absolutely impossible to break into.
  • by 91degrees ( 207121 ) on Wednesday October 12, 2005 @10:25AM (#13773051) Journal
    Hold the vendors responsible. They are responsible for 100% of all problems that are not the fault of the customer.

    The vendor then holds the devloper responsible. They are responsible for 100% of all vendor bugs that are not the responsibility of the vendor.

    The developer then holds the programmer responsible. He or she is responsible for 100% of all developer bugs that are not the responsibility of the developer.

    It's the way it works everywhere else. If you have a faulty product, you take it back to the shop. They then take it back to the manufacturer and if it's a fault caused by a specific individual, they either sack him or train him properly. The purchaser would generally not sue the guy on the production line or the designer, even if it was their fault.

    There are good reasons for doing things this way. It preents people from passing the buck. It means each entity along the line is wholly responsible for ensuring quality.
  • Liable for what? (Score:4, Insightful)

    by mccalli ( 323026 ) on Wednesday October 12, 2005 @10:26AM (#13773057) Homepage
    For bugs in the code you write? For bugs in the compiler which compiled it? For bugs in the operating system which ran the code? For bugs in the design of processor which executed it? For impurities in the particular processor the code was run with which caused it to malfunction at a certain clock speed?

    Nonsense.

    Cheers,
    Ian

  • He can't afford it (Score:5, Insightful)

    by samjam ( 256347 ) on Wednesday October 12, 2005 @10:26AM (#13773059) Homepage Journal
    Few people on this planet can afford software developed to such a standard.

    There will always be a market for "cheaper" software that is not guaranteed to such a level, and with support contacts instead, where developers will try a moderate ammount to fix problems as they arise.

    From another perspective, the market is demanding of cheap software - not good software, which is why there is so much of it.

    Sam

  • Dear Mr Schmidt (Score:1, Insightful)

    by Anonymous Coward on Wednesday October 12, 2005 @10:28AM (#13773075)
    Dear Mr Schmidt,


    Thank you for you insightful comments regarding security flaws in code. As a well regarded member of the 'cyber-security' community, I find your perspective to be quite fascinating.


    No doubt, in your long years as the former head of security with this community's favourite software development company, Microsoft, you gained much valuable experience in developing secure code.


    I am not entirely clear how you envisage this 'personal liability' working in practice. Should we perhaps lien a programmers personal property, dwelling and car as soon as he or she begins development of software? This will no doubt have the beneficial effect of attracting many new recruits to this fun and exciting industry.


    Might I also suggest, whilst we consider matters of personal responsibility, that we hold politicians and their appointees personally responsible their actions. There is the small matter of the US national debt, that I am sure we could sit down and discuss at some length.


    Kind regards,



    Anonymous Coward

  • No problem (Score:1, Insightful)

    by Anonymous Coward on Wednesday October 12, 2005 @10:28AM (#13773080)
    But... You no longer get to dictate any kind of timeframe for completion. It will be done when I'm certain that it's perfect.

    Deal?
  • by rovingeyes ( 575063 ) on Wednesday October 12, 2005 @10:29AM (#13773092)
    Holding companies accountable, as suggested in the article, might be a slightly better solution, but again it's somewhat complicated when you start trying to hold an overseas company accountable

    You don't hold overseas companies accountable, its not our job. We hold local companies accountable. They received the money from us. We don't care how they spend it or don't spend it. Normally these companies don't tell you upfront that they are the middle man. If they do that then their accountability is diminished. But in reality most of these companies say they are producing the code, have their licenses and brand name on them. So you just hold them accountable. If a software screws up they pay not the overseas company.

  • Flawed Premise (Score:2, Insightful)

    by GogglesPisano ( 199483 ) on Wednesday October 12, 2005 @10:29AM (#13773094)
    To put the entire blame on the developer misses the point.

    While programmer ignorance, incompetence and/or laziness certainly plays a role in the problem, there are other factors that should be considered:

    (1) Death-march-style deadlines imposed by management, leaving no time for proper design, threat modeling, or testing.

    (2) Security flaws in the underlying infrastructure (operating system, network, etc).

    (3) Malice/stupidity of authorized users to bypass established safeguards.

    Security is the responsibility of everyone involved in the creation, management, and use of a system, not just the hapless developer.

  • by Thud457 ( 234763 ) on Wednesday October 12, 2005 @10:32AM (#13773128) Homepage Journal
    1. What about slipshod companies that don't have proper processes in place to test & verify code before they ship it?

    2. What about laissez-fair management that ignores any such processes that are in place so to ship code on some arbitrary market-driven deadline?

  • Full of "Schmidt" (Score:5, Insightful)

    by guitaristx ( 791223 ) on Wednesday October 12, 2005 @10:33AM (#13773147) Journal
    This is absolute bunk! Most often, programmers would have a 5-10% stake in responsibility when compared with the mountainous bureaucracy above them. Consider how often a non-technical exec overseeing a software development project will agree to a contract that is nigh impossible to complete on-time. The customer holding that contract begins squeezing testicles, placing pressure (by extension, through the bureaucracy) on the entire development process. The exec says, "You mean there isn't a programmer writing or debugging code this very instant!? What a crime! You're not doing your jobs properly!" The truth of the matter is that ~30% of the project timeline should be research and design. Without a good design, and resources on-hand, bugs creep in. It is impossible to test quality into software, it must be designed in.

    Programmers don't draft contracts, they don't set deadlines, they don't make budget decisions, and certainly aren't responsible for failing to keep bugs out of a system that was (due to poor decision making in the aforementioned areas) designed to have bugs.
  • Re:CMMI (Score:5, Insightful)

    by ShieldW0lf ( 601553 ) on Wednesday October 12, 2005 @10:34AM (#13773172) Journal
    CMMI doesn't guarantee good practice any more than membership in the Better Business Bureau guarantees good business. But I'd rather work in a shop that has CMMI in place than one that doesn't. It's insurance against the sort of death marches that create slapdash practice, shoddy product, and security holes in the first place.

    That's where this sort of thing leads: insurance.

    If something like this were to happen, there would be an immediate chilling effect on software development, followed by liability insurance policies similar to what doctors have. Software developers would start having this insurance, and then when the end users start making claims, the mighty insurance companies will simultaneously raise their rates and use their financial and political powers to buy laws that cap their liability.

    Developers pay money, insurance companies get money, end users get screwed, politicians and executives get rich. This is called "building economic value".
  • Re:CMMI (Score:1, Insightful)

    by Anonymous Coward on Wednesday October 12, 2005 @10:38AM (#13773212)
    Insurance and corporations. All developers would just shield themselves in the exact same fashion that CEOs do: hide behind a few pages of paper entitled "Articles of Incorporation."
  • Re:Right.... (Score:1, Insightful)

    by jimbolauski ( 882977 ) on Wednesday October 12, 2005 @10:39AM (#13773215) Journal
    I sure you have heard of contractors being suied or put in jail because they didn't folllow code. The difference between software and construction is that their are rules that govern construction, even personal sheds must meet certian codes. I don't think that there are any regulations on software. Athought if regulations were put in place cost of software would rise.
  • by Proaxiom ( 544639 ) on Wednesday October 12, 2005 @10:41AM (#13773242)
    The company writting the code should be responsible for organizing such things.

    You got it right. Producing good code is a complicated process, not something one person can do. You need controls. You need reviews. You need methodical testing.

    Why blame the developer who wrote the buggy code, and not the tester who missed the bug? What about the designer who produces a complicated bug-prone design?

    Good software is a collaborative effort. You need a lot of people who know what they're doing working within a good process. Singling one person out in the system is misguided.

  • Profession (Score:5, Insightful)

    by archeopterix ( 594938 ) * on Wednesday October 12, 2005 @10:43AM (#13773261) Journal
    Merely holding developers accountable won't do anything without big, big changes in the software industry. Look at people who are personally accountable for their fuckups - medicine doctors. There are several distinct things about them:

    1. You cannot become a doctor without long theoretical and practical training, intermixed with hard exams. All this is heavily regulated. To become a coder, you just have to pass a job interview. Software engineering certifications are optional and generally regarded worthless.

    2. Doctors are insured against malpractice. The costs are high, and generally passed on to patients.

    3. Doctors can choose not to operate (administer drugs, etc.), if the action constitutes malpractice. In software industry it's "use this braindead tool, or get fired".

    4. Malpractice. Ok, today's revolutionary therapy, maybe tomorrow's malpractice (or vice versa), and experts might disagree about some practices, but there is some sort of general agreement on what constitutes malpractice. I'm not sure whether IT is mature enough to speak of "malpractice" here.

    To sum it up: yeah, you can make developers liable for their mistakes, but the consequences would be huge. The costs of IT would skyrocket. Are you ready to pay for that?

  • by Chas ( 5144 ) on Wednesday October 12, 2005 @10:47AM (#13773304) Homepage Journal
    Dammit!

    I'm so fscking SICK of these people who treat as if it's something that can be permanently gained by doing A, B, and C.

    BULL!

    Security is about understanding your platform.
    It's about knowing the strengths and weaknesses of said platform.
    It's about maximizing the strengths and limiting/minimizing the impact and exploitability of the weaknesses.

    It's about doing A, B and C, to get going. Then next week, you do D and E. Then think about implementing F. But make sure that it doesn't conflict with B.

    Also, they need to understand that security is NOT about keeping people out of the system. Face it. If someone wants to get into your systems bad enough, they WILL get in. Regardless of your protections.

    It's about making it so difficult to access it in an unauthorized manner that:

    A: The invader gives up and moves on to easier targets.
    B: Spends so much time trying to gain access that he gets noticed eventually.
    C: Has to utilize truly heroic (and traceable and wildly obvious) means to gain access that he gets noticed right away.

    So please, people! STOP with the damn pipe-dreams about "totally secure" systems already!

    The only "totally" secure system is one that's been rendered down to shavings and disbursed in random geographic locations via wind, water, and other means of distribution.
  • by jellomizer ( 103300 ) * on Wednesday October 12, 2005 @10:48AM (#13773318)
    1. We can pass the blame to any bugs in libraries or other peoples code that we use to them or if there is a bug in the operating system, because we followed the specs of the 3rd party tool but the 3rd party tool is not working up to specs.

    2. We get paid for the full development cycle, and no pressure to get it done on time, or even close.

    3. If the Specs for the application never changes from the writen specs of the application before it is written.

    4. We are not responcible for any flaws that happen in old versions when there is a newer version out there.

    5. The Latest version of the Application is younger then 3 months.

    6. The application went threw full debugging and testing for 2 years with at least 10 people per line of code.

    7. The application doesn't try to keep compatibility with an older system.

    8. Is used on hardware the specs were approved in and were created before the release of the application.

    9. And if the developer wants to support it.

    When developing a Car or builing a house, there is a lot more prework that goes in they know what they want and how it works before they build it. Programming right now is not setup like that because it is to expensive for a single application or a custom application. Plus it will make more people decide not to be a programmer if they are responcible for every code they ever wrote.
  • by KWTm ( 808824 ) on Wednesday October 12, 2005 @10:50AM (#13773340) Journal
    My first reaction was: I wonder which lobbyist of a Large Software Company helped put this one through?

    The programmer is personally liable, but the big corporation who employs him/her profits from the work? Wasn't the whole point of creating a corporation to put a degree of separation into liability?

    Also, even if A Large Software Company promised to protect their own employees (some liability insurance as part of the benefit, say), this would still be bad news because it discourages independent programmers and coerces everyone into joining A Big Corp.

    A better idea would be to make it optional, like certification by a licensed Software Engineer. Just like, for example, how you could build your own toolshed with wood and hammer, but to build a house, you have to get a Licensed Inspector or be a Licensed Civil Engineer or something. (Details fuzzy, but you get the idea.)

    Okay, now to go RTFA.
  • Re:I can see... (Score:5, Insightful)

    by MasterOfUniverse ( 812371 ) on Wednesday October 12, 2005 @10:51AM (#13773347)
    Exactly. If a software makes $1 million. Do the developer(s) get the million? Then why should they be held responsible for the "loss" and not profit?
  • Re:Sheesh! (Score:5, Insightful)

    by arkanes ( 521690 ) <arkanes@NoSPam.gmail.com> on Wednesday October 12, 2005 @11:12AM (#13773554) Homepage
    Unfortunately, it'd also completely destroy the very strong non-professional softare development community. Not just OSS either, but shareware, hobbyists, even personal development. The tools required to do software development, like a compiler, would be enormously more expensive. So the question is whether the cover of professionalism is worth the impact of essentially destroying the amateur community, and whether the economic gain of (maybe) better software is worth the massively increased price of software development, the essential extinction of low-price shareware, and the loss of the freedoms that OSS provides, notably the push to open standards that OSS drives. We would eventually have a "big 3" (or maybe 5 or 6) of software development, just as we do with automobile manufacturers, to the detriment of the consumer.
  • Re:CMMI (Score:5, Insightful)

    by Danse ( 1026 ) on Wednesday October 12, 2005 @11:13AM (#13773569)

    Yes, but if the hypothetical law was written that the coder was responsable, as recomended by the ex-cybersecurity czar, it wouldn't matter how many levels of incorporation you hid behind.

    Well, it would probably eliminate at least 90% of the software being written, since there aren't many coders who would want to be held personally responsible for flaws in the code, especially since it's usually a complex team process where they don't always have the final say in the outcome. So I guess that would reduce the overall number of bugs, right? :) Seriously though, I think this guy is barking up the wrong tree. You can put methods in place to improve software quality, but I don't believe it's possible to produce perfectly secure software, of anything more than very basic complexity, in a timely manner and for a price that people are willing to pay. Feel free to prove me wrong, but I haven't seen it done yet.

  • by MerlinTheWizard ( 824941 ) on Wednesday October 12, 2005 @11:14AM (#13773574)
    So, not content with the actual state of things (an ever decreasing number of young people willing to chose a career in the software development field, at least in most developed western countries), the guy wants developers to be held responsible for the bugs they introduce (and as if that was that easy to determine): so basically no one will want to be in the field anymore, which will just make all things worse and force us to outsource every development work even more than we already do. Yes, all in all, a great move indeed.
  • Contempt. (Score:5, Insightful)

    by CDPatten ( 907182 ) on Wednesday October 12, 2005 @11:17AM (#13773600) Homepage
    Programmers are not a parallel to automotive makers; they are a parallel to Authors, Book writers. Can you think of anything more absurd then suing an Author of a book over typos? Or the reviewer of that book who says "this is the best book of the year" and you thought it was the third best?

    This is the same reason patents on software are ridiculous, can you patent a love story plot? It's just absurd. This is another example of our society's run-away liberal government mentality. Big government stifles creativity, freedom, and crushes capitalism.

    A case like this should be thrown out of court as a frivolous lawsuit and the lawyer held in contempt, but we won't get that from activist judges.
  • Re:CMMI (Score:3, Insightful)

    by LeonGeeste ( 917243 ) * on Wednesday October 12, 2005 @11:18AM (#13773609) Journal
    Hold on - insurance is actually a good idea. That way, clients get compensated for bad product, and developers pay premiums based on their history. The liability insurance problem with doctors is a problem of the legal system, not with insurance itself. Payments are so widely varying, and probably partially due to jury's emotionalism, but more likely due to the fac that they have nothing to compare it to. If you break a vase for $10,000, they award $10,000 + admin. costs. It's really simple. But people are not allowed to negotiate with doctors before operations: "I will pay you for surgery, but only if you agree to pay $X to my family if you kill me, $Y if you lob off a limb, etc." so juries can never know what a "reasonable" payment is.

    If payments are widely varying, and, as is the case, dependent on a doctor's wealth, NOT actual harm done (rich doctors pay more) it's extremely difficult to insure, and, even worse, becoming a better doctor won't lower insurance premiums! (This is because everyone will make some mistake at some point, and at that point, the jury will award an amount closer to the doctor's net worth, meaning over time, bad doctors pay the same as good.)

    The way to solve this is to agree to a specific schedule of payments if there are bugs as part of a contract to develop code. This avoids all the problems you describe above (like trying to get out of liability) and keeps down insurance costs for good coders.
  • Re:CMMI (Score:5, Insightful)

    by Impy the Impiuos Imp ( 442658 ) on Wednesday October 12, 2005 @11:19AM (#13773630) Journal
    Can we make the politicians responsible for the consequences of their actions?
  • Re:CMMI (Score:2, Insightful)

    by Impy the Impiuos Imp ( 442658 ) on Wednesday October 12, 2005 @11:27AM (#13773685) Journal
    I have seen severe, fundamental design flaws get through to nearly the end of a project, passing multiple code and design reviews, and that's without a sinister agent out there trying to defeat you.

    It cannot be guaranteed.

    People who plan for long-term storage of nuclear waste have as rule #1 that they acknowledge they cannot design a system that will defeat people determined to break in. If the army protecting it goes AWOL over the centuries, as happened at the great pyramids, well, ...

    Even if you could prove mathematically a system was secure, there's still the social engineering aspect. Which, I see from various news stories, seems to account for a good chunk of these security "lapses".

    And I don't think you could prove a system mathematically secure "in general", anyway, as people data must go over a network, and people can crack encryption given enough time.

  • Re:CMMI (Score:5, Insightful)

    by Valdrax ( 32670 ) on Wednesday October 12, 2005 @11:30AM (#13773699)
    Ok, developers pay money, insurance companies get money. So, how does this screw end users? Software developers would be forced to write more secure codes to avoid crippling insurance rates. How do politicians and executives get rich, any more than they do already?

    Three words: Medical malpractice insurance. Take any side of this issue you want. In the end, patients get screwed somehow. You want this for software?
  • by Quasar1999 ( 520073 ) on Wednesday October 12, 2005 @11:33AM (#13773738) Journal
    I think you gave away a little too much. Besides, I was under the impression that if a developer checks in code that breaks the build they are put into 'developer jail' to prevent them from screwing up more of the code. Did this not happen in your example? If not, it defeats the purpose of 'gates' in the first place. I used to work for a company that was contracted out to do work at your former company. Needless to say the 'gate' system was totally stupid, as the person in charge of integration simply changed the requirements until our buggy code passed through the gate (redefined the requirements until the bugs were in code not mandatory). Go figure. Took me a few months to find another job and quit, but I know exactly how you feel.
  • Re:Deadlines (Score:3, Insightful)

    by tomstdenis ( 446163 ) <tomstdenis AT gmail DOT com> on Wednesday October 12, 2005 @11:37AM (#13773770) Homepage
    Yeah that's the point. Don't take money to do bad things.

    If your employer says "get this ready tommorow or you're fired" you're probably not at a good place anyways. And if enough people held such personal convictions the employer would have no choice.

    That said, all too many developers don't do their share of documentation or proper development practices. how many developers write doxygen/javadoc comments? How many developers verify their code? Write use cases at least? etc...

    There should be liabilities for software developers. Otherwise what are they worth if any "monkey" can develop software?

    Tom
  • Re:CMMI (Score:2, Insightful)

    by 'nother poster ( 700681 ) on Wednesday October 12, 2005 @11:42AM (#13773804)
    no. Whenever we try, they pass a law granting themselves imunity. Funny how that works.
  • by willCode4Beer.com ( 783783 ) on Wednesday October 12, 2005 @11:44AM (#13773827) Homepage Journal
    Lets not forget that nobody has really figured how to manage software development while the demands of software keep going up.

    Microsoft (in days of old) was criticized for raiding the top developers from other companies and universities. So with the top developers in the world we got Windows, Office and IE. (I don't think there is a need to say what people think of the quality here.) Google, now is the one raiding the top coders yet, they are still producing some buggy code.

    If the best in the business can't produce secure bug-free software, how is anybody else? Granted, we should all strive to make the most secure and bug-free code possible. But, I really don't think it will be a common practice until the management of the process is figured out.
    We've seen waterfall fail, over and over and over and over ....
    RUP, while an improvement, still falls short.
    Agile (XP, etc...) tries to address some realities of development but, it still doesn't really manage it.

    Still, we do see some really good software pop onto the scene every once and a while. Even this is a symptom. The same groups who produce these gems often fail to repeat the process on other projects.
  • by tomstdenis ( 446163 ) <tomstdenis AT gmail DOT com> on Wednesday October 12, 2005 @11:45AM (#13773835) Homepage
    I think you missed an important engineering concept known as the "tradeoff". Usually 100% verification of software is just not possible [e.g. prove me your implementation of AES works for EVERY plaintext/key combination]. And bugs are normally not intentional but simply in use cases developers/coders haven't explored.

    So while I agree with the sentiment that bug free software is possible this notion that no software [or hardware for that matter] should never have a bug is ludicrous and isn't even reflected by the "real engineers" [e.g. people who build buildings, roads, bridges].

    Tom
  • by ShieldW0lf ( 601553 ) on Wednesday October 12, 2005 @11:48AM (#13773859) Journal
    The best medical care system in the world comes from keeping the insurance companies out of it. Instead of paying a team of 12 to determine if you should be healed, then paying a doctor to heal you, we just get the doctor to heal you and call it a day. It's much more efficient, and enriches everyone except insurance salesmen.

    Insurance was created as a concept to deal with the fact that in a purely capitalist society there is no sense of community or common good and no one will help you when you need it most. Does anyone actually consider it to be an efficient and effective means of addressing this need?
  • by ctid ( 449118 ) on Wednesday October 12, 2005 @11:48AM (#13773871) Homepage
    If the law simply said that software products much be tracable to a company with more than X% (for X>80) software developers who are certified, and that the certification must be in a territory where the sanctions are credible, you'll see off-shoring end. Grandfather in existing developers and you're all sorted.


    Wouldn't this utterly destroy the Free Software movement? (Incidentally, we'd probably lose the internet too).

  • Re:CMMI (Score:2, Insightful)

    by wbradney ( 922339 ) on Wednesday October 12, 2005 @11:49AM (#13773876)
    Right. I'd be happy to take personal responsibility for the code I write for my employer, as long as I get: a) a direct and substantial cut of the profit and b) the code I write belongs to me, not them, and I take it with me when I quit or get fired. I will, of course, licence it back to them for a 'reasonable' fee...
  • Re:CMMI (Score:3, Insightful)

    by sedyn ( 880034 ) on Wednesday October 12, 2005 @11:52AM (#13773907)
    Exactly. I forgot about the owning of code part...

    Hell, then they really don't want to piss me off. Then I'll just GPL it and make ownership topple like dominos.
  • Re:CMMI (Score:3, Insightful)

    by dwandy ( 907337 ) on Wednesday October 12, 2005 @11:52AM (#13773909) Homepage Journal
    This is only a good idea if you're an insurance company, since they are the only ones guaranteed to make a profit on this.
    developers pay premiums based on their history.

    I don't know how it works where you are, but 'round here people pay car insurance based on how everyone else drives (factors like age, gender etc can play an enormous role in the rate, regardless of the drivers own record)

    The liability insurance problem with doctors is a problem of the legal system, not with insurance itself.

    ...and that very same legal system will preside over these claims.

  • Re:CMMI (Score:3, Insightful)

    by Kortec ( 449574 ) on Wednesday October 12, 2005 @11:55AM (#13773930) Homepage
    I agree that obviously this sort of a development, if you'll excuse the pun, would lead to the need for software malpractice insurance, but this is by no means any sort of solution. It's a decently well documented fact that the malpractice insurance costs for medical insurance are driving many out of the profession. On the anecdotal level, I'm personally aware of people who have stopped doing more risky procedures, root canals in the case I'm thinking of, just to lower their insurance bills so they can stay in operation. (For a less anecdotal approach, there's some documentation here [wikipedia.org], and lots here [google.com].) Does this translate to programmers only using languages and operating systems deemed "well supported" by some bean counter, and therefore less risky? Forget about the IP debacle, can we even begin to quantify the sort of effect this would have on innovation and technical advancement? Taking risks and trying out new things is the very soul of technical work, and working with unstable material is the most efficent way to make it stable.

    On a more pragmatic level, there are a number of differences between the more traditional professions to be held liable and that of the code-monkey. Most important, to my mind, is that for the larger firms, much work seems to be done in teams, so tracing down what exact individual is personally and exclusively responsible for a specific bug would be computationally expensive on a grand scale. This, therefore, would seem to point to a larger corporate liability, which I guess is fine for the truly larger corporations, but could kill a small company or an open source group without a second thought. I have yet to see any large company (*cough* Microsoft *cough*) actually being held truly responsible for their mistakes and bugs, so this obviously hasn't happened yet.
  • by Anonymous Coward on Wednesday October 12, 2005 @11:56AM (#13773932)
    Amen, Brother.

    In trying to maintain some sanity in the development process, I am constantly reminding project managers that I will be the one held responsible for errors, despite the fact that I didn't write the ambiguous specs, set an unreasonable schedule, and under-estimate the required QA resources. This is just another example of the misunderstanding that most people have of the software development process - coders get blamed for bugs and managers escape accountability (and get promoted for getting projects out "on time").
  • money (Score:5, Insightful)

    by willCode4Beer.com ( 783783 ) on Wednesday October 12, 2005 @12:00PM (#13773970) Homepage Journal
    And who is going to pay for this?

    We create a "secure" web browser but, its gonna cost $10K per copy. This will cover the cost of developemnet, security auditing, extra QA, and the dev cycles that go along with it. Since, the OS can't be trusted to run the browser, it will only work on a dedicated browsing computer with no operating system. Since other peoples code poses a risk, it will not run javascript, java, flash, or any kind of plugin.
    Who would buy this?

    If developers are carrying malpractice insurance, then the insurance companies are going to have a lot to say about how development is done, and *if* it should be done. Your boss hands you a project specification, you send a copy to your insurance co. You then tell your boss that you can't work on his project because you won't be covered.

    Developers are going to have to charge a lot more for their services. Both for the personal risk involved and to cover the cost of insurance.

    Programs can be made "more" secure and have "fewer" bugs but, its going to take more time. Time=money. Look how eveybody is whining that Microsoft is taking too long for the next version of windows. Maybe if they want it to be *secure and bug free* they'll tell MS not to rush; to take a few extra years to be sure about the product; and they'll pay more for it.
  • by Marsala ( 4168 ) on Wednesday October 12, 2005 @12:11PM (#13774056) Homepage

    Actually, you're not far off the mark here.

    In any company, there is one person and one person alone who's responsible for the defective product -- the CEO. If payroll isn't met on time, that's the CEO's fault. If someone gets mugged out in the parking lot because there wasn't adequate lighting or your building security was nonexistant, that's the CEO's fault. If there's no toilet paper in the bathroom, that's the CEO's fault. If the company fails to meet its sales expectations, whether it's because the sales staff sucks, the marketing staff screwed up their job, or the engineers decided that the 40% chance of the widget blowing up and unleashing a bolt of lightning into the frontal lobe of the user was "good enough"... that's the CEO's fault.

    The CEO is the chief executive officer. He's responsible for everything that his company does and everything that happens at the company. The weight of the world is literally on his shoulders, and it's why he gets the big bucks, the golden parachute, and the nice office.

    It's also why it's his responsibility to make sure that the developers that get hired by his company have either been trained properly or get trained properly. It's why it's his responsibility that project managers know what the hell they're doing and make sure that when you design "end-to-end solutions" that they don't have gaping security holes like customer data passing into the accounts payable system in clear text. It's why it's his job to hire a CTO that understands all of this and can hire the project managers and programmers necessary to do the job right.

    Schmidt is trying to pass the buck for his mistake. It's as absurd and cowardly as a general trying to say he lost a war because his soldiers didn't fight hard enough, or an NFL coach blaming his kicker missing a 3 pointer for causing the loss.

    If he was serious about getting it fixed, there'd be a lot less whining to a trade rag trying to pin the blame on his employees, and a whole lot more fixing it.

    Leadership for the win. \o/

  • by AstroDrabb ( 534369 ) on Wednesday October 12, 2005 @12:11PM (#13774058)
    I agree 100%. I think all companies should be liable for their products. However, I do not think it should be at the individual employee level. After all, the point of a fictitious entity know as a "corporation" is to remove personal liability. If one employee causes a bad product, well fire that employee. However, in the end it should be the "company" that is liable.

    If Ford has a car with faulty steering that locks and causes me to be in a very bad accident, should Ford be liable? IMO, yes. Should the engineers be personally liable? IMO, no. It is up to Ford and their management to hire competent employees and competent management to make sure those employees put out a safe product.

    Imagine what would happen if people were allowed to sue an individual employee because of a faulty product. The cost of labor for _any_ technical job would go through the roof because those, engineers, developers, machinists, etc would all need to buy personal liability insurance, just like doctors have to. One of the reasons doctors _have_ to charge so much here in the USA is because of insurance costs to protect them against sue-happy lawyers and people. Top surgeons can easily pay $100,000+ a year just for insurance!

  • Re:CMMI (Score:2, Insightful)

    by cenobyte40k ( 831687 ) on Wednesday October 12, 2005 @12:13PM (#13774073)
    You can just as easily pay someone to certify that the closed source software does what you want. I have a lab just for that here at work. Nothing gets rolled out until it is completely and totally tested. You really need to stop looking at this as some way to get rid of companies that you don't like and think about it logicly. How about this. Ford publishes the blueprints for all it's cars and truck every year, most major automakers do. (You can pick them up at most auto parts stores) These cars are very much less complicated than say SQL, Linux or Office, so it should be easier to figure out if there is a problem. However I am going to bet that A) you don't get the manual and check over everything about your car and you don't pay anyone to do it either and B) you would expect to be able to sue if the car has some horriable problem that is dangerous. With your logic because they publish the blueprints you should not be able to, which is well just silly.
  • by DavidpFitz ( 136265 ) on Wednesday October 12, 2005 @12:18PM (#13774117) Homepage Journal
    2. We get paid for the full development cycle, and no pressure to get it done on time, or even close.

    Get real! No pressure to get it done on time? What other engineering discipline would this be acceptable in? None. "Sorry sir, your bridge is not built yet - but we don't feel pressured to complete it in the timeframe we said we could do it in".

    3. If the Specs for the application never changes from the writen specs of the application before it is written.

    The world changes. Deal with it. Or be unemployed. Requirements change, just a fact of life.

    4. We are not responcible for any flaws that happen in old versions when there is a newer version out there.

    What?! So, never retro-fix a serious security flaw into a product just because a newer version exists? So, in your world you would just give your customer the one fingured salute and tell them to upgrade? You'll find yourself without a customer.

    5. The Latest version of the Application is younger then 3 months.

    I have no idea what you mean by this. Are you saying you would not take responsibilty for anything you produced which is older than 3 months?!

    6. The application went threw full debugging and testing for 2 years with at least 10 people per line of code.

    This is engineering. There is an acceptable level of failure.

    7. The application doesn't try to keep compatibility with an older system.

    Fantastic! I'll just tell the Bank who consult with me that they need to upgrade every system they have becasue our new application doesn't like talking to anything which is not 0 day. I can see that going down well.

    8. Is used on hardware the specs were approved in and were created before the release of the application.

    Yep, valid point. This should always be the case. Although now you can't beef up the servers in case of performance problems, even though this is the cheapest way to do it.

    9. And if the developer wants to support it.

    Now you're just making me laugh!!

  • There are so many (Score:3, Insightful)

    by niiler ( 716140 ) on Wednesday October 12, 2005 @12:21PM (#13774148) Journal
    Points of failure in any software, that it is impossible to know who to blame.

    For example. Today, I set up HPLIP for the first time instead of HPOJ for my PSC2110. What a pain. I had no problems configuring or making, but then there was an issue when I tried installing. Clearly the HPLIP programmers' fault, right? Or was it that I was using a Slackware derivative with a mixture of packages and as a result, many libraries and config files were in non-standard places? I would have guessed that if ./configure && make worked, everything was found properly. But it wasn't. If my nonstandard config was the problem, then perhaps I'm responsible. Eventually I got everything working but with one caviat. I could only scan as root.

    In the real world, if this happens to a litigious happy individual who likes to bill $400/hour, he'll sue:

    • The distro - for not giving directions, or having the package properly precompiled for exactly his system
    • Slackware - for not providing a compatible package (the reasoning being that if the distro is Slackware-based, then Slackware must assume some liability)
    • The hpij developers, since this could have been an issue
    • The cups developers, since this could have been the issue
    • The kernel developers, since this could have been the issue
    • HP, since their driver didn't work instantly in the desired way
    • etc...
    • I'm sure I've left someone out. Anyhow, considering the sense of entitlement most people who can hire lawyers have, this is not a path that we want to walk down. Each possible point of failure would become the target for a lawsuit when the real failure might be summed up as a case of not RTFM.
  • by RobinH ( 124750 ) on Wednesday October 12, 2005 @12:21PM (#13774157) Homepage
    While this may actually be feasible for shrink wrapped software that sells a million copies and has a team of expensive testers going over it button by button, this would completely destroy custom programming.

    I write software that is usually only run on one or two computers at one location, and it's constantly modified to add features, fix bugs, etc. Our company and our customers can't afford to pay triple the cost for the stringent software testing that a huge Micro$oft type place would have, so a law making the programmers personally liable would make all custom software prohibitively expensive.

    We do sell our code with a 1 year warrantee, so we agree to fix all bugs that come up within the first year. However, the agreement is not a guarantee. If there is a bug, we agree to fix it, but we're not going to compensate the customer for lost production or expenses.

    There is software in this world (I'm thinking the QNX kernel here) that actually comes with a guarantee that it works as documented. The company (QSSL) has liability insurance just in case. Of course, that makes QNX licenses more expensive than they would otherwise be.

    Most software comes with a disclaimer. Microsoft tells you that the user accepts the liability for any bugs. Even though nobody reads that disclaimer, it still exists. Right now you have a choice - you could hire someone to write code and give you a guarantee (expensive), or you could just buy something off the shelf (cheap) that would probably work ok most of the time. The article is talking about removing that choice.
  • Not a good idea. (Score:3, Insightful)

    by unoengborg ( 209251 ) on Wednesday October 12, 2005 @12:23PM (#13774188) Homepage
    People/companies are not writing bad code because they are sloppy or doesn't want their code to be secure or correct. They write bad code because there really is no way ensuring the security today. If there were, price insensitive things like battle ships would not be dead in the water because of software error. I suppose you could make code reasonably secure for certain certified environments e.g. Running a certain build of MS-Office on a certain build of Windows XP in a certain hardware in a specified configuration.

    What if the user doesn't run it under the conditions specified e.g. connect it to the internet and internet was not covered by the specification should the developer be liable then? Of course you could hold the developer liable no matter what. But that would put software development in a different position than all other products. E.g should a building contractor of a high building be held responsible for the damage to a parked car outside the building caused by somebody jumping from the roof in the act of committing suicide? I think not, even though the errors in building construction making this possible and the means to fix them is much more evident than most software problems.

    The only thing that will happen if this was introduced is that software prices would go up radically as software companies or individual developers need to make sure the make a profit even if they have to pay damages now and then. I.e. the price of the software will have to pay more lawyer and insurance fees. If this is introduced in a country the cost of running a business will increase significantly, and I am not just talking about software business. How many businesses would afford to have the cost of their IT infrastructure increased by several orders of magnitude. A country that introduced such laws would kill all business that need some kind of IT support, at least if it did not also have very high customs fees or taxes for imported products and services.

    As for the software industry of such a country you would probably see fewer and bigger companies with the money to bury customers claiming their rights in legal process for a very long time perhaps until they go out of business before they get their money. The fact that there was fewer actors in the market would in itself raise the price of software due to less competition. It would also slow down the speed of development. If you for instance create a new version of an office productivity suit, you would probably want to test it for several years on a group of subjects that have waived all their legal rights before you release it to the general public. Then you would like to profit from that investment for a very long time. Perhaps 20 years or so.

  • by Sycraft-fu ( 314770 ) on Wednesday October 12, 2005 @01:01PM (#13774559)
    My car is buggy, very buggy by software standards. Here's a list of just a few of it's bugs:

    1) It is not resiliant to attacks. If someone wants to break in and steal it, it's very easy to do. Trivially easy to someone with training. The manufacturer has done NOTHING to fix this. In fact, all suggested solutions are just bandaids, they don't really do anything. Stronger glass, a kill switch, the Club, all are easily defeatable. They offer me no absolute security against attacks.

    2) My car does not deal with user error very well. If I put it in neutral and floor it, the engine will overheat and seize up, no cut out. If I poot toothpaste in the oil tank instead of oil I'll ruin the engine. There is virtually no protection against me making mistakes, and many of the mistakes will permenatnly disable the car.

    3) My car doesn't handle unexpected situations well. If it suddenly hits a brick wall, it will be damaged or destoryed, same if another driver suddenly collides with me. It only operates properly under normal circumstances.

    What's worse? They KNEW about all these problems from the car's inception. They sold it to me, knowing these problems, and are doing NOTHING to fix them! Even upgrading to a newer version of my car (for which I must pay full price) won't fix them.

    So I feel it absurd to attempt to say "We have to hold software to the same standard as cars" and by that mean that software should be perfect. Cars aren't perfect, by software standards they are buggy peices of shit. I expect that software should be essentially immune to any malicious attacks. If a flaw is found, I expect it fixed in a timely fashion for no charge. Likewise, I expect software to deal with user error well and not blow up if I do something wrong. However if I told you I wanted a car that did all that, I'd be laughed at.
  • by Phisbut ( 761268 ) on Wednesday October 12, 2005 @01:06PM (#13774605)
    If something is advertised as being secure, then it had better be secure. The same thing should apply to software. You should be responsible if you tell customers your software is secure, and it isn't.

    I don't remember ever seeing a piece of software that wasn't provided "AS IS, WITHOUT WARRANTY OF ANY KIND INCLUDING THE IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE". Maybe the military or NASA can afford to buy software that has a purpose, but so far, all the software I have ever installed was somehow useless by design, since none of it should serve any purpose... Hard to hold me responsible if I sell you something and tell you in BIG CAPITAL LETTERS that whatever I'm selling to you is useless.

  • by onkelonkel ( 560274 ) on Wednesday October 12, 2005 @01:14PM (#13774653)
    You are correct about tradeoffs. "Perfect is the mortal enemy of good enough." For most software good enough is, well, good enough. Development cost and time-to-market drive decisions much more than ultimate lack of bugs. I realize that 99.9% of all sofware out there is not life and safety critical. Nobody dies if a printer driver crashes. Thus, there is no point, other than pride in ones own craftsmanship, in writing perfect code.

    However, a lot of people here have stated that it is flat out impossible to write code with no bugs. Almost as if it were a law of physics, or religious dogma. It is, of course, also a handy excuse for writing buggy software, and a great way to dodge the responsibilty.

    It IS possible to write bug free software; I know because that's what I do for a living. We write software for railroad traffic control systems. An unsafe failure can easily lead to dozens of lives being lost. For an analogy, picture an intersection of two busy four-lane highways, where the traffic lights once in a while all go green at the same time.

    There simply can't be bugs in our finished software. The procedures and methods we use to ensure this are time consuming and expensive, but we have no choice. And if, God forbid, somebody died because of a bug in my software, then I would be responsible.
  • by fossa ( 212602 ) <pat7@gmx. n e t> on Wednesday October 12, 2005 @01:25PM (#13774744) Journal

    And what if Ford sells you a car that fails to leap to the side to avoid an imminent collision, causing you do get into a very bad accident? And if Ford sells you a car that can drive into a building at 100mph? And if you use your car in some extreme environment that causes the breaks to degrade rapidly? What if the steering only locks after 20 years of use? I think you need to make a distinction between gross negligence and simple physics. Certainly if Ford misrepresents the capabilities of the auto that is different, but one simply cannot expect everything to work perfectly at all times. Life is fatal; everything is a tradeoff of risks, and at the end of the day you've got to watch out for yourself.

    There's also a big difference in that if I drive a faulty car (which there are various regulations against, or at least manufacturers must meet various regulations before they can sell a car), I put you in danger. If I use faulty software, I only put my data in danger (ignoring worms and the like). I'm not really interested in paying more for higher quality becaue you think I should.

    That leaves the question: if my faulty software damages your data becaue it contracted some malware that attacked you (or perhaps it's just faulty somehow), then who is at fault? Should the internet be regulated like roads are? I would like to think "no, certainly not", but who knows. Would regulation even improve things? Highly unlikely I think.

  • by tomstdenis ( 446163 ) <tomstdenis AT gmail DOT com> on Wednesday October 12, 2005 @01:38PM (#13774837) Homepage
    Not disagreeing with the sentiment. I've had to deal with my share of "my god this ain't right" code because for a lot of people simply demonstrating that a function CAN get the right results is enough to call it a day.

    The problem is us few folk who actually care to do things right at the start usually get pushed aside from the peeps who want a really quick solution. Of course it usually happens that down the road a proper start ends up saving time and money in the end but management doesn't care about that.

    For me the best compliment I get is "it just works". As in people use my libraries [and various programs] and they "just work" as advertised [e.g. documented]. People seem to be surprised that I document stuff too [e.g. I have a manually typed/formated manual in LaTeX as well as doxygen comments] as well.

    On my part I do things like make rational design choices [e.g. clear function names, consistent parameter orders, return values, expected behavioural models] because *I* want to use the code. The fact that it helps others [the code is public domain] is purely immaterial.

    And I think that's the trick. Most "really crappy" software is written by people who

    a) don't know better
    b) won't be using it themselves in future work [e.g. it works now, I'm done]
    c) see a)

    Like look at things like some kernel modules. They're for the most part horribly written but that's solely because once it works once they think their job is done.

    Then you have a host of really buggy pieces of commercial, shareware, freeware and OSS tools that come from people who bite off more than they chew. They come out of college or university without a single successful project under their belts and they assume they have unquestionable knowledge about the nature of the universe.

    What's worse is some of these people turn into 50 yr olds with a chip on their shoulders about a golden yesteryear.

    If people just wrote code under the working assumption they'd have to see it again one day you'd see more modular, flexible, well documented and thoroughly tested code. Or they're stupid for inventing more work for themselves...

    Tom
  • by sterno ( 16320 ) on Wednesday October 12, 2005 @02:14PM (#13775128) Homepage
    I agree 100%. I think all companies should be liable for their products. However, I do not think it should be at the individual employee level.

    Here's an interesting question. A piece of software that is written to work with Windows has a security flaw in it. The security flaw creates an exploitable condition in Windows such that you can gain total control over the system. Who's fault is it?

    Obviously there was a security flaw in the software that you were using, but then it wouldn't be that critical if Windows handled it's security better. So isn't Windows partially to blame. And what if you set it up in an insecure manner? Isn't that your fault? Or is the developer's fault for not making it more idiot proof.

    Now taking that down to the code inside of a program is just ridiculous. If you've got a team of 10 people (which is small in the grand scheme), each one of them could, individuall write totally secure code. However, come integration time, it turns out that they are opening up holes in eachother's code. So then who's fault is it? What about QA? Shouldn't they have some liability too?

    Finally there's the PHB factor. You could have a group of the best, most security knowledgeable programmers in the world, and they could still screw up due to lack of time and resources. What if the boss tells them to do something that makes the system innately insecure? Who's fault is it then, his for telling them to do it or theirs for not pushing back on the requirement. Not to mention what happens after people have work a few months of 60 hour work weeks trying to get a project done.

    In the end, liability is just a dumb concept in computers. In the end this is one of those places where the invisible hand of the market place is the best correction. Companies that write buggy software routinely will be smacked by the marketplace, by and large. The only exception to that rule is companies like Microsoft who have an effective monopoly. But then that's why we have anti-trust law isn't it?
  • Re:Contempt. (Score:3, Insightful)

    by greg_barton ( 5551 ) <greg_barton@yaho ... m minus math_god> on Wednesday October 12, 2005 @04:14PM (#13776259) Homepage Journal
    This is another example of our society's run-away liberal government mentality.

    Right. This was proposed by a former member of a Republican administration, who was appointed and served at a time the Republicans controlled the White House, Senate, and House of Representatives.

    Oh yeah. We're overrun by run-away liberalism.
  • by kaladorn ( 514293 ) on Wednesday October 12, 2005 @07:03PM (#13777504) Homepage Journal
    Add to the 'sign-off' aspect the usually required (at least here in Canada) training in law and ethics and you will find that few P.E.s will sign their names or affix their seals to things they don't have relatively high degrees of confidence in. When a P.E. screws up, they lose their license to practice and quite often their businesss, consultancy, or academic credentials at the same time. Thus, they try very hard not to screw up. This means they are act as a check on poor practices.

    But getting to be a P.E. involves overcoming the standard challenges and it isn't for everyone. A lot of engineering in non-software fields seems based around working with known processes and known parameters to produce a product or some result.

    The reason bridge building is a pretty sane discipline is that the characteristics of materials and the physics of bridges is pretty well explored. When a Civil Engineer builds a bridge (or designs one), he has good computer aided tools to do it, standard catalogs of parts and materials, and he knows all about tolerances, safety factors, and good processes. He couldn't sign-off on the project otherwise, without taking his head in his hands.

    Contrast that with my work, where I have to build applications using an OS I know is inherently flawed (they all are, but some more notably), it must be designed to work on a wide variety of hardware platforms (many of which I don't have on hand), it must often work with other people's code from outside my organization which is bleeding edge and often of dubious standards, and it is built with tools I only mostly trust and on top of libraries from the OS provider and from third parties into which I have no visibility. There are strategies to mitigate risk, but I'd be very damn leery of signing my name or affixing my sigil in a P.E. context to even my best code - because I know the system it is part of has so many components I don't control and so many points of failure.

    One risk mitigation strategy involves extensive testing (some say up to 90% of project cost). Anyone interested in paying $1500 for a copy of Office? I don't see many hands.

    I'm all for seeing an improvement of professional standards and practices in the field, the injection of more engineering approaches into the field, etc. But the software field moves faster (IMO) than any other technical field. It also is one in which you have the least faith in the parts you build with. Until reform happens *across and throughout* the field, any efforts to go after companies or individual engineers is a waste of time.

    Let's put it another way, more succinct: If I had to sign off in a legal liability sense for the code I've been writing for the last two years on the current contract, I'd imagine I'd have written about 10% of the code I have written and I'd have demanded a *lot more* from the people supplying me with 3rd party code to integrate. Since I know the business model wouldn't support that (the costs would kill the product as it stands), I have to think this approach is only viable once we decide we don't want 'the next new thing' in software and that we care about what we get enough to pay for it.

    Someone compared the effort to Ford or GM making cars. If you want to spend $15-50K dollars for a computer, I'm sure we can offer you a lot higher level reliability from the software. heck, at those kinds of costs, you might get the same sorts of warranties you get from Ford and GM, though they warrant around as much as they can get away with. But if you want to pay under $1000 for the hardware and under $1000 for the principal software, then you might as well expect something that works about 1/10th as well. And it seems to me you've got that.

    So, who here is lining up to buy the first $15K personal computer?

    Nice idea, don't see it happening anytime soon.
  • by Kuros_overkill ( 879102 ) on Thursday October 13, 2005 @02:42AM (#13779784)
    Here is the thing. My car is 20 years old. I'm living in a 40 year old building. The building is warm, and secure. My car runs, and has very little mantinance issues. My 3 year old computer is seriously out of date. If the computer could be garunteed to last for 10+ years, (Even I will admit that 20 years for a car is exceptional) I would have no problem shelling out $15K for it. I think this is where some of the issues come in to play. Every one is so desperate to have the Newest and prittiest, that they arn'd demadning the Best. And companies deliver what the people whant. And don't get me started on the big companies (not just M$, others do the same) that push the New and Pritty, just so they can keep turning large profits. Most of it is the software equivalent of repaining last years modle, and selling it as this years. Yes it looks newer, but it still has that faulty Altenator, that unreliable Fuel line. Eg. Word Perfect 5.1 did everything I use Word 2003 for, So why did I need to buy Word 97, Word 2000, WordXP, and Word 2003. (O.K., I still use Word Perfect 5.1, don't tell anybody, But most people out there know what I am talking about)

    sorry, got on a little bit of a rant there.

"Gravitation cannot be held responsible for people falling in love." -- Albert Einstein

Working...