Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security

Liability and Computer Security 159

Pelerin writes "In the latest Crypto-Gram, Bruce Schneier has written an interesting essay with some thoughts about the current lack of business incentives for the deployment and production of more secure software. His main recommendation/prediction is this: "Step one: enforce liabilities. This is essential. Today [...] the marketplace rewards low quality. More precisely, it rewards early releases at the expense of almost all quality. If we expect CEOs to spend significant resources on security -- especially the security of their customers -- they must be liable for mishandling their customers' data. If we expect software vendors to reduce features, lengthen development cycles, and invest in secure software development processes, they must be liable for security vulnerabilities in their products." Schneier's five-step plan for thinking about security is also good.

Pelerin continues: "All well and good, but this raises some questions in the case of a company offering security solutions based on open source / free software.

  • Where does the chain of liability end? Can somebody attempt to recover damages from Linus when a kernel security hole shows up?
  • Can a case be made for lower insurance rates for free software solutions? (I mean, can it be made to the accountants and the lawyers, not the techies).
  • When liability enters the picture, which mechanisms can allow free software to compete based on its merits, not on the likelihood of surviving a liability lawsuit?
"
This discussion has been archived. No new comments can be posted.

Liability and Computer Security

Comments Filter:
  • Free software (Score:3, Insightful)

    by chennes ( 263526 ) on Saturday April 20, 2002 @09:12PM (#3381397) Homepage
    Fortunately, the GPL licenses state that this is distributed under no warranty of any kind, which might provide some legal relief. If this was legislated around it could be a MAJOR blow to the free software community - if you can be held liable for your code fucking someone's computer up, that's a BIG incentive for little freelance coders to give up - Microsoft can pay the legal fees and out-of-court settlements - I cannot.

    Chris
    • Guess what... Microsoft has the exact same no warranty clause. Obviously this sort of clause is the cause of all these problems... offering a product that they admit is unsafe is well... questionable.

      See, if the GPL takes any sort of legal relief, the next MS license will copy it, unless a Free Software Clause is added to any bill (which is unlikely, thanks to Bush's absurdly-pro-business-ie-microsoft)

      In other words, only the BSD license would be screwed by this.
      • Re:Free software (Score:3, Interesting)

        by Stonehand ( 71085 )
        It would be amusing if a HUGE sticker were required to be slapped on the outside of software boxes containing such licenses, stating that "Food for thought: The publisher of this product would like you to know that he feels entitled to FUCK OVER YOUR COMPUTER AND ALL ITS CONTENTS and he won't owe you a dime."

        In big alarming black-on-yellow letters.

        Pity it'd never happen, but...
        • This is where the difference between free as in speech and free as in beer comes into play. When you pay for a piece of software, I think that you have the right to expect it to behave as advertised - fuck any "no warranty" clause to the contrary. If you obtained if for free as in beer, then there is literally no warranty - some folks published this because they thought it would be useful, and that's it. If there's an opensource piece of software that's free as in speech, but not as in beer, then we have the right to expect it to not fuck up our machines, and should be able to hold the company that sold it to us liable.
    • Re:Free software (Score:1, Interesting)

      by Anonymous Coward
      'Some States do not permit exclusion of liability, even for free software. You are not licensed to use this software in those States.'
    • just look at all of the security updates.
    • Every other license for any other software also disclaims any warranty, making the GPL's warranty null and void as well, if the rest of them are rendered null and void.
    • - Microsoft can pay the legal fees and out-of-court settlements - I cannot.

      Microsoft can't do much of that either. They might have $10^12 in market capitalization, but that would melt like a snowball in Hell if they were held strictly liable for security breaches. Their cash reserves would melt pretty fast too, if they had to pay the reported damages for each virus they enabled.

      I'll bet that any such legislation which actually gets passed will be weasle-worded enough that MS will have no real risk of ever shelling out, while Linus et al will be on the hook for everything.

      To judge the true effect of such a bill, just watch MS's stock price: if it falls as the bill goes through, it might hold someone liable for something. If the stock price holds steady, the bill will have no effect at best, and eliminate Libre software at worst.

  • Liability? (Score:3, Insightful)

    by quantaman ( 517394 ) on Saturday April 20, 2002 @09:12PM (#3381398)
    The problem with liability is that the your financial risk now becomes proportional to your success. While the model sounds good one bad security error could potentially put the software provider out of buisness from the lawsuits which would also leave hanging the people still using the software. The only time a company should be held liable is when the bug or security problem was intentionally left in (they would of had to take out a feature to fix it) and even then it's not a clear-cut issue. The only other time is when an incident happens at a time when the company has the fix but did not distribute it for some reason (i.e. marketing wanted to make the installed a different colour).

    • Who absorbs what risks between buyer and seller? It seems to me that there would be differences between free software and packaged software, between open source and closed source.
      Equitably, for free software, the buyer is assuming essentially all risks. Double your money back if not satisfied.
      With open source, the buyer is in a position to identify and fix problems and not totally at the mercy of the seller.
    • >The problem with liability is that the your >financial risk now becomes proportional to your >success.

      That's true for all industries. If Mom'n'Pop's Tire company makes a bad tire then they won't lose much. If Firestone makes a bad tire then they lose millions.

      Fortunately, revenue and profit are also proportional to success. Well, usually.
    • The problem with liability is that the your financial risk now becomes proportional to your success. While the model sounds good one bad security error could potentially put the software provider out of buisness from the lawsuits which would also leave hanging the people still using the software.

      I agree that this is problematic. This is the reason, why I think that the manufacturer of a pice of software should have no liability at all unless they choose to!

      I think that everybody that uses computers should be insured against risks caused by them. Now if MS says "No warranty" and the GPL says "No Warranty" then the insurance will say "MS is not better than GPL with regard to manufacturer liability, your insurance will cost the same."

      Of course this can be moderated by other factors like product history and operator competence, but essentially it would prevent manufacturer claims they do not back up. The insurance companies would just tell everybody straight what the situation is.
  • by drDugan ( 219551 ) on Saturday April 20, 2002 @09:14PM (#3381404) Homepage
    I hate to be naive here (but I am)... why do we
    need MORE laws to control us? What about
    those magic fingers of the markets? ...you
    know -- the ones that are supposed to push
    products toward what people demand.

    It's not clear to me that legislating software
    through increased liability is the best way to
    get security.

    thoughts?
    • by Anonymous Coward on Saturday April 20, 2002 @09:40PM (#3381496)
      This isn't about adding new laws to make writing software more difficult. It's about ending special protection and holding software companies to the same standards as everyone else. If I buy something from you, it better work--this is how it is for every other product under the sun, why is software special? As for free software, well the same standards would apply as for anything else that is free. You normally can't sue over something that is free, except in extreme cases where you can prove gross negligence or outright malice. That standard would work just fine here too.

      This may give proprietary software a PR advantage over free software (it has to uphold higher standards), but them's the breaks. Besides, free software has always touted an equivalent PR advantage (the source has been reviewed by countless experts in the field), so it's just good old-fashioned competition.

      In my view, those who are against software liability are no better than the RIAA/MPAA who try to prop up their inefficient ways of doing business through lobbying and legal bullying. They too like to blame their customers when anything goes wrong.
    • Well, we don't need more laws. There are already product liability laws. They just don't apply here. Just one of the many reasons that MS doesn't want software to be seen as a "product."

      I can see where the liability guys are coming from. OSS folks release the source, and GPL folks release a whole bunch of other rights as well. With code in hand and a pile of rights to do with it as you please, as well as probably not having paid a dime for it, the customer is more of a partner- assuming a lot of responsibility. Proprietary people charge money for what is really, despite their protests, a product. I've got a CD, maybe a book or two, some shrink wrap lying on the floor, and I'm at the vendor's mercy. I find out about security holes by getting cracked, even if the vendor has known about the hole for six months.

      The bottom line is that by retaining power, proprietary software companies also retain responsibility. If I am not allowed to look through and modify the source, the holes in my system are not my responsibility (except for buying bad software), but that of the vendor who won't allow me access. Power = responsibility. Money = money. People pushing for finding software companies liable aren't the "let's sue everybody" crowd, they are using the standards of the proprietary, corporate world against itself. Or, if you prefer, holding those companies to their own standards.

      License agreements are funny. According to one, I can't use my copy of XP on any box except the one it came on (don't worry, I haven't even used it on that one). How legally binding is an "agreement" that I didn't get to see until after the sale was completed? For that matter, how legally binding is an "agreement" with a monopoly? The "magic fingers of the markets" that you are holding out hope for are wearing thumbcuffs, my friend. But if the customers have to pay through the nose and have all real power held back from them, then the only answer is financial liability for the vendor. They might actually bother to produce good software then. If that financial incentive isn't enough, then there are other, more drastic legal measures. MS is illegally maintaining a monopoly, you know.

  • I know this will probably get modded into the ground, but what about Microsoft? Nimda and Code Red, which exclusively affected IIS on Win2K did "millions of dollars" in damage. If software companies are found to be liable for their hole-laden sotware, I would think Microsoft should be on the top of the list.

    The argument for manufacturer liability can be extended to be applied toward gun manufacturers. Just because a gun can be used to kill someone, doesn't mean the manufacturer should be held liable for the wrongful death. The lack of common sense present in the user should not be cause to pass the blame onto someone else.


    • Just because a gun can be used to kill someone, doesn't mean the manufacturer should be held liable for the wrongful death.


      An apt analogy would be that of a gun firing backwards into your skull when you aim the gun to shoot a burglar in the foot. Now, that is a problem. In that case, the defect is *IN THE PRODUCT*. Just contrasting "Format C:" with the US navy ship stuck in sea because of a divide by zero error. In case of Code Red and LookOut viruses someone else can be blamed for attack, so M$FT is easily off the case.

      If a M$FT product is perceived as the only show in town there is nothing that can be done.

      S
    • I know this will probably get modded into the ground, but what about Microsoft? Nimda and Code Red, which exclusively affected IIS on Win2K did "millions of dollars" in damage. If software companies are found to be liable for their hole-laden sotware, I would think Microsoft should be on the top of the list.
      Bad example. The patch for this was available for a month before the exploits started rolling in. What would OSS do if such laws existed? It would either need to be classified as 'non professional' code, meaning it's indemified against liability, but nobody would use it, or it would need to play by EXACTLY the same rules as any other software release. Having the code available should NOT release it from that responsibility, any more than an engineer would be released from responsibility for building a bridge that was unsafe, even if he allowed the random public to look at the blueprints all they wanted.
      • The patch has been out, and STILL find nimda and code red attacks in my logs. sigh.
      • Re:microsoft anyone? (Score:3, Interesting)

        by rgmoore ( 133276 )
        I know this will probably get modded into the ground, but what about Microsoft? Nimda and Code Red, which exclusively affected IIS on Win2K did "millions of dollars" in damage. If software companies are found to be liable for their hole-laden sotware, I would think Microsoft should be on the top of the list.

        Bad example. The patch for this was available for a month before the exploits started rolling in.

        It seems to me that this is exactly the kind of test case that needs to be looked at when discussing legal liability for software. If the patch is available, how much of the responsibility is on the administrator to apply it and how much is on the software company not to have written the buggy code in the first place? You can certainly argue that the availability of the patch should exempt the manufacturer from liability, but just how long does the patch have to be available to count? Is it acceptable if the patch is only available one month before the exploiting code shows up? One week? One day? One hour? Or should software authors have an affirmative responsibility to send patches to users, the same way that car manufacturers have to contact their buyers in the event of a recall? Who is liable when the patch is available but unapplied is the really interesting issue, not who is liable when no patch is available.

      • The patch for this was available for a month before the exploits started rolling in.
        Which leads to the question of why it wasn't applied.
  • By making software makers liable for security holes in their programs, you kill free and low-cost software. Nobody smaller than a large corporation would dare releasing networking software, out of fear that they'd be sued for millions in damages caused by a tiny bug.
    • Doubtful. People engage in risky businesses all the time. At some point, the profit becomes worthwile and the risk acceptable. Now, some companies might be driven out because they're genuinely unqualified to write software in this domain, it is too risky for them, but in that case they shouldn't be in that business to begin with. Believe me, I've seen plenty of companies doing business in my local area that are clearly unqualified to be advising clients on security issues -- they should be driven from the marketplace -- and my own company, which is much smaller than the worst and yet most common offender in the area, would greatly benefit from this.

      So, upon reflection, I must disagree with you here. This sort of thing would be a godsend for my company (currently with a grand total of 8 employees). The companies this would drive from the market are ones that shouldn't be in it, which would be a big benefit for my little co...

  • Here is an interesting part of that article

    "A company doesn't buy security for its warehouse -- strong locks, window bars, or an alarm system -- because it makes it feel safe. It buys that security because its insurance rates go down. The same thing will hold true for computer security. Once enough policies are being written, insurance companies will start charging different premiums for different levels of security. Even without legislated liability, the CEO will start noticing how his insurance rates change. And once the CEO starts buying security products based on his insurance premiums, the insurance industry will wield enormous power in the marketplace. They will determine which security products are ubiquitous, and which are ignored. And since the insurance companies pay for the actual liability, they have a great incentive to be rational about risk analysis and the effectiveness of security products. And software companies will take notice, and will increase security in order to make the insurance for their products affordable. "

    Could you imagine if the corporation you owner was charged more for liability insurance because you used the current version of IIS? It's so sad it's funny. If this wouldn't make Microsoft or Company X clean up their act I can't imagine what would other than the ethics of it :)

    Personally I work in healthcare so if my crap's not together I am going to jail. Too bad there's not HIPAA for everyone.

    • Actually there are insurance companies that charge MORE if you have MS product than if you use UNIX/Linux.

      However since so few companies actually has that insurance it hasn't yet influenced the marketplace.

      It seams that MS products are more problematic than open source/free software.
  • ...you're just going to end up with a swarm of lawyers invading the software industry, looking for anyone to sue.
    And the hardest hit will be the small and free software developers.

    Honestly it looks like the _best_ way to make big companies serious about software quality is to get the press on your side. A few high-profile MS security holes and what do they do? Launch a major internal initiative and rewrite IIS from scratch. If they continue to have holes after this, you can bet the press will be right there to grill them for it.

    Why do with lawyers what the free press and word of mouth can do better, faster, and cheaper?
    • by sulli ( 195030 )
      Flooding tech with lawyers at every stage would be really really bad. Just wait until Milberg, Weiss is hammering the Apache project because someone handed out some warez using it.

      Schneier is smart and knows a lot, but this is a stupid idea.

      • Just wait until Milberg, Weiss is hammering the Apache project because someone handed out some warez using it.

        What on Earth does this have to do with Schneier's idea?

        • Liability for software "security defects." Some court will find that this includes defeated copy protection. Then the class action lawyers will strike.
    • Why do with lawyers what the free press and word of mouth can do better, faster, and cheaper?

      Hehehe! That's the most hopelessly optimisticly naive thing I've seen anyone say all year! Congrats!

      Historical note: It wasn't the book Unsafe at Any Speed that resulted in improvements in car safety. The publication of that book just got Ford to spend huge amounts of money attempting to smear the author. It wasn't until the lawyers and politicians took notice and stepped in that the situation actually improved any.

      Yes, politicians and lawyers suck, but the alternative it worse. Faith in the Bible requires faith in "facts" no one was around to see. Faith in Libertarianism requires faith in "facts" the older among us were around to see, and know are false. An altogether higher order of faith... :)

      • Ugh. It really does hurt to admit it, but it is a very sad truth that no matter how much we may despise lawyers, we'd be worse off without them. Enforcing liability would be an order of magnitude more difficult if there were not people on hand who can do it full time.

        But you have to admit, some of them seem to try very hard to get the general population to want to strangle everyone with a law degree. And if there are too many lawyers around, why are they so expensive? If there aren't enough, why do we have so many willing to take on frivolous cases?

  • Indemnity clauses (Score:4, Insightful)

    by xrayspx ( 13127 ) on Saturday April 20, 2002 @09:19PM (#3381425) Homepage
    If you read a license, any license, it basically states that you use the enclosed software "at risk", meaning you can't sue if something, anything, goes wrong. Including data corruption, script kiddie 0wn@g3, etc. What he's proposing is getting rid of that. Fine, now Microsoft is liable for NT vulns, but you can't basically throw MS licensing rules out the window and leave BSD and GPL in tact. So then the "As Is" portions of the Open licenses have too.

    Why not hold Network Admins responsible for problems on their networks? I am a network admin, and if some kid got in and stole a database from one of my employers, compromising customers, I would expect to take the full heat for it. In the back of my mind I'd be saying "F*** Microsoft and their buggy-ass code", but I would know it was my fault for allowing it to happen.

    This is no solution. What's the estimated cause of Nimda so far? Code Red? SadminD? Melissa? I love you? all the other outlook worms?

    The cost of lawsuits from just these AUTOMATED attacks would cripple even Microsoft. Not to mention the CDUniverses of the, er, Universe.

    Software authors need these clauses for a reason, if they didn't have them there, they might as well go start a farming commune instead because it wouldn't be worth it to code anymore.

    Free Software authors would then also have to specify under which conditions they would ALLOW their software to be run. Otherwise some schmuck could install some .01a version of code that some guy wrote on his weekend off as a proof of concept on their primary webserver, immediately get hacked, and sue Joe Programmer into the stonage.

    Nice idea, just to tweak MS, but I don't like the way it would play out.

    • Why not hold Network Admins responsible for problems on their networks?

      Ummmm, Im an admin and I dont want to be responsible for *MS's* buggy software. Are you seriously suggesting that MS can't afford to be liable for their problems so I should be?

      As for GPL/Open Source etc. software your right that there is a problem :)

      • I'm saying that if we, as admins, don't properly secure our networks by using layers of security solutions, email scanners, not allowing certain attachment types, etc, we deserve what happens to us.

        I hate saying this, but even Microsoft should not be held liable for the affects of sloppy admin work.

        I just see this as an 'out' for admins and companies who didn't want to spend the money up front to invest in their data security. They can put it out there, not fully secure their network, get hacked because of some buffer overflow or IIS directory traversal attack, blame the software and go on with life.

        I hate to see that happen. Especially when the victim software becomes ssh
        • Now that'll I give ya: admins reponsible for bad admin work :) ... I have seen alot of that, and ocassionally been party to it :D
          • Now that'll I give ya: admins reponsible for bad admin work :) ... I have seen alot of that, and ocassionally been party to it :D
            Hey, who hasn't been there eh? Sometimes you have to weigh risk against cost-effectiveness and business-value and lots of other 50 cent words I don't understand coming down from management.

    • Here's a case where the free-as-in-beer aspect of free software becomes more important.

      It seems significantly more reasonable to hold large company charging loads of money Microsoft liable than it does to hold poor starving Linus Torvalds who's giving away his software for no money liable.
    • Amateur cars (Score:5, Insightful)

      by interiot ( 50685 ) on Saturday April 20, 2002 @11:03PM (#3381713) Homepage
      Take as an analogy the auto industry. Ford had legal suits brought against it due to its possible problems with their cars. This is good for the general safety of consumers, but it results in almost zero amateur cars. Individuals can build kit cars for themselves but can't sell newly manufactured ones, and smaller manufacturers [personaltransport.com] can distort their cars so they fit into some exception of the laws. But generally, 99.9% of the cars in the US are made by a couple of manufacturers [edmunds.com].

      This is what will happen to software if similar laws are applied to software.

      • Take as an analogy the auto industry. Ford had legal suits brought against it due to its possible problems with their cars. This is good for the general safety of consumers, but it results in almost zero amateur cars....This is what will happen to software if similar laws are applied to software.

        That's a good point. However, it's a bit disingenous to imply that the automotive hobbyist has disappeared. There are veritable legions of people who mod cars, do body work, repair engines and work on the racing circuit. It's simply not cost-effective for hobbyists to build whole cars that comply to street-legal standards.

        However, it's not quite as expensive to author "street legal" software. Some would argue that it's even cheaper for hobbyists than for corporations. Manufacturing would not be made more expensive by the author's proposal, just the testing and certification process.

        If it turns out that only a handful of corporations are capable of authoring mass-market software under the proposed system, then so be it. I don't think the hobbyist market will disappear. It's hard to argue that auto hobbyists have been disenfranchised by the current system. The only hobbyist action that is currently restricted is to take an unsafe car onto a public road.
        • You are assuming some particular composition for an as yet unwritten law. It would be possible to write what we would consider a good law. I just doubt that this is what would actually happen.

          Please remember that most legislators don't know much about the techincal issues. They do know about liability, and in those laws they have a marked history of being pro lawyer rather than being pro either of the litigants. Then they are the lobbiests, trying to push their particular twist on the law. I don't think this is the formula for ending up with a law that either we would approve of or that would be better for the country.

    • by cthugha ( 185672 ) on Saturday April 20, 2002 @11:09PM (#3381730)

      Fine, now Microsoft is liable for NT vulns, but you can't basically throw MS licensing rules out the window and leave BSD and GPL in tact.

      You can get MS and leave the GPL (essentially) intact. The difference between them is that you pay for MS stuff, whereas you generally don't pay for GPL software. Of course, if you pay for GPL software, you should probably have a right of action against the supplier (but not necessarily the original author, if s/he gives it away).

      The technical legal difference between the two is that an MS EULA is a contract (legally binding agreement for mutual consideration), whereas the GPL is only a licence (permission to do something the grantee couldn't previously do without anything in return) I understand the contract/licence nature of the GPL is still a matter of some debate, but if a law were passed saying "no clauses excluding liability in contracts for the sale of software", then we could probably catch the EULAs and leave the GPL and other open source licences intact where the GPL'd or OSL'd software was provided gratis. At any rate, I think it should be possible somehow to distinguish the two on a "you pay for one, you don't pay for the other" basis.

      Why not hold Network Admins responsible for problems on their networks? I am a network admin, and if some kid got in and stole a database from one of my employers, compromising customers, I would expect to take the full heat for it. In the back of my mind I'd be saying "F*** Microsoft and their buggy-ass code", but I would know it was my fault for allowing it to happen.

      It depends on who made the decision to go with the buggy software. If it was your decision, then yes, the responsibility falls on your shoulders. If, however, the decision came from management on the rationale that "nobody got sued for going with MS" or some other non-tech-related reason, and that decision was made against your own advice, then you shouldn't cop the heat for that

      Of course, given your lowly position in your organization relative to the goon that actually made the decision, office politics will pretty well guarantee that you'll take the heat anyway :).

      • And I know that was what the original article is trying to get across. You can treat the GPL'd/BSD'd stuff as some sort of non-profit entity and exempt them from the game. That would be great, but you can bet MS's PAC would bitch up a storm in DC about being singled out, even though they aren't, and try to get it applied to every piece of software there is.

        People pay for GPL'd software every time they buy a distro of Linux, the burden in that case would probably fall on RedHat or SuSE or Mandrake, rather than the original authors, but the arguement can be made. And if there's an arguement to make, leave it to some shifty ambulance chaser to make it...

        • People pay for GPL'd software every time they buy a distro of Linux, the burden in that case would probably fall on RedHat or SuSE or Mandrake, rather than the original authors, but the arguement can be made. And if there's an arguement to make, leave it to some shifty ambulance chaser to make it...

          The argument could be raised, but thankfully the GPL contains an "as is" warning effectively from the author. The law of negligence says that if you warn people about the risks they are undertaking before they commit to something, the standard of care you owe is lowered. If I warn a person to whom I give my work about the potential dangers of that work, responsibility would be transferred from my shoulders to theirs. In addition, if I go so far as to give them the means to guard against any flaws in my work (by providing source in the case of software), then I probably have an effective defence in the form of contributory negligence if it blows up in their face. Finally, any case in negligence would be weakened by the enornmous social (and arguably economic) benefit that flows from me giving away software. I doubt that even a lawyer who's clever to the power evil could get around all those factors.

          Products liability law usually says that such warnings don't apply, or if they do, they have to be prominently displayed on packaging, etc (which would tend to put consumers off buying it: can you imagine MS labelling its boxes in large letters "Warning! This software may delete all your work and ruin your life!"), but again, you have to be in the position where you're selling software, rather than just giving it away, in order to attract liability under those laws.

          As you say, if MS was really smart, they would go along with any new legislation, but insist on strict liability provisions that would cover all software. They could afford the increased legal exposure, but the average individual author of GPL'd work couldn't.

      • " If, however, the decision came from management on the rationale that "nobody got sued for going with MS" or some other non-tech-related reason, and that decision was made against your own advice, then you shouldn't cop the heat for that"

        If management is employing you for your tech-related expertise and advice then they should listen to it - whatever their prejudices about the matter are. However even when management actually made a bad decision contrary to advice they often want a scapegoat to blame.
    • Software authors need these clauses for a reason, if they didn't have them there, they might as well go start a farming commune instead because it wouldn't be worth it to code anymore.

      That's true. Software is unlike most any other product because of its complexity and nonlinearity. The average software developer makes hundreds of individual decisions per day that end up embedded in their code. Any one of those decisions could be a hole that destroys the security of the entire product.

      Testing and review helps, but it decades ago it was mathematically shown that in general you cannot prove whether an algorithm is bug-free. The tiniest crack in the logic could be used by an attacker as a wedge to subvert the entire product.

      This is very different from designing bridges or buildings, for example, where the thousands of decisions going into the design tend to reinforce the basic premise of its fundamnetal soundness. The mathematics of each calculation are usually verified by calculations done during other parts of the work. Due to this feedback, systematic failures are extremely rare, and when they do happen, often end up showcased on History Channel programs such as "Engineering Disasters".

      Laws developed to assign liability for bridge failures, train wrecks, etc. are not suitable for software problems. There needs to be a crystal clear distinction made between companies and individuals who make an honest mistake and work in good faith to correct it (no matter what havoc it wrecked), versus those who recklessly ignore third-party warnings and past problems in favor of distributing obviously flawed products time and time and again.

      In other words, software liability should not focus on individual incidents, but trends and patterns of behavior. Unfortunately, the law usually focuses on minutia, and it would be very hard to get it to focus on the big picture to punish only the genuine schmucks. Current legal practice usually likes to make examples out of a few unlucky small-timers. But as I explained, every software developer is almost certainly a potentially unlucky small-time offender.

      • Software is unlike most any other product because of its complexity and nonlinearity....decades ago it was mathematically shown that in general you cannot prove whether an algorithm is bug-free....This is very different from designing bridges or buildings...

        Best post so far. Software lemon laws would be great, but very difficult to do without taking the intent and/or general practices of the manufacturer into consideration.
      • Re:Indemnity clauses (Score:3, Interesting)

        by Electrum ( 94638 )

        This is very different from designing bridges or buildings, for example, where the thousands of decisions going into the design tend to reinforce the basic premise of its fundamnetal soundness. The mathematics of each calculation are usually verified by calculations done during other parts of the work. Due to this feedback, systematic failures are extremely rare, and when they do happen, often end up showcased on History Channel programs such as "Engineering Disasters".

        But it is possible to write secure software through good software engineering practices. Unfortunately, not many people seem to understand them. Only a few individuals like Dan Bernstein [cr.yp.to] can consistently and effectively write secure software, and will guarantee [cr.yp.to] that it is secure.

        If software was thoroughly designed from the start before any code was written, the same as with normal engineering projects, then perhaps more software would be secure. If you look at his guarantee for qmail, then you'll notice that he followed several principles throughout the design and implementation that allow him to guarantee that it is secure. If software engineers become liable for their work in the same way that traditional engineers are liable, then maybe software engineering will become more like traditional engineering.
  • Allright, lets start making software producers liable. But how exactly are we going to enforce it? There is no such thing as perfect software once you get past certain limits in complexity. Who will say if the software producer has done a reasonable job or not when securing the product?
    As countless examples in the US legal history tell us, this problem will most probably be solved by creating a set of (rather stupid) arbitrary rules that software makers must follow.
    Consider this example: US government institutions may only use software that meets certain accessibility standards (e.g. you have to be able to increase the font size, display stuff in high-contrast mode etc.) The only company that has resources to make its software compliant with these rules at the moment is Microsoft, it is just too expensive for others.
    Now what makes anyone think it would be any different with these security requirements? The rules will probably be something like "all financial transactions must use SSL" or "passwords must be encrypted with 128-bit keys" or something like that. But the reason behind most security holes today is not so much insecure protocols or insufficient key lengths but invalid assumptions between different components in complex software. And no law is ever going to take care of this problem.
  • Unless there was some way to enforce this for software companies around the world, this won't work. No government will handicap their own country's software companies by making them delay product releases. The masses will buy whatever is out first, putting those security conscious companies at a competive disadvantage, since software companies outside the country could simply beat them to the markets.
    • Not really, because this is enforced from the purchaser's end, not from the manufacturers end. It's not whether software written in country X will have the rules applied to it but whether software sold there will. It doesn't matter where the writer is, just where the user is. Of course if a country that's a small player in the software purchasing market tries this it probably won't work because companies will simply refuse to sell there. But if the United States or the European Union decide to try it, there will be a huge impact. The US and EU are a big enough markets that all of the big players would be forced to bring up their quality because they couldn't afford to be shut out.

  • by lux55 ( 532736 ) on Saturday April 20, 2002 @09:31PM (#3381476) Homepage Journal

    Liability is the reason that the Broadcast 2000 project was removed from public access, which is a tragedy because I'm sure tons of people could benefit from their free software. From their web site:

    In recent months the line between warranty exemption and liability has become increasingly blurred as more companies have liquidated and more individuals have begun to seek compensation. We've already seen several organizations win lawsuits against GPL/warranty free software writers because of damage that software caused to the organization. Several involved the RIAA vs mp3/p2p software writers. Several involved the MPAA vs media player authors. You might say that warranty exemption has become quite meaningless in today's economy.

    Theirs isn't a security issue, but it's still very relevant as they are acting out of the fear of being held liable for what they were offering for free. That is really sad.

    Security issues are deep-rooted, and most definitely can't be solved by nullifying the liability clause in licenses.

    • This is not the *real* reason. Heroine Warrior is not a company.
      From there web site "We're not a corporation. We're not a guy in dorm
      room trying to act like a corporation. We don't have accounts
      receivable but we do have accounts payable". The *real* reason was
      that Broadcast 2000 was not good enough. So they started to develop
      Cinelerra and made Broadcast 2000 unavailable from there site. But to
      keep it from getting to much attention they pretended it wasn't there
      as you get this message "It's not here anymore. Why don't you go to
      this award winning page." Where "this" takes you to Microsoft.com.
      However, it is available <a
      href="http://sourceforge.net/projects/heroines /"&g t;here</a>. Basically
      these guys are still developing the software but like playing stupid
      games to fool the general public into thinking there are no longer
      doing anything to avoid getting too much attention. A nice overview
      of the *real* history of Broadcast 2000/Cinelerra is availably
      with the Cinelerra docs.
    • This is not the *real* reason. Heroine Warrior is not a company. From there web site "We're not a corporation. We're not a guy in dorm room trying to act like a corporation. We don't have accounts receivable but we do have accounts payable". The *real* reason was that Broadcast 2000 was not good enough. So they started to develop Cinelerra and made Broadcast 2000 unavailable from there site. But to keep it from getting to much attention they pretended it wasn't there as you get this message "It's not here anymore. Why don't you go to this award winning page." Where "this" takes you to Microsoft.com. However, it is available here [sourceforge.net]. Basically these guys are still developing the software but like playing stupid games to fool the general public into thinking there are no longer doing anything to avoid getting too much attention. A nice overview of the *real* history of Broadcast 2000/Cinelerra is available with the Cinelerra docs.
    • authors could have purchased liability insurance.

      However, theirs was a somewhat special circumstance as they had a reasonable expectation of being sued by deep-pocketed organizations (MPAA, RIAA) whose motiviations are well known and have little to do with actual software security or quality.

  • Algorithm:

    foreach (@msft_bug) { pay $_ and wait };
    foreach (@competitors_bug) { they_pay $_ or go belly_up };
    negotiate any class_action_suit ( "just like tobacco companies did" => increase price if (you_can_cite_extra_liabilities));
    expand kitty;
    wait and see everyone vanish "like fleas";
  • If we expect software vendors to reduce features, lengthen development cycles, and invest in secure software development processes, they must be liable for security vulnerabilities in their products.

    The simple fact is, customers do not want software that has reduced features and is more expensive. The customer wants cheap software that is slightly secure and feature-full. If a customer wants secure software, then there are alternatives (Linux, BSD, contract a programmer to create a secure program...)

    If the government begins to practice software liability, then they are essentially telling the customer what she wants. The government does not know that the customer wants, the cusomer knows what the customer wants.
  • by Anonymous Coward
    The problems caused by insecure and misapplied software can be partly attributed to failures by software vendors, but I don't think it's realistic to insist that Microsoft be held liable for its bugs. For one thing, this would make it legally impossible to disclaim warranty over software... which would expose many open-source developers and hobby programmers to lawsuits for code they've posted to the public.

    For another thing, many of the security problems that exist (as the article points out) stem from improper configuration and use of a software product. If I buy something from CheckPoint, and accidentally leave myself wide open while installing it because I'm too cheap to hire a real firewall jockey to do it right, how is that CheckPoint's fault? And if we don't hold vendors responsible for these misconfigurations, the "sue the vendors" fix doesn't solve this part of the problem at all.

    As an alternative, think about holding the person or company who deploys insecure products, or deploys secure products incorrectly, responsible for the damaged caused. If some virus emerges that roots your webserver and uses it to DoS me, it's your fault that I'm losing traffic. This puts the incentive to fix insecure configurations in the hands of the people who are closest to the problem.

    Additionally, holding users responsible will tend to breed better security products. If a company realizes that it can be sued when its machines are compromised by ILOVEYOU and harming others' property, it will have a strong incentive to be selective and careful when purchasing and installing security measures. The guys selling IIS will have to clean up their act, or face a complete lack of customer interest.
  • I have some interesting (+1 interesting worthy!) ideas about the whole thing. First, instead of either blaming the company or the person, look at the circumstances. I realize that can't just shut down their entire network to patch everything, but it should be their risk if they choose not to. I also realize that shouldn't release obviously shitty software to make a profit, and then patch it into maturity. If has installed all their patches like a good , and they still get hacked and damaged somehow due to an obvious bug for which their is no patch, they should be entitled to some kind of compensation from . Not full compensation, but at least enough to give incentive to fix the gaping hole in their software.
  • "Today [...] the marketplace rewards low quality. More precisely, it rewards early releases at the expense of almost all quality."

    Was he thinking of Windows XP when he wrote this?

    Anyhoo, I agree with him about the ineffective airline security measures after September 11. If someone wanted to get on a plane and run it into some building, soldiers in terminals and having guards check for tickets at the gate won't stop them. If they really felt like doing so, I don't think spending the $200-whatever for a ticket is going to deter them. I think it's just there for looks. (they confiscate nail clippers for crying out loud)
  • "The costs of ignoring security and getting hacked are small: the possibility of bad press and angry customers, maybe some network downtime, none of which is permanent. And there's some regulatory pressure, from audits or lawsuits, that add additional costs. The result: a smart organization does what everyone else does, and no more."

    I don't understand. If the cost of having no security is so low then liability won't change anything. Why get security insurance if you can easily swallow the cost of getting hacked?

    It seems like the real problem is that security doesn't matter that much to most companies. If it did they would work hard to protect their bottom line by finding secure software. Liability won't significantly change this.
  • Design Limits ... (Score:2, Insightful)

    by LL ( 20038 )
    An engineer can guarantee a bridge to fail at specific loads ... can the state of software engineering claim the same for a piece of software? Even design by contract software like Eifel is no security blanket when used by the wrong hands or incomplete specifications (cf rocket that blew up due to engine being calibrated for different flight mode).

    We are still in the dark ages as far as software liability goes ... :-(

    LL
  • What if a software company were to change its license such that it WOULD assume liability? Granted, it would probably need insurance of some kind, but how much more comfortable would a purchaser of this hypothetical company's software be if had somebody to sue?

    Let the free market speak - Once a company is confident enough in its product to offer a warranty, the rest will follow.

    • Once a company is confident enough in its product to offer a warranty, the rest will follow.

      One small problem. The security of the software is directly tied to the OS that it runs on.

      If you write Windows software, you have to use Windows APIs (either directly or indirectly), and the security of YOUR application is then dependant upon the OS.

      If you write on a more secure OS, then your application would be inherently more secure, and you could probably offer a warranty.

      Likewise, if you write a module to add-on to an application, then your module would be at the same security level as the application ... and so forth.

  • This proposal sounds to me like proposing Ford Motors be liable for Fords crashing... which is not the way that works, and everyone knows why. The operator makes a big difference.

    Not that "common best practice" insurance for security liability wouldn't be a bad thing - it's so much easier to cost justify "running this will take our insurance premiums up $x" than it is to say "running this will increase our risk of Something Bad Happening some unknown percentage." But it's the operators that bear that cost, not the manufacturers.

    If you wanna run that FlashyRedSportscar 1.0 software that makes it more likely you hit a wall at 140 MPH - your risk, your call. Providing FlashyRedSportscar Software, Inc. was diligent in its processes, they shouldn't have to hire lawyers when you meet the wall.
    • This proposal sounds to me like proposing Ford Motors be liable for Fords crashing

      If the car crash is determined to be caused by a design flaw, ford IS liable. The classic case is the ford/firestone fiasco of a couple years ago.

      Software manufacturers should held to the same standards as every other company out there. If a design flaw is present which could produce problems to the customers (i.e. cost them money, productivity, damage equipment, injury, death, etc) they risk getting their ass sued. However, miss-use by the user (ex: typing 'rm -rf /' and losing all their work) does not make the manufacturer liable.

  • Basis of liability (Score:4, Interesting)

    by gweihir ( 88907 ) on Saturday April 20, 2002 @10:43PM (#3381671)
    The liability will not go to Linus. Basically everybody operating computing equipment will have to have insurance, just like if you operate a car or like you should have if you go wind-surfing.

    This insurance will get much cheaper if you use good systems and have the required competence to make them secure.

    Some problems will have to be resolved by the legal community:
    • Who is responsible for the operation of a pice of computing equipment and how does this responsibility transfer?
    • How is the competence of such an "operator" graded?
    • What constituts criminal/unauthorised misuse of computing equipment?

    The last point is important, since you are only responsible for problems caused by your equipment, as long as they are not due to some criminal action by somebody else that you could not easily detect.
    To stay with the car analogy: If somebody sabotages your brakes in a way you don't notice until they stop working, accidents that result may not be your responsibility.

    An additional point: While a car manufacturer has certain responsibilities, not everything that can go wrong is their responsibility. Only things they claim or are required by law to claim have to be backed up by their product. If you hit a tree because you don't know how to drive or if you start sliding on ice, that is certainly not the manufacturer's fault.

    In the case of software this gets a little more complicated, as there is no "unit" of software. My feeling is that Manufacturers will not face legal requirements for characteristics their software will need to have, because such characteristics might be impossible to specify (not saying people will not try). Instead I think that cheap "computer operation insurance" will only be available for products where either the Manufacturer takes legal responsibility for some characteristics of the product or where the insurance companies have a strong indication that the pice of software has these characteristics.

    I also think that Computer Scientists and other people that produce code and systems will have to have a kind of "Malpractice Insurance" whenever they commercially create code for others.
    • While a car manufacturer has certain responsibilities, not everything that can go wrong is their responsibility. Only things they claim or are required by law to claim have to be backed up by their product. If you hit a tree because you don't know how to drive or if you start sliding on ice, that is certainly not the manufacturer's fault.

      Interesting points ...

      So ... basically, every company that operates a MS Server (or Advanced Server) that does not have a MCSE running it would not be able to sue Microsoft because of the "seeming incompetance" of the administrator ... due to him not setting up the server correctly?

      I don't think this would fly, since even if you DO setup applications "according to spec", there are security holes. Even with the latest patches installed.

      This would be one of the points, in a lawsuit, that the defendant would have to prove ... that the operator was incompetant to run the machine. IF they were able to prove this, the plaintiff would lose. If they were unable to prove this, then the defendant would lose.

      • So ... basically, every company that operates a MS Server (or Advanced Server) that does not have a MCSE running it would not be able to sue Microsoft because of the "seeming incompetance" of the administrator ... due to him not setting up the server correctly?

        Not quite. My thought was that you cannot sue MS unless they claimed something untrue, like "no security holes".

        The missing MSCE would not have an impact on that, as the product is not changed by the way it is used. However the missing MSCE and the choice of a specific MS product with its specific set or vendor claimed properties would have a huge impact on the server insurance cost the company would have to pay.

        And if something goes wrong because MS claimed something untrue, not the company, but the insurance company would sue MS to get the money the had to pay for the damage done by the MS server back.

        In the case that the damage (to the server using company and other involved parties) was not due to false claims by MS, the insurance cost for this company would probably raise, just like car insurance gets more expensive after you had an accident. But no claims against MS would result.

        Note that I am used to the European legal system. If you misuse something or make a stupid mistake, you cannot sue the manufacturer here. The European legal system expects common sense from people. As an example, if you get cancer from cigarettes, you cannot sue the cigarette manufacturers here, because it is pretty obvious that cigarettes cause cancer. Or if you fall form a ladder you cannot sue the manufacturer unless the ladder was not well made. If it just fell over because you did not secure it, that is your problem, becaue it is pretty obvious that ladders have to be secured.

        I admit that I don't understand this American habit of suing the manufacturer, but my impression is that it is counterproductive. Especially with the huge amounths of money people sometimes get in settlements. No such thing here, if you are compensated for pain/damage suffered, the amount of money you get is entirely measured against your losses. Of course companies might have to pay penalties, but that money goes to the state.
  • OpenBSD & Qmail are examples of insecure freeware. But isn't OpenBSD exhaustively audited with many sections rewritten to eliminate security bugs probably spotted before ever being exploited? Doesn't Dan Bernstein write rock solid secure code in packages like Qmail and DJBDNS? The answer to both of those questions is a definite yes. The problem isn't that these things are made insecure, which is not the case at all. The problem is that the end user, or the system administrator, too easily can make things insecure by even the simplest mistake in configuration.

    I'd love to have secure programs as much as anyone, and OpenBSD and Qmail certainly show that some of that is available now. But when I choose what software I will install, I have to do more than just choose what is securely written; I have to balance development security against administrative security. Certainly hiring more skilled administrators can improve security. But if the software is harder to configure and manage, then it either takes more administrator time, skill, and attention, for a given level of result. To that end, one of the important factors I judge software by is how easily it can be configured.

    Close to that is another factor measuring how easily a given package can be hacked to correct a bug, or change a feature, if needed. If the code is well written, well documented, and clearly organized, the time it takes to hack it, and the certainty of hacking it correctly, is improved.

    For any given package, there will be some people more experienced with that one than others, and so this isn't always a clear cut decision. I made the choice to go with Linux and Postfix, instead of the other choices. But this decision suited my needs, balancing reasonably secure software and reasonably secure adaptability to my environment (including programming and administrative skills). It won't be the same choice for everyone. And there are cases I've recommended software I don't actually use because it better suited someone else's different circumstances (fortunately I was at least reasonably familiar with it from evaluation to know its specifics). For example, if you have no need to change anything, I'd say OpenBSD would be the best choice for a combo firewall and server (just don't let anyone touch it ... a console is a dangerous thing).

    Now here's the rub. What if someone does install OpenBSD and/or Qmail, and after they configure them, some kid breaks in and takes the machine for a ride? Are we going to blame Theo and Dan? I wouldn't, because I've seen way too many administrator mistakes (and learned from the ones I've made) to be putting the blame on the software. My big worry is that if we start pointing the liability finger at the software vendor, they're going to end up taking the heat way way more than they should be.

    The OpenBSD and Qmail development people, as far as I know, fess up to their bugs, especially the security bugs, and let people know when a hole is found. If we are going to have software liability, I think that a practice of consistently divulging known vulnerabilities should be considered a safe-harbor from the liability, even for bugs that got exploited before the developers were aware. It's the practice of covering up on the vulnerabilities that I despise. That's where the liability should be.

    The legal test should be whether the software vendor has carried out a consistent practice of immediately divulging (if not to the whole public, then at least to all their customers) the existance of the vulnerability, even if they don't have a fix for it yet. I'd rather take a web site down for a day if it is discovered to be insecure, while waiting to get it fixed. Of course open source is a plus here, as I can dig in and hack up a fix or work around myself, even if its just a quick and dirty one (like gross over sized malloc with some randomizing, for buffer overflows, to ride out a few days until a proper fix is available). And this means all customers, not just a few privileged big corporate customers.

  • This raises an interesting way in which the closed source/open source battleground could be leveled somewhat, and could bring computer software up to the level of quality we expect from other engineered products. Would we cross bridges if they BSOD'ed while we were on them, killing us? I think not.

    What the government needs to do is enact legislation that ties source code to a company's liability for the damage their software causes in case of failure. If a company releases its code with its products, then exempt them from liability; the customer has the code and could fix it if they wanted to. But, for companies that choose not to release their code, make them liable for their shoddy product. After all, what they're selling us is *supposed* to be complete and useable, and if they're not going to put their customers in a position where they can fix problems with a product themselves, then the closed source software company should pay.

    This would even be a positive situation for the closed software companies in the long run, as the liability that they are selling along with their product is yet another feature their software can claim. This could one day end up being the competitive point between open source and closed source: open source = a gamble for your company, but a cheaper product, closed source = guaranteed to work by the producer at extra cost.

    Either way, something has to be done.
    • This is the first post I find interesting and correct on some points.

      Somehow this opinion of yours is correct, but on the other hand, after that, you'd put a bad name on (open source=a gamble for your company) all OOS products. This would be a mass of commercials of "how not to gamble, go secure". Even though that's not true.

      For average user, that knows almost nothing about security, this would be just a good commercial for proprietary software.

      To conclude my opinion: LAW SHOULD BE MORE STRICT
      Either you're doing OOS and you're not liable (since you give it away for free), or you're selling software and you're automatically liable for what you're selling. Clauses for both parties should not be acceptable.

      And yes, something really has to be done.
  • Is it practical to guarantee that extremely large systems are error-free? For little programs all you need is a few test cases and you've basically covered every set of circumstances. I would imagine a modern operating system is a different story.

    I think software companies should take reasonable steps to ensure that their software is secure, and they should design their software with security in mind from day one. However, I don't think companies should be held liable for flaws unless those flaws are the result of negligence.
  • As it stands, just about every software license and EULA out there says that the software is not certified to be fit for any purpose, that unexpected results may vary and that they are not responsible for damages resulting from the use of the softare.

    To me that's a huge load. As far as software is concerned, we're still selling snake oil and living in the old west. There's a lot of buyer beware which is why I support trial-warez.

    On the other hand, open source software is almost always considered "a work in progress" that seemingly never completes. That's just a given. But when a commercial product is released, there's a sense of finality involved. This is version 1.0 and any newer version will cost you money.

    To me, once you exchange money and acquire a product, there is a moral responsibility on the supplier's part to guarantee the work in some way. I hate to use physical world analogies and so I won't go into detail. But imagine if the same sort of agreement went into the purchase of cars?

    There is a huge difference between a publically contributed free work and one that is licensed (not sold) to a user for a given purpose. This game of "I want your money but not the liability" is a load of attorney crap. If you're a professional, be prepared to behave like a professional.

    In any case, I think I'll go into business as a brain surgeon and make people sign agreements that say I'm not useful for any particular purpose, am not responsible for my actions and any additional surgical procedures resulting from my accidentally leaving my tools inside the patients body is an undocumented feature and not an error on my part.
  • ...just jack up the price to include your liability insurance.
  • Common aspects of liability are not really significant. Every comoany pushes it's product as soon as it's barely working, this products are being pushed untested (or barely tested). One of the greatest examples is Microsoft, which pushes products that surelly contain security holes. One point of view is that this security holes benefit company to spy easily on users actions, the other one is lack of testing.

    Trouble of liability is that everybody expects greater liability of GPL products than proprietary ones. Usualy OOS is more secure and more tested, but to expect liability of product that's is insane. (at least as long as Microsofts clause about 0.5$ is valid)

    In all of my cases that involved security or patches, I get much better support from OOS projects than proprietary ones. For instance contacting Corel or Microsoft is quite painfull, either they don't have a patch, don't know or in the worst case they don't even know that patch for solution already exists.

    If you take to consideration all of 5 points, what do you get?
    1. Greater expectations of proprietary software (liability must be included, so I really don't know if Microsoft could push another project like IIS or IE, without 0.5$ clause they'd be dead and burried in no time)
    2. Attack on OOS, where people are giving software for free, but they are liable for that??? Don't know but this is insane. OOS projects were always either well support or they died, but in 99% of cases I felt I've got a great support, either I got an answer or an explanation, in one case only I didn't get an answer and I didn't really bothered, I just swaped to another one that supported my needs well.

    So if you ask me liability was talked about too much. Talk rather about who should be liable and who should not be liable. This is a real question, not liability itself.
  • Secure servers without back doors are for weenies. :)
  • by Anonymous Coward
    I can't believe most of the posts so far say that this is the end of free software - I don't know the USA, and IANAL, but here in the UK I believe there's a legal (product liability) distinction drawn between

    A - Offering at no cost anything.
    and

    B - Paying for a product for a given purpose.

    I am sure you can expect very little legal comeback if someone gives you $product, and you lose a finger messing about trying to make it work. However if someone makes you pay leading you to believe their $product is suitable and safe, and you lose a finger due to a poorly designed product, Trading Standards & Consumer Protection laws can be used to sue the seller of $product for damages.

    Free Software is given away, no money, no trade, therefore the performance expectation is zero - anything more is a bonus

    Commercial software is sold, therefore assumed to be of a certain level of performance, usually "as advertised" - if a product fails to work as it should, or worse causes damage, the people making money should be liable, for sure.

    Open Source Software surely must avoid such liability issues, since compilation is required before anything can be expected to work, e.g. "Here are some text files - I find they can produce a program which may carry out function X". Even with harsh software product liability laws, you could charge money for the source code, since it alone can do nothing without a careful process required by the user - the binaries produced are the user's responsibility :) Distributing binaries, claiming that they do something, in exchange for money, is a totally different kettle of fish to Open Source.

    By not disclosing source code, companies take on the responsibility of making sure it works right. This should make them liable.

  • UCITA (Score:3, Interesting)

    by coyote-san ( 38515 ) on Saturday April 20, 2002 @11:31PM (#3381785)
    It's worth recalling that the proposed changes to UCITA (since only two states were dumb enough to immediately adopt the original model law) contains a truly incomprehensible couplet.

    Commercial contract can waive all liability. I seem to recall that the "technical self-help" measures (which allows them to write software that actively damages your system if it thinks your license has lapsed) has been removed, but it still gives them broad rights to gag you when you try to report problems, to falsely claim others haven't reported problems, to falsely claim that the problem either doesn't really exist or has been fixed, etc. It can do all of this because you handed over hard cash and a bona fide contract exists. (I'm not so sure it's bona fide - a contract requires an *exchange* of items of value, and I don't see much value in this software.)

    In contrast, free software isn't covered by a contract (since no money was exchanged) and UCITA explictly requires that warranties apply.

    This means that Microsoft (to pick a company at random), a company with billions of dollars in the bank and easily able to afford decent product testing, gets a free walk. Meanwhile Joe Sixpack, a professional programmer who released a simple "scratch my itch" program, can lose his house in legal fees defending himself even if he ultimately wins the court cases.

    The commentators (UF law professors, working under the aegis of the ACM?) suggested that the voting delegates seemed indifferent to this indefensible state of affairs. Hopefully they'll either fix it, or the lawmakers in the various states will quickly realize that UCITA 2.0 is just as bad as the original.

    But it's something that MUST be considered whenever we talk about the need for liability law to start applying in the software world. We can see the importance of having your own source code, but the people who would actually write the laws are still hearing from Microsoft et al, not us.
    • My opinion on this ... is that commercial vendors (any and ALL of them), should be liable.

      This means that Microsoft (the company that you picked at random) WOULD be held responsible for bugs/security holes. Joe Sixpack, if he does even shareware would be liable, to a limited degree (maybe a cap?). A possible formula for this would be like the graduated income tax brackets. The more you sell a product, the more you should have tested, the more you will be liable ... etc.

      However, ANY software that also released it source would be free from this liablilty, since the person installing the application has the source to recompile it if necessary. If they don't have a developer staff/friend to modify the source, they should contact the author and plead to fix it. If the author refuses, too bad, fix it yourself.

      Then I think it would be acceptable to have software liability.

  • We talked about this almost a year ago.

    Lower Your Insuance Premiums: Use Linux. [slashdot.org] The article is here. [com.com] I haven't seen any follow up news, but this is where product liability has the best potential to hurt MS: where the only way they can affect the true cost of their product is by releasing a product that works.

  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Saturday April 20, 2002 @11:45PM (#3381821)
    Comment removed based on user account deletion
    • Software liability, in the same sense as liability for a "standard" engineering product (electrical appliances, cars, buildings, etc.) is, like you say, ludicrous. That's because companies can employ underwriting laboratories to do testing that would exceed the cost of an in-house testing matrix. Engineering is governed by the laws of physics, which generally can tell you a lot about how resistant a building is to heat, wind, rain, etc. In general, software is just plain not tested enough. This is the biggest problem to the formulation of software engineering as a respectable discipline on par with civil or mechanical engineering.

      1. Businesses can crumble because of security assured to them by their software vendor that doesn't exist. People lose houses, jobs, and families because of this kind of thing. Security is dependent on more than just each component of a solution being appropriately secure - it needs the combination of each individual piece to be secure. This task is, in general, too difficult for the average tech lead at a small business, college, or school, who will have enough problems with basic functionality. To some extent, the burden needs to be shifted to software providers- I don't think this is a point of contention.

      2. It is easy to purchase the software you need, with a guarantee of security and reliability, and at a reasonable price, only if you are involved with the government of a large country, and even then you don't always get it right. [nasa.gov]

      3. IIS on its own may be secure enough for a company intranet, but if the intranet's firewall and proxy servers are compromised, then it has become not secure enough. Schneier wants insurance companies to take the brunt of deciding how effective security solutions are - not the US government.

      4. Schneier's main goal in instituting software liability is the management of security risk by lowering insurance premiums for people with more secure software. People who want to develop software without liability protection can count on an according security check level - if a system was in place that made security important for everybody, and not just these guys, [ncsc.mil] the world might be a better place.

      5. There are enough larger players within the software world that I don't think this would happen - specifically, IBM wants to protect AIX, Apple wants to protect OS X, and Sun wants to protect Solaris. And if IBM and the NSA want to continue to promote Linux, they WILL make it secure [nsa.gov]

      6. OpenBSD has had four years without a remote hole in the default install configuration - it has also had several local holes, and this is entirely discounting the problem of people who configure the software the wrong way. People are choosing to do this, and the market is sorting it out, but not to the extent that's necessary to prevent another Nimda, Code Red, or Iloveyou virus - the cost in lost productivity alone is earth-shattering. And people don't need to get hacked for terrible things to happen to them- in fact, if they never figure it out, all the better for the attacker. No, for the most part, people don't care- and they should. Most people don't want to get vaccinated, but we make them- because the cost to not get vaccinated for society as a whole is that much greater.

    • by Ed Avis ( 5917 )
      Exactly. Schneier complains that the market prefers quickly-released software to secure software. He may think this is foolish. But since when was it up to him to dictate what people should and should not be able to buy? Currently you have the choice between cheap software with no liability and very expensive software sufficient checking. Some like NASA and the military may choose the expensive option, but the cheap option should be available too.

      Most Slashdot readers may think it unfortunate that the market prefers Windows and MS Office to more capable alternatives, but few would argue for the more popular choice to be banned as a way of 'correcting' the market's decision.
    • 1.<snip>Peoples lives do not depend on commodity software. Thats the bottom line. We demand more of products that hold peoples lives in the balance.
      Sure, peoples lives do not depend on commodity software; not all software is commodity software, but there is no distiction in law, AFAIK, between types of software for liability purposes. Additionally, even commodity software can expose users to significant risks (leak of cc #'s or valuable proprietary business data, etc) which can result in real damage to the user. I have yet to see *one* cogent argument for why users of software should have no recourse in law for damages caused by negligence of their software vendor, when that is not the case for any other industry. Software may be different, meriting different fine details in liability legislation, but IMO it's not so different that it merits exemption from all liability whatsoever.
      2. It is entirely possible to purchase software with the level of protection you need and require
      I have no problem with this, but would add that all software should come with an explicit declaration of the level of reliability claimed by the author/vendor. If you don't warrant the software, you should *have* to declare that it is NOT FIT FOR ANY USE. Thems the breaks. If you can't stand by what you write, don't promote its use by others. I believe this covers point #3 as well.
      4. This will have a chilling effect on software development. Individuals or small projects may never be released because of fear of liability.
      Good! If some developer doesn't have any faith in their own ability to produce quality software, why the hell would I want to run it? Developers who are serious and believe in their abilities will purchase some limited amount of liability insurance and go boldly forth and code!

      5. <snipped nightmare scenario about all software becoming proprietary and encrypted>
      There is no consensus about what the "easiest way" to security or product quality is, so these wild speculations are silly. Insurance companies will likey review the known incidents and develop a somewhat empirical set of best-practices guidelines. At this point, raves like this constitue FUD, nothing more.
      6. The costs far outweight the benefits. People today can choose secure software (OpenBSD - four years without a remote hole). They just don't care to. What does this mean? People simply don't care.
      NO! People don't know. Oracle advertises their software as "unbreakable." M$ adds feature beatific datacenters with hypnotic voiceovers describing the server software existing in a state of nirvanic bliss. Believe me, if M$ advertised IIS as "works great if unconnected the internet and you don't move any windows in the GUI", people would choose not to use it! See my answer to #2 for more on how merchantability claims should be spelled out.

      I honestly believe that software liability would be a net win for the industry, admittedly with a little pain in the short term as people learned to live within the new system.

  • The person using the flawed application?
    the person creating the flawed application?

    Follow me on this.

    there are both sides for this, some people MIGHT want a less secure software (thus, a bit more rushed, thus less expensive) because of his specific application, why would his customers that don't request the features absorb the costs?

    We could discuss this point and give out gray areas, and it could make an interresting debate, but It's 1am and I'll limit this to something plain and simple and this is no microsoft bashing karma whoring since I already topped the 50 limit,

    Here goes: If you want the companies to be responsible for security flaws in their software, you have to first see if they do any misleading claims. Guess who comes to mind first? yes.. Microsoft. I don't run unix servers at work yet, I am exploring putting my email server on FreeBSD with postfix (which is kinda bitchy for a win2k guy that lost his unix/amiga side a long time ago :) ) and there's one thing seriously pissing me off from Microsoft, not as a Linux zealot (I hate linux at it's current state to be honnest) but as a customer who bought for almost 6 digits of microsoft software. They put so much on marketting, they put so much on presentation, they put so much in finding new ways of doing stuff, or clever ways to steal^H^H^H^H^H Implement existing ideas, but GOD I *HATE* it when I see them claiming their OS is the most secure, GOD I HATE it when they say it's more reliable than any competitor OS, I hate it when there's a bug and I think it's me who is the problem and I find out it's an OS bug (but that I can live with it). All this to say: If a vendor claims that his OS/Software suite/product is more secure for marketting purposes, it SHOULD AUTOMATICALLY MAKE HIM RESPONSIBLE FOR *ANY* UNEXPECTED ARISING SECURITY ISSUES, ESPECIALLY THE MAJOR ONES.

    Look at how nimda killed most servers and workstations running IIS, look at the freakin time it took for this bastard to get off the net? even MONTHS later I still had port 80 probbing on my machine for god's sake, how many high-speed provider shutted down incoming traffic on port 80? this was due to one serious SECURITY flaw and costed a lot of downtime and unexepected expenses.

    Yes there are stupid admins that don't update their machines often. But let's be honnest here, how many update do you need for major flaws on IIS versus Apache for example? I run IIS as an intranet, so I can "forget to update", but if I'd run it on internet for example, how many updates a month would I have to do compared to apache? a LOT more, I read both security lists out of curiosity, and the feeling I had about this was absolutely true. Too bad Apache doesn't have a IIS front-end and ease of use on win2k because I'm sure IIS would take an even bigger drop. I guess microsoft will do something really good with IIS6 because they are probably feeling the heat right now.

    Anyways, this is the reason why I will NEVER run my critical services such as DNS server or EMAIL on microsoft software (I use the ISP's for now, considering moving locally) they rush their things out, and fix later, which is totally unacceptable, and forcing to upgrade your browser instead of patching the bugs, and introducing new ones, etc... this is really becomming a serious issue, I wouldn't mind all this if they would at LEAST be honnest about this, but no, they want to go the PR way and bullshit people about security compared to unix system? come on, I have yet to see a nimda breaking loose on unix servers (this is only one example, let's not talk about melissa or any others).

    There aren't only negative sides to Microsoft software, windows 2000 is the best OS I've ever used since my amiga, it has it's downsides, amiga has it's downsides too so nothing is perfect, Win2k server is great for small buisness like mine and it's stable enough to do the job and I find IIS great for running my intranet. Well IIS would probably be the only software I'd expose out to the internet (if it was a non-critical server), because it's simple, easy to manage, permissions sets up pretty simply (for those of us who hate text files), but like a lot of people here, even if I find most microsoft software simple and Ok, I'd NEVER build a mission-critical solution on their product, I'd never run a "ebay" on IIS, I'd never be a ISP and running my DNS services on win2k, some do, and some don't have much problems, but when they do have them, they can tell you what hell looks like.

    So all this to say: If you want to sell stuff with no responsibilities attached to it because the people don't ask you for it or simply because of budget constraint, you can still be succesful and fill a need, but if you LIE about it, in my book, you diserve to be punished, and severely. If you'd be turning blue and a doctor would tell you "it's nothing, just take two aspirins" and you'd die a few hours later, he'd get his career kissed goodbye, while buisness isn't necessarely life, you can messup a LOT of lifes if your buisness go down because you miss a demo or your 20 programmers are down for 2 days because of a big virus attack and you need to rebuild all the servers and so on, I'm sure there's probably one example from a slashdot reader that could say he missed a demo and financing because of a stupid issue like this (well this might be a bit stretched but you get the point), what about the life of those employees? What about the total cost of all this downtime in the country?

    Microsoft is quick to blaim piracy costing BILLION of $$$, but they are quick also to change subject when we ask them how much THEY are costing to the industry because of downtime or upgrade or patching. Again, I am not against Microsoft because I think they are still doing great product, I am against their ATTITUDE towards the industry and all the false (or at least exagerated) claims they are making, if I'd do 1/2 of this as a small buisness, I would kiss my career goodbye, why would US's Icon be allowed to do this blattanly?

  • One of the few good security ideas around is LOMAC [pgp.com]. This is mandatory security, the only kind that works, because the apps aren't trusted. It works like Perl tainting, but at the kernel level. There's a Linux module, and it's in the "current" version of FreeBSD.

    The problem is modifying applications to live within the limits of LOMAC-type security. Work is underway to make WU-FTPD work under LOMAC, but somebody needs to do Apache and a mail program.

    If you work on any of those apps, read the LOMAC stuff and fix your apps to live within the LOMAC rules. This will do more for security than any amount of patching.

    • Interesting.

      Mandatory access is also part of Java security in the form of Protection Domains. [sun.com]

      This is part of a very sophisticated, multilevel model which can enable components from different sources to interoperate with the minimum of overhead.

      <IMHO>In general, OSS designers should track Java features and figure out how to use or duplicate them. It's easy to be complacent when considering the current generation of MS products, but once MS has transitioned to Dotnet swathes of security and reliability problem areas will have been eliminated. OSS will be left looking very exposed (unless it wants to be dependent on Mono...)</IMHO>
  • It would still lose a whole huge userbase.

    Consider the fact that if a vendor is forced to take liability for its Zapwicky Mark II. It uses some free software internally, this is known, nothing untoward is happening. The problem is the vendor is itself taking on liability for the free software. If i were making the decision on what to include in the distribution, that in itself would be reason to abandon the use of free software, and choose something proprietry that if there were problems, liability can be "passed on".

    Clearly, IANAL.

  • This would fuel the certification and standards industry and virtually stop all free software developement, because nobody would use free software written by people without dozens of certificates who could not prove that they followed certain standards.

    I say invest more in education and teach people about security.

    I know that sometimes the only way to force businesses to improve quality of service is to hurt them where it hurts most, i.e. dig into theor bank accounts, but the proposed solution would kill software developement.

    Not good.
  • This would make software more expensive; you simply cannot have it for free. Period. So the only question is whether or being able to hold the software producers liable, is worth what it costs.

    And the answer to that is entirely variable and conditional. If it costs (pulling a number out of my ass) $5000 extra for a machine that you only play games on, then it's not worth it; if it costs a million dollars for a machine that controls your water recycling on a year-long trip to Mars, it probably is.

    And because it is sometimes worth it and sometimes not, it should be an option. Instead of making every programmer bonded and liability-insured, thereby increasing the cost of all software, let the user decide when it's important and when it's not, and deal with buying the insurance themselves.

    And once it comes down to that, we all damned-well know that most users really don't care about security. So Bruce is trying to push this against the will of the people who will have to pay for it. I know he means well, but it's really just another attempt to enforce good ideas, which is usually a bad idea. Instead, he should stick to educating people about security, which he happens to be very, very good at. Instead of writing your congressman about outlawing liability disclaimers in license agreements, buy a copy of "Secrets and Lies" for a friend.

  • Like many in the security industry, he just cant argue. He's all but given up trying to convince people that security is important. It's funny, but he actually believes that the common man gives a shit about security. News flash: they dont. Due to this misunderstanding he pushes the blame onto the developers. Why should they be forced to develop secure software if no-one wants it? There-in lies the problem. If you want security to be taken seriously stop trying to use the force of government to make developers do something the market doesn't want. Convince the market that it is important. And no, that doesn't mean releasing scripts on bugtraq so kiddies go attacking the innocents so you can point your finger at developers and say "see see, bad software".
  • (A day late, a dollar short... I doubt anyone will read this. Oh well.)

    I agree with Schneier that software liability is the only thing that can fix the sorry state of today's commercial software. I also agree with the Slashdotters who say that making authors of free (either meaning) liable would kill off the practice. When I first pondered this dilemma before, I came up with an idea so fiendishly perfect that I'm sure tons of people have thought of it before: make the degree of liability proportional to the cost of the software!

    The Microsofts and Oracles of the world who make expensive, broken software will have to change one or the other or be sucked dry by damages awarded in liability lawsuits. On the flip side of the coin, the freeware and Open Source/Free Software communities won't have to change anything, and the shareware folks would be protected by the fact that most people who use their stuff never pay for it, perhaps even encouraging more people to buy shareware so that they might have legal recourse if it ever fails in the future.

For God's sake, stop researching for a while and begin to think!

Working...