Liability and Computer Security 159
Pelerin writes "In the latest
Crypto-Gram,
Bruce Schneier has written an interesting essay with some thoughts about the current lack of business incentives for
the deployment and production of more secure software. His main recommendation/prediction is this: "Step one: enforce liabilities. This is essential. Today [...] the marketplace rewards low quality. More precisely, it rewards early releases at the expense of almost all quality. If we expect
CEOs to spend significant resources on security -- especially the security of their customers -- they must be liable for mishandling their customers' data. If we expect software vendors to reduce features, lengthen development cycles, and invest in secure software development processes, they must be liable for security vulnerabilities in their products." Schneier's five-step plan for thinking about security is also good.
Pelerin continues: "All well and good, but this raises some questions in the case of a company offering security solutions based on open source / free software.
- Where does the chain of liability end? Can somebody attempt to recover damages from Linus when a kernel security hole shows up?
- Can a case be made for lower insurance rates for free software solutions? (I mean, can it be made to the accountants and the lawyers, not the techies).
- When liability enters the picture, which mechanisms can allow free software to compete based on its merits, not on the likelihood of surviving a liability lawsuit?
Free software (Score:3, Insightful)
Chris
Re:Free software (Score:2, Insightful)
See, if the GPL takes any sort of legal relief, the next MS license will copy it, unless a Free Software Clause is added to any bill (which is unlikely, thanks to Bush's absurdly-pro-business-ie-microsoft)
In other words, only the BSD license would be screwed by this.
Re:Free software (Score:3, Interesting)
In big alarming black-on-yellow letters.
Pity it'd never happen, but...
Re:Free software (Score:1)
Re:Free software (Score:1, Interesting)
Re:Free software (Score:1)
redhat would be screwed then (Score:1)
Re:Free software (Score:1)
Re:Free software (Score:2)
Microsoft can't do much of that either. They might have $10^12 in market capitalization, but that would melt like a snowball in Hell if they were held strictly liable for security breaches. Their cash reserves would melt pretty fast too, if they had to pay the reported damages for each virus they enabled.
I'll bet that any such legislation which actually gets passed will be weasle-worded enough that MS will have no real risk of ever shelling out, while Linus et al will be on the hook for everything.
To judge the true effect of such a bill, just watch MS's stock price: if it falls as the bill goes through, it might hold someone liable for something. If the stock price holds steady, the bill will have no effect at best, and eliminate Libre software at worst.
Liability? (Score:3, Insightful)
Re:Liability? (Score:2)
Equitably, for free software, the buyer is assuming essentially all risks. Double your money back if not satisfied.
With open source, the buyer is in a position to identify and fix problems and not totally at the mercy of the seller.
Re:Liability? (Score:1)
That's true for all industries. If Mom'n'Pop's Tire company makes a bad tire then they won't lose much. If Firestone makes a bad tire then they lose millions.
Fortunately, revenue and profit are also proportional to success. Well, usually.
Re:Liability? (Score:2)
I agree that this is problematic. This is the reason, why I think that the manufacturer of a pice of software should have no liability at all unless they choose to!
I think that everybody that uses computers should be insured against risks caused by them. Now if MS says "No warranty" and the GPL says "No Warranty" then the insurance will say "MS is not better than GPL with regard to manufacturer liability, your insurance will cost the same."
Of course this can be moderated by other factors like product history and operator competence, but essentially it would prevent manufacturer claims they do not back up. The insurance companies would just tell everybody straight what the situation is.
on legal liability (Score:3)
need MORE laws to control us? What about
those magic fingers of the markets?
know -- the ones that are supposed to push
products toward what people demand.
It's not clear to me that legislating software
through increased liability is the best way to
get security.
thoughts?
Re:on legal liability (Score:5, Insightful)
This may give proprietary software a PR advantage over free software (it has to uphold higher standards), but them's the breaks. Besides, free software has always touted an equivalent PR advantage (the source has been reviewed by countless experts in the field), so it's just good old-fashioned competition.
In my view, those who are against software liability are no better than the RIAA/MPAA who try to prop up their inefficient ways of doing business through lobbying and legal bullying. They too like to blame their customers when anything goes wrong.
Re:on legal liability (Score:3, Insightful)
I can see where the liability guys are coming from. OSS folks release the source, and GPL folks release a whole bunch of other rights as well. With code in hand and a pile of rights to do with it as you please, as well as probably not having paid a dime for it, the customer is more of a partner- assuming a lot of responsibility. Proprietary people charge money for what is really, despite their protests, a product. I've got a CD, maybe a book or two, some shrink wrap lying on the floor, and I'm at the vendor's mercy. I find out about security holes by getting cracked, even if the vendor has known about the hole for six months.
The bottom line is that by retaining power, proprietary software companies also retain responsibility. If I am not allowed to look through and modify the source, the holes in my system are not my responsibility (except for buying bad software), but that of the vendor who won't allow me access. Power = responsibility. Money = money. People pushing for finding software companies liable aren't the "let's sue everybody" crowd, they are using the standards of the proprietary, corporate world against itself. Or, if you prefer, holding those companies to their own standards.
License agreements are funny. According to one, I can't use my copy of XP on any box except the one it came on (don't worry, I haven't even used it on that one). How legally binding is an "agreement" that I didn't get to see until after the sale was completed? For that matter, how legally binding is an "agreement" with a monopoly? The "magic fingers of the markets" that you are holding out hope for are wearing thumbcuffs, my friend. But if the customers have to pay through the nose and have all real power held back from them, then the only answer is financial liability for the vendor. They might actually bother to produce good software then. If that financial incentive isn't enough, then there are other, more drastic legal measures. MS is illegally maintaining a monopoly, you know.
microsoft anyone? (Score:1, Interesting)
The argument for manufacturer liability can be extended to be applied toward gun manufacturers. Just because a gun can be used to kill someone, doesn't mean the manufacturer should be held liable for the wrongful death. The lack of common sense present in the user should not be cause to pass the blame onto someone else.
Re:microsoft anyone? (Score:1)
Just because a gun can be used to kill someone, doesn't mean the manufacturer should be held liable for the wrongful death.
An apt analogy would be that of a gun firing backwards into your skull when you aim the gun to shoot a burglar in the foot. Now, that is a problem. In that case, the defect is *IN THE PRODUCT*. Just contrasting "Format C:" with the US navy ship stuck in sea because of a divide by zero error. In case of Code Red and LookOut viruses someone else can be blamed for attack, so M$FT is easily off the case.
If a M$FT product is perceived as the only show in town there is nothing that can be done.
S
Re:microsoft anyone? (Score:3, Insightful)
Re:microsoft anyone? (Score:1)
Re:microsoft anyone? (Score:3, Interesting)
It seems to me that this is exactly the kind of test case that needs to be looked at when discussing legal liability for software. If the patch is available, how much of the responsibility is on the administrator to apply it and how much is on the software company not to have written the buggy code in the first place? You can certainly argue that the availability of the patch should exempt the manufacturer from liability, but just how long does the patch have to be available to count? Is it acceptable if the patch is only available one month before the exploiting code shows up? One week? One day? One hour? Or should software authors have an affirmative responsibility to send patches to users, the same way that car manufacturers have to contact their buyers in the event of a recall? Who is liable when the patch is available but unapplied is the really interesting issue, not who is liable when no patch is available.
Re:microsoft anyone? (Score:2)
Which leads to the question of why it wasn't applied.
this hurts independant developers (Score:1)
Re:this hurts independant developers (Score:1)
So, upon reflection, I must disagree with you here. This sort of thing would be a godsend for my company (currently with a grand total of 8 employees). The companies this would drive from the market are ones that shouldn't be in it, which would be a big benefit for my little co...
Higher insurance because... (Score:2, Interesting)
"A company doesn't buy security for its warehouse -- strong locks, window bars, or an alarm system -- because it makes it feel safe. It buys that security because its insurance rates go down. The same thing will hold true for computer security. Once enough policies are being written, insurance companies will start charging different premiums for different levels of security. Even without legislated liability, the CEO will start noticing how his insurance rates change. And once the CEO starts buying security products based on his insurance premiums, the insurance industry will wield enormous power in the marketplace. They will determine which security products are ubiquitous, and which are ignored. And since the insurance companies pay for the actual liability, they have a great incentive to be rational about risk analysis and the effectiveness of security products. And software companies will take notice, and will increase security in order to make the insurance for their products affordable. "
Could you imagine if the corporation you owner was charged more for liability insurance because you used the current version of IIS? It's so sad it's funny. If this wouldn't make Microsoft or Company X clean up their act I can't imagine what would other than the ethics of it :)
Personally I work in healthcare so if my crap's not together I am going to jail. Too bad there's not HIPAA for everyone.
Re:Higher insurance because... (Score:1)
However since so few companies actually has that insurance it hasn't yet influenced the marketplace.
It seams that MS products are more problematic than open source/free software.
I'd be careful about calling for liabilities (Score:2, Interesting)
And the hardest hit will be the small and free software developers.
Honestly it looks like the _best_ way to make big companies serious about software quality is to get the press on your side. A few high-profile MS security holes and what do they do? Launch a major internal initiative and rewrite IIS from scratch. If they continue to have holes after this, you can bet the press will be right there to grill them for it.
Why do with lawyers what the free press and word of mouth can do better, faster, and cheaper?
Yah (Score:2)
Schneier is smart and knows a lot, but this is a stupid idea.
Re:Yah (Score:1)
What on Earth does this have to do with Schneier's idea?
Re:Yah (Score:1)
Re:I'd be careful about calling for liabilities (Score:2, Interesting)
Hehehe! That's the most hopelessly optimisticly naive thing I've seen anyone say all year! Congrats!
Historical note: It wasn't the book Unsafe at Any Speed that resulted in improvements in car safety. The publication of that book just got Ford to spend huge amounts of money attempting to smear the author. It wasn't until the lawyers and politicians took notice and stepped in that the situation actually improved any.
Yes, politicians and lawyers suck, but the alternative it worse. Faith in the Bible requires faith in "facts" no one was around to see. Faith in Libertarianism requires faith in "facts" the older among us were around to see, and know are false. An altogether higher order of faith... :)
Re:I'd be careful about calling for liabilities (Score:2)
But you have to admit, some of them seem to try very hard to get the general population to want to strangle everyone with a law degree. And if there are too many lawyers around, why are they so expensive? If there aren't enough, why do we have so many willing to take on frivolous cases?
Indemnity clauses (Score:4, Insightful)
Why not hold Network Admins responsible for problems on their networks? I am a network admin, and if some kid got in and stole a database from one of my employers, compromising customers, I would expect to take the full heat for it. In the back of my mind I'd be saying "F*** Microsoft and their buggy-ass code", but I would know it was my fault for allowing it to happen.
This is no solution. What's the estimated cause of Nimda so far? Code Red? SadminD? Melissa? I love you? all the other outlook worms?
The cost of lawsuits from just these AUTOMATED attacks would cripple even Microsoft. Not to mention the CDUniverses of the, er, Universe.
Software authors need these clauses for a reason, if they didn't have them there, they might as well go start a farming commune instead because it wouldn't be worth it to code anymore.
Free Software authors would then also have to specify under which conditions they would ALLOW their software to be run. Otherwise some schmuck could install some
Nice idea, just to tweak MS, but I don't like the way it would play out.
Re:Indemnity clauses (Score:2)
Ummmm, Im an admin and I dont want to be responsible for *MS's* buggy software. Are you seriously suggesting that MS can't afford to be liable for their problems so I should be?
As for GPL/Open Source etc. software your right that there is a problem :)
Re:Indemnity clauses (Score:2)
I hate saying this, but even Microsoft should not be held liable for the affects of sloppy admin work.
I just see this as an 'out' for admins and companies who didn't want to spend the money up front to invest in their data security. They can put it out there, not fully secure their network, get hacked because of some buffer overflow or IIS directory traversal attack, blame the software and go on with life.
I hate to see that happen. Especially when the victim software becomes ssh
Re:Indemnity clauses (Score:2)
Re:Indemnity clauses (Score:2)
Re:Indemnity clauses (Score:1)
It seems significantly more reasonable to hold large company charging loads of money Microsoft liable than it does to hold poor starving Linus Torvalds who's giving away his software for no money liable.
Amateur cars (Score:5, Insightful)
This is what will happen to software if similar laws are applied to software.
Re:Amateur cars (Score:1)
That's a good point. However, it's a bit disingenous to imply that the automotive hobbyist has disappeared. There are veritable legions of people who mod cars, do body work, repair engines and work on the racing circuit. It's simply not cost-effective for hobbyists to build whole cars that comply to street-legal standards.
However, it's not quite as expensive to author "street legal" software. Some would argue that it's even cheaper for hobbyists than for corporations. Manufacturing would not be made more expensive by the author's proposal, just the testing and certification process.
If it turns out that only a handful of corporations are capable of authoring mass-market software under the proposed system, then so be it. I don't think the hobbyist market will disappear. It's hard to argue that auto hobbyists have been disenfranchised by the current system. The only hobbyist action that is currently restricted is to take an unsafe car onto a public road.
Re:Amateur cars (Score:2)
Please remember that most legislators don't know much about the techincal issues. They do know about liability, and in those laws they have a marked history of being pro lawyer rather than being pro either of the litigants. Then they are the lobbiests, trying to push their particular twist on the law. I don't think this is the formula for ending up with a law that either we would approve of or that would be better for the country.
Re:Indemnity clauses (Score:5, Insightful)
Fine, now Microsoft is liable for NT vulns, but you can't basically throw MS licensing rules out the window and leave BSD and GPL in tact.
You can get MS and leave the GPL (essentially) intact. The difference between them is that you pay for MS stuff, whereas you generally don't pay for GPL software. Of course, if you pay for GPL software, you should probably have a right of action against the supplier (but not necessarily the original author, if s/he gives it away).
The technical legal difference between the two is that an MS EULA is a contract (legally binding agreement for mutual consideration), whereas the GPL is only a licence (permission to do something the grantee couldn't previously do without anything in return) I understand the contract/licence nature of the GPL is still a matter of some debate, but if a law were passed saying "no clauses excluding liability in contracts for the sale of software", then we could probably catch the EULAs and leave the GPL and other open source licences intact where the GPL'd or OSL'd software was provided gratis. At any rate, I think it should be possible somehow to distinguish the two on a "you pay for one, you don't pay for the other" basis.
It depends on who made the decision to go with the buggy software. If it was your decision, then yes, the responsibility falls on your shoulders. If, however, the decision came from management on the rationale that "nobody got sued for going with MS" or some other non-tech-related reason, and that decision was made against your own advice, then you shouldn't cop the heat for that
Of course, given your lowly position in your organization relative to the goon that actually made the decision, office politics will pretty well guarantee that you'll take the heat anyway :).
Very good point (Score:2)
People pay for GPL'd software every time they buy a distro of Linux, the burden in that case would probably fall on RedHat or SuSE or Mandrake, rather than the original authors, but the arguement can be made. And if there's an arguement to make, leave it to some shifty ambulance chaser to make it...
Re:Very good point (Score:2)
The argument could be raised, but thankfully the GPL contains an "as is" warning effectively from the author. The law of negligence says that if you warn people about the risks they are undertaking before they commit to something, the standard of care you owe is lowered. If I warn a person to whom I give my work about the potential dangers of that work, responsibility would be transferred from my shoulders to theirs. In addition, if I go so far as to give them the means to guard against any flaws in my work (by providing source in the case of software), then I probably have an effective defence in the form of contributory negligence if it blows up in their face. Finally, any case in negligence would be weakened by the enornmous social (and arguably economic) benefit that flows from me giving away software. I doubt that even a lawyer who's clever to the power evil could get around all those factors.
Products liability law usually says that such warnings don't apply, or if they do, they have to be prominently displayed on packaging, etc (which would tend to put consumers off buying it: can you imagine MS labelling its boxes in large letters "Warning! This software may delete all your work and ruin your life!"), but again, you have to be in the position where you're selling software, rather than just giving it away, in order to attract liability under those laws.
As you say, if MS was really smart, they would go along with any new legislation, but insist on strict liability provisions that would cover all software. They could afford the increased legal exposure, but the average individual author of GPL'd work couldn't.
Re:Indemnity clauses (Score:2)
If management is employing you for your tech-related expertise and advice then they should listen to it - whatever their prejudices about the matter are. However even when management actually made a bad decision contrary to advice they often want a scapegoat to blame.
Re:Indemnity clauses (Score:3, Insightful)
That's true. Software is unlike most any other product because of its complexity and nonlinearity. The average software developer makes hundreds of individual decisions per day that end up embedded in their code. Any one of those decisions could be a hole that destroys the security of the entire product.
Testing and review helps, but it decades ago it was mathematically shown that in general you cannot prove whether an algorithm is bug-free. The tiniest crack in the logic could be used by an attacker as a wedge to subvert the entire product.
This is very different from designing bridges or buildings, for example, where the thousands of decisions going into the design tend to reinforce the basic premise of its fundamnetal soundness. The mathematics of each calculation are usually verified by calculations done during other parts of the work. Due to this feedback, systematic failures are extremely rare, and when they do happen, often end up showcased on History Channel programs such as "Engineering Disasters".
Laws developed to assign liability for bridge failures, train wrecks, etc. are not suitable for software problems. There needs to be a crystal clear distinction made between companies and individuals who make an honest mistake and work in good faith to correct it (no matter what havoc it wrecked), versus those who recklessly ignore third-party warnings and past problems in favor of distributing obviously flawed products time and time and again.
In other words, software liability should not focus on individual incidents, but trends and patterns of behavior. Unfortunately, the law usually focuses on minutia, and it would be very hard to get it to focus on the big picture to punish only the genuine schmucks. Current legal practice usually likes to make examples out of a few unlucky small-timers. But as I explained, every software developer is almost certainly a potentially unlucky small-time offender.
Re:Indemnity clauses (Score:1)
Best post so far. Software lemon laws would be great, but very difficult to do without taking the intent and/or general practices of the manufacturer into consideration.
Re:Indemnity clauses (Score:3, Interesting)
This is very different from designing bridges or buildings, for example, where the thousands of decisions going into the design tend to reinforce the basic premise of its fundamnetal soundness. The mathematics of each calculation are usually verified by calculations done during other parts of the work. Due to this feedback, systematic failures are extremely rare, and when they do happen, often end up showcased on History Channel programs such as "Engineering Disasters".
But it is possible to write secure software through good software engineering practices. Unfortunately, not many people seem to understand them. Only a few individuals like Dan Bernstein [cr.yp.to] can consistently and effectively write secure software, and will guarantee [cr.yp.to] that it is secure.
If software was thoroughly designed from the start before any code was written, the same as with normal engineering projects, then perhaps more software would be secure. If you look at his guarantee for qmail, then you'll notice that he followed several principles throughout the design and implementation that allow him to guarantee that it is secure. If software engineers become liable for their work in the same way that traditional engineers are liable, then maybe software engineering will become more like traditional engineering.Interesting idea but will probably get f**ked up (Score:1)
As countless examples in the US legal history tell us, this problem will most probably be solved by creating a set of (rather stupid) arbitrary rules that software makers must follow.
Consider this example: US government institutions may only use software that meets certain accessibility standards (e.g. you have to be able to increase the font size, display stuff in high-contrast mode etc.) The only company that has resources to make its software compliant with these rules at the moment is Microsoft, it is just too expensive for others.
Now what makes anyone think it would be any different with these security requirements? The rules will probably be something like "all financial transactions must use SSL" or "passwords must be encrypted with 128-bit keys" or something like that. But the reason behind most security holes today is not so much insecure protocols or insufficient key lengths but invalid assumptions between different components in complex software. And no law is ever going to take care of this problem.
Won't work unless it's globally enforceable (Score:2, Interesting)
Re:Won't work unless it's globally enforceable (Score:2)
Not really, because this is enforced from the purchaser's end, not from the manufacturers end. It's not whether software written in country X will have the rules applied to it but whether software sold there will. It doesn't matter where the writer is, just where the user is. Of course if a country that's a small player in the software purchasing market tries this it probably won't work because companies will simply refuse to sell there. But if the United States or the European Union decide to try it, there will be a huge impact. The US and EU are a big enough markets that all of the big players would be forced to bring up their quality because they couldn't afford to be shut out.
Liability and free software (Score:3, Informative)
Liability is the reason that the Broadcast 2000 project was removed from public access, which is a tragedy because I'm sure tons of people could benefit from their free software. From their web site:
Theirs isn't a security issue, but it's still very relevant as they are acting out of the fear of being held liable for what they were offering for free. That is really sad.
Security issues are deep-rooted, and most definitely can't be solved by nullifying the liability clause in licenses.
Re:Liability and free software (Score:1)
From there web site "We're not a corporation. We're not a guy in dorm
room trying to act like a corporation. We don't have accounts
receivable but we do have accounts payable". The *real* reason was
that Broadcast 2000 was not good enough. So they started to develop
Cinelerra and made Broadcast 2000 unavailable from there site. But to
keep it from getting to much attention they pretended it wasn't there
as you get this message "It's not here anymore. Why don't you go to
this award winning page." Where "this" takes you to Microsoft.com.
However, it is available <a
href="http://sourceforge.net/projects/heroine
these guys are still developing the software but like playing stupid
games to fool the general public into thinking there are no longer
doing anything to avoid getting too much attention. A nice overview
of the *real* history of Broadcast 2000/Cinelerra is availably
with the Cinelerra docs.
Re:Liability and free software (Score:1)
Re:Liability and free software (Score:1)
Broadcast 2000 (Score:2)
However, theirs was a somewhat special circumstance as they had a reasonable expectation of being sued by deep-pocketed organizations (MPAA, RIAA) whose motiviations are well known and have little to do with actual software security or quality.
Sure fire success policy for M$FT (Score:1)
foreach (@msft_bug) { pay $_ and wait };
foreach (@competitors_bug) { they_pay $_ or go belly_up };
negotiate any class_action_suit ( "just like tobacco companies did" => increase price if (you_can_cite_extra_liabilities));
expand kitty;
wait and see everyone vanish "like fleas";
What do the customer's want? (Score:1)
The simple fact is, customers do not want software that has reduced features and is more expensive. The customer wants cheap software that is slightly secure and feature-full. If a customer wants secure software, then there are alternatives (Linux, BSD, contract a programmer to create a secure program...)
If the government begins to practice software liability, then they are essentially telling the customer what she wants. The government does not know that the customer wants, the cusomer knows what the customer wants.
Hold users accountible, not vendors (Score:1, Interesting)
For another thing, many of the security problems that exist (as the article points out) stem from improper configuration and use of a software product. If I buy something from CheckPoint, and accidentally leave myself wide open while installing it because I'm too cheap to hire a real firewall jockey to do it right, how is that CheckPoint's fault? And if we don't hold vendors responsible for these misconfigurations, the "sue the vendors" fix doesn't solve this part of the problem at all.
As an alternative, think about holding the person or company who deploys insecure products, or deploys secure products incorrectly, responsible for the damaged caused. If some virus emerges that roots your webserver and uses it to DoS me, it's your fault that I'm losing traffic. This puts the incentive to fix insecure configurations in the hands of the people who are closest to the problem.
Additionally, holding users responsible will tend to breed better security products. If a company realizes that it can be sued when its machines are compromised by ILOVEYOU and harming others' property, it will have a strong incentive to be selective and careful when purchasing and installing security measures. The guys selling IIS will have to clean up their act, or face a complete lack of customer interest.
Re:Hold users accountible, not vendors (Score:1)
Ideas. (Score:1)
Low quality stuff (Score:1)
Was he thinking of Windows XP when he wrote this?
Anyhoo, I agree with him about the ineffective airline security measures after September 11. If someone wanted to get on a plane and run it into some building, soldiers in terminals and having guards check for tickets at the gate won't stop them. If they really felt like doing so, I don't think spending the $200-whatever for a ticket is going to deter them. I think it's just there for looks. (they confiscate nail clippers for crying out loud)
Liability not the problem (Score:1)
I don't understand. If the cost of having no security is so low then liability won't change anything. Why get security insurance if you can easily swallow the cost of getting hacked?
It seems like the real problem is that security doesn't matter that much to most companies. If it did they would work hard to protect their bottom line by finding secure software. Liability won't significantly change this.
Design Limits ... (Score:2, Insightful)
We are still in the dark ages as far as software liability goes
LL
What about the reverse? (Score:2, Interesting)
What if a software company were to change its license such that it WOULD assume liability? Granted, it would probably need insurance of some kind, but how much more comfortable would a purchaser of this hypothetical company's software be if had somebody to sue?
Let the free market speak - Once a company is confident enough in its product to offer a warranty, the rest will follow.
Re:What about the reverse? (Score:2)
One small problem. The security of the software is directly tied to the OS that it runs on.
If you write Windows software, you have to use Windows APIs (either directly or indirectly), and the security of YOUR application is then dependant upon the OS.
If you write on a more secure OS, then your application would be inherently more secure, and you could probably offer a warranty.
Likewise, if you write a module to add-on to an application, then your module would be at the same security level as the application ... and so forth.
Whose liability? (Score:2)
Not that "common best practice" insurance for security liability wouldn't be a bad thing - it's so much easier to cost justify "running this will take our insurance premiums up $x" than it is to say "running this will increase our risk of Something Bad Happening some unknown percentage." But it's the operators that bear that cost, not the manufacturers.
If you wanna run that FlashyRedSportscar 1.0 software that makes it more likely you hit a wall at 140 MPH - your risk, your call. Providing FlashyRedSportscar Software, Inc. was diligent in its processes, they shouldn't have to hire lawyers when you meet the wall.
Re:Whose liability? (Score:2)
If the car crash is determined to be caused by a design flaw, ford IS liable. The classic case is the ford/firestone fiasco of a couple years ago.
Software manufacturers should held to the same standards as every other company out there. If a design flaw is present which could produce problems to the customers (i.e. cost them money, productivity, damage equipment, injury, death, etc) they risk getting their ass sued. However, miss-use by the user (ex: typing 'rm -rf /' and losing all their work) does not make the manufacturer liable.
Basis of liability (Score:4, Interesting)
This insurance will get much cheaper if you use good systems and have the required competence to make them secure.
Some problems will have to be resolved by the legal community:
The last point is important, since you are only responsible for problems caused by your equipment, as long as they are not due to some criminal action by somebody else that you could not easily detect.
To stay with the car analogy: If somebody sabotages your brakes in a way you don't notice until they stop working, accidents that result may not be your responsibility.
An additional point: While a car manufacturer has certain responsibilities, not everything that can go wrong is their responsibility. Only things they claim or are required by law to claim have to be backed up by their product. If you hit a tree because you don't know how to drive or if you start sliding on ice, that is certainly not the manufacturer's fault.
In the case of software this gets a little more complicated, as there is no "unit" of software. My feeling is that Manufacturers will not face legal requirements for characteristics their software will need to have, because such characteristics might be impossible to specify (not saying people will not try). Instead I think that cheap "computer operation insurance" will only be available for products where either the Manufacturer takes legal responsibility for some characteristics of the product or where the insurance companies have a strong indication that the pice of software has these characteristics.
I also think that Computer Scientists and other people that produce code and systems will have to have a kind of "Malpractice Insurance" whenever they commercially create code for others.
Re:Basis of liability (Score:2)
Interesting points ...
So ... basically, every company that operates a MS Server (or Advanced Server) that does not have a MCSE running it would not be able to sue Microsoft because of the "seeming incompetance" of the administrator ... due to him not setting up the server correctly?
I don't think this would fly, since even if you DO setup applications "according to spec", there are security holes. Even with the latest patches installed.
This would be one of the points, in a lawsuit, that the defendant would have to prove ... that the operator was incompetant to run the machine. IF they were able to prove this, the plaintiff would lose. If they were unable to prove this, then the defendant would lose.
Re:Basis of liability (Score:2)
Not quite. My thought was that you cannot sue MS unless they claimed something untrue, like "no security holes".
The missing MSCE would not have an impact on that, as the product is not changed by the way it is used. However the missing MSCE and the choice of a specific MS product with its specific set or vendor claimed properties would have a huge impact on the server insurance cost the company would have to pay.
And if something goes wrong because MS claimed something untrue, not the company, but the insurance company would sue MS to get the money the had to pay for the damage done by the MS server back.
In the case that the damage (to the server using company and other involved parties) was not due to false claims by MS, the insurance cost for this company would probably raise, just like car insurance gets more expensive after you had an accident. But no claims against MS would result.
Note that I am used to the European legal system. If you misuse something or make a stupid mistake, you cannot sue the manufacturer here. The European legal system expects common sense from people. As an example, if you get cancer from cigarettes, you cannot sue the cigarette manufacturers here, because it is pretty obvious that cigarettes cause cancer. Or if you fall form a ladder you cannot sue the manufacturer unless the ladder was not well made. If it just fell over because you did not secure it, that is your problem, becaue it is pretty obvious that ladders have to be secured.
I admit that I don't understand this American habit of suing the manufacturer, but my impression is that it is counterproductive. Especially with the huge amounths of money people sometimes get in settlements. No such thing here, if you are compensated for pain/damage suffered, the amount of money you get is entirely measured against your losses. Of course companies might have to pay penalties, but that money goes to the state.
OpenBSD & Qmail are examples of insecure freew (Score:2)
OpenBSD & Qmail are examples of insecure freeware. But isn't OpenBSD exhaustively audited with many sections rewritten to eliminate security bugs probably spotted before ever being exploited? Doesn't Dan Bernstein write rock solid secure code in packages like Qmail and DJBDNS? The answer to both of those questions is a definite yes. The problem isn't that these things are made insecure, which is not the case at all. The problem is that the end user, or the system administrator, too easily can make things insecure by even the simplest mistake in configuration.
I'd love to have secure programs as much as anyone, and OpenBSD and Qmail certainly show that some of that is available now. But when I choose what software I will install, I have to do more than just choose what is securely written; I have to balance development security against administrative security. Certainly hiring more skilled administrators can improve security. But if the software is harder to configure and manage, then it either takes more administrator time, skill, and attention, for a given level of result. To that end, one of the important factors I judge software by is how easily it can be configured.
Close to that is another factor measuring how easily a given package can be hacked to correct a bug, or change a feature, if needed. If the code is well written, well documented, and clearly organized, the time it takes to hack it, and the certainty of hacking it correctly, is improved.
For any given package, there will be some people more experienced with that one than others, and so this isn't always a clear cut decision. I made the choice to go with Linux and Postfix, instead of the other choices. But this decision suited my needs, balancing reasonably secure software and reasonably secure adaptability to my environment (including programming and administrative skills). It won't be the same choice for everyone. And there are cases I've recommended software I don't actually use because it better suited someone else's different circumstances (fortunately I was at least reasonably familiar with it from evaluation to know its specifics). For example, if you have no need to change anything, I'd say OpenBSD would be the best choice for a combo firewall and server (just don't let anyone touch it ... a console is a dangerous thing).
Now here's the rub. What if someone does install OpenBSD and/or Qmail, and after they configure them, some kid breaks in and takes the machine for a ride? Are we going to blame Theo and Dan? I wouldn't, because I've seen way too many administrator mistakes (and learned from the ones I've made) to be putting the blame on the software. My big worry is that if we start pointing the liability finger at the software vendor, they're going to end up taking the heat way way more than they should be.
The OpenBSD and Qmail development people, as far as I know, fess up to their bugs, especially the security bugs, and let people know when a hole is found. If we are going to have software liability, I think that a practice of consistently divulging known vulnerabilities should be considered a safe-harbor from the liability, even for bugs that got exploited before the developers were aware. It's the practice of covering up on the vulnerabilities that I despise. That's where the liability should be.
The legal test should be whether the software vendor has carried out a consistent practice of immediately divulging (if not to the whole public, then at least to all their customers) the existance of the vulnerability, even if they don't have a fix for it yet. I'd rather take a web site down for a day if it is discovered to be insecure, while waiting to get it fixed. Of course open source is a plus here, as I can dig in and hack up a fix or work around myself, even if its just a quick and dirty one (like gross over sized malloc with some randomizing, for buffer overflows, to ride out a few days until a proper fix is available). And this means all customers, not just a few privileged big corporate customers.
Interesting way to level the playing field. (Score:2, Interesting)
What the government needs to do is enact legislation that ties source code to a company's liability for the damage their software causes in case of failure. If a company releases its code with its products, then exempt them from liability; the customer has the code and could fix it if they wanted to. But, for companies that choose not to release their code, make them liable for their shoddy product. After all, what they're selling us is *supposed* to be complete and useable, and if they're not going to put their customers in a position where they can fix problems with a product themselves, then the closed source software company should pay.
This would even be a positive situation for the closed software companies in the long run, as the liability that they are selling along with their product is yet another feature their software can claim. This could one day end up being the competitive point between open source and closed source: open source = a gamble for your company, but a cheaper product, closed source = guaranteed to work by the producer at extra cost.
Either way, something has to be done.
Re:Interesting way to level the playing field. (Score:1)
Somehow this opinion of yours is correct, but on the other hand, after that, you'd put a bad name on (open source=a gamble for your company) all OOS products. This would be a mass of commercials of "how not to gamble, go secure". Even though that's not true.
For average user, that knows almost nothing about security, this would be just a good commercial for proprietary software.
To conclude my opinion: LAW SHOULD BE MORE STRICT
Either you're doing OOS and you're not liable (since you give it away for free), or you're selling software and you're automatically liable for what you're selling. Clauses for both parties should not be acceptable.
And yes, something really has to be done.
Practicality (Score:1)
I think software companies should take reasonable steps to ensure that their software is secure, and they should design their software with security in mind from day one. However, I don't think companies should be held liable for flaws unless those flaws are the result of negligence.
The WESAYSO license (Score:2)
To me that's a huge load. As far as software is concerned, we're still selling snake oil and living in the old west. There's a lot of buyer beware which is why I support trial-warez.
On the other hand, open source software is almost always considered "a work in progress" that seemingly never completes. That's just a given. But when a commercial product is released, there's a sense of finality involved. This is version 1.0 and any newer version will cost you money.
To me, once you exchange money and acquire a product, there is a moral responsibility on the supplier's part to guarantee the work in some way. I hate to use physical world analogies and so I won't go into detail. But imagine if the same sort of agreement went into the purchase of cars?
There is a huge difference between a publically contributed free work and one that is licensed (not sold) to a user for a given purpose. This game of "I want your money but not the liability" is a load of attorney crap. If you're a professional, be prepared to behave like a professional.
In any case, I think I'll go into business as a brain surgeon and make people sign agreements that say I'm not useful for any particular purpose, am not responsible for my actions and any additional surgical procedures resulting from my accidentally leaving my tools inside the patients body is an undocumented feature and not an error on my part.
Why make security software better? (Score:2, Informative)
Common problems of liability (Score:1)
Trouble of liability is that everybody expects greater liability of GPL products than proprietary ones. Usualy OOS is more secure and more tested, but to expect liability of product that's is insane. (at least as long as Microsofts clause about 0.5$ is valid)
In all of my cases that involved security or patches, I get much better support from OOS projects than proprietary ones. For instance contacting Corel or Microsoft is quite painfull, either they don't have a patch, don't know or in the worst case they don't even know that patch for solution already exists.
If you take to consideration all of 5 points, what do you get?
1. Greater expectations of proprietary software (liability must be included, so I really don't know if Microsoft could push another project like IIS or IE, without 0.5$ clause they'd be dead and burried in no time)
2. Attack on OOS, where people are giving software for free, but they are liable for that??? Don't know but this is insane. OOS projects were always either well support or they died, but in 99% of cases I felt I've got a great support, either I got an answer or an explanation, in one case only I didn't get an answer and I didn't really bothered, I just swaped to another one that supported my needs well.
So if you ask me liability was talked about too much. Talk rather about who should be liable and who should not be liable. This is a real question, not liability itself.
Microsoft's Security Stance. (Score:2, Funny)
Product Liability for Free Software (Score:1, Insightful)
A - Offering at no cost anything.
and
B - Paying for a product for a given purpose.
:) Distributing binaries, claiming that they do something, in exchange for money, is a totally different kettle of fish to Open Source.
I am sure you can expect very little legal comeback if someone gives you $product, and you lose a finger messing about trying to make it work. However if someone makes you pay leading you to believe their $product is suitable and safe, and you lose a finger due to a poorly designed product, Trading Standards & Consumer Protection laws can be used to sue the seller of $product for damages.
Free Software is given away, no money, no trade, therefore the performance expectation is zero - anything more is a bonus
Commercial software is sold, therefore assumed to be of a certain level of performance, usually "as advertised" - if a product fails to work as it should, or worse causes damage, the people making money should be liable, for sure.
Open Source Software surely must avoid such liability issues, since compilation is required before anything can be expected to work, e.g. "Here are some text files - I find they can produce a program which may carry out function X". Even with harsh software product liability laws, you could charge money for the source code, since it alone can do nothing without a careful process required by the user - the binaries produced are the user's responsibility
By not disclosing source code, companies take on the responsibility of making sure it works right. This should make them liable.
Re:Product Liability for Free Software (Score:1)
UCITA (Score:3, Interesting)
Commercial contract can waive all liability. I seem to recall that the "technical self-help" measures (which allows them to write software that actively damages your system if it thinks your license has lapsed) has been removed, but it still gives them broad rights to gag you when you try to report problems, to falsely claim others haven't reported problems, to falsely claim that the problem either doesn't really exist or has been fixed, etc. It can do all of this because you handed over hard cash and a bona fide contract exists. (I'm not so sure it's bona fide - a contract requires an *exchange* of items of value, and I don't see much value in this software.)
In contrast, free software isn't covered by a contract (since no money was exchanged) and UCITA explictly requires that warranties apply.
This means that Microsoft (to pick a company at random), a company with billions of dollars in the bank and easily able to afford decent product testing, gets a free walk. Meanwhile Joe Sixpack, a professional programmer who released a simple "scratch my itch" program, can lose his house in legal fees defending himself even if he ultimately wins the court cases.
The commentators (UF law professors, working under the aegis of the ACM?) suggested that the voting delegates seemed indifferent to this indefensible state of affairs. Hopefully they'll either fix it, or the lawmakers in the various states will quickly realize that UCITA 2.0 is just as bad as the original.
But it's something that MUST be considered whenever we talk about the need for liability law to start applying in the software world. We can see the importance of having your own source code, but the people who would actually write the laws are still hearing from Microsoft et al, not us.
Re:UCITA (Score:2)
This means that Microsoft (the company that you picked at random) WOULD be held responsible for bugs/security holes. Joe Sixpack, if he does even shareware would be liable, to a limited degree (maybe a cap?). A possible formula for this would be like the graduated income tax brackets. The more you sell a product, the more you should have tested, the more you will be liable ... etc.
However, ANY software that also released it source would be free from this liablilty, since the person installing the application has the source to recompile it if necessary. If they don't have a developer staff/friend to modify the source, they should contact the author and plead to fix it. If the author refuses, too bad, fix it yourself.
Then I think it would be acceptable to have software liability.
There's more than one way to cost. (Score:2)
Lower Your Insuance Premiums: Use Linux. [slashdot.org] The article is here. [com.com] I haven't seen any follow up news, but this is where product liability has the best potential to hurt MS: where the only way they can affect the true cost of their product is by releasing a product that works.
Comment removed (Score:5, Insightful)
Re:Bad Idea, Very Bad (Score:2, Interesting)
Software liability, in the same sense as liability for a "standard" engineering product (electrical appliances, cars, buildings, etc.) is, like you say, ludicrous. That's because companies can employ underwriting laboratories to do testing that would exceed the cost of an in-house testing matrix. Engineering is governed by the laws of physics, which generally can tell you a lot about how resistant a building is to heat, wind, rain, etc. In general, software is just plain not tested enough. This is the biggest problem to the formulation of software engineering as a respectable discipline on par with civil or mechanical engineering.
1. Businesses can crumble because of security assured to them by their software vendor that doesn't exist. People lose houses, jobs, and families because of this kind of thing. Security is dependent on more than just each component of a solution being appropriately secure - it needs the combination of each individual piece to be secure. This task is, in general, too difficult for the average tech lead at a small business, college, or school, who will have enough problems with basic functionality. To some extent, the burden needs to be shifted to software providers- I don't think this is a point of contention.
2. It is easy to purchase the software you need, with a guarantee of security and reliability, and at a reasonable price, only if you are involved with the government of a large country, and even then you don't always get it right. [nasa.gov]
3. IIS on its own may be secure enough for a company intranet, but if the intranet's firewall and proxy servers are compromised, then it has become not secure enough. Schneier wants insurance companies to take the brunt of deciding how effective security solutions are - not the US government.
4. Schneier's main goal in instituting software liability is the management of security risk by lowering insurance premiums for people with more secure software. People who want to develop software without liability protection can count on an according security check level - if a system was in place that made security important for everybody, and not just these guys, [ncsc.mil] the world might be a better place.
5. There are enough larger players within the software world that I don't think this would happen - specifically, IBM wants to protect AIX, Apple wants to protect OS X, and Sun wants to protect Solaris. And if IBM and the NSA want to continue to promote Linux, they WILL make it secure [nsa.gov]
6. OpenBSD has had four years without a remote hole in the default install configuration - it has also had several local holes, and this is entirely discounting the problem of people who configure the software the wrong way. People are choosing to do this, and the market is sorting it out, but not to the extent that's necessary to prevent another Nimda, Code Red, or Iloveyou virus - the cost in lost productivity alone is earth-shattering. And people don't need to get hacked for terrible things to happen to them- in fact, if they never figure it out, all the better for the attacker. No, for the most part, people don't care- and they should. Most people don't want to get vaccinated, but we make them- because the cost to not get vaccinated for society as a whole is that much greater.
Re: (Score:2)
Re:Bad Idea, Very Bad (Score:3, Insightful)
Most Slashdot readers may think it unfortunate that the market prefers Windows and MS Office to more capable alternatives, but few would argue for the more popular choice to be banned as a way of 'correcting' the market's decision.
Point by point rebuttal (Score:2)
I honestly believe that software liability would be a net win for the industry, admittedly with a little pain in the short term as people learned to live within the new system.
Re: (Score:2)
Who do you make responsible? (Score:2)
the person creating the flawed application?
Follow me on this.
there are both sides for this, some people MIGHT want a less secure software (thus, a bit more rushed, thus less expensive) because of his specific application, why would his customers that don't request the features absorb the costs?
We could discuss this point and give out gray areas, and it could make an interresting debate, but It's 1am and I'll limit this to something plain and simple and this is no microsoft bashing karma whoring since I already topped the 50 limit,
Here goes: If you want the companies to be responsible for security flaws in their software, you have to first see if they do any misleading claims. Guess who comes to mind first? yes.. Microsoft. I don't run unix servers at work yet, I am exploring putting my email server on FreeBSD with postfix (which is kinda bitchy for a win2k guy that lost his unix/amiga side a long time ago
Look at how nimda killed most servers and workstations running IIS, look at the freakin time it took for this bastard to get off the net? even MONTHS later I still had port 80 probbing on my machine for god's sake, how many high-speed provider shutted down incoming traffic on port 80? this was due to one serious SECURITY flaw and costed a lot of downtime and unexepected expenses.
Yes there are stupid admins that don't update their machines often. But let's be honnest here, how many update do you need for major flaws on IIS versus Apache for example? I run IIS as an intranet, so I can "forget to update", but if I'd run it on internet for example, how many updates a month would I have to do compared to apache? a LOT more, I read both security lists out of curiosity, and the feeling I had about this was absolutely true. Too bad Apache doesn't have a IIS front-end and ease of use on win2k because I'm sure IIS would take an even bigger drop. I guess microsoft will do something really good with IIS6 because they are probably feeling the heat right now.
Anyways, this is the reason why I will NEVER run my critical services such as DNS server or EMAIL on microsoft software (I use the ISP's for now, considering moving locally) they rush their things out, and fix later, which is totally unacceptable, and forcing to upgrade your browser instead of patching the bugs, and introducing new ones, etc... this is really becomming a serious issue, I wouldn't mind all this if they would at LEAST be honnest about this, but no, they want to go the PR way and bullshit people about security compared to unix system? come on, I have yet to see a nimda breaking loose on unix servers (this is only one example, let's not talk about melissa or any others).
There aren't only negative sides to Microsoft software, windows 2000 is the best OS I've ever used since my amiga, it has it's downsides, amiga has it's downsides too so nothing is perfect, Win2k server is great for small buisness like mine and it's stable enough to do the job and I find IIS great for running my intranet. Well IIS would probably be the only software I'd expose out to the internet (if it was a non-critical server), because it's simple, easy to manage, permissions sets up pretty simply (for those of us who hate text files), but like a lot of people here, even if I find most microsoft software simple and Ok, I'd NEVER build a mission-critical solution on their product, I'd never run a "ebay" on IIS, I'd never be a ISP and running my DNS services on win2k, some do, and some don't have much problems, but when they do have them, they can tell you what hell looks like.
So all this to say: If you want to sell stuff with no responsibilities attached to it because the people don't ask you for it or simply because of budget constraint, you can still be succesful and fill a need, but if you LIE about it, in my book, you diserve to be punished, and severely. If you'd be turning blue and a doctor would tell you "it's nothing, just take two aspirins" and you'd die a few hours later, he'd get his career kissed goodbye, while buisness isn't necessarely life, you can messup a LOT of lifes if your buisness go down because you miss a demo or your 20 programmers are down for 2 days because of a big virus attack and you need to rebuild all the servers and so on, I'm sure there's probably one example from a slashdot reader that could say he missed a demo and financing because of a stupid issue like this (well this might be a bit stretched but you get the point), what about the life of those employees? What about the total cost of all this downtime in the country?
Microsoft is quick to blaim piracy costing BILLION of $$$, but they are quick also to change subject when we ask them how much THEY are costing to the industry because of downtime or upgrade or patching. Again, I am not against Microsoft because I think they are still doing great product, I am against their ATTITUDE towards the industry and all the false (or at least exagerated) claims they are making, if I'd do 1/2 of this as a small buisness, I would kiss my career goodbye, why would US's Icon be allowed to do this blattanly?
Get behind LOMAC and push! (Score:2)
The problem is modifying applications to live within the limits of LOMAC-type security. Work is underway to make WU-FTPD work under LOMAC, but somebody needs to do Apache and a mail program.
If you work on any of those apps, read the LOMAC stuff and fix your apps to live within the LOMAC rules. This will do more for security than any amount of patching.
Re:Get behind LOMAC and push! (Score:2)
Mandatory access is also part of Java security in the form of Protection Domains. [sun.com]
This is part of a very sophisticated, multilevel model which can enable components from different sources to interoperate with the minimum of overhead.
<IMHO>In general, OSS designers should track Java features and figure out how to use or duplicate them. It's easy to be complacent when considering the current generation of MS products, but once MS has transitioned to Dotnet swathes of security and reliability problem areas will have been eliminated. OSS will be left looking very exposed (unless it wants to be dependent on Mono...)</IMHO>
Even if free software were somehow exempt (Score:2)
Consider the fact that if a vendor is forced to take liability for its Zapwicky Mark II. It uses some free software internally, this is known, nothing untoward is happening. The problem is the vendor is itself taking on liability for the free software. If i were making the decision on what to include in the distribution, that in itself would be reason to abandon the use of free software, and choose something proprietry that if there were problems, liability can be "passed on".
Clearly, IANAL.
Bad idea (Score:1)
I say invest more in education and teach people about security.
I know that sometimes the only way to force businesses to improve quality of service is to hurt them where it hurts most, i.e. dig into theor bank accounts, but the proposed solution would kill software developement.
Not good.
TANSTAAFL (Score:2)
This would make software more expensive; you simply cannot have it for free. Period. So the only question is whether or being able to hold the software producers liable, is worth what it costs.
And the answer to that is entirely variable and conditional. If it costs (pulling a number out of my ass) $5000 extra for a machine that you only play games on, then it's not worth it; if it costs a million dollars for a machine that controls your water recycling on a year-long trip to Mars, it probably is.
And because it is sometimes worth it and sometimes not, it should be an option. Instead of making every programmer bonded and liability-insured, thereby increasing the cost of all software, let the user decide when it's important and when it's not, and deal with buying the insurance themselves.
And once it comes down to that, we all damned-well know that most users really don't care about security. So Bruce is trying to push this against the will of the people who will have to pay for it. I know he means well, but it's really just another attempt to enforce good ideas, which is usually a bad idea. Instead, he should stick to educating people about security, which he happens to be very, very good at. Instead of writing your congressman about outlawing liability disclaimers in license agreements, buy a copy of "Secrets and Lies" for a friend.
Bruce is annoying (Score:2)
Software liability: My 2 cents... (Score:2, Interesting)
(A day late, a dollar short... I doubt anyone will read this. Oh well.)
I agree with Schneier that software liability is the only thing that can fix the sorry state of today's commercial software. I also agree with the Slashdotters who say that making authors of free (either meaning) liable would kill off the practice. When I first pondered this dilemma before, I came up with an idea so fiendishly perfect that I'm sure tons of people have thought of it before: make the degree of liability proportional to the cost of the software!
The Microsofts and Oracles of the world who make expensive, broken software will have to change one or the other or be sucked dry by damages awarded in liability lawsuits. On the flip side of the coin, the freeware and Open Source/Free Software communities won't have to change anything, and the shareware folks would be protected by the fact that most people who use their stuff never pay for it, perhaps even encouraging more people to buy shareware so that they might have legal recourse if it ever fails in the future.
Re:Trust No One (Score:2)
That's interesting, but I still say that the user is solely liable for his data. I don't fully trust third-party encryption because anyone could have a master key or a back door, no one knows for sure what interests all go into things like this. The only safe way is to use your own homemade ciphers, assuming that you're not a total idiot.
If you're smart enough to make your own ciphers, you're smart enough to examine others' ciphers and determine whether they're good or not. Furthermore, many people will have investigated the security of well-known ciphers, but that will not be true of ciphers you create yourself. There's no reason to make your own cipher.
Re:Trust No One (Score:2, Insightful)