Should Developers Be Sued For Security Holes? 550
An anonymous reader writes "A Cambridge academic is arguing for regulations that allow software users to sue developers when sloppy coding leaves holes for malware infection. European officials have considered introducing such a law but no binding regulations have been passed. Not everyone agrees that it's a good idea — Microsoft has previously argued against such a move by analogy, claiming a burglary victim wouldn't expect to be able to sue the manufacturer of the door or a window in their home."
Nah (Score:5, Insightful)
I think excessively poor software should result in some form of negligence ... but general “can happen to anyone” type bugs.. no.
You can buy software with a (real) warrantee attached. In general this costs a fuck tonne of money because they are accepting a fair amount of liability. Even in a very horizontal market, the price increase for accepting that liability is going to be way more than anyone can afford.
You get what you pay for. Want software that is very secure and unlikely to have serious bugs.. you can get it.. but it’s gonna cost more than you are willing to pay if you don’t really _need_ that level of support.
Would stop a lot of development (Score:5, Insightful)
If it was possible to prevent all security holes, this wouldn't be a bad idea. However, it is provably impossible to do so. This would just create a new inurance industry, profiting from others' mistakes. It would really only serve to cut down on development, especially from small companies and individuals that couldn't afford to make a single security mistake (or insurance against lawsuits).
Sure (Score:5, Insightful)
What we need is more and richer lawyers and frightened software developers with malpractice costs bigger than doctors. Perhaps we can eventually make sure all code is only developed by giant corporations made up primarily of legal defense teams dedicated to patent exploitation and liability control with tiny development arms tagged on the end.
For "sloppy coding"? Definitely! (Score:2, Insightful)
Exception: FOSS. All commercial software vendors should be liable for any and all damage caused by sloppy coding, including system cleanup, downtimes, etc. In most European countries this would just require classifying sloppy coding as "gross negligence". I am all for it.
Engineering Discipline (Score:5, Insightful)
Betteridge's law of headlines (Score:5, Insightful)
Sue the actual developer? How would you propose to do that if they're working for an incorporated company with limited liability?
Fair's fair (Score:3, Insightful)
Sure, let's agree with Prof. Clayton that you should be able to sue developers for malpractice if their code contains security holes.
Then perhaps you should also be able to sue professors, like Richard Clayton for malpractice, if their students are undereducated, or if their papers contain flaws.
Re:Nah (Score:5, Insightful)
excessively poor software should result in some form of negligence ... but general âoecan happen to anyoneâ type bugs.. no.
And how do you define the difference?
Based on the quality of code?
Based on the amount of unit testing that was (provably) performed?
This will start a slew of software that is only warranted under specific OS/software configurations (and then installing an aggressive anti-virus or not error-checking your RAM chips regularly would void your warranty).
Re:Short answer: No (Score:5, Insightful)
It'll have very little impact on actual code quality.
All that will happen is:
- software prices will increase
- a whole insurance industry will spring up around it (think malpractice insurance)..
- people will specifically seek out stuff developed by small shops and try to break it specifically so they can sue..
- producing software will become so expensive and require so much up-front investment that indie devs will be SOL
- the big guys will keep producing shit, and just protect themselves behind lawyers (and feed the cost back to the customer)
Re:Nah (Score:5, Insightful)
Some things, like allowing SQL injection, might be considered negligence. But no programmer can possibly guarantee a complete absence of bugs, and any bug can be a security hole. It takes time and money to track them down. If you don't give them that time and money, you can't expect perfect security.
like other engineering fields (Score:2, Insightful)
It's really time for computer science to grow up and join the rest of the pack. If a mechanical engineer designs a bridge that collapses under normal load, that engineer can be held PERSONALLY responsible for breach of duty. It's long past time for us to stop forgiving shody practices and people making the same old mistakes over and over that cause 90% of security vulnerabilities. Until people are held accountable, there won't be any meaningful change, and we'll keep having a field dominated by hacks and talent-free people without a real understanding of what they are doing.
We NEED this responsibility, and so does the public we serve. They're growing tired of the mess that exists right now. Apple is trying to do better on this front but really it needs to go much futher, and the whole field needs to improve. We've had many decades of ad-hoc cowboy-coders. It's time to start demanding professional behavior.
Re:For "sloppy coding"? Definitely! (Score:5, Insightful)
Why should FOSS get a bye? What user really has the time to validate the code, line by line, to search for security weaknesses BEFORE using it? No. Users expect the software, free or commercial, to work as advertised. And, given the "superiority" that FOSS pro-ports over commercial software, maybe they should be held to an even higher standard? Didn't think you'd want to go there.
In many ways, FOSS would find itself encountering lawsuits despite the "good samaritan" approach it provides. Loss, whether it be from something you paid for free, is still a loss and, in our litigious society, fair game.
No, leave it to an academic to propose making individual developers liable for each line of code they right. This will destroy the entire IT industry (and, most institutions) in a sweeping blow. Who could afford the "malpractice" insurance given the wide-spread dissemination of most commercial and FOS software?
Re:Nah (Score:4, Insightful)
Perfect, no... but I suspect there are companies where if required to justify what they did to protect the end users (was the thought of security even a consideration at any point..) would pretty much give a blank stare.
It's not about having perfect security imo, but rather about at least making an effort proportional to the risk you are putting users in.
Re:For "sloppy coding"? Definitely! (Score:4, Insightful)
Re:Nah (Score:2, Insightful)
Allowing SQL injection attacks is negligence. And if you allow an SQL Injection attack, you need to find another line of work.
SQLinjection is the easiest attack vector to ward off, all you have to do is use prepared statements.
Done.
Re:like other engineering fields (Score:5, Insightful)
OTOH a professional engineer differs from a software developer in one key way: he can't legally be overridden on safety matters. If management orders him to use steel that doesn't meet spec for the bridge's designed load, he can refuse to sign off on the plans and if the company tries to fire him the company is the one who'll end up in legal hot water after he reports them. If you want to make software developers responsible in that same way, you need to give them the same authority and immunity to repercussions for using that authority.
Re:Short answer: No (Score:3, Insightful)
Re:Nah (Score:4, Insightful)
Re:Sure (Score:2, Insightful)
Developers outside of the US would thank us for our malpractice driven suicide.
It would be the end of OSS (Score:5, Insightful)
While OSS zealots like to think it is bug free, it isn't. Bugs can and do happen in OSS. Well who the hell is going to contribute to a free project if they know they can be sued for it?
Also it would lead to way less flexibility in software. Vendors would restrict what you could run and how you could run it. That is what you find in real high end systems where problems aren't ok. They are very expensive, they only do what they are designed to do, no installing arbitrary software, and upgrades are very slow in coming.
So long as you want your system to be the wild west where you can install whatever you like, use it in any way you like, and be nice and cheap then you have to accept that problems can and will happen. If you want verified design you can have that, however you need to be prepared to pay the price both in terms of money and in terms of restrictions.
Requires total change of industry (Score:4, Insightful)
This would also cause a massive drop in the number of available licensed programmers for any work that needs to be done, as well as having Programmers carrying liability insurance. This would cause development costs to get much larger. I seriously doubt half of the programmers in the country are close to prepared to pass any licensing tests.
Re:Engineering Discipline (Score:4, Insightful)
He's thinking about the money that could be made if demand were to greatly outpace supply. He also thinks he isn't a hack.
In the long run, it'd be a huge mistake. Any number of great programmers started off as hacks, then through time and experience, became who they are today. Implementing something like this would only serve to help destroy the technology field faster (and where it goes, others will be sure to follow).
Re:Nah (Score:5, Insightful)
Are you guys crazy?
Do you realize how much bridges cost?
Re:Short answer: No (Score:5, Insightful)
As a professional engineer in a closely related field (industrial control systems), I disagree. What is required is a degree of rigour in design to remove systematic errors as much as is humanly possible. Engineered products still fail, and end-users may sue, but the test is simply whether the engineer, or developer in this case, took all *reasonable* measures to limit errors.
Long overdue in the software development profession, IMO. It's time we grew up.
Re:Nah (Score:5, Insightful)
The problem is that the market is by definition asymmetric. The customer usually knows jack about the implications of security concerns. If people had any idea of security, Facebook wouldn't be a success.
Re:Nah (Score:4, Insightful)
What's wrong with basic auth over ssl? it's still encrypted...
no worse than form based auth.
the biggest problem is that users have no idea how their data will be stored or used on the back end... the frontend auth system, wether it uses ssl, basic auth, forms or whatever, is visible to the user, but beyond that your data is going into an unknown black box..
if you've no idea what is being done at the backend, you cant make an informed decision as to wether you want to trust a given site with your info or not.
Re:Short answer: No (Score:4, Insightful)
So, how does the price of a typical industrial control system compare to the price of typical boxed software for popular platforms?