Should Developers Be Sued For Security Holes? 550
An anonymous reader writes "A Cambridge academic is arguing for regulations that allow software users to sue developers when sloppy coding leaves holes for malware infection. European officials have considered introducing such a law but no binding regulations have been passed. Not everyone agrees that it's a good idea — Microsoft has previously argued against such a move by analogy, claiming a burglary victim wouldn't expect to be able to sue the manufacturer of the door or a window in their home."
Nah (Score:5, Insightful)
I think excessively poor software should result in some form of negligence ... but general “can happen to anyone” type bugs.. no.
You can buy software with a (real) warrantee attached. In general this costs a fuck tonne of money because they are accepting a fair amount of liability. Even in a very horizontal market, the price increase for accepting that liability is going to be way more than anyone can afford.
You get what you pay for. Want software that is very secure and unlikely to have serious bugs.. you can get it.. but it’s gonna cost more than you are willing to pay if you don’t really _need_ that level of support.
Re:Nah (Score:5, Insightful)
excessively poor software should result in some form of negligence ... but general âoecan happen to anyoneâ type bugs.. no.
And how do you define the difference?
Based on the quality of code?
Based on the amount of unit testing that was (provably) performed?
This will start a slew of software that is only warranted under specific OS/software configurations (and then installing an aggressive anti-virus or not error-checking your RAM chips regularly would void your warranty).
Re:Nah (Score:5, Interesting)
Simply requiring encryption when handling something sensitive like credit card info is a start. See: Sony and the PSN disaster.
Re:Nah (Score:4, Insightful)
Re:Nah (Score:4, Interesting)
The engineers, possibly, the architects definitely. Not so much the builders as long as they can show they were following the spec.
If there is any liability, it should lie with the company releasing the software. No individual developer can be held responsible - the software should have gone through testing, QA, user acceptance testing... where do you draw the line? Why the developers and not, say, the testing team for failing to develop a test that shows the bug?
Re:Nah (Score:5, Insightful)
Are you guys crazy?
Do you realize how much bridges cost?
Re:Nah (Score:5, Insightful)
The problem is that the market is by definition asymmetric. The customer usually knows jack about the implications of security concerns. If people had any idea of security, Facebook wouldn't be a success.
Re: (Score:3)
Or Apple, Google, Samsung, HTC, ...
Re: (Score:3)
Translation: You want bespoke (nuclear powerplant-level) security at mass-market prices.
Re: (Score:3)
> You want bespoke (nuclear powerplant-level) security
No, you misunderstand the point completely. This is not so much about functional guarantees of the software than non-functional ones. If you buy a game for $0.99, you do not expect it to be bug free. However, you do (reasonable) expect it not to steal your credit card data or compromise your system.
It is just as with any other item for sale: a cheap toy made in China may not last very long, but you do expect it not to harm anyone. Every toy sold for $
Re:Nah (Score:4, Insightful)
What's wrong with basic auth over ssl? it's still encrypted...
no worse than form based auth.
the biggest problem is that users have no idea how their data will be stored or used on the back end... the frontend auth system, wether it uses ssl, basic auth, forms or whatever, is visible to the user, but beyond that your data is going into an unknown black box..
if you've no idea what is being done at the backend, you cant make an informed decision as to wether you want to trust a given site with your info or not.
Re:Nah (Score:5, Interesting)
What really needs to be tackled is the insane and deceitful difference between software marketing, software warranties and software EULA's. The worst examples of corporate disinformation and outright lies ever seen by man. It's like the very worst of snake oil con men from the 19th century all joined the software sales business, with all sorts of lies printed on the outside of the label but once to consume it's contents all the disclaimers, once hidden by it's contents appear on the inside of the label and this is insanely and corruptly enough is now accepted as normal practice, led by M$.
Re:Nah (Score:4)
You might want to start here:
https://www.pcisecuritystandards.org/ [pcisecuritystandards.org]
Re: (Score:3)
What's really lame is that PCI is an interconnect bus on computers, and this PCI group ripped off the name. They should be forced to change it.
Re: (Score:2)
That of course is a huge issue.
Realistically you would need a standard (or set of standards) defining what "secure software" is... and good luck with that!
I would venture that in the case of a huge vulnerability, the company would be required to show "what they did" to secure the software (what kind of testing they did, review, etc..) and a jury would decide if they were negligent (excessively negligent would be the lead dev cracking on the stand about how the boss kept shouting "ship it or you get the cane
Re: (Score:2)
You don't define the difference. We already have mechanisms under law to handle such as this.A good example would be cases where someone's computer was damaged by a virus scanner that quarantined a key system file and caused a failed boot condition. Those consumers could sue the company for losses associated with getting their computers fixed. It could also escalate into a class action lawsuit. I would think the same recourse is available to an enterprise.
Re: (Score:3)
excessively poor software should result in some form of negligence ... but general âoecan happen to anyoneâ type bugs.. no.
And how do you define the difference?
A.Some things can simply happen (using < when you meant to use <=). Or obscure bugs that manifest themselves as corner cases between many components. Those are the "general, can happen" types.
B. There are others that are simply not acceptable given how much knowledge we have acquired in the last 40 years. Things that are simply erradicated by using widely accepted coding techniques .ie. putting consts/rvals on the right side of the equality operator in C/C++; always checking pointers before derefer
Re: (Score:3)
One of the principal myths surrounding medical malpractice is its effect on overall health care costs. Medical malpractice is actually a tiny percentage of health care costs, in part because medical malpractice claims are far less frequent than many people believe. In 2004, the CBO calculated malpractice costs amounted to “less than 2 percent of overall health care spending. Thus, even a reduction of 25 percent to 30 percent in malpractice costs would lower health care costs by only about 0.4 percent to 0.5 percent, and the likely effect on health insurance premiums would be comparably small.” i
src:Justice.org [justice.org]
Re:Nah (Score:5, Insightful)
Some things, like allowing SQL injection, might be considered negligence. But no programmer can possibly guarantee a complete absence of bugs, and any bug can be a security hole. It takes time and money to track them down. If you don't give them that time and money, you can't expect perfect security.
Re:Nah (Score:4, Insightful)
Perfect, no... but I suspect there are companies where if required to justify what they did to protect the end users (was the thought of security even a consideration at any point..) would pretty much give a blank stare.
It's not about having perfect security imo, but rather about at least making an effort proportional to the risk you are putting users in.
Re: (Score:2)
That last line is the big one: think about what's at stake. If there's any sensitive data involved (credit cards, medical, whatever), security is a very real concern. Of course that also needs to be included in the price. You get what you pay for.
Re: (Score:2, Insightful)
Allowing SQL injection attacks is negligence. And if you allow an SQL Injection attack, you need to find another line of work.
SQLinjection is the easiest attack vector to ward off, all you have to do is use prepared statements.
Done.
Re: (Score:2)
Absolutely. It amazes me how many sites, important ones, even, are vulnerable to it. It's trivial to prevent, and doing so makes your code prettier and faster. There's no excuse.
Re: (Score:3)
Oh, there is "an excuse", happy camping on the line 50231 in the EULA. That's why we love the software industry.
College != ready to program (Score:3)
OK, they might have taught you that in college. But they didn't necessarily teach everybody that.
In any case, every there's a post about college on /., most posters come out to say that college isn't about learning the specifics of development (like frameworks, SCM, etc.). It's about learning the theory of computer science. Learning minutiae is for trade school.
If that's the case, then people who studied the science of computing would have no clue about SQL injection. There's really no place you'd be able t
Re: (Score:3)
I think he means a non privileged user inserted some values via a prepared statement which became active web content (maybe unescaped javascript - but not unescaped sql !).
That content was then viewed by an admin on another machine at which point the java script executed and could submit web pages with actions with admin privileges.
Re: (Score:3)
Obviously you shouldn't be using eval() on user input. That should be pretty obvious to everyone with the slightest bit of programming knowledge.
eval() is always risky, everybody who knows about the existence of eval() has to know that you shouldn't be using it recklessly. If there's any PHP framework that does use eval() on user input, then its maker should be banned from ever programming anything in his life ever again.
Re: (Score:3)
PHP suffers the same problem as any language thats accessible to novice programmers...
Novice programmers will use it, and create poor code with it...
And then non technical management with no understanding of security or technology will decide to start putting such code into production.
Most decisions are made by people not qualified to make them.
Re: (Score:2)
no, the developer will just have a very constrained usage scenario for the no security hole policy. the NSA has proved that you can make Windows and Linux secure, its just going to be a pain in the a$$ to use and annoying to do the simplest things.
Re: (Score:3)
Blame the organization not the developer.
Week one we need a working prototype.
Week two we need to put the prototype into production.
We want it to do all these features... 6 months down the line we need to put in production with half the features.
Yes this products will be only used internally.
As developers we are often not given the full picture and the organization changes the mind. Often good developers in bad organizations write bad code.
Re:Nah (Score:5, Funny)
I dunno, as an indie dev you can change the warranty in the license of your software, stating that
"This software will occasionally test your hardware by deadlocking it, perform random functions regardless the user inputs, and make hardware resources available to the cloud, specifically to the latest botnet makers."
So it always performs as planned :)
Re: (Score:3)
That's the problem of the lowest bidder system...
Companies want to reduce costs, and don't really understand much if anything about technology... So the developer who estimates 100 hours is perceived as bad value relative to the developer who estimates he can do the same thing in 20 hours.
In reality, the 20 hour developer will probably overrun his 20 hours, and will also almost certainly write shoddy code which is full of bugs and difficult to maintain. Although the initial quote is obviously cheaper, once
Would stop a lot of development (Score:5, Insightful)
If it was possible to prevent all security holes, this wouldn't be a bad idea. However, it is provably impossible to do so. This would just create a new inurance industry, profiting from others' mistakes. It would really only serve to cut down on development, especially from small companies and individuals that couldn't afford to make a single security mistake (or insurance against lawsuits).
Re: (Score:2)
Not really - you would need professional indemnity insurance.
The insurance is based on risk of a claim (more copies sold / bigger the premium, could be priced on a fixed price per copy), the impact of damage (just make sure that the license terms exclude indirect consequental damages).
The risk side of the equation can be reduced by using appropriate development structures (code reviews, etc).
This could improve the quality of the industry long term but there will be some pain getting there...
Re:Would stop a lot of development (Score:5, Informative)
If it was possible to prevent all security holes, this wouldn't be a bad idea. However, it is provably impossible to do so.
This is true. However, I still think it should be possible to sue for gross negligence. Like lack of input validation, or storing passwords in plain text, or installing everything world writable.
That's like a bike lock manufacturer whose locks open if hit with a shoe, or a car manufacturer whose cars start if you roll them dowhill and put them in gear, even without an ignition key. Both existed, but would be considered gross negligence today.
I don't expect software to be perfect, but I do expect it to not be outright stupid.
It would be the end of OSS (Score:5, Insightful)
While OSS zealots like to think it is bug free, it isn't. Bugs can and do happen in OSS. Well who the hell is going to contribute to a free project if they know they can be sued for it?
Also it would lead to way less flexibility in software. Vendors would restrict what you could run and how you could run it. That is what you find in real high end systems where problems aren't ok. They are very expensive, they only do what they are designed to do, no installing arbitrary software, and upgrades are very slow in coming.
So long as you want your system to be the wild west where you can install whatever you like, use it in any way you like, and be nice and cheap then you have to accept that problems can and will happen. If you want verified design you can have that, however you need to be prepared to pay the price both in terms of money and in terms of restrictions.
Re: (Score:3)
It would be the end of plugins --think of the liabilities Mozilla opens itself to by letting you extend it in unexpected ways. It would be the end of App stores, given that they extend android and allow personal information to be collected in ways that are hidden and unexpected to most. Hmm, but then it would be the end of ALL Flash support and its vulnerabilities. You'd probably have a fresh start to code signing and trust / authentication system. Personally I hate that MS forced EVEN legitimate (read expe
Re:Would stop a lot of development (Score:5, Interesting)
Turing already did. This reduces to the halting problem.
That's incorrect. The halting problem deals with finding a general purpose algorithm that applies to all cases, but we're talking about solutions to specific problem sets, which absolutely can and do exist.
I've written assembly programs, especially drivers, that were provably free of all bugs: Every series of opcodes did EXACTLY what it was supposed to do under all possible inputs. It's infeasible to develop all software in such a rigorous manner only due to cost, not due to some fault of the hardware or its software (the opcodes).
I didn't have to test my driver over an infinity of inputs because the hardware had a finite set of possible inputs. The bits are limited -- We don't have Turing's infinitely long tape as a CPU word size.
Re: (Score:3)
As soon as it is up to juries to set precedent, the small-time software developer is going to quit. I certainly wouldn't release or contribute to any free software that could cause me to get sued; I wouldn't have time or money to pay a lawyer to eventually hopefully get a decision in my favor.
Re: (Score:3)
You may not be able to just take any program and prove that it works, but you can construct a program of your own in a way that it is possible to prove that it works; actually what you do is the proof and then in best-case scenario extract the machine code from that.
Of course, doing this is still very costly: 7500 lines of C-code can take 200000 lines of Isabelle proof code:
http://ertos.nicta.com.au/research/l4.verified/numbers.pml [nicta.com.au]
Sure (Score:5, Insightful)
What we need is more and richer lawyers and frightened software developers with malpractice costs bigger than doctors. Perhaps we can eventually make sure all code is only developed by giant corporations made up primarily of legal defense teams dedicated to patent exploitation and liability control with tiny development arms tagged on the end.
Windows (Score:5, Funny)
Interesting choice of words there!
For "sloppy coding"? Definitely! (Score:2, Insightful)
Exception: FOSS. All commercial software vendors should be liable for any and all damage caused by sloppy coding, including system cleanup, downtimes, etc. In most European countries this would just require classifying sloppy coding as "gross negligence". I am all for it.
Re: (Score:2)
Re:For "sloppy coding"? Definitely! (Score:4, Insightful)
Re: (Score:2)
"I'd say that most developers want to create the best code they can"
No, they don't.
Because if they did, they are in a perfect position to enforce their rules: you just don't write down a single semicolon and the software doesn't run at all. It's up to the developer when exactly to write down said semicolon.
Re:For "sloppy coding"? Definitely! (Score:5, Insightful)
Why should FOSS get a bye? What user really has the time to validate the code, line by line, to search for security weaknesses BEFORE using it? No. Users expect the software, free or commercial, to work as advertised. And, given the "superiority" that FOSS pro-ports over commercial software, maybe they should be held to an even higher standard? Didn't think you'd want to go there.
In many ways, FOSS would find itself encountering lawsuits despite the "good samaritan" approach it provides. Loss, whether it be from something you paid for free, is still a loss and, in our litigious society, fair game.
No, leave it to an academic to propose making individual developers liable for each line of code they right. This will destroy the entire IT industry (and, most institutions) in a sweeping blow. Who could afford the "malpractice" insurance given the wide-spread dissemination of most commercial and FOS software?
Re: (Score:3)
Except the license agreements already say that there's no warranty, express or implied, and that the developer is indemnified against defects. This is usually there in commercial software as well. If you don't agree with that clause, you don't get a license to the software.
BSD license:
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL BE LIABLE FOR ANY
DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES
(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES;
LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND
ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS
SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
GPL:
15. Disclaimer of Warranty.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
The law would have to be changed to specifically deny that right to the author. If you buy a car, or electronics or something, there may not be an explicit warranty but they usually haven't disclaimed a warranty in advance
Re:For "sloppy coding"? Definitely! (Score:4)
Why does that make incompetent software developers any less responsible for the bugs in their code?
Re: (Score:3, Informative)
Re: (Score:2)
That makes no sense at all.
First, that puts FOSS at a huge disadvantage. If a customer uses 'commercial software' (whatever that is), they can sue if something goes wrong. If they use FOSS - too bad?
Second, define 'commercial software', and more importantly 'commercial developers'. What about the large amount of FOSS that is developed by 'commercial' developers (IBM, Red Hat, etc)? What about FOSS that is 'sold' (RHEL, SuSE)? Do those companies get a free pass on selling crap, or are the developers o
Re: (Score:2)
Re: (Score:3)
No?
I just want anybody that earns money producing software to be able to demonstrate they were using sound engineering practices. You know, like manufacturers of food, medical goods, building materials, cars, etc. The current state of a lot of software is a disgrace, and the problem is companies earning boatloads of money without any liability if quality sucks. As soon as a company can prove they were not "sloppy", residual bugs become accidents and do not make you liable unless specifically arranged contra
Bad Analogy (Score:5, Informative)
You can not sue a door or window manufacturer for failure of your action (leaving the door / window open).
You should be able to successfully able to sue a door / window manufacturer for failing to provide the request product (i.e. seal the opening).
That then hits the ugly question of what is "reasonable". Did the manufacturer provide a reasonable product that provided the expected level of security?
Re: (Score:2)
Re: (Score:2)
Many rental laws give you the right to self-fix and deduct from the next rent payment. Look into them wherever you live.
--
BMO
Re: (Score:3)
Re: (Score:2)
You should be able to successfully able to sue a door / window manufacturer for failing to provide the request product (i.e. seal the opening).
They provided the product, it's not their fault how the product was actually used or installed. It's not the manufacturer's fault that the person responsible for the actual installation was on his first day on the job after reading several tutorials online. It's the responsibility of whoever sets up a computer to secure that computer, you can't sue the manufacturer of your CPU because it executed malicious code that deleted your files.
Re: (Score:3)
Yeah, no. Software programs juggle so many variables that it's virtually impossible to prove the program is bug-free. And add in the computer illiterate, who will find a way to generate giant lawsuits because of "the computer didn't preserve the placement of the file icons I dragged around the folder, after I copied it to a new driver; therefore it's a design flaw" crowd, and you know as well as I do that tech will be sacrificed for the greater human stupidity.
Re: (Score:3)
That then hits the ugly question of what is "reasonable". Did the manufacturer provide a reasonable product that provided the expected level of security?
Well, since my software explicitly states that it disclaims all warranty, and I make no claim as to its fitness for any particular use, then what is your expected level of security? I basically say, Here's the bits I configured. Hope you find them useful! I use input fuzzing, unit testing, stress testing, stateful malloc & free replacements, etc, and do my best to create good secure software. However, I don't know what else you have running in your machine -- I don't even know if your hardware is fa
Engineering Discipline (Score:5, Insightful)
Re:Engineering Discipline (Score:4, Insightful)
He's thinking about the money that could be made if demand were to greatly outpace supply. He also thinks he isn't a hack.
In the long run, it'd be a huge mistake. Any number of great programmers started off as hacks, then through time and experience, became who they are today. Implementing something like this would only serve to help destroy the technology field faster (and where it goes, others will be sure to follow).
Betteridge's law of headlines (Score:5, Insightful)
Sue the actual developer? How would you propose to do that if they're working for an incorporated company with limited liability?
Re: (Score:2)
Fair's fair (Score:3, Insightful)
Sure, let's agree with Prof. Clayton that you should be able to sue developers for malpractice if their code contains security holes.
Then perhaps you should also be able to sue professors, like Richard Clayton for malpractice, if their students are undereducated, or if their papers contain flaws.
it depends (Score:2)
> Microsoft has previously argued against such a move
Well, of course (/snark)
> claiming a burglary victim wouldn't expect to be able to sue the manufacturer of the door or a window in their home.
Maybe one could expect that, if the advertisement for the door or window led one to believe a level of security that the door or window was not designed to supply. Or if a reasonable person would assume, for instance, that a door with a security-type cylindrical-key lock on it could not be opened with a commo
Hanlon's (Score:4, Interesting)
You could consider suing developers that intentionally planted backdoors (even if was following NSA or other US government agency orders), but can't target the ones that by weren't aware of them, did by mistake, lack of knowledge or culture, or because things changed (i..e. having/forcing a 8 char password was "good enough" several years ago, not anymore), or even because taken assumptions no longer true by end user choice (how much portals meant for intranets with not specially strong security end being used on internet).
Also, who you sue because a bug in an open source program with a lot of contributes? or against a big corporation that put in legalese that they aren't responsible for any damage or problem that could happen for using it (that is most commercial software licenses)?
Sure. it can be done. (Score:3)
As long as you're going to foot the bill for a $500 application that changes your computer's wallpaper.
PEs and PLSs, doctors, psychologists, etc, all carry liability insurance. They're also not cheap. In the 80s, a survey crew cost $100/hr to come out and measure your land with a half-day minimum.
Now apply these costs to software.
--
BMO
Re: (Score:3)
In an up economy, it should be 300.
Did it for a few years. Walking barefoot with your pants rolled up in the middle of winter to locate a stream centerline isn't as cold as it sounds, though.
Hip waders? Bah.
--
BMO
Re: (Score:2)
Re: (Score:3)
like other engineering fields (Score:2, Insightful)
It's really time for computer science to grow up and join the rest of the pack. If a mechanical engineer designs a bridge that collapses under normal load, that engineer can be held PERSONALLY responsible for breach of duty. It's long past time for us to stop forgiving shody practices and people making the same old mistakes over and over that cause 90% of security vulnerabilities. Until people are held accountable, there won't be any meaningful change, and we'll keep having a field dominated by hacks and
Re:like other engineering fields (Score:5, Insightful)
OTOH a professional engineer differs from a software developer in one key way: he can't legally be overridden on safety matters. If management orders him to use steel that doesn't meet spec for the bridge's designed load, he can refuse to sign off on the plans and if the company tries to fire him the company is the one who'll end up in legal hot water after he reports them. If you want to make software developers responsible in that same way, you need to give them the same authority and immunity to repercussions for using that authority.
Re: (Score:3)
It's really time for lovers of over-regulation (including regulation through overbroad liability) to consider the consequences of their actions.
He's not, however, responsible if someone dynamites the bridge, deliberately overloads it, or commits other malicious acts designed to destroy it
If you're going to be adding accountability (Score:3)
What happens if there is gross negligence? (Score:4, Interesting)
Bugs and security vulns are almost unavoidable - but some are due to gross negligence. Gross negligence should always be open to litigation. To follow on from Microsoft's analogy, if a door manufacturer was grossly negligent (let's assume that the door includes the lock and hinges - when this isn't normally teh case), and sold a high security door system, but had accidentally keyed all the doors to a single grand-master key. Then if you were burgled because a burglar happened to find out about this grandmaster key, then potentially you have a claim.
I don't see why it shouldn't be too different in software development. A software vendor needs to bear some responsibilty for good programming practice.
Bad software is everywhere; some is so bad, that it does border on grossly negligent.
As an example, I recently reverse engineered an "electronic patient record" system that was installed at a local hospital. This had a number of interesting design features:
1. Password security was via encryption rather than hashing. The encryption was a home-brew modified Vigenere cipher.
2. The database connection string was stored in the clear in a conf file, stored in the user's home directory. Interesting the database connection used the "sa" user.
3. Presumably for performance reasons, certain database tables (notably "users") would be cached in plaintext to the user's home directory. This way, an SQL join could be avoided, and the joins could be done client side.
4. The software ran an auto-updater that would automatically connect to a specified web site and download and run patches as admin - without any kind of signature verification.
5. All SQL queries were dynamically generated strings - no parameters, prepared statements or stored procedure. Not every user input was properly escaped. Entry of a patient name with an apostrophe in it, would cause very peculiar behavior. In the end, regular memos had to go round to staff telling them under no circumstances to use apostrophes in patient names, and to avoid, wherever possible the use of apostrophes in the plain text entries.
This is by no means all the security problems this software had, never mind the bugs e.g. a race condition when synchronising with a second application which would result in the two components opening different patient's charts.
Amazingly, there weren't any security breaches or significant medical errors as a result of this software - but I can't really conclude that this software production was anything other than grossly negligent.
Re: (Score:3)
For those who didn't RTFA (Score:5, Informative)
They aren't talking about suing the individual programmers, they're talking about suing the software companies. Specifically, they want to disallow this kind of language very common in EULAs (this is taken from an actual EULA, name omitted to protect the guilty):
_______ and/or its respective suppliers hereby disclaim all warranties and conditions with regard to this product, including all implied warranties and conditions of merchantibility, fitness for a particular purpose, title and non-infringement. In no event shall _______ and/or its respective suppliers be liable for any special, indirect or consequential damages or any damages whatsoever resulting from loss of use, data or profits, whether in an action of contract, negligence or other tortious action, arising out of or in connection with the use of this software.
The translation of this clause out of legalese is "No matter what happens, you can't sue us, we're not responsible. We don't promise that this software is even remotely like what we advertised it to be."
Depends on the intention (Score:2)
If the security hole introduced intentionally with malice, or nothing is done about a known security bug, then getting sued maybe on the cards...
It depends on the impact the security hole...eg privacy or information breach.
If the security hole was introduced without malice or fixed in time, or does not have major impact (ie affects only test data, or performance...), then you're unlikely to have a case for litigation.
Only an academic would think this is a good idea. (Score:2)
That is all.
Oh, there are ways around this. (Score:2)
Add a EULA that forbids anyone suing me.
Short of that, no longer license software, but provide it for free, but tivo'd. The binary blob inside is what I'll license for a cost. And guess what, its just a trivial piece of software that cannot contain any bugs.
si (Score:2)
Gee, I wonder why a company like Microsoft that writes perfect code would ever lobby against this...
Doors vs Vault Doors (Score:5, Interesting)
Just like anything else, pay for whatever guarantee you desire. If you want your software created in record time, for a low cost, then the bugs are a part of the equasion. If you want secure coding, then you'll get to pay for it in time and money. It's always been that simple. You don't sue the manufacturer of your house door, but you do sue the manufacturer of your bank vault door. The difference in cost is tremendous.
It's rare that my clients ask for proper security. But for the elements that they do indeed want to protect, they pay for me to do my very best work. And you'd better believe that they hold me responsible and often accountable for significant problems should they result.
But in the end, it's all just insurance anyway. If a client of mine wants a particular e-commerce feature to be super-secure, then they'll ask me to pay for any dollars lost due to bugs. I know that I'm not perfect, and of the thirty possible bugs, there's a small chance that I'll fall into one or two of them, and a partial chance that I won't catch it before it's exploited. So while much of the added price is for me to sit there and check things closely, the rest of the added price is for me to accumulate in the event that I need to pay it back. Over multiple clients and multiple exploits, that's the only way to do it.
The obvious alternative of checking things even closer winds up being far more money, and is only really relevant when physical safety is an issue.
Simple compromise (Score:3)
Developers, no. Companies, yes. (Score:3)
Developers don't choose when to release software, it's management. Think you need to do more testing but management thinks it looks ready? It's out the door and you cant do anything to stop them. Testing is just as important as coding and the developers dont do all the testing, it's usually outsourced.
Bottom line: if a company doesn't do it's due diligence then yes, they should be responsible for putting out bad software.
Requires total change of industry (Score:4, Insightful)
This would also cause a massive drop in the number of available licensed programmers for any work that needs to be done, as well as having Programmers carrying liability insurance. This would cause development costs to get much larger. I seriously doubt half of the programmers in the country are close to prepared to pass any licensing tests.
Why not managers and owners? (Score:3)
At the last corporate job I held our managers would frequently push the development staff to put things out before they had been fully tested.
Why punish the people who write the software when often they have an extremely limited amount of control over things?
Businesses selling shitty, insecure software should absolutely be held accountable. Individual developers within those businesses being directly liable? No way.
Why not hold each factory worker who was responsible for a round of ammunition or piece of a missile liable for murder when a drone strike takes out a civilian?
Hold the people responsible for making decisions responsible, not the people who are just putting things together.
Re: (Score:2)
Why not? As an indie dev, it kind of freaks me out, but if it drives as little as half of th crap coders outside of the market, it might be a very good idea...
Re:Short answer: No (Score:5, Insightful)
It'll have very little impact on actual code quality.
All that will happen is:
- software prices will increase
- a whole insurance industry will spring up around it (think malpractice insurance)..
- people will specifically seek out stuff developed by small shops and try to break it specifically so they can sue..
- producing software will become so expensive and require so much up-front investment that indie devs will be SOL
- the big guys will keep producing shit, and just protect themselves behind lawyers (and feed the cost back to the customer)
Re: (Score:2)
Exactly. Imagine the insurance premiums -> they will be comparable or larger to what the engineers / doctors pay.
Software will become to expensive, in reaction, that the field will collapse.
Re:Short answer: No (Score:5, Insightful)
As a professional engineer in a closely related field (industrial control systems), I disagree. What is required is a degree of rigour in design to remove systematic errors as much as is humanly possible. Engineered products still fail, and end-users may sue, but the test is simply whether the engineer, or developer in this case, took all *reasonable* measures to limit errors.
Long overdue in the software development profession, IMO. It's time we grew up.
Re:Short answer: No (Score:4, Insightful)
So, how does the price of a typical industrial control system compare to the price of typical boxed software for popular platforms?
Re: (Score:2)
The "why not" is because all it would result in is more outsourcing to countries outside of jurisdiction.
Either way, bad software is more of a consequence of bad managers than bad developers. Managers allocate resources to projects--including development talent, managers demand deadlines contrary to software quality. If people should be held accountable it should be the ones running the show not the ones taking orders.
Re: (Score:2, Funny)
Yes, and I am sure the managers will be the ones to pay the price.
Re: (Score:3, Insightful)
Re: (Score:3)
Depends on the manager. It's a myth that managers make more money than developers; usually, the lowest-level manager (the one who directly supervises the developers) doesn't make much more (if any) than the developers. It's the levels above him that get paid exponentially more at each level. The only reason someone goes into management is because either 1) they're not that great a developer, and prefer talking and being in meetings and bossing people around to doing actual work, and/or 2) they want to go
Re: (Score:2)
If such a regulation passed, we'd see a lot more attention given to removing bugs, and a lot less attention given to creating new but probably useless features. Yes, I'm looking at both the Gnome3 and Win8 dev teams here.
If such a regulation passed, you wouldn't be able to look at the Gnome3 dev team at all, because it wouldn't exist anymore. As would most free software developers.
Put yourself in the position of Linus Torvalds when he started developing Linux, for instance. Do you think he would even start distributing it if he could be sued for it?
Re: (Score:3)
You seem to have a strange idea about what motivates people to write free software.
I'm a free software developer on my free time. I mostly work on my own projects, but I occasionally contribute random patches to large projects. I write free software for fun, mostly doing things I'm interested at the time. Most software I write have no bad bugs, probably because I write it very slowly, and because stuff I write tend to be very simple. But if I had to go out of my way to ensure that it's impossible for some
Re: (Score:2)
'Tis quite alright. We'll just stop making software for the other sectors. They can go back to doing things on an abacus, while the rest of us have 30 minute work days.
Re: (Score:3)
If you start holding them liable, you can say goodbye to freeware and hobbyist projects.. No one will want the liability that only super corporations can afford to defend.. This will grant microsoft, apple, and the government exactly what they want.