Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Crime Programming The Courts

Should Developers Be Sued For Security Holes? 550

An anonymous reader writes "A Cambridge academic is arguing for regulations that allow software users to sue developers when sloppy coding leaves holes for malware infection. European officials have considered introducing such a law but no binding regulations have been passed. Not everyone agrees that it's a good idea — Microsoft has previously argued against such a move by analogy, claiming a burglary victim wouldn't expect to be able to sue the manufacturer of the door or a window in their home."
This discussion has been archived. No new comments can be posted.

Should Developers Be Sued For Security Holes?

Comments Filter:
  • Nah (Score:5, Insightful)

    by Anrego ( 830717 ) * on Thursday August 23, 2012 @06:41PM (#41102783)

    I think excessively poor software should result in some form of negligence ... but general “can happen to anyone” type bugs.. no.

    You can buy software with a (real) warrantee attached. In general this costs a fuck tonne of money because they are accepting a fair amount of liability. Even in a very horizontal market, the price increase for accepting that liability is going to be way more than anyone can afford.

    You get what you pay for. Want software that is very secure and unlikely to have serious bugs.. you can get it.. but it’s gonna cost more than you are willing to pay if you don’t really _need_ that level of support.

  • by Burdell ( 228580 ) on Thursday August 23, 2012 @06:43PM (#41102809)

    If it was possible to prevent all security holes, this wouldn't be a bad idea. However, it is provably impossible to do so. This would just create a new inurance industry, profiting from others' mistakes. It would really only serve to cut down on development, especially from small companies and individuals that couldn't afford to make a single security mistake (or insurance against lawsuits).

  • Sure (Score:5, Insightful)

    by BigSlowTarget ( 325940 ) on Thursday August 23, 2012 @06:44PM (#41102819) Journal

    What we need is more and richer lawyers and frightened software developers with malpractice costs bigger than doctors. Perhaps we can eventually make sure all code is only developed by giant corporations made up primarily of legal defense teams dedicated to patent exploitation and liability control with tiny development arms tagged on the end.

  • by gweihir ( 88907 ) on Thursday August 23, 2012 @06:46PM (#41102859)

    Exception: FOSS. All commercial software vendors should be liable for any and all damage caused by sloppy coding, including system cleanup, downtimes, etc. In most European countries this would just require classifying sloppy coding as "gross negligence". I am all for it.

  • by DemonGenius ( 2247652 ) on Thursday August 23, 2012 @06:47PM (#41102867)
    If software development was an official engineering discipline that required P.Eng designation, then maybe this case would have more legs. Even then I'd be in disagreement. Otherwise, hell no, HELL NOOOOOOOOOOOO!!!!!!! That is definitely one way to drive people away from a career in software development. This actually seems like a sneaky way for management to evade culpability if their product harms a customer/user.
  • by fiannaFailMan ( 702447 ) on Thursday August 23, 2012 @06:47PM (#41102877) Journal

    Sue the actual developer? How would you propose to do that if they're working for an incorporated company with limited liability?

  • Fair's fair (Score:3, Insightful)

    by Anonymous Coward on Thursday August 23, 2012 @06:48PM (#41102881)

    Sure, let's agree with Prof. Clayton that you should be able to sue developers for malpractice if their code contains security holes.

    Then perhaps you should also be able to sue professors, like Richard Clayton for malpractice, if their students are undereducated, or if their papers contain flaws.

  • Re:Nah (Score:5, Insightful)

    by Mitreya ( 579078 ) <<moc.liamg> <ta> <ayertim>> on Thursday August 23, 2012 @06:52PM (#41102925)

    excessively poor software should result in some form of negligence ... but general âoecan happen to anyoneâ type bugs.. no.

    And how do you define the difference?
    Based on the quality of code?
    Based on the amount of unit testing that was (provably) performed?

    This will start a slew of software that is only warranted under specific OS/software configurations (and then installing an aggressive anti-virus or not error-checking your RAM chips regularly would void your warranty).

  • by Anrego ( 830717 ) * on Thursday August 23, 2012 @06:53PM (#41102935)

    It'll have very little impact on actual code quality.

    All that will happen is:
    - software prices will increase
    - a whole insurance industry will spring up around it (think malpractice insurance)..
    - people will specifically seek out stuff developed by small shops and try to break it specifically so they can sue..
    - producing software will become so expensive and require so much up-front investment that indie devs will be SOL
    - the big guys will keep producing shit, and just protect themselves behind lawyers (and feed the cost back to the customer)

  • Re:Nah (Score:5, Insightful)

    by mcvos ( 645701 ) on Thursday August 23, 2012 @06:55PM (#41102981)

    Some things, like allowing SQL injection, might be considered negligence. But no programmer can possibly guarantee a complete absence of bugs, and any bug can be a security hole. It takes time and money to track them down. If you don't give them that time and money, you can't expect perfect security.

  • by Anonymous Coward on Thursday August 23, 2012 @06:57PM (#41103015)

    It's really time for computer science to grow up and join the rest of the pack. If a mechanical engineer designs a bridge that collapses under normal load, that engineer can be held PERSONALLY responsible for breach of duty. It's long past time for us to stop forgiving shody practices and people making the same old mistakes over and over that cause 90% of security vulnerabilities. Until people are held accountable, there won't be any meaningful change, and we'll keep having a field dominated by hacks and talent-free people without a real understanding of what they are doing.

    We NEED this responsibility, and so does the public we serve. They're growing tired of the mess that exists right now. Apple is trying to do better on this front but really it needs to go much futher, and the whole field needs to improve. We've had many decades of ad-hoc cowboy-coders. It's time to start demanding professional behavior.

  • by Ronin Developer ( 67677 ) on Thursday August 23, 2012 @06:57PM (#41103019)

    Why should FOSS get a bye? What user really has the time to validate the code, line by line, to search for security weaknesses BEFORE using it? No. Users expect the software, free or commercial, to work as advertised. And, given the "superiority" that FOSS pro-ports over commercial software, maybe they should be held to an even higher standard? Didn't think you'd want to go there.

    In many ways, FOSS would find itself encountering lawsuits despite the "good samaritan" approach it provides. Loss, whether it be from something you paid for free, is still a loss and, in our litigious society, fair game.

    No, leave it to an academic to propose making individual developers liable for each line of code they right. This will destroy the entire IT industry (and, most institutions) in a sweeping blow. Who could afford the "malpractice" insurance given the wide-spread dissemination of most commercial and FOS software?

     

  • Re:Nah (Score:4, Insightful)

    by Anrego ( 830717 ) * on Thursday August 23, 2012 @07:01PM (#41103065)

    Perfect, no... but I suspect there are companies where if required to justify what they did to protect the end users (was the thought of security even a consideration at any point..) would pretty much give a blank stare.

    It's not about having perfect security imo, but rather about at least making an effort proportional to the risk you are putting users in.

  • by swillden ( 191260 ) <shawn-ds@willden.org> on Thursday August 23, 2012 @07:02PM (#41103073) Journal
    I think the company would be liable, not the individual.
  • Re:Nah (Score:2, Insightful)

    by Anonymous Coward on Thursday August 23, 2012 @07:04PM (#41103103)

    Allowing SQL injection attacks is negligence. And if you allow an SQL Injection attack, you need to find another line of work.

    SQLinjection is the easiest attack vector to ward off, all you have to do is use prepared statements.

    Done.

  • by Todd Knarr ( 15451 ) on Thursday August 23, 2012 @07:07PM (#41103141) Homepage

    OTOH a professional engineer differs from a software developer in one key way: he can't legally be overridden on safety matters. If management orders him to use steel that doesn't meet spec for the bridge's designed load, he can refuse to sign off on the plans and if the company tries to fire him the company is the one who'll end up in legal hot water after he reports them. If you want to make software developers responsible in that same way, you need to give them the same authority and immunity to repercussions for using that authority.

  • by Elminster Aumar ( 2668365 ) on Thursday August 23, 2012 @07:09PM (#41103163)
    Just because managers hire employees doesn't mean they're in 100% control of who they hire. Managers have supervisors just like everyone else. They have deadlines, too. Besides that, I'm not so sure it's wise to assume that just because you're the one hiring that you somehow control whether you hire talented developers. Last but not least, just because you have a talented pack doesn't mean they either work together well or acclimate to the technologies clientele require servicing with. Same with new hires, too: out of 200 applicants, a manager who hires the one that screened best doesn't imply automatic success because too many moving parts occupy the context. Long story short, it's not all on the manager's shoulders just like it's not all the fault of the developers, clients, etc. All this article is about is some whiny asshole trying to point his fat, stubby finger at someone because he made a bad decision. They're angry, they're pissed, and now someone has to pay their bill. That's all this boils down to.
  • Re:Nah (Score:4, Insightful)

    by tomhath ( 637240 ) on Thursday August 23, 2012 @07:18PM (#41103261)
    So how do you define "sensitive"? There's no end to it; once you open the door a good lawyer can can convince a jury anything.
  • Re:Sure (Score:2, Insightful)

    by Anonymous Coward on Thursday August 23, 2012 @07:25PM (#41103327)

    Developers outside of the US would thank us for our malpractice driven suicide.

  • by Sycraft-fu ( 314770 ) on Thursday August 23, 2012 @07:49PM (#41103539)

    While OSS zealots like to think it is bug free, it isn't. Bugs can and do happen in OSS. Well who the hell is going to contribute to a free project if they know they can be sued for it?

    Also it would lead to way less flexibility in software. Vendors would restrict what you could run and how you could run it. That is what you find in real high end systems where problems aren't ok. They are very expensive, they only do what they are designed to do, no installing arbitrary software, and upgrades are very slow in coming.

    So long as you want your system to be the wild west where you can install whatever you like, use it in any way you like, and be nice and cheap then you have to accept that problems can and will happen. If you want verified design you can have that, however you need to be prepared to pay the price both in terms of money and in terms of restrictions.

  • by Algae_94 ( 2017070 ) on Thursday August 23, 2012 @07:53PM (#41103579) Journal
    Sure this could be done, Professional Engineers put their asses on the line when they approve designs and Programmers could be held to the same level. This would most likely require devs to go through a similarly rigorous process of training and test taking.

    This would also cause a massive drop in the number of available licensed programmers for any work that needs to be done, as well as having Programmers carrying liability insurance. This would cause development costs to get much larger. I seriously doubt half of the programmers in the country are close to prepared to pass any licensing tests.
  • by lightknight ( 213164 ) on Thursday August 23, 2012 @08:18PM (#41103871) Homepage

    He's thinking about the money that could be made if demand were to greatly outpace supply. He also thinks he isn't a hack.

    In the long run, it'd be a huge mistake. Any number of great programmers started off as hacks, then through time and experience, became who they are today. Implementing something like this would only serve to help destroy the technology field faster (and where it goes, others will be sure to follow).

  • Re:Nah (Score:5, Insightful)

    by ColdWetDog ( 752185 ) on Thursday August 23, 2012 @08:39PM (#41104061) Homepage

    Are you guys crazy?

    Do you realize how much bridges cost?

  • by Anonymous Coward on Thursday August 23, 2012 @10:03PM (#41104637)

    As a professional engineer in a closely related field (industrial control systems), I disagree. What is required is a degree of rigour in design to remove systematic errors as much as is humanly possible. Engineered products still fail, and end-users may sue, but the test is simply whether the engineer, or developer in this case, took all *reasonable* measures to limit errors.

    Long overdue in the software development profession, IMO. It's time we grew up.

  • Re:Nah (Score:5, Insightful)

    by Opportunist ( 166417 ) on Thursday August 23, 2012 @11:31PM (#41105125)

    The problem is that the market is by definition asymmetric. The customer usually knows jack about the implications of security concerns. If people had any idea of security, Facebook wouldn't be a success.

  • Re:Nah (Score:4, Insightful)

    by Bert64 ( 520050 ) <bert AT slashdot DOT firenzee DOT com> on Friday August 24, 2012 @01:45AM (#41105923) Homepage

    What's wrong with basic auth over ssl? it's still encrypted...
    no worse than form based auth.

    the biggest problem is that users have no idea how their data will be stored or used on the back end... the frontend auth system, wether it uses ssl, basic auth, forms or whatever, is visible to the user, but beyond that your data is going into an unknown black box..
    if you've no idea what is being done at the backend, you cant make an informed decision as to wether you want to trust a given site with your info or not.

  • by shutdown -p now ( 807394 ) on Friday August 24, 2012 @01:55AM (#41105969) Journal

    So, how does the price of a typical industrial control system compare to the price of typical boxed software for popular platforms?

"God is a comedian playing to an audience too afraid to laugh." - Voltaire

Working...