Insecure Code - Vendors or Developers To Blame? 284
Annto Dev writes "Computer security expert, Bruce Schneier feels that vendors are to blame for 'lousy software'. From the article: 'They try to balance the costs of more-secure software--extra developers, fewer features, longer time to market--against the costs of insecure software: expense to patch, occasional bad press, potential loss of sales. The end result is that insecure software is common...' he said. Last week Howard Schmidt, the former White House cybersecurity adviser, argued at a seminar in London that programmers should be held responsible for flaws in code they write."
Errors and Omissions Insurance (Score:4, Insightful)
E&O is incredibly expensive. I looked into buying a policy when I started doing environmental work due to the possibility that I could be named a 'potentially responsible party' in an environmental enforcement action by the government. I side-stepped that need when I went to work for a large firm that could afford the E&O insurance. You can bet that cost was included in my chargeout rate.
That is what this effort will lead to for independent programmers. You will have the choice of buying E&O insurance, provided you qualify, and jacking your prices up to cover your costs, or you will have to work for a company that already has it. Hobby/free software enthusiasts are screwed.
I prefer the policy of 'caveat emptor'. If you install free software on your production machine without properly vetting it you are not only a fool but should bear all of the costs yourself.
Why not?! (Score:2, Insightful)
Kettle = black; (Score:5, Insightful)
OK. And to make it fair, let's let lawmakers be responsible for all the unintended consequences their legislation brings about.
Welcome to business. (Score:3, Insightful)
this doesn't sit well with me (Score:4, Insightful)
Secure code will never happen (Score:5, Insightful)
The software industry operates more like the automobile industry: they know their cars will have problems, so they freely fix those problems for the warranty period. Software's warranty period is as long as the vendor or developer say they'll support that software.
The major difference is with closed source software, after the "warrany" period is up you can't usually pay someone to fix the problems. Open source provides a great car analogy, because after, say, Red Hat stops supporting your OS you can still fix it yourself or hire a developer to fix it for you.
This is why nobody would buy a car with the hood welded shut. For the life of me I can't figure out why anybody would buy software with the "hood" welded shut.
Re:Why not?! (Score:5, Insightful)
Tell you what, I'll get licenced to write code and be legally responsible for it the day that customers are willing to pay about 8x what they current pay for the software, and can wait about 4x as long. Can you make that happen? I'm waiting anxiously, because *I* don't get to make those decisions.
So guess what...you want good code, hold the *EMPLOYER* responsible. I'll bet I suddenly find myself with all of the time I need to develop quality softare.
Re:Errors and Omissions Insurance (Score:5, Insightful)
-Rick
Re:How about both? (Score:2, Insightful)
Re:Why not?! (Score:5, Insightful)
Re:Errors and Omissions Insurance (Score:3, Insightful)
Net result: not much additional motivation to secure code, more suits and thus costs increase to feed the lawyers instead of the process.
come on, really. (Score:2, Insightful)
People really over complicate this topic. Nobody is perfect, and people make mistakes. It really doesn't matter what excuse I use (deadlines, bad company decisions, whatever) if its code I wrote, its my fault. Even if I identified the hole and my boss told me to skip it, I still published flawed code. If I was perfect, it would be bullet proof from the get go, and if my team was perfect the same would apply. My boss would never have to tell me to ignore the error/hole, because my UML model was flawless, and my execution of it was flawless.
I think this topic comes up because of one of two things; developers passing the buck, or blogger/writers trying to get some press. The fault is obvious, the solution however is far more difficult, and since humans will create the solutions, chances are it will be flawed too, and the cycle repeats...
Re:Why not?! (Score:5, Insightful)
The vendor... (Score:5, Insightful)
If I develop X for a company that then takes X to market, and X turns out to be faulty, company should be at fault. I am at fault for writing shoddy code, the effect of which will be that I get fewer future contracts or employment to do the same. Company is at fault for taking X to market, and as such should be resonsible for any liability due to X's shortcomings.
GM is responsible for a shoddy part on one of its vehicles, not the engineer that developed the part.
Sole proprietors who take their code to market should be responsible, but in that instance, the sole proprietor is both developer and vendor.
Re:Why not?! (Score:5, Insightful)
My point is, there's a whole range of "bad things" that happen, from clearly negligent to uncontrollable, and a lot of stuff in between, and we make that judgement every day by assessing or not assessing blame.
To construct large, complex software systems without bugs (including security flaws) is beyond the state of the art. In fact, it is beyond the state of the art by definition: if we could make today's systems bug-free, we could, and would, make even more ambitious systems by tolerating some rate of errors. Conversely, with today's state of the art, if we placed correctness (including security) above everything else, we'd have to cut way back on what we attempt, and charge a lot more. The market has already decided that's the wrong approach.
Languages with buffer overflows need to be avoided (Score:5, Insightful)
It's the customers' fault (Score:3, Insightful)
If people really cared about security, MS would have been driven out of business a long time ago, and other vendors would have taken note of that and made sure the same thing didn't happen to them. We would have more secure, less featureful, less convenient, more expensive software. But people don't care that much, so that didn't happen.
I'll accept liability for the code (Score:5, Insightful)
As a programmer, I'll accept liability for bugs in the code... the day I get the same protection that a professional engineer gets: if I say I need X for the program to be properly designed/written/tested, any manager or executive or marketer who says otherwise can be thrown in jail. No mere job protection, no civil remedies, jail time for anyone who tries to overrule me, same as would happen to a manager who overruled the structural engineer's specification of the grades of concrete and steel to be used in a building.
Responsibility and authority go hand in hand. If they want to hand the the responsibility, they give me the authority to go with it. If, OTOH, they don't want to give me the authority, then they can shoulder the responsibility.
Plenty of Examples (Score:4, Insightful)
They have weak or no APIs, the built-in tools aren't able to perform the most basic tasks the users want, and the customized workaround take as much work as rewriting the software.
I think the guy from the article has a point, as there are many businesses that spend many times any of our salaries running commercial software, and the people involved in the purchase have no idea they're throwing bad money at subpar products. I'm not sure he's talking about something relevant to most slashdotters: even those of us who work in IT don't really get to pick the accounting software people use, the CFOs pretty much run what they know and we have to build accounting their own network around that package.
Who Do You Trust? (Score:5, Insightful)
Noted security expert or political hack,
It's not even close. On the credibility front Schneier has hundreds - no, thousands - of times more credibility on this issue than a political appoiontee out of the White House. Actually it's infinitely more credibility because anything times zero is zero where the White House is concerned.
Re:Why not?! (Score:4, Insightful)
Name them. At least those that do not require a minimum level of formal training or accreditation.
Also, this is a mute issue because the lawsuits follow the money. If I were to sue somebody for faulty software, would I waste my time, money, and lawyer expertise on suing the developer that makes say $80,000 a year that probably has no real capitol to speak of, or the multi-million dollar company?
Also, when I was a developer, I was not the one that decided when the product was done testing and ready to ship. If I can't make that decision, then I have no liability. Period.
Re:E&O by company or by employee (Score:5, Insightful)
Plus, the company owns the product (Score:5, Insightful)
Besides, it is a company's responsibility to sell good products. If they sell a product that is defective, it is often because they didn't do sufficient Q&A on the product, or rushed it to market.
Bottom line is that if a car maker sells a car with a defective part (the tires lugs were defective), and it passes shoddy Q&A, it is the maker's fault, not the assembly line guy. If it doesn't pass Q&A, you can be sure Ford won't sell it -- but the same doesn't seem true of software.
Re:Languages with buffer overflows need to be avoi (Score:3, Insightful)
I see no reason why a user shouldn't be able to download and run any program they find, as they should all be sandboxed appropriately that they cannot cause damage.
Sure, it may be a good start to remove some of the bugs, but who writes the sandbox? In what language? Is the sandbox itself sandboxed, to prevent being comprimised? If so, who writes that sandbox? In what language? Is that sandbox itself sandboxed, to prevent being comprimised? If so...
It's not an "obvious solution." It's an "obvious time-saver" when it comes to having to check for bugs.
Could you explain to me why a program downloaded from the internet has read and write access to every file on my computer?
I think that has more to say about your choice of operating system rather than the program itself.
It essentially means that if you write a program to be used on a network, you have to maintain and patch it forever because you'll never catch all the buffer overflows it contains.
I think you mean:
It essentially means that if you write a program to be used on a network, you have to maintain and patch it forever because you'll never catch all the programming errors, incorrect assumptions, and random unexpected behaviour introduced through unforseen run-time activity it contains.
The broader picture... (Score:2, Insightful)
Although it may vary from shop to shop, where I am currently follows a pretty standard model:
There is a major misconception that a Developer is the "one stop" source for software, where that is rarely the case. Even when some of the first steps are handled by a single person (usually when the Developer is a Lead or a Programmer Analyst, in title) the process entails more than just a single person.
It is only a matter of time, if it hasn't happened already, that insurance companies start selling liability insurance to Developers, just like they sell Malpractice insurance to Doctors. There are companies out there that will claim the "collective effort" when the profits roll in, but will hang a developer out to dry when something goes wrong. Thankfully, I left that job for the one I am at now. ;-)
Software Assurance (Score:5, Insightful)
The same principle applies to security. While you may not be able to say your system in completely invulnerable without expending enourmous amounts of time and money, you can make certain formal assurances like "no buffer overflow exploits exist in this software" or "the software will always fully and correctly carry security protocol X, or abort with an error and deny access". Such things don't ensure 100% security, but being able to formally make such assurances does significantly improve the expected security of the software.
For some reason software has gotten stuck in an "all or nothing" mentality, claiming that obviously you can't ensure perfection, therefore you should assume nothing, and make no assurances at all. That is neither necessary, nor productive. Being able to formally guarantee certain properties of software is both possible, and only as much extra work as the amount of assurance you choose to provide.
Jedidiah.
Re:Errors and Omissions Insurance (GPL V3) (Score:3, Insightful)
I'm wondering if the GPL 3 should include a clause to protect against this kind of lawsuit as well as patent lawsuits.
Re:Why not?! (Score:5, Insightful)
Because an Engineer of Real World Objects(tm) won't ever have to, say, open the bridge for traffic while the road deck is still being attached because someone decided it needed to be released early.
An Engineer wouldn't be told that we need flying butresses, a bike lane, and cantilevered sidewalks two weeks before the bridge is supposed to be open, but that it can't affect the delivery timeline and there's no time to test them, and no extra time to do the work so we'll have to do it in our own time.
Until such time as your employer builds in several extra weeks (months?) of testing for security, provides you with resources to do it, and brings in independant experts to help verify it, then it will be completely impossible for professional developers to meet that standard.
And as long as the company is selling the software with a license that absolves them from any blame, and helps to ensure they have that theoretically-perfect software, I'm sure as hell not putting my ass on the line with the ultimate responsibility for it.
Just because the company made several million, and the salesperson got a huge comission, doesn't mean that if it was rushed out the door for reasons out of my control that I got paid any more for the effort. Shit may run down hill, but no *way* it falls that far.
Re:Why not?! (Score:3, Insightful)
Why should a software engineer have to provide this same security guarantee?
The level of security is a feature like any other feature.
If you are paying the architect to design "fort knox", then it should be secure to top security standards, otherwise top security can not be expected; because sometimes people just want a cheap place to live with doors that have locks.
Do we expect every piece of software written, every house built, every vault designed, etc. to meet the highest security standards? In a word, no. Because, at some point, the cost of security becomes so high that we decide that the security is "good enough".
The US military wasn't even able to secure the pentagon from an attack, and we're worried about suing joe blow software developer for some bugs?
Re:Secure code will never happen (Score:3, Insightful)