Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security Programming IT Technology

Insecure Code - Vendors or Developers To Blame? 284

Annto Dev writes "Computer security expert, Bruce Schneier feels that vendors are to blame for 'lousy software'. From the article: 'They try to balance the costs of more-secure software--extra developers, fewer features, longer time to market--against the costs of insecure software: expense to patch, occasional bad press, potential loss of sales. The end result is that insecure software is common...' he said. Last week Howard Schmidt, the former White House cybersecurity adviser, argued at a seminar in London that programmers should be held responsible for flaws in code they write."
This discussion has been archived. No new comments can be posted.

Insecure Code - Vendors or Developers To Blame?

Comments Filter:
  • by geomon ( 78680 ) on Friday October 21, 2005 @12:46PM (#13845476) Homepage Journal
    Great news for the E&O insurance industry! When programmers become liable for the mistakes (read: human nature) of their creations then everyone who codes for a living will have to consider buying insurance to hedge their risk, or find another form of work.

    E&O is incredibly expensive. I looked into buying a policy when I started doing environmental work due to the possibility that I could be named a 'potentially responsible party' in an environmental enforcement action by the government. I side-stepped that need when I went to work for a large firm that could afford the E&O insurance. You can bet that cost was included in my chargeout rate.

    That is what this effort will lead to for independent programmers. You will have the choice of buying E&O insurance, provided you qualify, and jacking your prices up to cover your costs, or you will have to work for a company that already has it. Hobby/free software enthusiasts are screwed.

    I prefer the policy of 'caveat emptor'. If you install free software on your production machine without properly vetting it you are not only a fool but should bear all of the costs yourself.
  • Why not?! (Score:2, Insightful)

    by orangeguru ( 411012 ) on Friday October 21, 2005 @12:47PM (#13845491) Homepage
    Almost all other professions have to take responsibility for their work and constructs - why are programmers an exception?!
  • Kettle = black; (Score:5, Insightful)

    by LaughingCoder ( 914424 ) on Friday October 21, 2005 @12:49PM (#13845534)
    "the former White House cybersecurity adviser, argued at a seminar in London that programmers should be held responsible for flaws in code they write."

    OK. And to make it fair, let's let lawmakers be responsible for all the unintended consequences their legislation brings about.
  • by shdragon ( 1797 ) on Friday October 21, 2005 @12:50PM (#13845547) Homepage Journal
    I'd be glad to take responsibilty for any code I write just as soon as they're willling to pay my new, updated fees. If it's really *that* important shouldn't the client be equally if not more concerned with cost as getting it done right?
  • by fak3r ( 917687 ) on Friday October 21, 2005 @12:53PM (#13845569) Homepage
    • "White House cybersecurity adviser, argued at a seminar in London that programmers should be held responsible for flaws in code they write
    "The problem with that is when an employee writes code for a company, it becomes the companies' code, so it would follow that any litigation should fall on the company, and not the programmer. I would also argue that the programmer doesn't release the software, that's up to the company which *should* have testing and QA measures in place to find bugs and insecurity.
  • by digidave ( 259925 ) on Friday October 21, 2005 @12:55PM (#13845585)
    I'm sick and tired of hearing talk about holding vendors or developers legally responsible for writing insecure code. It's impossible to write any complex application and not have security problems.

    The software industry operates more like the automobile industry: they know their cars will have problems, so they freely fix those problems for the warranty period. Software's warranty period is as long as the vendor or developer say they'll support that software.

    The major difference is with closed source software, after the "warrany" period is up you can't usually pay someone to fix the problems. Open source provides a great car analogy, because after, say, Red Hat stops supporting your OS you can still fix it yourself or hire a developer to fix it for you.

    This is why nobody would buy a car with the hood welded shut. For the life of me I can't figure out why anybody would buy software with the "hood" welded shut.
  • Re:Why not?! (Score:5, Insightful)

    by Anonymous Coward on Friday October 21, 2005 @12:55PM (#13845594)
    We're an exception because you pay us next to nothing and give us no time to work. You don't hire an engineer and then tell him to build a bridge in a few weeks.

    Tell you what, I'll get licenced to write code and be legally responsible for it the day that customers are willing to pay about 8x what they current pay for the software, and can wait about 4x as long. Can you make that happen? I'm waiting anxiously, because *I* don't get to make those decisions.

    So guess what...you want good code, hold the *EMPLOYER* responsible. I'll bet I suddenly find myself with all of the time I need to develop quality softare.
  • by RingDev ( 879105 ) on Friday October 21, 2005 @12:57PM (#13845607) Homepage Journal
    You'll notice also that this does nothing to improve the security of the code. It just makes it more expencive.

    -Rick
  • Re:How about both? (Score:2, Insightful)

    by whyne ( 784135 ) on Friday October 21, 2005 @12:57PM (#13845616)
    Today's corporate structure has more than 2 groups involved. Excrement rolls down hill. The analysts and advisers explain the value of shipping a release this quarter or year (profit now for the quarter, later for the next quarter, then for the year ... ). Upper management liens on middle management. Middle then liens on lower and supervisors. Then the developers work harder to bring it to life. Even if they do not like the product/method/model, a programmer may not be able to effect the outcome.
  • Re:Why not?! (Score:5, Insightful)

    by Quasar1999 ( 520073 ) on Friday October 21, 2005 @01:01PM (#13845653) Journal
    As a software developer, I'll take responsiblity for bugs in my code and the damages they may cause, the day that politicians take responsiblity for their campain promises and the crap they end up passing as law later...
  • by Godeke ( 32895 ) * on Friday October 21, 2005 @01:02PM (#13845669)
    Give this man a dollar! E&O insurance actually increases the rate at which lawsuits are filed (since you have a better chance of actually being paid out). Now, the threat of having your E&O insurance premiums increase is a motivator, I don't think it is much of one as the scenario where you don't have E&O insurance and you are "self insuring".

    Net result: not much additional motivation to secure code, more suits and thus costs increase to feed the lawyers instead of the process.
  • come on, really. (Score:2, Insightful)

    by CDPatten ( 907182 ) on Friday October 21, 2005 @01:06PM (#13845704) Homepage
    I'm a developer and errors/holes in my code are my fault. Some, could in theory be the fault of the framework I use, but typically, its mine.

    People really over complicate this topic. Nobody is perfect, and people make mistakes. It really doesn't matter what excuse I use (deadlines, bad company decisions, whatever) if its code I wrote, its my fault. Even if I identified the hole and my boss told me to skip it, I still published flawed code. If I was perfect, it would be bullet proof from the get go, and if my team was perfect the same would apply. My boss would never have to tell me to ignore the error/hole, because my UML model was flawless, and my execution of it was flawless.

    I think this topic comes up because of one of two things; developers passing the buck, or blogger/writers trying to get some press. The fault is obvious, the solution however is far more difficult, and since humans will create the solutions, chances are it will be flawed too, and the cycle repeats...

  • Re:Why not?! (Score:5, Insightful)

    by cpuh0g ( 839926 ) on Friday October 21, 2005 @01:07PM (#13845713)
    Really? If your car's engine has a problem, do you sue the machinist who made the faulty part or just sue his company? Individual engineers who work for a company that creates software are responsible within the company, but should not be exposed personally. The company takes the ultimate responsibility for the products they produce. If they shortchange the development cycle in order to rush to market and the product is crap, the company takes the hit, not the engineer who wrote the code.
  • The vendor... (Score:5, Insightful)

    by BewireNomali ( 618969 ) on Friday October 21, 2005 @01:09PM (#13845730)
    I don't code, but I don't think making developers responsible for faulty code is a good solution.

    If I develop X for a company that then takes X to market, and X turns out to be faulty, company should be at fault. I am at fault for writing shoddy code, the effect of which will be that I get fewer future contracts or employment to do the same. Company is at fault for taking X to market, and as such should be resonsible for any liability due to X's shortcomings.

    GM is responsible for a shoddy part on one of its vehicles, not the engineer that developed the part.

    Sole proprietors who take their code to market should be responsible, but in that instance, the sole proprietor is both developer and vendor.
  • Re:Why not?! (Score:5, Insightful)

    by timeOday ( 582209 ) on Friday October 21, 2005 @01:09PM (#13845732)
    Almost all other professions have to take responsibility for their work and constructs
    They are? I never sued a farmer because I cut open a watermelon and it was bad inside. I never sued a GM engineer because my transmission only lasted 60K miles. I never sued a weatherman because his forecast was wrong. I never sued a chef because I had a bad meal.

    My point is, there's a whole range of "bad things" that happen, from clearly negligent to uncontrollable, and a lot of stuff in between, and we make that judgement every day by assessing or not assessing blame.

    To construct large, complex software systems without bugs (including security flaws) is beyond the state of the art. In fact, it is beyond the state of the art by definition: if we could make today's systems bug-free, we could, and would, make even more ambitious systems by tolerating some rate of errors. Conversely, with today's state of the art, if we placed correctness (including security) above everything else, we'd have to cut way back on what we attempt, and charge a lot more. The market has already decided that's the wrong approach.

  • by applecrumble ( 910692 ) on Friday October 21, 2005 @01:10PM (#13845738)
    A good start to our current security problem would be to stop writing internet based software in languages that allow buffer overflows to occur (e.g. C, C++). 90% of security exploits are caused by buffer overflows. I've seen a figure like this in research papers, but it should be obvious to anyone from reading patch descriptions and current security alters. Writing computer programs in these types of languages and patching the errors as they are found is simply not a scalable solution. It essentially means that if you write a program to be used on a network, you have to maintain and patch it forever because you'll never catch all the buffer overflows it contains (e.g. the zlib bug, not a particular large library and it has been around for a long time). Picking a tool that doesn't even allow these types of errors is the obvious solution. In addition, we need to start using more granular security permissions for our programs. Blaming security problems solely on users is ridiculous. Could you explain to me why a program downloaded from the internet has read and write access to every file on my computer? Why it can open up network connections? Having root users is a start, but we need to be able to sandbox all the applications we download so they just aren't allowed to do anything bad. I see no reason why a user shouldn't be able to download and run any program they find, as they should all be sandboxed appropriately that they cannot cause damage.
  • by rsheridan6 ( 600425 ) on Friday October 21, 2005 @01:10PM (#13845748)
    If customers demanded secure software, then vendors would produce secure software. People instead buy software that's either the most familiar, easiest to use, cheapest, or has the most features checked off, so that's what vendors. That's why the utter pile of crap known as Windows 3.1/95 won while OS/2 and other more secure alternatives lost, and Windows continues to win over more secure alternatives today. Why should vendors spend their resources on something people have proven they don't care about?

    If people really cared about security, MS would have been driven out of business a long time ago, and other vendors would have taken note of that and made sure the same thing didn't happen to them. We would have more secure, less featureful, less convenient, more expensive software. But people don't care that much, so that didn't happen.

  • by Todd Knarr ( 15451 ) on Friday October 21, 2005 @01:11PM (#13845756) Homepage

    As a programmer, I'll accept liability for bugs in the code... the day I get the same protection that a professional engineer gets: if I say I need X for the program to be properly designed/written/tested, any manager or executive or marketer who says otherwise can be thrown in jail. No mere job protection, no civil remedies, jail time for anyone who tries to overrule me, same as would happen to a manager who overruled the structural engineer's specification of the grades of concrete and steel to be used in a building.

    Responsibility and authority go hand in hand. If they want to hand the the responsibility, they give me the authority to go with it. If, OTOH, they don't want to give me the authority, then they can shoulder the responsibility.

  • Plenty of Examples (Score:4, Insightful)

    by jpowers ( 32595 ) on Friday October 21, 2005 @01:12PM (#13845761) Homepage
    You can make a case for this without worrying about impinging on the right to make free software. Peopleware really isn't worth the thousands of dollars it runs you. Solomon Accounting isn't worth the $100K it costs for a companywide install, Great Plains and larger packages like Deltek's Costpoint (actual install cost: $450K) are no better.

    They have weak or no APIs, the built-in tools aren't able to perform the most basic tasks the users want, and the customized workaround take as much work as rewriting the software.

    I think the guy from the article has a point, as there are many businesses that spend many times any of our salaries running commercial software, and the people involved in the purchase have no idea they're throwing bad money at subpar products. I'm not sure he's talking about something relevant to most slashdotters: even those of us who work in IT don't really get to pick the accounting software people use, the CFOs pretty much run what they know and we have to build accounting their own network around that package.
  • Who Do You Trust? (Score:5, Insightful)

    by PingXao ( 153057 ) on Friday October 21, 2005 @01:13PM (#13845766)
    Who do you trust more?

    Noted security expert or political hack, ... noted security expert or political hack, ... noted security expert of political hack?

    It's not even close. On the credibility front Schneier has hundreds - no, thousands - of times more credibility on this issue than a political appoiontee out of the White House. Actually it's infinitely more credibility because anything times zero is zero where the White House is concerned.
  • Re:Why not?! (Score:4, Insightful)

    by hackstraw ( 262471 ) * on Friday October 21, 2005 @01:20PM (#13845827)
    Almost all other professions have to take responsibility for their work and constructs - why are programmers an exception?!

    Name them. At least those that do not require a minimum level of formal training or accreditation.

    Also, this is a mute issue because the lawsuits follow the money. If I were to sue somebody for faulty software, would I waste my time, money, and lawyer expertise on suing the developer that makes say $80,000 a year that probably has no real capitol to speak of, or the multi-million dollar company?

    Also, when I was a developer, I was not the one that decided when the product was done testing and ready to ship. If I can't make that decision, then I have no liability. Period.
  • by mrchaotica ( 681592 ) on Friday October 21, 2005 @01:23PM (#13845846)
    The day that every programmer is licensed the way that doctors and lawyers...
    In other words, the day when they start making "software engineering" a real engineering discipline, and letting programmers become Professional Engineers [wikipedia.org]:
    The earmark that distinguishes a professional engineer is the authority to "sign off" or "stamp" on a design or a structure, thus taking legal responsibility for it. (emphasis added)
  • by kcurtis ( 311610 ) on Friday October 21, 2005 @01:24PM (#13845863)
    Most contracts result in the company owning all of the intellectual property. If the programmer can't own their work, then the owner should be responsible for it.

    Besides, it is a company's responsibility to sell good products. If they sell a product that is defective, it is often because they didn't do sufficient Q&A on the product, or rushed it to market.

    Bottom line is that if a car maker sells a car with a defective part (the tires lugs were defective), and it passes shoddy Q&A, it is the maker's fault, not the assembly line guy. If it doesn't pass Q&A, you can be sure Ford won't sell it -- but the same doesn't seem true of software.
  • by mopslik ( 688435 ) on Friday October 21, 2005 @01:25PM (#13845867)

    I see no reason why a user shouldn't be able to download and run any program they find, as they should all be sandboxed appropriately that they cannot cause damage.

    Sure, it may be a good start to remove some of the bugs, but who writes the sandbox? In what language? Is the sandbox itself sandboxed, to prevent being comprimised? If so, who writes that sandbox? In what language? Is that sandbox itself sandboxed, to prevent being comprimised? If so...

    It's not an "obvious solution." It's an "obvious time-saver" when it comes to having to check for bugs.

    Could you explain to me why a program downloaded from the internet has read and write access to every file on my computer?

    I think that has more to say about your choice of operating system rather than the program itself.

    It essentially means that if you write a program to be used on a network, you have to maintain and patch it forever because you'll never catch all the buffer overflows it contains.

    I think you mean:

    It essentially means that if you write a program to be used on a network, you have to maintain and patch it forever because you'll never catch all the programming errors, incorrect assumptions, and random unexpected behaviour introduced through unforseen run-time activity it contains.

  • by jferris ( 908786 ) on Friday October 21, 2005 @01:27PM (#13845887) Homepage
    ...is that developers are not the only people responsible.

    Although it may vary from shop to shop, where I am currently follows a pretty standard model:

    • Business Analysts gather requirements from the prespective users.
    • Project Management creates specifications
    • Specification are presented in a JAD session where they are gone over in a public discussion
    • Project Management and Business Analaysts work together to deliver a formal Design Document
    • Second JAD session to dissect Design Document and petition for any changes.
    • Development begins while Technical Writers produce any documentation in conjunction with Project Management and Developers.
    • Development performs Unit Testing to verify that the requirments of the specifications are met, as detailed in the Design Document.
    • Quality Assurance tests the entire functionality on top of what has been Unit Tested by Development.
    • Release is scheduled.
    • Repeated as needed (although usually more briefly) for bugs and maintenance.

    There is a major misconception that a Developer is the "one stop" source for software, where that is rarely the case. Even when some of the first steps are handled by a single person (usually when the Developer is a Lead or a Programmer Analyst, in title) the process entails more than just a single person.

    It is only a matter of time, if it hasn't happened already, that insurance companies start selling liability insurance to Developers, just like they sell Malpractice insurance to Doctors. There are companies out there that will claim the "collective effort" when the profits roll in, but will hang a developer out to dry when something goes wrong. Thankfully, I left that job for the one I am at now. ;-)

  • Software Assurance (Score:5, Insightful)

    by Coryoth ( 254751 ) on Friday October 21, 2005 @01:34PM (#13845954) Homepage Journal
    Yes, software has bugs and mistakes and errors, and in a large project it can become infeasible to guarantee that there aren't issues somewhere. That doesn't mean, however, that software should simply ignore the issue. It's a matter of contracts and assurance: It is possible to make certain assurances about a piece of software and spend the time making sure it fulfills those properties. For instance, while you might not go to the trouble of ensuring a word processor is completely bug free, it may be worth providing assurances, for instance, that files cannot be corrupted when the program crashes, and that the print preview is exactly what will be printed. There are methods for proving and verifying such properties, and if you restrict it to key properties that the client wants formal assurance on then it is not significant extra work to use those methods.

    The same principle applies to security. While you may not be able to say your system in completely invulnerable without expending enourmous amounts of time and money, you can make certain formal assurances like "no buffer overflow exploits exist in this software" or "the software will always fully and correctly carry security protocol X, or abort with an error and deny access". Such things don't ensure 100% security, but being able to formally make such assurances does significantly improve the expected security of the software.

    For some reason software has gotten stuck in an "all or nothing" mentality, claiming that obviously you can't ensure perfection, therefore you should assume nothing, and make no assurances at all. That is neither necessary, nor productive. Being able to formally guarantee certain properties of software is both possible, and only as much extra work as the amount of assurance you choose to provide.

    Jedidiah.
  • by diakka ( 2281 ) on Friday October 21, 2005 @01:35PM (#13845965)
    The parent poster is correct in that this would destroy hobbyist programming, at least in the US.

    I'm wondering if the GPL 3 should include a clause to protect against this kind of lawsuit as well as patent lawsuits.
  • Re:Why not?! (Score:5, Insightful)

    by gstoddart ( 321705 ) on Friday October 21, 2005 @01:50PM (#13846103) Homepage
    Almost all other professions have to take responsibility for their work and constructs - why are programmers an exception?!

    Because an Engineer of Real World Objects(tm) won't ever have to, say, open the bridge for traffic while the road deck is still being attached because someone decided it needed to be released early.

    An Engineer wouldn't be told that we need flying butresses, a bike lane, and cantilevered sidewalks two weeks before the bridge is supposed to be open, but that it can't affect the delivery timeline and there's no time to test them, and no extra time to do the work so we'll have to do it in our own time.

    Until such time as your employer builds in several extra weeks (months?) of testing for security, provides you with resources to do it, and brings in independant experts to help verify it, then it will be completely impossible for professional developers to meet that standard.

    And as long as the company is selling the software with a license that absolves them from any blame, and helps to ensure they have that theoretically-perfect software, I'm sure as hell not putting my ass on the line with the ultimate responsibility for it.

    Just because the company made several million, and the salesperson got a huge comission, doesn't mean that if it was rushed out the door for reasons out of my control that I got paid any more for the effort. Shit may run down hill, but no *way* it falls that far.
  • Re:Why not?! (Score:3, Insightful)

    by x8 ( 879751 ) on Friday October 21, 2005 @02:04PM (#13846213)
    Does an architect have to guarantee that every house he designs is secure against breakins?
    Why should a software engineer have to provide this same security guarantee?

    The level of security is a feature like any other feature.

    If you are paying the architect to design "fort knox", then it should be secure to top security standards, otherwise top security can not be expected; because sometimes people just want a cheap place to live with doors that have locks.

    Do we expect every piece of software written, every house built, every vault designed, etc. to meet the highest security standards? In a word, no. Because, at some point, the cost of security becomes so high that we decide that the security is "good enough".

    The US military wasn't even able to secure the pentagon from an attack, and we're worried about suing joe blow software developer for some bugs?
  • by Brad Mace ( 624801 ) on Friday October 21, 2005 @03:12PM (#13846791) Homepage
    Another problem with these discussions is that they treat all 'flaws' the same. I can see at least 3 distinct categories that people should be thinking about:
    1. Malicious code, deliberately inserted into the program. I think everyone would agree the programmer should be held accountable in this case. Vendor could potentially have liability as well
    2. True bugs: code that doesn't function as the programmer intended. I think accountability would vary dramatically on a case by case basis. What percentage of programmers would have made a similar mistake? Perhaps the QA team should be held accountable instead. But expecting perfection from either group seems unreasonable.
    3. Unrealistic expectations: The windows on my house aren't bulletproof, but I don't sue the manufacturer if someone shoots me through them. We should be focusing on the perpetrators in these cases, and perhaps on countries that do nothing about them. As mentioned previously, cars aren't nearly 100% safe, and people are dying in those crashes.

No man is an island if he's on at least one mailing list.

Working...