Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Security Programming IT Technology

Insecure Code - Vendors or Developers To Blame? 284

Annto Dev writes "Computer security expert, Bruce Schneier feels that vendors are to blame for 'lousy software'. From the article: 'They try to balance the costs of more-secure software--extra developers, fewer features, longer time to market--against the costs of insecure software: expense to patch, occasional bad press, potential loss of sales. The end result is that insecure software is common...' he said. Last week Howard Schmidt, the former White House cybersecurity adviser, argued at a seminar in London that programmers should be held responsible for flaws in code they write."
This discussion has been archived. No new comments can be posted.

Insecure Code - Vendors or Developers To Blame?

Comments Filter:
  • by geomon ( 78680 ) on Friday October 21, 2005 @12:46PM (#13845476) Homepage Journal
    Great news for the E&O insurance industry! When programmers become liable for the mistakes (read: human nature) of their creations then everyone who codes for a living will have to consider buying insurance to hedge their risk, or find another form of work.

    E&O is incredibly expensive. I looked into buying a policy when I started doing environmental work due to the possibility that I could be named a 'potentially responsible party' in an environmental enforcement action by the government. I side-stepped that need when I went to work for a large firm that could afford the E&O insurance. You can bet that cost was included in my chargeout rate.

    That is what this effort will lead to for independent programmers. You will have the choice of buying E&O insurance, provided you qualify, and jacking your prices up to cover your costs, or you will have to work for a company that already has it. Hobby/free software enthusiasts are screwed.

    I prefer the policy of 'caveat emptor'. If you install free software on your production machine without properly vetting it you are not only a fool but should bear all of the costs yourself.
    • by RingDev ( 879105 ) on Friday October 21, 2005 @12:57PM (#13845607) Homepage Journal
      You'll notice also that this does nothing to improve the security of the code. It just makes it more expencive.

      -Rick
      • Give this man a dollar! E&O insurance actually increases the rate at which lawsuits are filed (since you have a better chance of actually being paid out). Now, the threat of having your E&O insurance premiums increase is a motivator, I don't think it is much of one as the scenario where you don't have E&O insurance and you are "self insuring".

        Net result: not much additional motivation to secure code, more suits and thus costs increase to feed the lawyers instead of the process.
      • However, if $Corporation knows it could be taken to court, and its insurance rates would then rise, it would be motivated to, at least, have proper testing.
        • However, if $Corporation knows it could be taken to court, and its insurance rates would then rise, it would be motivated to, at least, have proper testing.

          It would also be irresponsible for the $Corporation to just eat the cost of this insurance but would instead have to raise the price of its products. It would also be prudent for the $Corporation to be risk averse and not change the code too often leading to feature stagnation.

          The only winners in this situation are the insurance companies and chest puffi
      • Comment removed based on user account deletion
    • If you install free software on your production machine without properly vetting it you are not only a fool but should bear all of the costs yourself.

      If you install ANY software on your production machine without properly vetting it, you are not only a fool but should bear all of the costs yourself.
      • Granted.

        Now how are you going to "properly vet" a mysterious black box, by which I mean anything other than open source software? Read the vendor's documentation so that you know they think it is secure? Try to hack it yourself?

        I'll allow that there are a few private products with an established reputation for security solid enough that you'd consider taking their word even if the code hasn't been subject to public review.

        Personally, I favor the following - vendors should be respons
    • The parent poster is correct in that this would destroy hobbyist programming, at least in the US.

      I'm wondering if the GPL 3 should include a clause to protect against this kind of lawsuit as well as patent lawsuits.
      • Most other free software licenses also have something similar:

        11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
        FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
        OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
        PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
        OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
        MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
        TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
        PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
        REPAIR OR CORRECTION.

            12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
        WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
        REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
        INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
        OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
        TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
        YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
        PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
        POSSIBILITY OF SUCH DAMAGES.
      • I wonder if they can get around it by claiming the code as the documentation as to what the program does. That way if it does something wrong it is perfectly documented that that is what it is suppose to do. If you don't want it to do that let me know and I (the programmer) can change it. This would be kind of like the Microsoft argument of "it's not a bug it's a feature" except with OS it is a documented feature that is subject to change appon request.

        Closed source applications wouldn't be able to use
    • Of course, it would also mean that programmers would have to gain the right bof signoff. That is, the company cannot for any reason even pretend to ship the software unless/until all pf the programmers sign off certifying that the code is clean.

      Of course, there is a huge can of worms here. What if the software uses 4096 bit DSA sigs for authentication, and next year someone figures out a way to defeat it? You really can't blame the programmers who used (at the time) the most secure authentication known.

  • Why not?! (Score:2, Insightful)

    by orangeguru ( 411012 )
    Almost all other professions have to take responsibility for their work and constructs - why are programmers an exception?!
    • Almost all other professions have to take responsibility for their work and constructs - why are programmers an exception?!

      Well, they aren't. Doctors and lawyers have certifying boards (here in the US) that can sanction irresponsible behavior, and the worst cases can result in lawsuits. Architects and engineers usually have to find an insurance carrier who will underwrite their work. That is what errors and omissions insurance is for. It is expensive and hard to qualify for.
    • Re:Why not?! (Score:5, Insightful)

      by Anonymous Coward on Friday October 21, 2005 @12:55PM (#13845594)
      We're an exception because you pay us next to nothing and give us no time to work. You don't hire an engineer and then tell him to build a bridge in a few weeks.

      Tell you what, I'll get licenced to write code and be legally responsible for it the day that customers are willing to pay about 8x what they current pay for the software, and can wait about 4x as long. Can you make that happen? I'm waiting anxiously, because *I* don't get to make those decisions.

      So guess what...you want good code, hold the *EMPLOYER* responsible. I'll bet I suddenly find myself with all of the time I need to develop quality softare.
    • Re:Why not?! (Score:5, Insightful)

      by Quasar1999 ( 520073 ) on Friday October 21, 2005 @01:01PM (#13845653) Journal
      As a software developer, I'll take responsiblity for bugs in my code and the damages they may cause, the day that politicians take responsiblity for their campain promises and the crap they end up passing as law later...
    • Re:Why not?! (Score:5, Insightful)

      by cpuh0g ( 839926 ) on Friday October 21, 2005 @01:07PM (#13845713)
      Really? If your car's engine has a problem, do you sue the machinist who made the faulty part or just sue his company? Individual engineers who work for a company that creates software are responsible within the company, but should not be exposed personally. The company takes the ultimate responsibility for the products they produce. If they shortchange the development cycle in order to rush to market and the product is crap, the company takes the hit, not the engineer who wrote the code.
      • If your software crashes, you won't die. The best solution IMHO is to sell insurances to customers. You buy the software normally and buy an insurance to either a third party or the company providing the software that if you ever gets problems with it, they are going to pay you.

        That way, the outragous fees that would cost would be limitated to those who actually want it and those of us who test properly and have good security (including backup) practice can continue to act as we always did.
    • Say a building collapses because one of the supports was not welded/poured/constructed properly. Would it be a fair solution to hold the few construction workers that worked on that support responsible for any damage or loss of life? No, that is completely ridiculous. If a company sells a product it is their legal responsibility. The construction company has a right to fire those responsible, but they should not be able to sue them for mistakes.
    • Re:Why not?! (Score:5, Insightful)

      by timeOday ( 582209 ) on Friday October 21, 2005 @01:09PM (#13845732)
      Almost all other professions have to take responsibility for their work and constructs
      They are? I never sued a farmer because I cut open a watermelon and it was bad inside. I never sued a GM engineer because my transmission only lasted 60K miles. I never sued a weatherman because his forecast was wrong. I never sued a chef because I had a bad meal.

      My point is, there's a whole range of "bad things" that happen, from clearly negligent to uncontrollable, and a lot of stuff in between, and we make that judgement every day by assessing or not assessing blame.

      To construct large, complex software systems without bugs (including security flaws) is beyond the state of the art. In fact, it is beyond the state of the art by definition: if we could make today's systems bug-free, we could, and would, make even more ambitious systems by tolerating some rate of errors. Conversely, with today's state of the art, if we placed correctness (including security) above everything else, we'd have to cut way back on what we attempt, and charge a lot more. The market has already decided that's the wrong approach.

      • Re:Why not?! (Score:2, Interesting)

        by Anonymous Coward
        In no way can they hold me personally responsible. The company I work for makes all kind of sacrifices to win the bid. General quality is what is mostly sacrificed by having ridiculous deadlines and cutting testing time.

        I would love to have luxury of being able to build properly secure solutions and perform extensive system testing, but it's just not possible. The same is true for proper documentation and being pro-active during maintenance contracts.

        The worst part of it all is that the clients have gott
      • by kcurtis ( 311610 ) on Friday October 21, 2005 @01:24PM (#13845863)
        Most contracts result in the company owning all of the intellectual property. If the programmer can't own their work, then the owner should be responsible for it.

        Besides, it is a company's responsibility to sell good products. If they sell a product that is defective, it is often because they didn't do sufficient Q&A on the product, or rushed it to market.

        Bottom line is that if a car maker sells a car with a defective part (the tires lugs were defective), and it passes shoddy Q&A, it is the maker's fault, not the assembly line guy. If it doesn't pass Q&A, you can be sure Ford won't sell it -- but the same doesn't seem true of software.
        • If it doesn't pass Q&A...

          Wow! I didn't know that software did Questions and Answers! OK, people - one last time...

          Q&A=Question and Answer (as in, "The speaker was available for a Q&A session following his presentation.")
          QA=Quality Assurance (i.e., the process by which a product or service is assured of suitability for use).

          The next one of you who messes this up is really going to get an earful.

          Sincerely,
          Mr. Language Nazi

    • Almost all other professions have to take responsibility for their work and constructs
      ...and so are programmers. They're generally held responsible by their employers, the way it really should be.
    • The Code of Hammurabi has one of the oldest product liability clasues in history: If a building colapses and kills people, the builder shall be stoned to death. One could make the punishment fit the crime in that way: The bug revealed your email addrss to spammers; force the programmer into reading spam all day....and MAKE him reply to all the unsubscribe links ;)
      • I find it interesting that stating the fact that product liability as a concept is as old as the oldest laws is found to be trollish simply to inform that programmers should be responsible.... Remind me to call the Help Desk next time the Air Traffic Control has problems....
        -Now THIS is trollish!
    • Well, let's see what would happen...

      Assume I work for a company with 100 programmers working on the same code.

      Now, of course, I can't know what's going on with all 100 programmers at the same time. They may do something against the specs, and because of this, they FUBAR my code that follows said specs. Am I responsible? Do you trust a jury on that one, when you have to argue your case against your fellow employee? And what if the specs are absolute shit, who gets the blame there?

      Furthermore, what's the
    • Re:Why not?! (Score:4, Insightful)

      by hackstraw ( 262471 ) * on Friday October 21, 2005 @01:20PM (#13845827)
      Almost all other professions have to take responsibility for their work and constructs - why are programmers an exception?!

      Name them. At least those that do not require a minimum level of formal training or accreditation.

      Also, this is a mute issue because the lawsuits follow the money. If I were to sue somebody for faulty software, would I waste my time, money, and lawyer expertise on suing the developer that makes say $80,000 a year that probably has no real capitol to speak of, or the multi-million dollar company?

      Also, when I was a developer, I was not the one that decided when the product was done testing and ready to ship. If I can't make that decision, then I have no liability. Period.
    • Re:Why not?! (Score:5, Insightful)

      by gstoddart ( 321705 ) on Friday October 21, 2005 @01:50PM (#13846103) Homepage
      Almost all other professions have to take responsibility for their work and constructs - why are programmers an exception?!

      Because an Engineer of Real World Objects(tm) won't ever have to, say, open the bridge for traffic while the road deck is still being attached because someone decided it needed to be released early.

      An Engineer wouldn't be told that we need flying butresses, a bike lane, and cantilevered sidewalks two weeks before the bridge is supposed to be open, but that it can't affect the delivery timeline and there's no time to test them, and no extra time to do the work so we'll have to do it in our own time.

      Until such time as your employer builds in several extra weeks (months?) of testing for security, provides you with resources to do it, and brings in independant experts to help verify it, then it will be completely impossible for professional developers to meet that standard.

      And as long as the company is selling the software with a license that absolves them from any blame, and helps to ensure they have that theoretically-perfect software, I'm sure as hell not putting my ass on the line with the ultimate responsibility for it.

      Just because the company made several million, and the salesperson got a huge comission, doesn't mean that if it was rushed out the door for reasons out of my control that I got paid any more for the effort. Shit may run down hill, but no *way* it falls that far.
    • Uh, because what does 'work on' mean?

      If I inherit a massive codebase and add a few standalone utility functions, it would be easy to track, but most software projects involve inserting code in numerous places.

      Who's responsible? Whoever touched it last? It's not like when you buy a house and you know what responsibilities the plumber, electrician, and HVAC installers have.

      You'd have programmers suing each other. "That's your code." "No, it's not, it's your code"...every organization would have to use a c
    • Re:Why not?! (Score:3, Insightful)

      by x8 ( 879751 )
      Does an architect have to guarantee that every house he designs is secure against breakins?
      Why should a software engineer have to provide this same security guarantee?

      The level of security is a feature like any other feature.

      If you are paying the architect to design "fort knox", then it should be secure to top security standards, otherwise top security can not be expected; because sometimes people just want a cheap place to live with doors that have locks.

      Do we expect every piece of software written, every
  • How about both? (Score:3, Interesting)

    by 8127972 ( 73495 ) on Friday October 21, 2005 @12:49PM (#13845520)
    Vendors (more specifically, the product managers, sales types, etc.) are under pressure to get proudcts out the door to get sales and keep sharholders happy. That forces developers to limit the amount of time they spend writing quality software so that they can keep the PHB's happy. Net result, crappy insecure software.

    BTW, this topic seems vaugely familiar. Is this a dupe?
    • by eln ( 21727 )
      BTW, this topic seems vaugely familiar. Is this a dupe?

      It's not a dupe, it's an "encore presentation."
    • Vendors are under pressure to get proudcts out the door to get sales and keep sharholders happy. That forces developers to limit the amount of time they spend writing quality software

      Ahh, so you're saying that developers are to blame for the decisions of their managers?

      Yeah, *that* makes sense.
    • Re:How about both? (Score:2, Insightful)

      by whyne ( 784135 )
      Today's corporate structure has more than 2 groups involved. Excrement rolls down hill. The analysts and advisers explain the value of shipping a release this quarter or year (profit now for the quarter, later for the next quarter, then for the year ... ). Upper management liens on middle management. Middle then liens on lower and supervisors. Then the developers work harder to bring it to life. Even if they do not like the product/method/model, a programmer may not be able to effect the outcome.
    • Yes, it is a dupe [slashdot.org]. Had 800+ comments.
    • How about neither?

      The customers are the ones who put crappy software into service and never hold companies or programmers accountable for the disastrous results. How many butt-in-the-air versions of Windows has corporations pressed into service in place of more secure solutions because of the 'ease of use' and lower 'TCO' arguments?

      When customers are willing to buy or, even worse, actually prefer 'good enough' solutions, all they ever get are 'good enough' solutions.
  • Kettle = black; (Score:5, Insightful)

    by LaughingCoder ( 914424 ) on Friday October 21, 2005 @12:49PM (#13845534)
    "the former White House cybersecurity adviser, argued at a seminar in London that programmers should be held responsible for flaws in code they write."

    OK. And to make it fair, let's let lawmakers be responsible for all the unintended consequences their legislation brings about.
  • by Godeke ( 32895 ) * on Friday October 21, 2005 @12:50PM (#13845540)
    Let's see: do we hold employees at an auto factory responsible when unrealistic timetables means shoddy workmanship, or do we hold the employer who chooses speed to market over quality responsible? If that failure means the death of someone, do we sue the manufacturer or the guy who made the poor weld?

    Large software companies have more in common with factories than they do with law firms or medical practices, two places where the liability *is* on the individual. The employees don't get to choose how much time is spent designing quality and security into the product, nor do they get to choose how much quality assurance is done on the back end (although that is a lesser solution to quality code, it is still necessary).

    The day that every programmer is licensed the way that doctors and lawyers are is the day I will reassess this position, but for now programmers are *not* in the position to make the decisions that lead to quality code. I'm not convinced that licensing would ensure that, but without licensing coders are nothing more that code churners cranking to the beat of the employers drum.
    • Large software companies have more in common with factories than they do with law firms or medical practices

      Actually, this is true ... witness outsourcing. When's the last time you saw law firms outsource?

      BTW, how is this going to work if the programmer is a citizen of India? Are US prosecutors going to extradite him or her for inadvertant buffer overflows?
    • I agree 100% that the company, not the individua should be the one holding the bag, but what happens to feelancers? Unless they can pass the liablity on to the customer when they hand over the code (or otherwise shield their personal assests) virtually no-one is going to be sure enough of their work to code outside the protection of a company.
    • by mrchaotica ( 681592 ) on Friday October 21, 2005 @01:23PM (#13845846)
      The day that every programmer is licensed the way that doctors and lawyers...
      In other words, the day when they start making "software engineering" a real engineering discipline, and letting programmers become Professional Engineers [wikipedia.org]:
      The earmark that distinguishes a professional engineer is the authority to "sign off" or "stamp" on a design or a structure, thus taking legal responsibility for it. (emphasis added)
      • In other words, the day when they start making "software engineering" a real engineering discipline, and letting programmers become Professional Engineers [wikipedia.org]:

        The difference being, is on large projects, when the lead engineer says "can't be done" or "not on time", that decision is final since anyone over-ruling the engineer has just taken on the responsibility that engineer is legally on the hook for. And he'll probably tell you to go to hell and say too bad.

        Whether it gets made into a "real en

  • insecure software (Score:3, Informative)

    by unix_geek_512 ( 810627 ) on Friday October 21, 2005 @12:50PM (#13845542)
    Having been involved in software development I can confirm that most companies are more concerned about cost than the security of their code.

    They would rather get the product out there quickly in order to produce revenue rather than hire more and better developers
    to secure the code.

    It is very sad....
  • by shdragon ( 1797 ) on Friday October 21, 2005 @12:50PM (#13845547) Homepage Journal
    I'd be glad to take responsibilty for any code I write just as soon as they're willling to pay my new, updated fees. If it's really *that* important shouldn't the client be equally if not more concerned with cost as getting it done right?
  • it's all about EULA (Score:5, Informative)

    by Thud457 ( 234763 ) on Friday October 21, 2005 @12:51PM (#13845556) Homepage Journal
    That's ok, it's covered in the EULA -- the vendor's not responsible for anything. Since the developers are either employed by or are contractors for the vendor, they're similarly protected from any responsibility. So it boils down to caveat emptor -- test, test, and retest before accepting any software product.

    Too bad you have to click through the EULA before you can test it, suckers!

  • by fak3r ( 917687 ) on Friday October 21, 2005 @12:53PM (#13845569) Homepage
    • "White House cybersecurity adviser, argued at a seminar in London that programmers should be held responsible for flaws in code they write
    "The problem with that is when an employee writes code for a company, it becomes the companies' code, so it would follow that any litigation should fall on the company, and not the programmer. I would also argue that the programmer doesn't release the software, that's up to the company which *should* have testing and QA measures in place to find bugs and insecurity.
  • by Quiet_Desperation ( 858215 ) on Friday October 21, 2005 @12:54PM (#13845576)
    Everyone knows that insecure code is caused by code rot and magical error pixies.

    Next you'll be claiming that bad movies are the fault of the people making them, or that it's Britney Spears' fault she sounds like a howler monkey being run over by a bus.

    Sheesh. Scientologists...

  • by digidave ( 259925 ) on Friday October 21, 2005 @12:55PM (#13845585)
    I'm sick and tired of hearing talk about holding vendors or developers legally responsible for writing insecure code. It's impossible to write any complex application and not have security problems.

    The software industry operates more like the automobile industry: they know their cars will have problems, so they freely fix those problems for the warranty period. Software's warranty period is as long as the vendor or developer say they'll support that software.

    The major difference is with closed source software, after the "warrany" period is up you can't usually pay someone to fix the problems. Open source provides a great car analogy, because after, say, Red Hat stops supporting your OS you can still fix it yourself or hire a developer to fix it for you.

    This is why nobody would buy a car with the hood welded shut. For the life of me I can't figure out why anybody would buy software with the "hood" welded shut.
    • It's impossible to write any complex application and not have security problems.

      Tell me about it. Its too cool that I can always find an exploit in my credit card company's computer system, my bank's computer system, and the IRS computer system so that I can simply raise my credit limit, lower my balance, put more money in my account, and I never have to pay taxes. Next week, I'm going to start a nuclear war just for fun, because the password on WOMPR is still "Joshua".

      WOULD YOU LIKE TO PLAY A GAME OF CHE
      • "Its too cool that I can always find an exploit in my credit card company's computer system, my bank's computer system, and the IRS computer system"

        You have no access to those systems. They are not on the Net. Give a good hacker access to a banker's terminal for a long period of time and you'll see him get access he shouldn't.
        • You have no access to those systems. They are not on the Net.

          Yes they are. At least my bank and my credit cards are on the net. I can transfer funds, see my balance, refute a transaction, open new accounts with any web browser.
    • Although you're right about the nature of the problem, the welded hood analogy doesn't fit.

      Automobiles typically need 'preventive maintenance' (PM) performed on them, such as changing filters, belts, and other mechanical systems that need to be replaced due to use and wear. The closest analogy to this in computers is defragging the hard drive, and maybe the occassional disk replacement or vaccuming out the dust.

      Automobile manufacturers have done the closest thing they can to allow you to do the PM, without
      • The analogy works for as far as I took it, which is all analogies are meant to do. Arguably, non-security related bug fixes are similar to automobile preventative maintenance, or at least similar to auto recalls.
    • I'm sick and tired of hearing talk about holding vendors or developers legally responsible for writing insecure code. It's impossible to write any complex application and not have security problems.

      Software is unreliable because we have been doing it essentially the same way for 150 years without stopping to think that there might be a better way. We've been writing algorithms ever since Lady Ada Lovelace penned down the first table of intructions for a digital computer. It's time we reevaluate the algorith
    • Since you can't make something 100% perfectly secure, we shouldn't bother trying right? Just produce complete and utter crap like IE, IIS, Mozilla, etc?
    • Quality code CAN happen... but first things must change...

      Right now the environment in the business world today prevents truly bug-free programming. A lot needs to change:

      1 - Fire all the programmers and developers that can't program. We all know which ones in the group fit into this category. Unfortunately our bosses don't know. They're the ones that cause the majority of the bugs. They came into the industry just for money (pre-2000 bust) and they have no real feel for programming yet they know how to e

    • Another problem with these discussions is that they treat all 'flaws' the same. I can see at least 3 distinct categories that people should be thinking about:
      1. Malicious code, deliberately inserted into the program. I think everyone would agree the programmer should be held accountable in this case. Vendor could potentially have liability as well
      2. True bugs: code that doesn't function as the programmer intended. I think accountability would vary dramatically on a case by case basis. What percentage of
  • Is that developers are looking to make a profit because that is what it comes down to when you do cost vs. benefit analysis... Its like when factories were required to put giant filters in place to help cut down on pollution and if they didnt they were find $1000 dollars... Well if it costs a million dollars to implement the pollution solution (yes that rhymes) then cost vs. benefit for a profit margin was simple for them (although the environment suffered...) Should there be penalties for bad security prac
  • Vendors or Developers To Blame?

    I don't know anything about what causes buggy software, but years of training by the press, television, and movies have meticulously prepared my brain to accept the oversimplifying fiction that it must be one of them and not the other.

  • So I think everyone is financially to blame, other than my client of course.
  • by Nevyn ( 5505 ) * on Friday October 21, 2005 @01:01PM (#13845648) Homepage Journal

    This all seems to be a rehash of the "worse is better" meme ... that those damn software programers/companies aren't doing what we want. The only problem is, it's all crack [artima.com]. Almost no customers, even now, are willing to pay more for "quality".

    Yes, I think all other things being equal, people will go towards quality/security ... but it just isn't high on anyones list. Cheap, features, usable ... and maybe quality comes in fourth, maybe.

    And, yes, there are exceptions ... NASA JPL obviously spend huge amounts of money to get quality at the expense of everything else, and I say this having written my own webserver because apache-httpd had too many bugs [and.org] (which comes with a security guarantee against remote attacks) ... but I'm not expecting people to migrate in droves from apache-httpd, it's got more features. The 90%+ market share have spoken, consistently, and they just don't care about the same things Bruce and I do.

    I have a lot of respect for Bruce, but the companies really are just producing what most people want ... so stop blaming them.

    • This all seems to be a rehash of the "worse is better" meme ... that those damn software programers/companies aren't doing what we want. The only problem is, it's all crack. Almost no customers, even now, are willing to pay more for "quality".

      That is slowly changing as the security and reliability meme becomes more common in the mainstream. In practice it was Microsofts horrible run with security, which got a lot of press time, which began to bring security into public focus. It's still not entirely mainstr
  • There are laws that say that the employer assumes the liability for the employees -- unless the employee is acting so bad (e.g. going out of his way to kill someone) that you then say the employee is acting badly. This is why, for example, Domino's contracts with drivers to provide pizza delivery services: Domino's doesn't want the liability for auto accidents. I guess the law could be changed, but that's basically how it goes now. I don't see how it could go any other way: e.g. Billy Gates tells you to s
  • by dzfoo ( 772245 ) on Friday October 21, 2005 @01:06PM (#13845703)
    The real article by Bruce Schnier is in Wired:

    http://www.wired.com/news/privacy/0,1848,69247,00. html [wired.com]

    Its more interesting than the sound-bite-full ZD-Net summary.

          -dZ.
  • come on, really. (Score:2, Insightful)

    by CDPatten ( 907182 )
    I'm a developer and errors/holes in my code are my fault. Some, could in theory be the fault of the framework I use, but typically, its mine.

    People really over complicate this topic. Nobody is perfect, and people make mistakes. It really doesn't matter what excuse I use (deadlines, bad company decisions, whatever) if its code I wrote, its my fault. Even if I identified the hole and my boss told me to skip it, I still published flawed code. If I was perfect, it would be bullet proof from the get go, and
    • " I'm a developer and errors/holes in my code are my fault.""

      Yes, but fault != financial responsibility for the consequences of the errors. Who is taking the financial risk by publishing the software? It's the same entity that will be reaping the rewards of sales of that software, or of other revenue streams derived from distribution of that software.

      "Even if I identified the hole and my boss told me to skip it, I still published flawed code"

      Not really. You wrote it, but your company published it.
  • The link in the post goes to a LA Times summary of the article. The real article is at Wired [wired.com].
  • Sorry: I screwed up my formatting and punctuation. Here it is again, this time with feeling.

    There are laws that say that the employer assumes the liability for the employees -- unless the employee is acting so bad (e.g. going out of his way to kill someone) that you then say the employee is acting badly.

    This is why, for example, Domino's contracts with drivers to provide pizza delivery services: Domino's doesn't want the liability for auto accidents.

    I guess the law could be changed, but that's basically how
  • The vendor... (Score:5, Insightful)

    by BewireNomali ( 618969 ) on Friday October 21, 2005 @01:09PM (#13845730)
    I don't code, but I don't think making developers responsible for faulty code is a good solution.

    If I develop X for a company that then takes X to market, and X turns out to be faulty, company should be at fault. I am at fault for writing shoddy code, the effect of which will be that I get fewer future contracts or employment to do the same. Company is at fault for taking X to market, and as such should be resonsible for any liability due to X's shortcomings.

    GM is responsible for a shoddy part on one of its vehicles, not the engineer that developed the part.

    Sole proprietors who take their code to market should be responsible, but in that instance, the sole proprietor is both developer and vendor.
    • They have TIME to do the tests and have made a conscious decision on what constitutes acceptable failure tolerances.

      NOBODY expects a car tire to perform acceptably under the load of a 747 suddenly going down on it so it suddenly accelerates to 120 miles per hour.

      No industry is subjected to performing perfectly each and every time with untested configurations of their components. Testing COSTS.

      Basically, its the attitude that permitted the rise of Microsoft that is at fault here, and this attitude came from
  • by applecrumble ( 910692 ) on Friday October 21, 2005 @01:10PM (#13845738)
    A good start to our current security problem would be to stop writing internet based software in languages that allow buffer overflows to occur (e.g. C, C++). 90% of security exploits are caused by buffer overflows. I've seen a figure like this in research papers, but it should be obvious to anyone from reading patch descriptions and current security alters. Writing computer programs in these types of languages and patching the errors as they are found is simply not a scalable solution. It essentially means that if you write a program to be used on a network, you have to maintain and patch it forever because you'll never catch all the buffer overflows it contains (e.g. the zlib bug, not a particular large library and it has been around for a long time). Picking a tool that doesn't even allow these types of errors is the obvious solution. In addition, we need to start using more granular security permissions for our programs. Blaming security problems solely on users is ridiculous. Could you explain to me why a program downloaded from the internet has read and write access to every file on my computer? Why it can open up network connections? Having root users is a start, but we need to be able to sandbox all the applications we download so they just aren't allowed to do anything bad. I see no reason why a user shouldn't be able to download and run any program they find, as they should all be sandboxed appropriately that they cannot cause damage.
    • I see no reason why a user shouldn't be able to download and run any program they find, as they should all be sandboxed appropriately that they cannot cause damage.

      Sure, it may be a good start to remove some of the bugs, but who writes the sandbox? In what language? Is the sandbox itself sandboxed, to prevent being comprimised? If so, who writes that sandbox? In what language? Is that sandbox itself sandboxed, to prevent being comprimised? If so...

      It's not an "obvious solution." It's an "obvious t

  • by rsheridan6 ( 600425 ) on Friday October 21, 2005 @01:10PM (#13845748)
    If customers demanded secure software, then vendors would produce secure software. People instead buy software that's either the most familiar, easiest to use, cheapest, or has the most features checked off, so that's what vendors. That's why the utter pile of crap known as Windows 3.1/95 won while OS/2 and other more secure alternatives lost, and Windows continues to win over more secure alternatives today. Why should vendors spend their resources on something people have proven they don't care about?

    If people really cared about security, MS would have been driven out of business a long time ago, and other vendors would have taken note of that and made sure the same thing didn't happen to them. We would have more secure, less featureful, less convenient, more expensive software. But people don't care that much, so that didn't happen.

  • by Todd Knarr ( 15451 ) on Friday October 21, 2005 @01:11PM (#13845756) Homepage

    As a programmer, I'll accept liability for bugs in the code... the day I get the same protection that a professional engineer gets: if I say I need X for the program to be properly designed/written/tested, any manager or executive or marketer who says otherwise can be thrown in jail. No mere job protection, no civil remedies, jail time for anyone who tries to overrule me, same as would happen to a manager who overruled the structural engineer's specification of the grades of concrete and steel to be used in a building.

    Responsibility and authority go hand in hand. If they want to hand the the responsibility, they give me the authority to go with it. If, OTOH, they don't want to give me the authority, then they can shoulder the responsibility.

    • Not only is this exactly the correct solution to this problem, it also illustrates why anyone who calls themself a "Software Engineer" is full of it!
      • I work with a bunch of licensed Electrical Engineers. I write engineering applications they use to do their work.

        My job title ?

        Senior Computer Programmer.

        Despite the fact that my degree came from a school of engineering just like theirs did, and I do complicated system design *and* have to understand all of the details of their engineering practices and workflow, I actually think my job title is more appropriate than that of 'Software Engineer'. Not that Engineer would be inappropriate, it's just that 'pro

    • You imply that authority and responsibility go hand in hand.

      Rather, IMO, responsibility consists of equal parts accountability and authority. If you are responsible for X, then you have authority over X, and are held accountable for it.

      If you agree to be held accountable for X without having authority to influence it, you've signed a recipe for disaster, regardless of what it's called. I guess "responsiblity for X" is a something nice for a boss to say, but it's a false premise unless you have accountabilit
  • Plenty of Examples (Score:4, Insightful)

    by jpowers ( 32595 ) on Friday October 21, 2005 @01:12PM (#13845761) Homepage
    You can make a case for this without worrying about impinging on the right to make free software. Peopleware really isn't worth the thousands of dollars it runs you. Solomon Accounting isn't worth the $100K it costs for a companywide install, Great Plains and larger packages like Deltek's Costpoint (actual install cost: $450K) are no better.

    They have weak or no APIs, the built-in tools aren't able to perform the most basic tasks the users want, and the customized workaround take as much work as rewriting the software.

    I think the guy from the article has a point, as there are many businesses that spend many times any of our salaries running commercial software, and the people involved in the purchase have no idea they're throwing bad money at subpar products. I'm not sure he's talking about something relevant to most slashdotters: even those of us who work in IT don't really get to pick the accounting software people use, the CFOs pretty much run what they know and we have to build accounting their own network around that package.
  • Who Do You Trust? (Score:5, Insightful)

    by PingXao ( 153057 ) on Friday October 21, 2005 @01:13PM (#13845766)
    Who do you trust more?

    Noted security expert or political hack, ... noted security expert or political hack, ... noted security expert of political hack?

    It's not even close. On the credibility front Schneier has hundreds - no, thousands - of times more credibility on this issue than a political appoiontee out of the White House. Actually it's infinitely more credibility because anything times zero is zero where the White House is concerned.
  • I am still in college for CS, so I'm not really in the loop enough to know about the real-world application of programming, but I am split on this subject. Is it the developers fault bugs are in code? Yeah. But could it be helped? Probably not. The nature of programming alone makes it seem that almost everyone probably misses a few bugs (that are probably caught during testing before the product goes gold), since people do not work like computers. And from what I've heard, the time constraints for man
  • ...is that developers are not the only people responsible.

    Although it may vary from shop to shop, where I am currently follows a pretty standard model:

    • Business Analysts gather requirements from the prespective users.
    • Project Management creates specifications
    • Specification are presented in a JAD session where they are gone over in a public discussion
    • Project Management and Business Analaysts work together to deliver a formal Design Document
    • Second JAD session to dissect Design Document and petition for
  • If you're an engineer practicing on behalf of a single company, you're entitled to an "industrial exemption," at least in the state laws I've read. EEs working for Motorola don't need to be a PE. The company takes the liability, not the engineer. Businesses actively make a decision to place time to market over correctness, over "security." Forcing liability on people engaged in development is stupid and causes inevitable friction between the developer and the business who's hired him and ten others. If you
  • in differnt ways.

    If a developer produces shotty code, then they can be reprimanded and ultimately fired f their is a clear sign of shotty work and an inability on the part of the developer to improve.

    If the vendor deploys shotty code, then they can be sued if it causes some sort of damages to the customer. It's the vendors responsibility to ensure the software works fine before it goes out to the customer, and that involves having the right processes in place for quality testing, that are usually dedicated
  • Software Assurance (Score:5, Insightful)

    by Coryoth ( 254751 ) on Friday October 21, 2005 @01:34PM (#13845954) Homepage Journal
    Yes, software has bugs and mistakes and errors, and in a large project it can become infeasible to guarantee that there aren't issues somewhere. That doesn't mean, however, that software should simply ignore the issue. It's a matter of contracts and assurance: It is possible to make certain assurances about a piece of software and spend the time making sure it fulfills those properties. For instance, while you might not go to the trouble of ensuring a word processor is completely bug free, it may be worth providing assurances, for instance, that files cannot be corrupted when the program crashes, and that the print preview is exactly what will be printed. There are methods for proving and verifying such properties, and if you restrict it to key properties that the client wants formal assurance on then it is not significant extra work to use those methods.

    The same principle applies to security. While you may not be able to say your system in completely invulnerable without expending enourmous amounts of time and money, you can make certain formal assurances like "no buffer overflow exploits exist in this software" or "the software will always fully and correctly carry security protocol X, or abort with an error and deny access". Such things don't ensure 100% security, but being able to formally make such assurances does significantly improve the expected security of the software.

    For some reason software has gotten stuck in an "all or nothing" mentality, claiming that obviously you can't ensure perfection, therefore you should assume nothing, and make no assurances at all. That is neither necessary, nor productive. Being able to formally guarantee certain properties of software is both possible, and only as much extra work as the amount of assurance you choose to provide.

    Jedidiah.
  • If an individual or a company produces software, and they are liable for any damages the software causes, what do you think is going to happen?

    End users may end up in a situation where they either...

    a) sign an agreement whereby they accept liability themselves, getting the product at a reasonable price - or free of charge.

    or

    b) pay the vendor an exhorbitant fee for some form of software insurance - which would be necessary to cover costs should the company have to recompense for damages.
  • There are 2 kinds of issues:

    1) Designed Issues. These are just plain sloppy design. But is there a way to mitigate this? Yes, with more extenstive and proven frameworks. Still designer lack-of-talent can still doom and app. These can generally be easily fixed because they are at a high enough level.

    2) The harder problems come from the toolkits. This could be due to GLIBC or other"low level" libraries(LLL). I define a LLL as any toolkit which requires and permits you to manage memory implementation directly
  • If you're going to make individual programmers liable then you should also make corporations liable because corporations are legal persons as well. Legal persons are called "legal" precisely because they can be sued. Indeed the whole purporse of the creation of the corporation originally was to have a place where liability for engineering projects could be absorbed from individual investors. So I agree and disagree with both Schmidt and Wired' magazine's Bruce Schneier [wired.com], the former focusing on persons and
  • Proposals like these are stupid and idiotic because half the time it simply ISN'T CLEAR what a piece of software should be doing. We simply don't have formally rigorous and perfect descriptions to meet real-world demands, and for the most part we CAN'T have them. If we had, we could use those instead of writing computer programs. For the time being, however, computer programs are the best descriptions we have.
  • we sue the inventors and implementors of the programming languages that are used to write insecure software!

    or better yet, we could sue the hardware manufactures for allowing their hardware to run insecure software!
  • In my post [slashdot.org] from the last article, I already stated the many reasons why blaming the developer is simply ridiculous.

    But the fact that software, in general, has so many flaws is a simple matter of economics. As I said, in a different way, in my last post, people are willing to pay the current price for software with the types of flaws they have now. People simply wouldn't be willing to pay what software WOULD cost if it had few or no flaws.

    Left to its own devices, an economy can generally regulate such things
  • It's a ridiculous idea, but if it were to happen, it has the following implications:

    - Cost of software would have to go WAY up. Time to produce would increase significantly as well. Developers would take longer to produce software so more developers were needed or less software would be produced. MORE DEVELOPER JOBS. Better job security.

    - Increase in cost of software would cause more in-house development in order to save money, where there is no external liability as the software is not sold in the in

BLISS is ignorance.

Working...