Holding Developers Liable For Bugs 838
sebFlyte writes "According to a ZDNet report, Howard Schmidt, ex-White House cybersecurity advisor, thinks that developers should be held personally liable for security flaws in code they write. He doesn't seem to think that writing poor code is entirely the fault of coders though: he blames the education system. He was speaking in his capacity as CEO of a security consulting firm at Secure London 2005."
Send jobs overseas, CMM (Score:5, Insightful)
As for the article's last point about CMM environments: It's not at all an indication that software has been developed by quality developers, all it means is that the code was developed using a reasonable development framework. CMM level 3 means that you document your processes, and typically have peer review. Bad peers means peer review is worthless - it does not guarantee good programs. CMM Level 4 involves"quantitative quality goals" by which productivity, quality and performance are to be measured. This is a bit better, but again it's a matter of where the bar is set. CMM Level 5 is about continual improvement, and is extremely strict. I think that CMM Level 5 is the only environment where one can actually be assured of reasonable quality code. I've seen way too much bad code come out of CMM-3 and -4 environments to give them much credit. If you've got great people, then a CMM-3 environment typically produces great results. For -3 and -4, what you put in is what you get out - not guaranteed greatness.
Who is the bad guy? (Score:5, Insightful)
Sheesh! (Score:5, Insightful)
Why not make CEO's personally liable for not putting the code through proper QC channels and selling it over-promised.
Made to sell, not to use? Who's fault is that?
B-)
Hold Government Leaders personally responsible (Score:4, Insightful)
I'd love to see the some jail time or a fine for Mike Brown after Katrina, or how about some jail time for Bush after the false pretences of Iraq?
Want me to pay 10x more attention when I code? (Score:3, Insightful)
Pay me 10x more. And don't be in such a hurry for your product to get completed.
Re:Right.... (Score:5, Insightful)
Not coders fault (Score:5, Insightful)
Right. (Score:5, Insightful)
Re:Wouldn't that be like... (Score:3, Insightful)
And we do that today.
Why should software be any different, except that writing bug-free software is probably just as hard as designing a "perfect" car.
Re:Who is the bad guy? (Score:5, Insightful)
Re:Send jobs overseas, CMM (Score:4, Insightful)
Re:Right.... (Score:5, Insightful)
Your comarison doesn't match because developers would be held liable for a skill that they present as "Professional". Similar would be making the brick layer accountable for a building coming down.
nonsense (Score:5, Insightful)
Oh, yeah (Score:3, Insightful)
CMMI (Score:5, Insightful)
Yeah, let's blame the developers. (Score:5, Insightful)
I am currently the Development Lead / System Architect at my company. In my experience, the majority of "issues" and or "bugs" that I have seen crop up have been directly tied to poor requirements gathering by our "Business Analysts".
Often, it turns into a real pissing contest between the two groups. Usually, after testing reveals that the grand vision of the BA is a crock we will usually revert back to the original recommendation of the development group.
Yeah, let's blame the developers for the problems. That's the ticket.
Re:Who is the bad guy? (Score:3, Insightful)
One slightly contrived example...
A house has a door lock that's poorly made. A burglar jiggles the handle and it falls off and the door opens. You can bet yer bippy that the lock manufacturer is gonna hear from the homeowner's lawyer(s).
What a dumb idea. (Score:2, Insightful)
If we hold devs responsible how about politicians? (Score:1, Insightful)
In the world of corrupted politics today, it is hard to find ANYONE accountable for ANYTHING. Why should it be different for everyone else?
Just a thought.
Re:I can see... (Score:5, Insightful)
nice sound bite (Score:1, Insightful)
Bah (Score:2, Insightful)
Re:Wouldn't that be like... (Score:5, Insightful)
If you're going to attempt to compare apples and oranges, let's at least use an orange colored apple, shall we?
It'd be like holding car manufacturers liable for not making a car absolutely impossible to break into.
Chain of responsibility (Score:5, Insightful)
The vendor then holds the devloper responsible. They are responsible for 100% of all vendor bugs that are not the responsibility of the vendor.
The developer then holds the programmer responsible. He or she is responsible for 100% of all developer bugs that are not the responsibility of the developer.
It's the way it works everywhere else. If you have a faulty product, you take it back to the shop. They then take it back to the manufacturer and if it's a fault caused by a specific individual, they either sack him or train him properly. The purchaser would generally not sue the guy on the production line or the designer, even if it was their fault.
There are good reasons for doing things this way. It preents people from passing the buck. It means each entity along the line is wholly responsible for ensuring quality.
Liable for what? (Score:4, Insightful)
Nonsense.
Cheers,
Ian
He can't afford it (Score:5, Insightful)
There will always be a market for "cheaper" software that is not guaranteed to such a level, and with support contacts instead, where developers will try a moderate ammount to fix problems as they arise.
From another perspective, the market is demanding of cheap software - not good software, which is why there is so much of it.
Sam
Dear Mr Schmidt (Score:1, Insightful)
Thank you for you insightful comments regarding security flaws in code. As a well regarded member of the 'cyber-security' community, I find your perspective to be quite fascinating.
No doubt, in your long years as the former head of security with this community's favourite software development company, Microsoft, you gained much valuable experience in developing secure code.
I am not entirely clear how you envisage this 'personal liability' working in practice. Should we perhaps lien a programmers personal property, dwelling and car as soon as he or she begins development of software? This will no doubt have the beneficial effect of attracting many new recruits to this fun and exciting industry.
Might I also suggest, whilst we consider matters of personal responsibility, that we hold politicians and their appointees personally responsible their actions. There is the small matter of the US national debt, that I am sure we could sit down and discuss at some length.
Kind regards,
Anonymous Coward
No problem (Score:1, Insightful)
Deal?
Re:Send jobs overseas, CMM (Score:4, Insightful)
You don't hold overseas companies accountable, its not our job. We hold local companies accountable. They received the money from us. We don't care how they spend it or don't spend it. Normally these companies don't tell you upfront that they are the middle man. If they do that then their accountability is diminished. But in reality most of these companies say they are producing the code, have their licenses and brand name on them. So you just hold them accountable. If a software screws up they pay not the overseas company.
Flawed Premise (Score:2, Insightful)
While programmer ignorance, incompetence and/or laziness certainly plays a role in the problem, there are other factors that should be considered:
(1) Death-march-style deadlines imposed by management, leaving no time for proper design, threat modeling, or testing.
(2) Security flaws in the underlying infrastructure (operating system, network, etc).
(3) Malice/stupidity of authorized users to bypass established safeguards.
Security is the responsibility of everyone involved in the creation, management, and use of a system, not just the hapless developer.
organizational problems are bigger part (Score:4, Insightful)
2. What about laissez-fair management that ignores any such processes that are in place so to ship code on some arbitrary market-driven deadline?
Full of "Schmidt" (Score:5, Insightful)
Programmers don't draft contracts, they don't set deadlines, they don't make budget decisions, and certainly aren't responsible for failing to keep bugs out of a system that was (due to poor decision making in the aforementioned areas) designed to have bugs.
Re:CMMI (Score:5, Insightful)
That's where this sort of thing leads: insurance.
If something like this were to happen, there would be an immediate chilling effect on software development, followed by liability insurance policies similar to what doctors have. Software developers would start having this insurance, and then when the end users start making claims, the mighty insurance companies will simultaneously raise their rates and use their financial and political powers to buy laws that cap their liability.
Developers pay money, insurance companies get money, end users get screwed, politicians and executives get rich. This is called "building economic value".
Re:CMMI (Score:1, Insightful)
Re:Right.... (Score:1, Insightful)
Re:If anyone it should be the managers (Score:5, Insightful)
You got it right. Producing good code is a complicated process, not something one person can do. You need controls. You need reviews. You need methodical testing.
Why blame the developer who wrote the buggy code, and not the tester who missed the bug? What about the designer who produces a complicated bug-prone design?
Good software is a collaborative effort. You need a lot of people who know what they're doing working within a good process. Singling one person out in the system is misguided.
Profession (Score:5, Insightful)
1. You cannot become a doctor without long theoretical and practical training, intermixed with hard exams. All this is heavily regulated. To become a coder, you just have to pass a job interview. Software engineering certifications are optional and generally regarded worthless.
2. Doctors are insured against malpractice. The costs are high, and generally passed on to patients.
3. Doctors can choose not to operate (administer drugs, etc.), if the action constitutes malpractice. In software industry it's "use this braindead tool, or get fired".
4. Malpractice. Ok, today's revolutionary therapy, maybe tomorrow's malpractice (or vice versa), and experts might disagree about some practices, but there is some sort of general agreement on what constitutes malpractice. I'm not sure whether IT is mature enough to speak of "malpractice" here.
To sum it up: yeah, you can make developers liable for their mistakes, but the consequences would be huge. The costs of IT would skyrocket. Are you ready to pay for that?
SECURITY IS NOT A STATE, IT IS A PROCESS! (Score:2, Insightful)
I'm so fscking SICK of these people who treat as if it's something that can be permanently gained by doing A, B, and C.
BULL!
Security is about understanding your platform.
It's about knowing the strengths and weaknesses of said platform.
It's about maximizing the strengths and limiting/minimizing the impact and exploitability of the weaknesses.
It's about doing A, B and C, to get going. Then next week, you do D and E. Then think about implementing F. But make sure that it doesn't conflict with B.
Also, they need to understand that security is NOT about keeping people out of the system. Face it. If someone wants to get into your systems bad enough, they WILL get in. Regardless of your protections.
It's about making it so difficult to access it in an unauthorized manner that:
A: The invader gives up and moves on to easier targets.
B: Spends so much time trying to gain access that he gets noticed eventually.
C: Has to utilize truly heroic (and traceable and wildly obvious) means to gain access that he gets noticed right away.
So please, people! STOP with the damn pipe-dreams about "totally secure" systems already!
The only "totally" secure system is one that's been rendered down to shavings and disbursed in random geographic locations via wind, water, and other means of distribution.
Developers will take responcibility if... (Score:5, Insightful)
2. We get paid for the full development cycle, and no pressure to get it done on time, or even close.
3. If the Specs for the application never changes from the writen specs of the application before it is written.
4. We are not responcible for any flaws that happen in old versions when there is a newer version out there.
5. The Latest version of the Application is younger then 3 months.
6. The application went threw full debugging and testing for 2 years with at least 10 people per line of code.
7. The application doesn't try to keep compatibility with an older system.
8. Is used on hardware the specs were approved in and were created before the release of the application.
9. And if the developer wants to support it.
When developing a Car or builing a house, there is a lot more prework that goes in they know what they want and how it works before they build it. Programming right now is not setup like that because it is to expensive for a single application or a custom application. Plus it will make more people decide not to be a programmer if they are responcible for every code they ever wrote.
Programmers liable but big companies profit?? (Score:3, Insightful)
The programmer is personally liable, but the big corporation who employs him/her profits from the work? Wasn't the whole point of creating a corporation to put a degree of separation into liability?
Also, even if A Large Software Company promised to protect their own employees (some liability insurance as part of the benefit, say), this would still be bad news because it discourages independent programmers and coerces everyone into joining A Big Corp.
A better idea would be to make it optional, like certification by a licensed Software Engineer. Just like, for example, how you could build your own toolshed with wood and hammer, but to build a house, you have to get a Licensed Inspector or be a Licensed Civil Engineer or something. (Details fuzzy, but you get the idea.)
Okay, now to go RTFA.
Re:I can see... (Score:5, Insightful)
Re:Sheesh! (Score:5, Insightful)
Re:CMMI (Score:5, Insightful)
Yes, but if the hypothetical law was written that the coder was responsable, as recomended by the ex-cybersecurity czar, it wouldn't matter how many levels of incorporation you hid behind.
:) Seriously though, I think this guy is barking up the wrong tree. You can put methods in place to improve software quality, but I don't believe it's possible to produce perfectly secure software, of anything more than very basic complexity, in a timely manner and for a price that people are willing to pay. Feel free to prove me wrong, but I haven't seen it done yet.
Well, it would probably eliminate at least 90% of the software being written, since there aren't many coders who would want to be held personally responsible for flaws in the code, especially since it's usually a complex team process where they don't always have the final say in the outcome. So I guess that would reduce the overall number of bugs, right?
Great news for the software industry... (Score:2, Insightful)
Contempt. (Score:5, Insightful)
This is the same reason patents on software are ridiculous, can you patent a love story plot? It's just absurd. This is another example of our society's run-away liberal government mentality. Big government stifles creativity, freedom, and crushes capitalism.
A case like this should be thrown out of court as a frivolous lawsuit and the lawyer held in contempt, but we won't get that from activist judges.
Re:CMMI (Score:3, Insightful)
If payments are widely varying, and, as is the case, dependent on a doctor's wealth, NOT actual harm done (rich doctors pay more) it's extremely difficult to insure, and, even worse, becoming a better doctor won't lower insurance premiums! (This is because everyone will make some mistake at some point, and at that point, the jury will award an amount closer to the doctor's net worth, meaning over time, bad doctors pay the same as good.)
The way to solve this is to agree to a specific schedule of payments if there are bugs as part of a contract to develop code. This avoids all the problems you describe above (like trying to get out of liability) and keeps down insurance costs for good coders.
Re:CMMI (Score:5, Insightful)
Re:CMMI (Score:2, Insightful)
It cannot be guaranteed.
People who plan for long-term storage of nuclear waste have as rule #1 that they acknowledge they cannot design a system that will defeat people determined to break in. If the army protecting it goes AWOL over the centuries, as happened at the great pyramids, well,
Even if you could prove mathematically a system was secure, there's still the social engineering aspect. Which, I see from various news stories, seems to account for a good chunk of these security "lapses".
And I don't think you could prove a system mathematically secure "in general", anyway, as people data must go over a network, and people can crack encryption given enough time.
Re:CMMI (Score:5, Insightful)
Three words: Medical malpractice insurance. Take any side of this issue you want. In the end, patients get screwed somehow. You want this for software?
Re:Not coders fault (Score:3, Insightful)
Re:Deadlines (Score:3, Insightful)
If your employer says "get this ready tommorow or you're fired" you're probably not at a good place anyways. And if enough people held such personal convictions the employer would have no choice.
That said, all too many developers don't do their share of documentation or proper development practices. how many developers write doxygen/javadoc comments? How many developers verify their code? Write use cases at least? etc...
There should be liabilities for software developers. Otherwise what are they worth if any "monkey" can develop software?
Tom
Re:CMMI (Score:2, Insightful)
Re:If anyone it should be the managers (Score:5, Insightful)
Microsoft (in days of old) was criticized for raiding the top developers from other companies and universities. So with the top developers in the world we got Windows, Office and IE. (I don't think there is a need to say what people think of the quality here.) Google, now is the one raiding the top coders yet, they are still producing some buggy code.
If the best in the business can't produce secure bug-free software, how is anybody else? Granted, we should all strive to make the most secure and bug-free code possible. But, I really don't think it will be a common practice until the management of the process is figured out.
We've seen waterfall fail, over and over and over and over
RUP, while an improvement, still falls short.
Agile (XP, etc...) tries to address some realities of development but, it still doesn't really manage it.
Still, we do see some really good software pop onto the scene every once and a while. Even this is a symptom. The same groups who produce these gems often fail to repeat the process on other projects.
Re:Software Engineer - Oxymoron (Score:4, Insightful)
So while I agree with the sentiment that bug free software is possible this notion that no software [or hardware for that matter] should never have a bug is ludicrous and isn't even reflected by the "real engineers" [e.g. people who build buildings, roads, bridges].
Tom
Re:Insurance is bad, mm'kay? (Score:5, Insightful)
Insurance was created as a concept to deal with the fact that in a purely capitalist society there is no sense of community or common good and no one will help you when you need it most. Does anyone actually consider it to be an efficient and effective means of addressing this need?
Re:He's right, and it'd be good for IT (Score:3, Insightful)
Wouldn't this utterly destroy the Free Software movement? (Incidentally, we'd probably lose the internet too).
Re:CMMI (Score:2, Insightful)
Re:CMMI (Score:3, Insightful)
Hell, then they really don't want to piss me off. Then I'll just GPL it and make ownership topple like dominos.
Re:CMMI (Score:3, Insightful)
I don't know how it works where you are, but 'round here people pay car insurance based on how everyone else drives (factors like age, gender etc can play an enormous role in the rate, regardless of the drivers own record)
Re:CMMI (Score:3, Insightful)
On a more pragmatic level, there are a number of differences between the more traditional professions to be held liable and that of the code-monkey. Most important, to my mind, is that for the larger firms, much work seems to be done in teams, so tracing down what exact individual is personally and exclusively responsible for a specific bug would be computationally expensive on a grand scale. This, therefore, would seem to point to a larger corporate liability, which I guess is fine for the truly larger corporations, but could kill a small company or an open source group without a second thought. I have yet to see any large company (*cough* Microsoft *cough*) actually being held truly responsible for their mistakes and bugs, so this obviously hasn't happened yet.
Re:Developers will take responcibility if... (Score:1, Insightful)
In trying to maintain some sanity in the development process, I am constantly reminding project managers that I will be the one held responsible for errors, despite the fact that I didn't write the ambiguous specs, set an unreasonable schedule, and under-estimate the required QA resources. This is just another example of the misunderstanding that most people have of the software development process - coders get blamed for bugs and managers escape accountability (and get promoted for getting projects out "on time").
money (Score:5, Insightful)
We create a "secure" web browser but, its gonna cost $10K per copy. This will cover the cost of developemnet, security auditing, extra QA, and the dev cycles that go along with it. Since, the OS can't be trusted to run the browser, it will only work on a dedicated browsing computer with no operating system. Since other peoples code poses a risk, it will not run javascript, java, flash, or any kind of plugin.
Who would buy this?
If developers are carrying malpractice insurance, then the insurance companies are going to have a lot to say about how development is done, and *if* it should be done. Your boss hands you a project specification, you send a copy to your insurance co. You then tell your boss that you can't work on his project because you won't be covered.
Developers are going to have to charge a lot more for their services. Both for the personal risk involved and to cover the cost of insurance.
Programs can be made "more" secure and have "fewer" bugs but, its going to take more time. Time=money. Look how eveybody is whining that Microsoft is taking too long for the next version of windows. Maybe if they want it to be *secure and bug free* they'll tell MS not to rush; to take a few extra years to be sure about the product; and they'll pay more for it.
Re:Chain of responsibility (Score:3, Insightful)
Actually, you're not far off the mark here.
In any company, there is one person and one person alone who's responsible for the defective product -- the CEO. If payroll isn't met on time, that's the CEO's fault. If someone gets mugged out in the parking lot because there wasn't adequate lighting or your building security was nonexistant, that's the CEO's fault. If there's no toilet paper in the bathroom, that's the CEO's fault. If the company fails to meet its sales expectations, whether it's because the sales staff sucks, the marketing staff screwed up their job, or the engineers decided that the 40% chance of the widget blowing up and unleashing a bolt of lightning into the frontal lobe of the user was "good enough"... that's the CEO's fault.
The CEO is the chief executive officer. He's responsible for everything that his company does and everything that happens at the company. The weight of the world is literally on his shoulders, and it's why he gets the big bucks, the golden parachute, and the nice office.
It's also why it's his responsibility to make sure that the developers that get hired by his company have either been trained properly or get trained properly. It's why it's his responsibility that project managers know what the hell they're doing and make sure that when you design "end-to-end solutions" that they don't have gaping security holes like customer data passing into the accounts payable system in clear text. It's why it's his job to hire a CTO that understands all of this and can hire the project managers and programmers necessary to do the job right.
Schmidt is trying to pass the buck for his mistake. It's as absurd and cowardly as a general trying to say he lost a war because his soldiers didn't fight hard enough, or an NFL coach blaming his kicker missing a 3 pointer for causing the loss.
If he was serious about getting it fixed, there'd be a lot less whining to a trade rag trying to pin the blame on his employees, and a whole lot more fixing it.
Leadership for the win. \o/
Re:organizational problems are bigger part (Score:5, Insightful)
If Ford has a car with faulty steering that locks and causes me to be in a very bad accident, should Ford be liable? IMO, yes. Should the engineers be personally liable? IMO, no. It is up to Ford and their management to hire competent employees and competent management to make sure those employees put out a safe product.
Imagine what would happen if people were allowed to sue an individual employee because of a faulty product. The cost of labor for _any_ technical job would go through the roof because those, engineers, developers, machinists, etc would all need to buy personal liability insurance, just like doctors have to. One of the reasons doctors _have_ to charge so much here in the USA is because of insurance costs to protect them against sue-happy lawyers and people. Top surgeons can easily pay $100,000+ a year just for insurance!
Re:CMMI (Score:2, Insightful)
Re:Developers will take responcibility if... (Score:3, Insightful)
Get real! No pressure to get it done on time? What other engineering discipline would this be acceptable in? None. "Sorry sir, your bridge is not built yet - but we don't feel pressured to complete it in the timeframe we said we could do it in".
The world changes. Deal with it. Or be unemployed. Requirements change, just a fact of life.
What?! So, never retro-fix a serious security flaw into a product just because a newer version exists? So, in your world you would just give your customer the one fingured salute and tell them to upgrade? You'll find yourself without a customer.
I have no idea what you mean by this. Are you saying you would not take responsibilty for anything you produced which is older than 3 months?!
This is engineering. There is an acceptable level of failure.
Fantastic! I'll just tell the Bank who consult with me that they need to upgrade every system they have becasue our new application doesn't like talking to anything which is not 0 day. I can see that going down well.
Yep, valid point. This should always be the case. Although now you can't beef up the servers in case of performance problems, even though this is the cheapest way to do it.
Now you're just making me laugh!!
There are so many (Score:3, Insightful)
For example. Today, I set up HPLIP for the first time instead of HPOJ for my PSC2110. What a pain. I had no problems configuring or making, but then there was an issue when I tried installing. Clearly the HPLIP programmers' fault, right? Or was it that I was using a Slackware derivative with a mixture of packages and as a result, many libraries and config files were in non-standard places? I would have guessed that if ./configure && make worked, everything was found properly. But it wasn't. If my nonstandard config was the problem, then perhaps I'm responsible. Eventually I got everything working but with one caviat. I could only scan as root.
In the real world, if this happens to a litigious happy individual who likes to bill $400/hour, he'll sue:
There are different categories of software (Score:5, Insightful)
I write software that is usually only run on one or two computers at one location, and it's constantly modified to add features, fix bugs, etc. Our company and our customers can't afford to pay triple the cost for the stringent software testing that a huge Micro$oft type place would have, so a law making the programmers personally liable would make all custom software prohibitively expensive.
We do sell our code with a 1 year warrantee, so we agree to fix all bugs that come up within the first year. However, the agreement is not a guarantee. If there is a bug, we agree to fix it, but we're not going to compensate the customer for lost production or expenses.
There is software in this world (I'm thinking the QNX kernel here) that actually comes with a guarantee that it works as documented. The company (QSSL) has liability insurance just in case. Of course, that makes QNX licenses more expensive than they would otherwise be.
Most software comes with a disclaimer. Microsoft tells you that the user accepts the liability for any bugs. Even though nobody reads that disclaimer, it still exists. Right now you have a choice - you could hire someone to write code and give you a guarantee (expensive), or you could just buy something off the shelf (cheap) that would probably work ok most of the time. The article is talking about removing that choice.
Not a good idea. (Score:3, Insightful)
What if the user doesn't run it under the conditions specified e.g. connect it to the internet and internet was not covered by the specification should the developer be liable then? Of course you could hold the developer liable no matter what. But that would put software development in a different position than all other products. E.g should a building contractor of a high building be held responsible for the damage to a parked car outside the building caused by somebody jumping from the roof in the act of committing suicide? I think not, even though the errors in building construction making this possible and the means to fix them is much more evident than most software problems.
The only thing that will happen if this was introduced is that software prices would go up radically as software companies or individual developers need to make sure the make a profit even if they have to pay damages now and then. I.e. the price of the software will have to pay more lawyer and insurance fees. If this is introduced in a country the cost of running a business will increase significantly, and I am not just talking about software business. How many businesses would afford to have the cost of their IT infrastructure increased by several orders of magnitude. A country that introduced such laws would kill all business that need some kind of IT support, at least if it did not also have very high customs fees or taxes for imported products and services.
As for the software industry of such a country you would probably see fewer and bigger companies with the money to bury customers claiming their rights in legal process for a very long time perhaps until they go out of business before they get their money. The fact that there was fewer actors in the market would in itself raise the price of software due to less competition. It would also slow down the speed of development. If you for instance create a new version of an office productivity suit, you would probably want to test it for several years on a group of subjects that have waived all their legal rights before you release it to the general public. Then you would like to profit from that investment for a very long time. Perhaps 20 years or so.
And people want software held to a higher standard (Score:5, Insightful)
1) It is not resiliant to attacks. If someone wants to break in and steal it, it's very easy to do. Trivially easy to someone with training. The manufacturer has done NOTHING to fix this. In fact, all suggested solutions are just bandaids, they don't really do anything. Stronger glass, a kill switch, the Club, all are easily defeatable. They offer me no absolute security against attacks.
2) My car does not deal with user error very well. If I put it in neutral and floor it, the engine will overheat and seize up, no cut out. If I poot toothpaste in the oil tank instead of oil I'll ruin the engine. There is virtually no protection against me making mistakes, and many of the mistakes will permenatnly disable the car.
3) My car doesn't handle unexpected situations well. If it suddenly hits a brick wall, it will be damaged or destoryed, same if another driver suddenly collides with me. It only operates properly under normal circumstances.
What's worse? They KNEW about all these problems from the car's inception. They sold it to me, knowing these problems, and are doing NOTHING to fix them! Even upgrading to a newer version of my car (for which I must pay full price) won't fix them.
So I feel it absurd to attempt to say "We have to hold software to the same standard as cars" and by that mean that software should be perfect. Cars aren't perfect, by software standards they are buggy peices of shit. I expect that software should be essentially immune to any malicious attacks. If a flaw is found, I expect it fixed in a timely fashion for no charge. Likewise, I expect software to deal with user error well and not blow up if I do something wrong. However if I told you I wanted a car that did all that, I'd be laughed at.
Re:Who is the bad guy? (Score:3, Insightful)
I don't remember ever seeing a piece of software that wasn't provided "AS IS, WITHOUT WARRANTY OF ANY KIND INCLUDING THE IMPLIED WARRANTIES OF FITNESS FOR A PARTICULAR PURPOSE". Maybe the military or NASA can afford to buy software that has a purpose, but so far, all the software I have ever installed was somehow useless by design, since none of it should serve any purpose... Hard to hold me responsible if I sell you something and tell you in BIG CAPITAL LETTERS that whatever I'm selling to you is useless.
Re:Software Engineer - Oxymoron (Score:2, Insightful)
However, a lot of people here have stated that it is flat out impossible to write code with no bugs. Almost as if it were a law of physics, or religious dogma. It is, of course, also a handy excuse for writing buggy software, and a great way to dodge the responsibilty.
It IS possible to write bug free software; I know because that's what I do for a living. We write software for railroad traffic control systems. An unsafe failure can easily lead to dozens of lives being lost. For an analogy, picture an intersection of two busy four-lane highways, where the traffic lights once in a while all go green at the same time.
There simply can't be bugs in our finished software. The procedures and methods we use to ensure this are time consuming and expensive, but we have no choice. And if, God forbid, somebody died because of a bug in my software, then I would be responsible.
Re:organizational problems are bigger part (Score:4, Insightful)
And what if Ford sells you a car that fails to leap to the side to avoid an imminent collision, causing you do get into a very bad accident? And if Ford sells you a car that can drive into a building at 100mph? And if you use your car in some extreme environment that causes the breaks to degrade rapidly? What if the steering only locks after 20 years of use? I think you need to make a distinction between gross negligence and simple physics. Certainly if Ford misrepresents the capabilities of the auto that is different, but one simply cannot expect everything to work perfectly at all times. Life is fatal; everything is a tradeoff of risks, and at the end of the day you've got to watch out for yourself.
There's also a big difference in that if I drive a faulty car (which there are various regulations against, or at least manufacturers must meet various regulations before they can sell a car), I put you in danger. If I use faulty software, I only put my data in danger (ignoring worms and the like). I'm not really interested in paying more for higher quality becaue you think I should.
That leaves the question: if my faulty software damages your data becaue it contracted some malware that attacked you (or perhaps it's just faulty somehow), then who is at fault? Should the internet be regulated like roads are? I would like to think "no, certainly not", but who knows. Would regulation even improve things? Highly unlikely I think.
Re:Software Engineer - Oxymoron (Score:3, Insightful)
The problem is us few folk who actually care to do things right at the start usually get pushed aside from the peeps who want a really quick solution. Of course it usually happens that down the road a proper start ends up saving time and money in the end but management doesn't care about that.
For me the best compliment I get is "it just works". As in people use my libraries [and various programs] and they "just work" as advertised [e.g. documented]. People seem to be surprised that I document stuff too [e.g. I have a manually typed/formated manual in LaTeX as well as doxygen comments] as well.
On my part I do things like make rational design choices [e.g. clear function names, consistent parameter orders, return values, expected behavioural models] because *I* want to use the code. The fact that it helps others [the code is public domain] is purely immaterial.
And I think that's the trick. Most "really crappy" software is written by people who
a) don't know better
b) won't be using it themselves in future work [e.g. it works now, I'm done]
c) see a)
Like look at things like some kernel modules. They're for the most part horribly written but that's solely because once it works once they think their job is done.
Then you have a host of really buggy pieces of commercial, shareware, freeware and OSS tools that come from people who bite off more than they chew. They come out of college or university without a single successful project under their belts and they assume they have unquestionable knowledge about the nature of the universe.
What's worse is some of these people turn into 50 yr olds with a chip on their shoulders about a golden yesteryear.
If people just wrote code under the working assumption they'd have to see it again one day you'd see more modular, flexible, well documented and thoroughly tested code. Or they're stupid for inventing more work for themselves...
Tom
I'd even argue the company angle (Score:5, Insightful)
Here's an interesting question. A piece of software that is written to work with Windows has a security flaw in it. The security flaw creates an exploitable condition in Windows such that you can gain total control over the system. Who's fault is it?
Obviously there was a security flaw in the software that you were using, but then it wouldn't be that critical if Windows handled it's security better. So isn't Windows partially to blame. And what if you set it up in an insecure manner? Isn't that your fault? Or is the developer's fault for not making it more idiot proof.
Now taking that down to the code inside of a program is just ridiculous. If you've got a team of 10 people (which is small in the grand scheme), each one of them could, individuall write totally secure code. However, come integration time, it turns out that they are opening up holes in eachother's code. So then who's fault is it? What about QA? Shouldn't they have some liability too?
Finally there's the PHB factor. You could have a group of the best, most security knowledgeable programmers in the world, and they could still screw up due to lack of time and resources. What if the boss tells them to do something that makes the system innately insecure? Who's fault is it then, his for telling them to do it or theirs for not pushing back on the requirement. Not to mention what happens after people have work a few months of 60 hour work weeks trying to get a project done.
In the end, liability is just a dumb concept in computers. In the end this is one of those places where the invisible hand of the market place is the best correction. Companies that write buggy software routinely will be smacked by the marketplace, by and large. The only exception to that rule is companies like Microsoft who have an effective monopoly. But then that's why we have anti-trust law isn't it?
Re:Contempt. (Score:3, Insightful)
Right. This was proposed by a former member of a Republican administration, who was appointed and served at a time the Republicans controlled the White House, Senate, and House of Representatives.
Oh yeah. We're overrun by run-away liberalism.
Professional Engineers, Practices and the Industry (Score:3, Insightful)
But getting to be a P.E. involves overcoming the standard challenges and it isn't for everyone. A lot of engineering in non-software fields seems based around working with known processes and known parameters to produce a product or some result.
The reason bridge building is a pretty sane discipline is that the characteristics of materials and the physics of bridges is pretty well explored. When a Civil Engineer builds a bridge (or designs one), he has good computer aided tools to do it, standard catalogs of parts and materials, and he knows all about tolerances, safety factors, and good processes. He couldn't sign-off on the project otherwise, without taking his head in his hands.
Contrast that with my work, where I have to build applications using an OS I know is inherently flawed (they all are, but some more notably), it must be designed to work on a wide variety of hardware platforms (many of which I don't have on hand), it must often work with other people's code from outside my organization which is bleeding edge and often of dubious standards, and it is built with tools I only mostly trust and on top of libraries from the OS provider and from third parties into which I have no visibility. There are strategies to mitigate risk, but I'd be very damn leery of signing my name or affixing my sigil in a P.E. context to even my best code - because I know the system it is part of has so many components I don't control and so many points of failure.
One risk mitigation strategy involves extensive testing (some say up to 90% of project cost). Anyone interested in paying $1500 for a copy of Office? I don't see many hands.
I'm all for seeing an improvement of professional standards and practices in the field, the injection of more engineering approaches into the field, etc. But the software field moves faster (IMO) than any other technical field. It also is one in which you have the least faith in the parts you build with. Until reform happens *across and throughout* the field, any efforts to go after companies or individual engineers is a waste of time.
Let's put it another way, more succinct: If I had to sign off in a legal liability sense for the code I've been writing for the last two years on the current contract, I'd imagine I'd have written about 10% of the code I have written and I'd have demanded a *lot more* from the people supplying me with 3rd party code to integrate. Since I know the business model wouldn't support that (the costs would kill the product as it stands), I have to think this approach is only viable once we decide we don't want 'the next new thing' in software and that we care about what we get enough to pay for it.
Someone compared the effort to Ford or GM making cars. If you want to spend $15-50K dollars for a computer, I'm sure we can offer you a lot higher level reliability from the software. heck, at those kinds of costs, you might get the same sorts of warranties you get from Ford and GM, though they warrant around as much as they can get away with. But if you want to pay under $1000 for the hardware and under $1000 for the principal software, then you might as well expect something that works about 1/10th as well. And it seems to me you've got that.
So, who here is lining up to buy the first $15K personal computer?
Nice idea, don't see it happening anytime soon.
Re:Professional Engineers, Practices and the Indus (Score:2, Insightful)
sorry, got on a little bit of a rant there.