Yet Another Software Sucks Article 32
Narril Duskwalker writes "This one's from cNet.
`There's only one problem with software development these days, according to security analyst and author Gary McGraw: It isn't any good.'"
`There's only one problem with software development these days, according to security analyst and author Gary McGraw: It isn't any good.'"
Yeah C sucks... (Score:1)
And popular languages like C and C++ are really awful from a security standpoint.
I wonder what language they wrote the JVM in...
Re:Yeah C sucks... (Score:1)
Get off your soap box ... (Score:1)
I can now say "I got my thoughts published on the internet at a popular news sight." just like him.
Re:Get off your soap box ... (Score:2, Funny)
Listen, pal, very, very few commuters run on electricty. Oh, yeah, you'll see a few GM EV1's or Honda Insights (which is only a hybrid, anyway) during your morning commute, but the vast majority of drivers still rely on the internal combustion engine.
Get your facts straight before you post, buddy!
Wrong Problem. (Score:2, Interesting)
The problem is that software vendors get away with using the laughable disclaimer that "this product isn't warranted for any suitability or purpose."
I'm not even sure that the kind of disclaimer above should be legal without a more concise "NOT GUARANTEED TO WORK" stamped across the splash screen.
If a company isn't willing to guarantee that a program fucking does something, why do they keep coming back to it? Because it's got a Madonna song and fluffy clouds in the commercial?
If a company consistently provides unstable software, why do people run to upgrade instead of demanding more comprehensive patches for what they've already paid for? Is rushing toward flashy new features more important than stabilizing what you've already got?
Re:Wrong Problem. (Score:1, Interesting)
Unfortunately this world doesn't allow for either of those things. Features that you don't need are needed by someone else, so they get put in. The company not only needs to compete with time to get the product out, but also with other companies who are also unwilling to give up any marketshare.
It may be a pretty bad system, but release product now and release fixes later is good business.
Re:Wrong Problem. (Score:3, Interesting)
We can blame Microsoft for this sick state of affairs. Until they came along, it was SOP for software licenses to essentially rent the software to the user, giving the author a stable revenue stream. Then Microsoft comes along, and realizes that to sell software for microcomputers, for a number of reasons (lack of hardware support, lack of user interest, non-commercial nature of licencees) a different licencing paradigm was called for, so they went with something more like books.
The catch is, books are generally heavily proofread, but once they've been printed, your stuck. Normally this isn't so bad, since most books work just fine w/ a few typos. Not so with computer programs, a single typo going unnoticed durring testing could be a fatal flaw for software.
Unfortunately, the book model provides no significant incentive to fix released software. Sure, the major showstopper bugs will get patched if they escape testing, but most of the minor glitches and irritations will be left in until the next release, when the product can again be sold for a profit.
The real kicker is that the market for new systems is slowing, and with it the market for new softwar, and consumers are tiring of having to pay to upgrade to properly working software, but the last 20yr of paying once for software has lead them away from acceptance of subscription-style payments, putting software houses in quite a bind...
So, if the consumer was able to accept a payment model that didn't reward the perpetual ignore-bugs/add-features/realease-new-version cycle, we might have non-bloated, functional software.
Personally, I give it another 20yr, before the general populace has enough common sense about computers to force vendors to do the Right Thing.
Re:Wrong Problem. (Score:2)
I don't expect hobbyist, academic or non-profit software to have warranties (such as Linux), but I do expect all software that I purchase to have one (like the Redhat distribution of Linux). It doesn't have to be a fancy warranty, but it should be more than "we disclaim all merchantability even though we put a price sticker on it like it was". Don't claim that your software is usable then disclaim fitness for use. That's borderline fraud.
Re:Wrong Problem. (Score:1)
For security, things are further complicated because in general, functionality and security trade off badly against one another. Plus security is not a feature...it's a property.
We wrote "Building Secure Software" to help developers negotiate that tradeoff (and other related thorny tradeoffs). In the real world of "should have shipped it last quarter" developers concerned about security need all the help they can get!
gem
Gary McGraw
http://www.cigital.com/~gem [cigital.com]
Re:Wrong Problem. (Score:2)
Yes, but what other industry is allowed to include a blanket disclaimer for all effects of using a product? And what other industry can refuse to let you return your product when it doesn't work for you?
I reiterate that the current software warranties and disclaimers should be illegal.
Gimme specifics (Score:1)
Interesting article, but I'd like to see more specifics. I guess McGraw is (1) trying to sell his book and (2) talking to a wide audience as likely to include managers as developers, but what I really want are some meaty bits that, as a developer, I can directly act upon.
The FreeBSD site has some secure programming guidelines [freebsd.org] which are worth a gander.
Re:Gimme specifics (Score:1)
The second half of Building Secure Software [buildingse...ftware.com] has detailed chapters on: buffer overflows, access control, race conditions, random numbers, applying crypto, input validation, password systems, tamperproofing, and getting through firewalls. There's (too much) C, some Java, and a bit of python to make things real.
Security weenies tend not to understand that software lies at the heart of the security problem, choosing instead to throw firewalls and crypto at the problem and call it solved. You guys know better.
Software, the proactive security solution.
gem
Gary McGraw
http://www.cigital.com/~gem [cigital.com]
Some insight. (Score:1)
Does it occur to him that software designers might write "complex and hastily" code because expectations are high and time is short. And where, besides software, would computer security problems lie anyway? SCSI cables?
Re:Some insight. (Score:1)
Some people believe that computer security problems lie in the network cable connecting a LAN to the Net. That would be wrong.
gem
Re:Some insight. (Score:2)
Even if the software is bug-free, a bad system design can make the system insecure.
Take him seriously (Score:1)
If someone can't even use the english language "proper", how an earth can they be trusted to use any other language, with out dodgy grammer.
Maybe he was in a rush?
Try DeMarco's take on it... (Score:3, Interesting)
Kicking those who manage complexity is always going to be easy - but until you can do better then you're not really helping.
The book is well worth a read... if only to shut up all those metrics freaks...
T
What the.... (Score:1)
Let's assume we take all the authors advice and read his book. Write some "secure" software that meet his requirements and for all intents and purposes the software is "secure". Now take this piece of secure software and put it in an insecure environment. With no physical security, no network security, no security policies in place to use the software, etc. Suddenly your "secure" software is not secure anymore. So obviously, the problem is not software. Perfectly secure software will not solve security problems.
Let's use an example. Let's say somebody keeps breaking in my front door. I add locks and chains and somebody keeps getting in. So I go out and spend $10k on a front door that is bulletproof and guaranteed to be secure. Haha, I've got them now. The next day, they break my window and walk climb through. The door was just the easiest way to break in. Secure the door and there will be another weakness. The problem is much bigger than just the software.
In the same way you should not trust a newly invented crypto algorithm, you should not trust some random joe blow's opinion who got published on cnet.
Re:What the.... (Score:1)
Of course there is no such thing as perfect security or perfect software! I don't think I said that. It's all about risk management.
Here's what Bruce Schneier says about "Building Secure Software" in the preface that he graciously agreed to write for it:
"We wouldn't have to spend so much time, money, and effort on network security if we didn't have such bad software security...
Sometimes, network security can defend against these vulnerabilities. Sometimes the firewall can be set to block particular types of packets or messages, or to only allow connections from trusted sources. (Hopefully, the attacker is not already inside the firewall, or employed by one of those trusted sources.) Sometimes the intrusion detection system can be set to alarm if the particular vulnerability is exploited. Sometimes a Managed Security Monitoring service can catch the exploit in progress, and halt the intrusion in real time. But in all of these cases, the original fault lay with the software. It was bad software that resulted in the vulnerability in the first place."
A point well worth pondering. Now go read the book and *then* you will be allowed to rant.
gem
Gary McGraw
"a random joe blow"
Re:What the.... (Score:1)
10. Assume nothing. Question all assumptions and choices you make when you develop software. It's very hard not to be shortsighted. So it's really good to get other people to assess your design and help you with risk analysis. Something that goes along with this is: Never trust security claims people make. Computer security these days is chock full of snake oil.
Re:What the.... (Score:1)
gem
"snakeoil"
Just a normal evolutionary process... (Score:1)
Simplicty takes time. Complexity is easy. Early phases of evolution always spawn fantasticly complex structures. Over time these get beaten down into simpler, stronger, and more subtle designs.
This is really a great opportunity for those who would try to dethrone Microsoft from its place as the software vendor of choice for many companies. Provide an alternative, use security as the winning card.
The challenge is to improve the existing structures, secure all the entry points into a network, assume that all contacts are potentially hostile, and generally turn the naive servers we use today into something resembling a real organism: alert, defensive, paranoid. Competition will then do the rest.
Reactions kind of funny... (Score:2)
Let's face it, the guy is, unfortunately, right. C and C++ ARE crappy languages, in that they rely on the programmer to ensure there are no buffer overflows. Other languages do offer such protection. OK, they may be different from what you are used to working with, but too many programmers (me included) don't check for overflows. As a result, we get bugs. Some merely crash the program or the system, some can be used to crack the whole box.
Who's to blame? Designers, programmers, consumers? It's tough work to retrofit security into an insecure design -- look at all the work sendmail has required in the last five years. In general, only those programmers who have been bitten by a security bug take the time to put in the extra checks -- it slows down the programming and the program. As for consumers, we'd all rather have the latest and greatest."features." (Like Clippy!) Until you've been bitten, of course; mine came when somebody hijacked my 14.4k dial-up connection to relay spam.
Why don't we return all the software that crashes, like everybody's talking about doing with the new copy-protected CDs? What do you need with the new Office XP that wasn't in Office 95, for example? The new, improved, crash-resistance? ;)
It turns out that software is way more complicated (Score:1)
"It turns out that software is way more complicated than it used to be."
[Wow. That's the most blindingly obvious comment I've heard for a long time!]
I feel that poor software quality boils down to (a) inprecise specification (b) incomplete planning and (c) development environments that are to damm fast.
I recall in the good 'ol days of computing having to wait hours for a compilation run to complete. You can bet that the thought behind code changes was considerable and that every line was checked very carefully, dry-running and all that stuff.
Nowadays it's hack, compile, test, hack, compile, test etc etc until it works. Why it works and if it will hold up is a seperate, and largely ignored, issue. And don't point at QA and say it's their job to find the bugs in your crappy code. It isn't -- it's yours.
So, who's for going back to really s l o w compilers as a mechanism for improving software quality????
determining number of bugs... (Score:2, Interesting)
No the best method also factors in competancy of management, competancy of engineers, and the cost of failure.
Lets take as example nuclear power plants than have operation control code behind them - how many lines of code do they have? I'd suggest 10's of millions.
Why don't we see crashes of these systems widely reported?
1) They are safety critical, if an error occurs anywhere the surrounding code must fail closed meaning that it should not result in false results being produced.
2) If you screw up you can't just say "hey we'll fix it in the next version" - if you are lucky you'll simply get your day in court for negligence and you will no longer have a place in the safety critical market. If you are unlucky that still happens but you then get the ass sued off you by the relatives of anyone injured, maimed or killed by your software bug.
You have to admit the second point really is one hell of an incentive not to screw up!