Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Yet Another Software Sucks Article 32

Narril Duskwalker writes "This one's from cNet.
`There's only one problem with software development these days, according to security analyst and author Gary McGraw: It isn't any good.'"
This discussion has been archived. No new comments can be posted.

Yet Another Software Sucks Article

Comments Filter:
  • And popular languages like C and C++ are really awful from a security standpoint.

    I wonder what language they wrote the JVM in...

  • IMO, this guy needs to get off his soap box and stop stating the obvious. He makes a lot of valid points, but he stopped just short of saying "comuters run on electricity, and you need to plug them in to use them." I'm amaized at the fact that this artical got published at CNET and that I actually read it. Hell, if I ran this post through a spell checker, I would only be a half step behind him.

    I can now say "I got my thoughts published on the internet at a popular news sight." just like him.
    • comuters run on electricity

      Listen, pal, very, very few commuters run on electricty. Oh, yeah, you'll see a few GM EV1's or Honda Insights (which is only a hybrid, anyway) during your morning commute, but the vast majority of drivers still rely on the internal combustion engine.

      Get your facts straight before you post, buddy!

  • Wrong Problem. (Score:2, Interesting)

    by Snowfox ( 34467 )
    The problem is that consumers are willing to put up with crap because they buy marketing promises instead of software.

    The problem is that software vendors get away with using the laughable disclaimer that "this product isn't warranted for any suitability or purpose."

    I'm not even sure that the kind of disclaimer above should be legal without a more concise "NOT GUARANTEED TO WORK" stamped across the splash screen.

    If a company isn't willing to guarantee that a program fucking does something, why do they keep coming back to it? Because it's got a Madonna song and fluffy clouds in the commercial?

    If a company consistently provides unstable software, why do people run to upgrade instead of demanding more comprehensive patches for what they've already paid for? Is rushing toward flashy new features more important than stabilizing what you've already got?

    • Re:Wrong Problem. (Score:1, Interesting)

      by Anonymous Coward
      In an ideal world software makers would have enough time to fix all bugs and ship only those features that are necessary for *you* to get your work done.

      Unfortunately this world doesn't allow for either of those things. Features that you don't need are needed by someone else, so they get put in. The company not only needs to compete with time to get the product out, but also with other companies who are also unwilling to give up any marketshare.

      It may be a pretty bad system, but release product now and release fixes later is good business.
      • Re:Wrong Problem. (Score:3, Interesting)

        by ameoba ( 173803 )
        Not quite... more like features that aren't needed keep getting added so that, instead of fixing the current version, the version number can be incremented and customers can be charged for upgrading to a newer version.

        We can blame Microsoft for this sick state of affairs. Until they came along, it was SOP for software licenses to essentially rent the software to the user, giving the author a stable revenue stream. Then Microsoft comes along, and realizes that to sell software for microcomputers, for a number of reasons (lack of hardware support, lack of user interest, non-commercial nature of licencees) a different licencing paradigm was called for, so they went with something more like books.

        The catch is, books are generally heavily proofread, but once they've been printed, your stuck. Normally this isn't so bad, since most books work just fine w/ a few typos. Not so with computer programs, a single typo going unnoticed durring testing could be a fatal flaw for software.

        Unfortunately, the book model provides no significant incentive to fix released software. Sure, the major showstopper bugs will get patched if they escape testing, but most of the minor glitches and irritations will be left in until the next release, when the product can again be sold for a profit.

        The real kicker is that the market for new systems is slowing, and with it the market for new softwar, and consumers are tiring of having to pay to upgrade to properly working software, but the last 20yr of paying once for software has lead them away from acceptance of subscription-style payments, putting software houses in quite a bind...

        So, if the consumer was able to accept a payment model that didn't reward the perpetual ignore-bugs/add-features/realease-new-version cycle, we might have non-bloated, functional software.

        Personally, I give it another 20yr, before the general populace has enough common sense about computers to force vendors to do the Right Thing.
    • I hear your cry, and I have shouted it myself before. All commercial software should have a warranty, even open source commercial software. My toaster has a warranty. My coffee pot has a warranty. My car, stereo and carpet have warranties. Even my RAM has a warranty. Why not my software?

      I don't expect hobbyist, academic or non-profit software to have warranties (such as Linux), but I do expect all software that I purchase to have one (like the Redhat distribution of Linux). It doesn't have to be a fancy warranty, but it should be more than "we disclaim all merchantability even though we put a price sticker on it like it was". Don't claim that your software is usable then disclaim fitness for use. That's borderline fraud.
    • Even companies that have traditionally used egregious licenses to escape the blame for their bad shovelware are coming to realize that users are demanding better stuff.

      For security, things are further complicated because in general, functionality and security trade off badly against one another. Plus security is not a feature...it's a property.

      We wrote "Building Secure Software" to help developers negotiate that tradeoff (and other related thorny tradeoffs). In the real world of "should have shipped it last quarter" developers concerned about security need all the help they can get!

      gem

      Gary McGraw
      http://www.cigital.com/~gem [cigital.com]

  • Interesting article, but I'd like to see more specifics. I guess McGraw is (1) trying to sell his book and (2) talking to a wide audience as likely to include managers as developers, but what I really want are some meaty bits that, as a developer, I can directly act upon.

    The FreeBSD site has some secure programming guidelines [freebsd.org] which are worth a gander.

    • Happily there are many more specifics and tons of code examples in the book that I was (in fact) plugging on cnet.

      The second half of Building Secure Software [buildingse...ftware.com] has detailed chapters on: buffer overflows, access control, race conditions, random numbers, applying crypto, input validation, password systems, tamperproofing, and getting through firewalls. There's (too much) C, some Java, and a bit of python to make things real.

      Security weenies tend not to understand that software lies at the heart of the security problem, choosing instead to throw firewalls and crypto at the problem and call it solved. You guys know better.

      Software, the proactive security solution.

      gem

      Gary McGraw
      http://www.cigital.com/~gem [cigital.com]

  • Score: -1, Kick in the Head Obvious

    Does it occur to him that software designers might write "complex and hastily" code because expectations are high and time is short. And where, besides software, would computer security problems lie anyway? SCSI cables?
  • "... way more complicated than it used ..."

    If someone can't even use the english language "proper", how an earth can they be trusted to use any other language, with out dodgy grammer.

    Maybe he was in a rush?
  • by MeerCat ( 5914 ) on Friday November 30, 2001 @10:36AM (#2636520) Homepage
    OK, so the article is about coding for security, but it's worth considering Tom DeMarco's line in his excellent book Why Does Software Cost So Much ? [dorsethouse.com] where, he says, the correct answer is "Compared to what ??".

    Kicking those who manage complexity is always going to be easy - but until you can do better then you're not really helping.

    The book is well worth a read... if only to shut up all those metrics freaks...

    T
  • For one, who the hell is this guy? What qualifications or experience does he have that make him a security expert? I could care less about his doctorate or his java security book. Security has nothing to do with a programming language. You can write secure, or insecure programs in any language. As Bruce Schneier says, security is a process, not a product.

    Let's assume we take all the authors advice and read his book. Write some "secure" software that meet his requirements and for all intents and purposes the software is "secure". Now take this piece of secure software and put it in an insecure environment. With no physical security, no network security, no security policies in place to use the software, etc. Suddenly your "secure" software is not secure anymore. So obviously, the problem is not software. Perfectly secure software will not solve security problems.

    Let's use an example. Let's say somebody keeps breaking in my front door. I add locks and chains and somebody keeps getting in. So I go out and spend $10k on a front door that is bulletproof and guaranteed to be secure. Haha, I've got them now. The next day, they break my window and walk climb through. The door was just the easiest way to break in. Secure the door and there will be another weakness. The problem is much bigger than just the software.

    In the same way you should not trust a newly invented crypto algorithm, you should not trust some random joe blow's opinion who got published on cnet.
    • This guy is me. Who the hell are you?

      Of course there is no such thing as perfect security or perfect software! I don't think I said that. It's all about risk management.

      Here's what Bruce Schneier says about "Building Secure Software" in the preface that he graciously agreed to write for it:

      "We wouldn't have to spend so much time, money, and effort on network security if we didn't have such bad software security...
      Sometimes, network security can defend against these vulnerabilities. Sometimes the firewall can be set to block particular types of packets or messages, or to only allow connections from trusted sources. (Hopefully, the attacker is not already inside the firewall, or employed by one of those trusted sources.) Sometimes the intrusion detection system can be set to alarm if the particular vulnerability is exploited. Sometimes a Managed Security Monitoring service can catch the exploit in progress, and halt the intrusion in real time. But in all of these cases, the original fault lay with the software. It was bad software that resulted in the vulnerability in the first place."

      A point well worth pondering. Now go read the book and *then* you will be allowed to rant.

      gem
      Gary McGraw
      "a random joe blow"

      • You should take your own advice:

        10. Assume nothing. Question all assumptions and choices you make when you develop software. It's very hard not to be shortsighted. So it's really good to get other people to assess your design and help you with risk analysis. Something that goes along with this is: Never trust security claims people make. Computer security these days is chock full of snake oil.

  • Simplicty takes time. Complexity is easy. Early phases of evolution always spawn fantasticly complex structures. Over time these get beaten down into simpler, stronger, and more subtle designs.

    This is really a great opportunity for those who would try to dethrone Microsoft from its place as the software vendor of choice for many companies. Provide an alternative, use security as the winning card.

    The challenge is to improve the existing structures, secure all the entry points into a network, assume that all contacts are potentially hostile, and generally turn the naive servers we use today into something resembling a real organism: alert, defensive, paranoid. Competition will then do the rest.

  • and kind of sad, too. Seems almost nobody on slashdot wants software that is secure. We'd rather have it complex, with lots of bells and whistles. Really???

    Let's face it, the guy is, unfortunately, right. C and C++ ARE crappy languages, in that they rely on the programmer to ensure there are no buffer overflows. Other languages do offer such protection. OK, they may be different from what you are used to working with, but too many programmers (me included) don't check for overflows. As a result, we get bugs. Some merely crash the program or the system, some can be used to crack the whole box.

    Who's to blame? Designers, programmers, consumers? It's tough work to retrofit security into an insecure design -- look at all the work sendmail has required in the last five years. In general, only those programmers who have been bitten by a security bug take the time to put in the extra checks -- it slows down the programming and the program. As for consumers, we'd all rather have the latest and greatest."features." (Like Clippy!) Until you've been bitten, of course; mine came when somebody hijacked my 14.4k dial-up connection to relay spam.

    Why don't we return all the software that crashes, like everybody's talking about doing with the new copy-protected CDs? What do you need with the new Office XP that wasn't in Office 95, for example? The new, improved, crash-resistance? ;)

  • "It turns out that software is way more complicated than it used to be."

    [Wow. That's the most blindingly obvious comment I've heard for a long time!]

    I feel that poor software quality boils down to (a) inprecise specification (b) incomplete planning and (c) development environments that are to damm fast.

    I recall in the good 'ol days of computing having to wait hours for a compilation run to complete. You can bet that the thought behind code changes was considerable and that every line was checked very carefully, dry-running and all that stuff.

    Nowadays it's hack, compile, test, hack, compile, test etc etc until it works. Why it works and if it will hold up is a seperate, and largely ignored, issue. And don't point at QA and say it's their job to find the bugs in your crappy code. It isn't -- it's yours.

    So, who's for going back to really s l o w compilers as a mechanism for improving software quality????

  • "And the best way to determine how many problems are going to be in a piece of software is to count how many lines of code it has. The simple metric goes like this: More lines, more bugs."

    No the best method also factors in competancy of management, competancy of engineers, and the cost of failure.

    Lets take as example nuclear power plants than have operation control code behind them - how many lines of code do they have? I'd suggest 10's of millions.

    Why don't we see crashes of these systems widely reported?

    1) They are safety critical, if an error occurs anywhere the surrounding code must fail closed meaning that it should not result in false results being produced.

    2) If you screw up you can't just say "hey we'll fix it in the next version" - if you are lucky you'll simply get your day in court for negligence and you will no longer have a place in the safety critical market. If you are unlucky that still happens but you then get the ass sued off you by the relatives of anyone injured, maimed or killed by your software bug.

    You have to admit the second point really is one hell of an incentive not to screw up!

Our policy is, when in doubt, do the right thing. -- Roy L. Ash, ex-president, Litton Industries

Working...