Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!


Forgot your password?
Programming IT Technology

The Peon's Guide To Secure System Development 347

libertynews writes "Michael Bacarella has written an article on coding and security. He starts out by saying 'Increasingly incompetent developers are creeping their way into important projects. Considering that most good programmers are pretty bad at security, bad programmers with roles in important projects are guaranteed to doom the world to oblivion.' It is well worth the time to read it."
This discussion has been archived. No new comments can be posted.

The Peon's Guide To Secure System Development

Comments Filter:
  • by jpt.d ( 444929 ) <abfall@r[ ]rs.com ['oge' in gap]> on Friday November 15, 2002 @03:05PM (#4679041)
    The P.Eng has one thing right - we need 'software engineers' or 'computer engineers' that are liable for their work (and the company that uses them are liable for too).

    If Microsoft's products are so good, why do they disclaim liability on it?

    Of course it isn't just microsoft doing this either. The whole licensing thing. If a 'license' is supposted to give you the privledge to do or use something, then in most things you are completely liable for your actions. For example, I have a drivers license, I kill somebody it is my fault. If Acme's Nuclear Control Software 2002 goes faulty and blows up part of the states - they would probably claim no fault (bad example I know - special case currently probably).
    • I think it, currently, is tied back to US 1st amendment Free Speech protection - a book is free speech, it doesn't have to be correct. If you read a book, follow it's advice and lose your shirt or damage something, the book publisher probably has a legal protection against being held liable for it ("we just published the false information, you're the one who acted upon it"). However, yelling 'fire' in a crowded theatre or 'fighting words'* is not protected speech - we might need to tie at least some software with potentially damaging consequences to something like that in some situations.

      *In 1942, the U.S. Supreme Court in Chaplinsky v. New Hampshire defined fighting words as words which are likely by their very utterance to inflict injury, or which tend to incite the average person to immediate violence. The high court said that fighting words receive no First Amendment protection.

    • How absurd. This whole certification thing is such a tired argument, though it's one that the IEEE is revving up as a new source of income (and I'm an IEEE member, but that doesn't mean that I agree with ridiculous certifications). Certifications and licensing are not, in most cases, a guarantee of quality. In reality in many cases these licensing boards turn into self-protective entities that allow their members to get away with things that they would never get away with if not surrounding by the shroud of, err, "persona responsibility" (see some of the medical boards that act more like shields against personal responsibility). Did you know that one of the P.Eng criteria, at least here in Ontario, is that you cannot discredit another P.Eng?

      Most certifications are nothing more than an economic barrier to entry: A club, if you will, whose membership betrays zero information about the capabilities of their members, but rather excludes those who haven't signed up. P.Eng is a particularly notorious one because they've tried to get their grubby hands on virtually all aspects of society, while provably offering nothing in return. No thank you. I don't need a "P.Eng of Burger Making" to make my Big Mac, even if that does help Bob get his friends a job through his exclusive club.
      • Sorry but there I think you are absolutely wrong. I happen to be an engineer, not because of an economic barrier, but because I managed to study x years in a University.

        And since you are in Ontario, which is where I got my engineering degree you should know that money is not the issue to getting an education.

        Also engineering certification does not mean quality. It means that you studied so many years and have gone through specific procedures. Just like police people and fire people. Some police people are good and some are baffoons, but regardless you know that they have gone through police trainning....

        When engineers become liable for stuff that they design, people design very DIFFERENTLY. This is not to say that everybody has to be an engineer to work on software. Just like in a custom machinery shop not everybody is an engineer. You just need enough engineers to sign off legally on designs.
    • You don't need to make engineers liable. It management (executive management to be precise) that needs to be liable. Hell I keep getting turned down on projects to improve the code in my company. The only way I could possibly put reeally good security would be to put 20-30 hours a week of my own time in - past the 50-60 I already work. No thanks.
      • Agree 100%. That's the problem, you can't sell anything to mgmt unless it makes them look good somehow, and security is not as "sexy" as new features. Hence companies like Microsoft can sell the garbage they do, because they just add more bells and whistles (bloat) each version.
    • by Anonymous Coward
      >High level languages like Ruby, Python, or even >Java are strongly recommended for all new >projects.

      All of these languages use a C program to
      run.(interperter, VM).

      First this guy suggest against useing
      close source components are components
      that you do not understand.

      Well, what are these high level languages that
      he is suggesting. There just a convinent
      ways to write C. (Java excluded)

      Maybe he thinks that you should read through
      the ruby and python source before you
      start using these languages?

      I think he's suggestion is the reason
      we have bloated unsecure software,
      everyone trust that there languages
      is in a little black box just because
      it has a VM. What if the VM has a security
      flaw, isn't this just like running a
      secure program on top of windows.

      Just keeping a developer from using pointers
      is no way to insure a projects security.

      • There's a simple solution to both the problem of hight level languages being "just a convinent ways to write C" (and I don't see why one should exclude Java here, but I don't think that this argument is valid anyway) and them being slow. Use high level languages that don't run in a C-based interpreter. Duh.

        For example, try Common Lisp [cons.org], Objective Caml [ocaml.org] or Ada [gnat.com] (not that high-level, but not the worst idea if you care about security).

      • Using a high level language is the best kind of software reuse. The reason behind this is simple, chances are good that you are never going to be as talented as Guido van Rossum or Larry Wall. Nor will your data structures get as many eye balls examining them as Python lists or Perl arrays. Borrowing the work of the hackers that created some of these languages only makes sense.

        Now, I am not saying that these programs don't have bugs, because they do, but I would bet that they have less bugs than anything you have ever written. So while using high level languages doesn't insure security, it certainly does help.

    • The problem with this is that the whole system has to interoperate together. The system is only as secure as the total of all its components working together.

      You can say the same thing for a bridge. It will only stand if all the parts of construction are good, which the developer (not the engineer) are in control of. If the design is inherently flawed, the engineering firm will liable. If the construction is flawed, the developer is liable.

      The difference between software and your analogy is that the engineer/developer has complete control over the whole system. Developers don't. Microsoft doesn't. If the user of that same bridge goes and replaces all the rivets used, the developer can hardly be blamed when the bridge fails because of this.

      If I build a huge structure right in the middle, and you build another, and CowboyNeal builds a third, much smaller building, and suddenly the bridge collapses, whose fault is that? The bridge developer? Me for starting the trend? CowboyNeal even though his was the smallest?

      and then when we bring security into play, that is a whole different ball game. The engineer doesn't have to worry about people activly trying to make his bridge fail. If someone (say a tterrorist) plants shaped charges to destroy the main supports, and the bridge collapses under its own weight, no one would even think about sueing the engineer (except for maybe the lady that dumped coffee in her own lap, and somehow thought McDonalds was at fault).

      In software systems we rely on everyone else to be well behaved. We also rely on the combination of everyone elses systems not interfering with our systems in unexpected ways. A system of mine could run fine without a single crash. A system of yours could run without a single crash. Together they might get spurious crashes. I have never had a crash on a fresh install of Windows while playing Freecell before I install anything else.

      The same idea of liability can't be applied to software systems.
    • by aero6dof ( 415422 ) <aero6dof@yahoo.com> on Friday November 15, 2002 @03:32PM (#4679284) Homepage
      If Microsoft's products are so good, why do they disclaim liability on it?

      Because the customers don't want to pay the added cost of reliability beyond what they need. If you want absolutely, positively bulletproof software, you're going to have to pay a higher development cost (mostly in testing, but in extra liability insurance for the company too). For safety-critical applications, customers are willing (or should be willing anyway) to pay for the additional cost , but it's ridiculous to pay for it when you don't need to. Do some googling on the cost of the space shuttle software for instance...
      • Yes. See the April 98 (!) Byte Magazine. "Why PCs Fail, and Mainframes Don't"
      • Do they have a option? I mean, is there any competing product that will be Windows compatible? Don't tell me Linux, because it's not Windows. And Windows is a monopoly you cannot get away to not using it in a coporate enviroment today. You can at most replace some servers and selected pals workstations.
        • by DrSkwid ( 118965 )
          You don't need secure clients, you only need secure servers.

          Tell me, what is the compelling business reason for using windows that prevents me from using anything else in a corporate environment?

          There is only one answer (or is it three?):
          "Fear, Uncertainty and Doubt"

          And that's one feature we can all do without!

      • If you want absolutely, positively bulletproof software

        Doesn't need to be bulet-proof. But they should be liable for negligence, or for overstating their security. If the product is not meant to be used in a secure enviroment and with certain limitations and certain controls (say following an ISO standard), it should be stated in the license or outside the box (or somewhere).

        If you get roOted, it can depend on a thouthand things. If the reason is because there was a huge hole in code and the company did not patch in 1 day (to name an example) they SHOULD be liable.

        What companies want is a declaration of how secure the product is and a statement about how will a hole be handled and what to expect and how to proceed. The we fix as we go and don't blame us, but nonetheless we claim our products are as secure as everyone's else is unfair.
      • Not only that (Score:3, Insightful)

        by Sycraft-fu ( 314770 )
        But you have to have a verified design. Actually MS does offer solutions like this, inderictly, with Windows Datacentre. With a Datacentre server you can get things like gaurenteed uptime and so on. What happens is you contact an SI that is authorised to sell it, and you work with them to design the hardware and software you are going to use. They build and test the whole thing, and then sell you a gaurentee with the system and service contract. You then can't mess with the system. You can't go and install whatever software you want, because the software might break the system.

        Real verified reliable design are, by necessity, very unflexable. You have to verify all the components and make sure they work together to insure that one won't cause problems. You then can't change the components, with out reverfing.

        This just doesn't work for a desktop, where the user expects to be able to operate the system as they desire. that means that peopel can, and will, find combinations of software adn hardware that will fail. Hence, a software company can't gaurentee reliability in that situation.
    • "If Microsoft's products are so good, why do they disclaim liability on it?"

      Well, let's see:

      1.) MS has millions (like many many millions) of people using their products. Even a small percentage of liability could lead to bankruptcy.

      2.) Lots of ppl hate MS enough that they malisciously work to create problems with it. (Nimda, Melissa, etc...)

      3.) Windows based machines are built from a broad variety of hardware that MS cannot possibly vouch for. If Windows is unstable as a result of a bad driver, blame wil be misplaced.

  • They're just paying for what they get. I tend to believe that its not so much bad programmers as it is a general apathetic attitude that good programmers have now. If there's no incentive to bust your balls, you're not going to!
    • Most programmers graduate from state universities with no real-world experience in security, hacking, and so forth and no connections to anything that's going on -- it's simply a pass from the university of a student molded from the dirt-poor standards of a mainstream college system to a corporate programming world of laziness and no liabilities.

      However, these people who are no more qualified to write code than a third worlder with no previous formal schooling trained to be an H1B in a cert mill -- yet are paid much more, for no good reason.

      If anything, regular programmers who would ever, for example, use PHP's fopen() for a proxy like the article described should be paid like H1Bs and school teachers -- about $35,000 a year, at the most.

      However, the ones who really know their shit -- like Mr. Bacarella -- should be the ones making $100,000 a year or more.
  • So basically... (Score:5, Informative)

    by vasqzr ( 619165 ) <vasqzrNO@SPAMnetscape.net> on Friday November 15, 2002 @03:09PM (#4679082)

    He read a few books on the subject, and summarized the most simple concepts in an article.

    Nothing new here.

    Head to Amazon and find some books ...

    Software Project Survival Guide by Steve C McConnell (Paperback)
    Writing Solid Code: Microsoft's Techniques for Developing Bug-Free C Programs by Steve Maguire (Paperback)
    The Mythical Man-Month: Essays on Software Engineering, Anniversary Edition (2nd Edition) by Frederick P. Brooks (Paperback)
    The Pragmatic Programmer: From Journeyman to Master by Andrew Hunt, et al (Paperback)

    • by Anonymous Coward on Friday November 15, 2002 @03:20PM (#4679190)
      Writing Solid Code: Microsoft's Techniques for Developing Bug-Free C Programs by Steve Maguire (Paperback

      Also holds the world record for "Shortest Book".
    • Re:So basically... (Score:2, Informative)

      by Shackleford ( 623553 )
      Yes, it seemed that much of the article focused on security basics, such as the importance of disabling unnecessary services and not trusting firewalls to be a security panacea.

      Anyway, here are a few more suggestions for books that apparently go beyond the basics:

      Any others?

    • Writing Solid Code: Microsoft's Techniques for Developing Bug-Free C Programs by Steve Maguire (Paperback)

      Why do I find that title funny?

      Seriously now, I had the good luck to find and buy that book about 4 years ago, ever since I always go back and check some of the insights there. There's pretty much everything you need to write solid C code that's bullet-proof and easy to maintain/upgrade. Too bad they don't use the book in-house more often.
  • in your system design, you should probably give up now.

    A non-Windows system is not a guarantee of invulnerability, but keeping a Windows system is guaranteed to put you at risk.

    The real world seems to agree with him on these.

  • I work for an IT security company that does works some pretty secure systems. When we come across custom apps we are amazed time and time again how the logic was put into developing them, not just security. Its one thing to code, its another to do it well. My favorite catch was an SQL developer who created a hyperlink to care and feed his system that simply had to many bugs and pushed to production. Its important that companies have good end to end IT polices, apps, usage and security, but in large part managements dont recognize the risk until its to late.
  • by Lumpy ( 12016 ) on Friday November 15, 2002 @03:11PM (#4679095) Homepage
    The coders are still shackled to the management that are trying to push it out as soon as it compiles and runs.. management doesnt CARE about stability or security and sales/marketing doesn't even care if it works.

    until you can get the COMPANY liable for their software claims. and make their claims open and public, not buried in legalease.. I.E. if you dont want to be liable for it not working then the packaging must state "MIGHT NOT WORK" on the front in big letters.

    until then it will not change... not in commercial software anyways...

  • Useless advice? (Score:3, Insightful)

    by Ars-Fartsica ( 166957 ) on Friday November 15, 2002 @03:11PM (#4679101)
    To quote "It should be a curse to teach C/C++"


    While I have taken this out of context, its not worthwhile to dispense with systems coding issues - thats exactly where most security problems start and need to be stopped. Anyone can be safe in a sandbox.

    • Re:Useless advice? (Score:5, Insightful)

      by Subcarrier ( 262294 ) on Friday November 15, 2002 @03:43PM (#4679369)
      He's contradicting himself here:

      You can tell the difference between a developer who gets it and one who doesn't because the developer who doesn't get it is content to build a custom system using closed source components that they cannot understand, let alone keep secure.

      when he goes on to say that

      High level languages are usually more secure than C/C++ ...

      High level languages are built on layers and layers of things written by other people, things that you know nothing about. If you use C or assemlber, you're much more likely to be in control of the security of your code.

      I guess the comment about C/C++ is aimed at coders who suck more than average; they're certainly better of using code written by other people.
  • bad coders (Score:2, Insightful)

    by greechneb ( 574646 )
    When I look back at my programming classes in college, the majority of the people didn't have a freakin clue. I don't think most of them could install a program on their own. Unfortunately the teachers all walked them through it, and they ended up passing, because they had their hands held the entire way. Its scary to think that some of them could end up in high places.
    • by BoomerSooner ( 308737 ) on Friday November 15, 2002 @03:25PM (#4679232) Homepage Journal
      I guess you shot out of the womb with coding skills (doubtful). Everyone has to learn in their own way. In the end if someone wants to learn to program well, they will. Otherwise they'll just coast along until it's required.

      I was a shitty programmer out of college and after moving between various jobs I learned along the way.

      Business works by getting the most for the least amount of cash. Unfortunately most businesses don't have competent managers that can tell the difference between anything applicable in the real world and a buzz word they just read on CNet (most technical conversations are over their heads). That is my experience anyway.
      • I guess you shot out of the womb with coding skills (doubtful).

        Realize, though, that's it's often the so-called experienced programmers who are making mistakes, and these are the people that the article is criticizing. The newbie programmer will write a typical program in Python, because it's just so much damn easier. The e\33+ programmer will write it in C++, because C++ is a Manly Language. (In case you didn't read the article, one of the key points is that languages like C++ should be avoided.)
  • a good read (Score:5, Funny)

    by lactose99 ( 71132 ) on Friday November 15, 2002 @03:12PM (#4679111)
    I found 2 quotes particularly enjoyable:

    Call yourself a computer professional? Congratulations. You are responsible for the imminent collapse of civilization.


    The user is pure evil.

    Very true and sometimes misunderstood bits of information.
    • by Digital Mage ( 124845 ) on Friday November 15, 2002 @03:35PM (#4679307)
      1) Users are pure evil.
      2) Civilization is made up of users.
      3) Computer professionals are responsible for the collapse of civilization.
      4) Computer professionals will therefore destroy all evil. ;^)...Cool!
      • by Tack ( 4642 )
        This is almost true, unfortunately I must ammend 2) and 4):

        1) Users are pure evil. (Given.)
        2) Civilization is made up of users and computer professionals. (Assumption)
        3) Computer professionals are responsible for the collapse of civilization. (Given.)
        4) Computer professionals will therefore destroy all evil and take themselves out in the process. (Conclusion.)

  • by ultramk ( 470198 ) <[ten.llebcap] [ta] [kmartlu]> on Friday November 15, 2002 @03:13PM (#4679123)
    the real question that any developer needs to ask...

    "What you need doing? Daboo!"

    going back to minding my fortress now...

  • by Max Coffee ( 559629 ) on Friday November 15, 2002 @03:16PM (#4679149)
    I really don't think the blame can be placed on the programmers here. Software development organizations get from their programmers what they measure and reward.

    I used to work at a software house, and I noticed our code always adapted to whatever the organization cared about. When they cared about timeliness, we gave it to them, but the bug count went up. When they cared about a low defect rate, we gave it to them, but the volume of code (completed feature set) went down. When they cared about maintainability too, they got that, but app performance suffered.

    Most competent programmers can probably make meaningful conributions to secure apps, especially if the efforts are led by good architects. Not everyone has to be the best. The only thing is, whoever is commissioning the software has to rank security (which includes a low defect rate) above timeliness and feature count. If that's done, most programmers can rise to the challenge.

    Don't blame the programers. They're just adapting to their environment. They do have to put food on the table after all, so they'll do what their companies value.

    • by aero6dof ( 415422 ) <aero6dof@yahoo.com> on Friday November 15, 2002 @03:49PM (#4679417) Homepage
      I agree with you and would go a little further. Its not only their internal company environment, but the environment of market too. Unless customers are deciding to purchase one product over another due to its security features, software isn't going to get more secure.

      Look at airbags in cars, the government doesn't mandate side impact airbags, but some manufacturers put them in anyway because it's a selling point that some of the customers care about.

      Now, I'm sure someone is going to point out that maybe we should have gov't enforced minimum security standards. However, I'm skeptical that government would be capable of doing it sanely right now.
  • Better languages (Score:3, Interesting)

    by PylonHead ( 61401 ) on Friday November 15, 2002 @03:16PM (#4679153) Homepage Journal
    It should be a crime to teach people C/C++.

    High level languages like Ruby, Python, or even Java are strongly recommended for all new projects.

    How about a high level, compiled language with static typing like Ocaml. More speed, more protection, and it's been officially certified as "The programming tool of choice for discriminating hackers".

    Ocaml [ocaml.org]
    • First, Ocaml is one in a loooooong line of lanugage that claiming to be safer than C/C++ while simultaneously claiming to be faster. I have not seen one new language in the last five years NOT claim to (in "some cases") be faster than C/C++, yet they never can back this claim up in the average case.

      Now how many Ocaml coders are there out there? Five thousand? Actually that number is probably generous. Just fess up that no one cares about this language regardless of its benefits. Its added to the list of Lisp, Haskell, and all of the other languages that could save the world if we just adopted them.

      Even then, Ocaml does nothing to secure the monstrous existing C/C++ code base.

      When coders run out of answers, they often resort to blanket claims of utopia delivered by a mysterious and obscure language.

    • What about embedded software? I read the FAQ/introduction and couldn't see anything about embedded software. In fact, I can't see how this language would cope with embedded things, where type castings and volatile addresses are *needed*.

      Strikes me that a lot of code out there has to be written in low level languages because of the application, and ironically, these applications may just be the ones that need securing the most (think of a cascading failure in a router for example, depending on the data and network, it could be very very bad).

      I could be entirely wrong, and embedded OCaml might exist - if not, I'm sure it's been though about - but I just don't see how it can work for embedded code (well, I guess you can abstract the device interaction from a core written in OCaml, but that still leaves the IO risky - that's probably where you want secure code right?).
  • What hubris. (Score:5, Insightful)

    by ProtonMotiveForce ( 267027 ) on Friday November 15, 2002 @03:16PM (#4679155)
    This "technologist" is carrying on about bad programmers and security? Wow - I assume he's a seasoned professional with many large-scale projects under his belt?

    With such trenchant insights as "Don't use C/C++"! "Don't use Windows!" "Watch out for user input"!.

    Wow. How truly insightful. I'm not even going to bother pointing out the utter absurdity of claiming that using or not using C/C++ has anything to do with it, or the added security problems with using high level languages (do you trust the implementation?).

    I'm just going to say I've had bloody poops with more useful information in them than this article.
    • Re:What hubris. (Score:5, Informative)

      by stephanruby ( 542433 ) on Friday November 15, 2002 @04:17PM (#4679611)
      This "technologist" is carrying on about bad programmers and security? Wow - I assume he's a seasoned professional with many large-scale projects under his belt?

      Here is his professional experience on projects [bacarella.com]. You can actually see his code and the depth of his work is not at all impressive.

    • There are two points I'd like to make.

      A: I agree with everything he says in his article, and it obviously isn't obvious to most programmers these days, thus insightful.

      B: Skimming through random code of his, it does seem that his code doesn't live up to such high standards that he may claim.

      A: If you just look at the huge amount of high-level projects written in low-level languages such as C and C++, and the sheer amount of bugs, you can see he has a point.
      High level languages may have implementations that add security risks, but the languages themselves make it harder to accidentally express bugs, including those that generate security flaws. A language's practical security value can be measured by the security level of its implementations. If you look at CPython's implementation (The one written in C), for example, you see some very good code, written by very good hackers. I have no clue about the bug-levels in Python systems like psych, Jython, or others, but they are probably of adequate levels. Perl has been around long enough to have probably been debugged as well.
      Java is new and has some seriously crappy implementations with lots of bugs. But out of the vast amount of implementations, some must be safe.
      Adding the language implementation code, is similar to adding any type of code to your project (libraries, system calls, etc). However, language code is probably better debugged, and only a very small base of its implementation has to be debugged for pointer flaws and indexing problems to be eliminated.
      There is almost no doubt that Python has (nearly?) no pointer problems or indexing errors in typical configurations in its official sources.

      B: Just skimming through his "light, secure, lightning-fast HTTP server" code, I saw some ugliness right off..
      Using integers for enum values (even when an enum is declared!). Using complicated pointer arithmetic, where a simple indexing "for" construct can be used (to eliminate error-proneness). Using a static array of pointers to structures, malloc'ating each entry: This is combining the evils of static allocation (limited size, unused bytes for lower cases), and the evils of dynamic allocation (complexity of pointers), the need to malloc'ate (slower than not doing anything), memory fragmentation. Ofcourse he should have used a dynamically growing array of pointers (preferrably one implemented generically via macros/or void* lists, to reuse debugged code), or a static array of structures, but appearantly his code is second-class.

      To summarize: while I think his code isn't the best, I do agree with the points he makes, and according to your claim that high-level languages cannot help security, I think that you are probably worse off.

  • Peon?! (Score:5, Funny)

    by gergi ( 220700 ) on Friday November 15, 2002 @03:16PM (#4679156)
    Everyone knows peons don't care about security. They just go around doing whatever they're told to do. Half the time, they're just standing around because there's nothing for them to do. They are oblivious to security breaches... I can't tell you how many peons I've seen getting hacked to death without them even noticing! And if they do notice, all they ever respond with is "Stop poking me!!!"

    Peons, indeed
    • Re:Peon?! (Score:2, Interesting)

      by LostCluster ( 625375 )
      This is the same as the school system admin who sets up a mail server for the school but fails to restrict its use to only the school's IP space. Suddenly it's discovered as an open relay, published on web sites, discovered by spammers, and they find their IP space in a black hole.

      They're puzzled wondering why their network is sorta-broken. Most web sites work just fine but some don't. Everybody can send out e-mail, but people are complaining that the messages are bouncing half of the time.

      When they discover they've been black-holed, they don't understand why they're being punished for the actions of spammers that they think are out of their control. They want to what they spammers are doing with their network to be illegal, and they want the lawyers to make the problem go away.

      Oh, all the trouble a little security knowledge could save.
  • Open source systems offer this power to the end user (you), that is their real strength. You can tell the difference between a developer who gets it and one who doesn't because the developer who doesn't get it is content to build a custom system using closed source components that they cannot understand, let alone keep secure.

    That's precisely why the IT deparment of my company is setting themselves up to fall apart. My group's lead tech (lead not because of higher knowledge, but because he's hung around a while and sold himself) is convinced closed source is better. His arguments come from quoting Microsoft's advertising and web sites (which are basically just more advertising). Without even trying anything open source my company has whole-heartedly adopted .NET. I am so out of here as soon as possible.

    Ignorance may be bliss, but only for the person who's ignorant. They're happy... I'm not.
  • ...I mean, really:

    "...Considering that most good programmers are pretty bad at security, bad programmers with roles in important projects are guaranteed to doom the world to oblivion..."


    We are the important little center of the universe, aren't we!

    Oh! this is just book-marketing bullshit?

    Or maybe hyperbole, if the author is literate to know what that word means...


  • Gotta agree with him on this one. I finally got out of a multi-year project where we used a gigantic POS graphics package as the back end. It added unnecessary complexity and over a year of hacked code to what should have been a month-long project (had we coded the graphics functions ourselves).

    We got stuck with the package because the client chose it, and refused to admit they were wrong. When the project when 10X over budget and people got fired, they still stayed with the graphics package and even upgraded it to the 2.0 version.

    The only way out was to quote them an astronomical figure for upgrading our software to match the POS and hope they wouldn't bite. I cheered when they politely declined.

    It's good to have a job where you can choose your clients.

  • I agree whole-heartedly with the first of 2 non-superfluous statements the author makes: Why do you think Java and, to a lesser extent, C# are so popular right now? ESPECIALLY for teaching? Because with Java and C#, it's very, very hard to write code that can break the system it's running on. I also agree to some extent with his position on cyptography...most serious (non-IE/Outlook) insecurities aren't based on cracked crypto - they're in buffer overflows, and weak points in code. I don't pretend to be anything but a pathetic first year java student, but I can see where this author is coming from just be reading this website once a week...
    • by Christopher Thomas ( 11717 ) on Friday November 15, 2002 @03:57PM (#4679458)
      Why do you think Java and, to a lesser extent, C# are so popular right now? ESPECIALLY for teaching? Because with Java and C#, it's very, very hard to write code that can break the system it's running on.

      It's also very hard with C/C++. The most you break on any system without very broken protection-handling is the faulty program itself.

      The reason Java is taught as an introductory language is that it was stylish about 5 years ago. The reason C# is taught as an introductory language is that Microsoft threw a lot of money at universities to teach it, and at marketing to attempt to make it stylish.

      It boggles my mind that people in second-year programming courses at my university don't know what a pointer is, because it wasn't covered in their first-year programming course (which used Java).

      Languages with built-in safeguards are great, if that's your primary concern, but programming courses in university are supposed to teach you about all aspects of programming that you might reasonably encounter. If someone graduates without knowing how to debug memory errors and then has to maintain a C++ program, God help us all. This is also why we're forced to learn Lisp/Scheme and exposed to Fortran at some point - exposure to the concepts is what's important.

      As far as what's used in industry is concerned, first likelihood is whatever the shop has used for the past several years (anything from VC++/VB down to Cobol, depending on where you're working), and second likelihood is whatever the industry fad was when upper management was setting up specifications.
    • Point at one competitive OS, RDBMS or graphics editing program written in Java or C#. This is why C/C++ still matters and will for many more years.

      High level languages are great for high level problems. Low level langauges are great for low level problems. Use the right tool.

    • When all these half-trained Java-based "CS major"s have to deal with real systems of all types, including those that require memory management by hand?

      This is precisely why Java and C# SHOULDN'T be the primary teaching language at any serious institution. It doesn't just encourage bad habits in memory management, it breeds ignorance of the CONCEPT of memory management. I'm extremely glad I had a good background in C/C++ (and even some Pascal before those) before I ever learned Java or Python, or I wouldn't have a clue about half the concepts that a good C background forces you to learn.
    • Schools are teaching languages like Java and C# because there is so much hype surrounding those languages.

      IMO, this is part of the problem. Languages should not be taught. concepts should be taught,a nd those concepts will translate to any language. People are coming out of schools having only been taught in Java (C# / whatever) and have no concept of things like memory management, buffer overlows etc. These folks have never had to think about these issues before because they only wrote Java code in school. When these people get out into the industry, they will almost certainly have to maintain some code at some time that deals with issues like this. Sorry, but the world doesnt revolve around all of these high level languages yet. There is a lot of C/C++ code out there in the real world that will need to be maintained for many years to come. We haven't managed to kill off COBOL yet after all :).

      Also, as others have mentioned, coding in these high-level languages can give you a false sense of security. Do you trust your Java implementation for example? Are you willing to say for certain that the JVM doesnt have any buffer overflows in it :).
    • Because with Java and C#, it's very, very hard to write code that can break the system it's running on.
      If I can break the system I'm running on with a user level program, be in C, assembly, or whatever, the operating system has a bug and should be fixed. Once your running in the kernel level, well, you can pretty easily break things in any language.

      Languages like Java and C# give you controlled, well-known failure states for certain categories of bugs (you can still walk off the end of an array hosing your program, but the behavior is well defined, unlike C or C++). These languages also restrict your ability to specify unsafe things, but in doing so they take away your ability to specify certain useful ways of doing things. But there are still huge numbers of ways to put bugs into your programs in any language. Java and C# are not magic bullets, they're simply points along a spectrum of safety, power, and expressiveness.

  • Here's a wonderful paragraph...
    High level languages like Ruby, Python, or even Java are strongly recommended for all new projects. The reason these languages are more secure (in theory) is that they don't have pointers. Most security vulnerabilities that involve breaking program code involve manipulating pointers-in fact, many programming bugs are generally related to pointers in some way. As with the OS issue noted above, do not mistake this for invulnerability. You're simply less likely to be compromised using this particular attack vector with a high level programming language.

    I guess we better throw out everything other language, since these are "strongly recommended for all new projects." Here's a better idea: why not just write the software in the language best suited for the job, or that you're more familiar with, and code it to check for unexpected data.

  • Wrong approach (Score:5, Insightful)

    by lazyl ( 619939 ) on Friday November 15, 2002 @03:23PM (#4679221)
    It should be a crime to teach people C/C++.

    This guy is a little rough I think.

    High level languages like Ruby, Python, or even Java are strongly recommended for all new projects.

    This sentence should be continued "..for mediocre programmers.". Professional experts should use whatever language they are best at as long as it's reasonable for the project.

    This article looks like he's giving advice on how to take a group of wanna-be progammers and try and get useful results from them. I think that's the wrong approach. What you should do is hire real experts. That way all the wanna-be programmers won't be able to get jobs and so they might realize "hmm.. maybe I should go back to school and get some real skills". Then we wont have as many of the problems that this guy talks about. Though maybe the schools aren't teaching the skills properly, but that's a different topic.
    • So are OpenBSD programmers good at coding in C securely? Are they not professionals? Have they not made mistakes that comprimised the security of the entire system? What was your point again?
  • by DoctorMabuse ( 456736 ) on Friday November 15, 2002 @03:27PM (#4679251) Homepage
    During the 1980s, I developed software for ICBM command and control systems and for ICBM targeting. One of these systems ran on a Rolm 16-bit computer and was programmed in Jovial, assembly and Fortran. At the time, this computer was already 5 to 10 years behind the commercial state-of-the-art. However, it worked and almost all of the bugs in the computer and the compilers were known, and THAT is the key to developing secure software.

    Don't use the latest and greatest. Use something that has been in production for several years and has had the bugs worked out. The military used to do this on critical systems. Did I hate coding in Jovial on a machine that only had 64K? Yes. But I also knew the machine inside and out and had hand-checked the compiler's assembly code generation to make sure that it wasn't doing silly things. It didn't, because 5 years in production had wrung out all of the bugs.
    • Hey...
      We're talking important stuff here, like e-mail and P2P networks, not silly ICBM toys

      Now getting into a more serious attitude, the DOD has always done things in a way which is completly different from Corporate America, and Consumer America, where 2.0 is much better than 1.5, because it has more features, nicer GFX, whatever. Ohh, and 8.0 is much better, even if there was never a version 7.0, or 6.0, or 5.0, etc.

      Do you guys think that the Marketing people as Microsoft were thinking about security when they gave the 8.0 number to the new MSN?? Unfortunatly, this is a marketing world, and the best marketing almost always wins. And if the loose, the marketing people try to make it look like they won anyway !!
    • Using an obscure, outdated toolkit may get you a sense of security, but you often have a harder time hiring programmers at competitive salaries. People with obscure skills charge more. This is but one reason why many firms use the prevailing language (however it became prevailing is another issue) - because they need to aim at the widest possible hiring pool.

      Added to which, using outdated hardware is never an option in industry. You must write to you user's platform.

      • SOMEONE has to use the new stuff, in order to get the bugs worked out. Hire skilled, intelligent people who stay on top of new technology and are able to intelligently evaluate it's strengths and weaknesses, as well as compensate and correct any new bugs in less-tested systems. That way you can get the benefits of new technology, and you don't need to get bitten by every bug that comes along.
  • by j_kenpo ( 571930 ) on Friday November 15, 2002 @03:27PM (#4679256)
    You know, theres something to be said for ignoring articles written in a degrading way towards its audience. It does make an interesting read if you imagine the comic book shop guy from the Simpsons was the author... worst article ever...

  • This is one of the best all-around security articles I've read in a long time. If even 10% of the world's programmers read this and take it to heart, the world will be a measurably better place.
  • If something like Windows plays any part at all in your system design, you should probably give up now. Despite being closed source, holes are discovered constantly.

    I hate to break it to this guy, but this article is basically a big rant of his personal opinions. Not that I have anything against that, but I feel anyone heeding this person's advice unerringly would be making just as big a mistake as if they didn't listen to any of his advice.

    Open-source, closed-source, it doesn't goddamn matter. The fact is, code is written by humans, and is therefore imperfect. Realize that now and save yourself a lot of time. Open-source continues to have just as many flaws in it as closed-source. How many times has the bind package been updated in recent memory? And don't start the "many eyes" thing again, we all know it and we're all tired of it, and I realize open source gets fixed faster.

    My point is, when I first got into Linux, I took a default install of Red Hat and threw it on there. I had read all sorts of advice that if I wanted a secure server, I should use *nix, so I did. Yeah... rooted. Rebuilt the box, using a way newer distro... rooted. My failing was trusting the code implicity based on what other people said. Old versions of open source stuff are just as vulnerable as old versions of closed source stuff! And you know what? I guarantee that this will always continue to be true.

    Constant vigilance is your only safe-guard. The open-source/closed-source argument is secondary to this. If you can build, deploy and maintain a closed-source based system much easier/cheaper/faster than an open-source one, well, balance that against your security requirements.
  • by bigmouth_strikes ( 224629 ) on Friday November 15, 2002 @03:30PM (#4679275) Journal
    The article is a nice read, but it is obvious that the author have little experience in commercial software production.

    Quality and security of a commercial software product is a financial decision, not a technical. Much like how software architecture is a strategic and not a technical decision, which many software developers do not realize.

    When the cost of continuing to improve quality and security exceeds the income from support contracts, you have to draw the line. If you don't provide or charge for support, you draw the line when your investment exceeds your targeted income projections.

    There are software products that are secure and virtually bug-free, but you and I can't afford them. They run nuclear plants, space shuttle command centers, etc etc. Hundreds of millions of dollars have been spent on that software, and it is not a question about "the user is evil". It's about having a thorough and mature development process and organization, preferable at CMM level [cmu.edu] 5.

    So, I really don't know where the article would apply. Maybe when writing simple VB games for your website. Absolutely not when writing commercial grade software.
    • Your point is noted, but the author is speaking of the collective crappiness and the fallout that will occur.

      I just spent the last 3 weeks cleaning up crappy programming from one of my project-mates. Pick something - not closing db connections, 18 points where infinite loops could occur (!), 48 cases where error points are ignored they didn't exist, and the program continues. In a program that is 60Kb of bytecode! I'm already rewriting code, and this is the first release!

      This is not a low budget, miniscule project. But still, one bad grape and the whole bunch goes. Time and time again.

      So for everyone chanting "hire experts!", count the number of truly solid programmers you know, and drum up a percentage against those you know that suck. For a while there, the industry was stretched across ALL of those people, good and bad, and dying for more techies. Do you really think that the good developers (i.e., the ones who know to slow down and get it write the first time) can take up the entire load? Do you think industry is gonna wait for these experts? Now how about CMM level 4+ rated groups versus all those developing code. Rinse repeat.

      On a more humorous note, the budget problems would probably all disappear if it weren't for Slashdot, but I'm not exactly out to kick my habit...
  • One of the most common security bugs is a buffer overflow. BUGTRAQ often sounds like a broken record which says "buffer overflow"; obviously coding practices which prevent buffer overflows is desirable.

    For my application [maradns.org], I have made a special string library which is resistant to buffer overflows. Instead of a string being a simple pointer to a string of characters, terminated by a null, a string is a structure with the following information:

    • The current length of the string
    • The maximum possible length for the string
    • The encoding of the string
    • The length, in octets, of a single piece of data in the string
    I then make sure that any manipulations to the string library always check to make sure we do not exceed the maximum length; I also have a three-byte cusion in every sllocated string to insure that one-byte buffer overflows do not happen.

    Some other practices:

    • Only give static strings to anything which accepts format (%s, etc.) strings.
    • Do not use signal handlers; or use them with the utmost care.
    • Do not use the system() call.

    - Sam

  • by talks_to_birds ( 2488 ) on Friday November 15, 2002 @03:31PM (#4679280) Homepage Journal
    Surf to his web site [bacarella.com], and it's just the same old self-absorbed bullshit that so many other people put up.


    Let's see...

    • I was born on August 28th 1980, 4PM in Long Island.
    • My life was pretty aimless until I broke my arm in the 8th grade, keeping me from most sports and physical activity. That's when I discovered the magic of computers. I haven't stopped poking them since.
    • My chief interest is in information science (ie, computers). This interest involves my day job, my business, and most of my recreational activity.

    Wow! Pretty exceptional, don't you think?

    'bout the only thing going for the guy is he *doesn't* have a blog...

    How the f*ck did this nonsense get put up on /. anyway?

    What changed hands to get this deal done?


    • I agree, what mindless drivel. All rant and no facts.

      It should be a crime to teach people C/C++.

      Then further into the article:
      Whenever possible, use industry standards. For example: POSIX, ANSI C, OpenGL, SQL, etc. Resist using non-standard extensions, if you must have them, keep them limited.

      I feel for his clients. Slashdot blew it on this story.

    • I think I've figured it out... Finally...

      The Slashdot crew MUST BE using a magic 8-ball to decide what stories go up, and which do not.

      That's the only explanation I've got.
    • Looks like the editors just got trolled. 'Netgraft' indeed. Perhaps this guy is a mate of the Afghan with a C64 that so intrigued Katz, speaking of which, what happened to him ?
    • A story by Michael Barcella posted by michael... One and the same??

      Seriously though, this is one of the worst rants I have read. First of all, his claims about closed - open source are nonsense. He says that noone at Oracle could possibly understand the 1.2 GB codebase, and then says we should all understand every process on our own Linux box. Ummm.... And if my Linux box runs Oracle??

      We trust our OS'es to be reasonably secure. Whether it is Windows or Linux or Plan9. Linux can be more secure due to it's open source nature. Conversely, one could state that Since Windows is the dominant OS, IE the one most attacked, it could evolve to be more secure than an OS that is rarely attacked.

      I recall that in the last PC Week "Hack our Honeypot" contest, the Linux system was hacked long before the WinNT box, because the software was open source and could be combed for vulnerabilities. Is open source still more secure??

      I am not trying to say that Windows is better, Linux is better, I'm just saying that when you make sweeping generalizations about design methodologies relating to product quality you deserve to be lambasted as an Idiot. Open source is not inherently more secure than closed source. Period. Yes you can review the source of open source, but who really does?? And for evey package, every revision?? Most OS'es are simply too complex for one person to get his/her brain around. Same for the Office suites and databases. I'm sure some Yakoff will shoot back that understands all X million lines of Open Office, but I doubt he will be telling the truth. Most folks can't be bothered to read the EULA, (this includes most engineers) but they can read the full source for sendmail??

      Also, are we talking about the OS or its applications?? Outlook Virii are the by product of Outlook, not Windows. IIS is responsable for its own security bugs. The only real Windows Components I can think of with security problems outstanding is Shatter attacks on the COM subsystem (Local) and the remote help exploit. (Easily fixed.) Most of the other attacks can be avoided by having the latest patches, turning off the time service and UPnP, and not using IIS. (In linux it's: sendmail, bind, etc...)

      OK, to qualify for the Linux Zealots out there: Linux has had more security advisories this year than Windows. (See earlier story) Many Windows "Security Vulnerabilities" require user interaction from outlook, etc.

      The author posits that we should only use code that we understand to the letter, but we only program in Perl, Ruby, etc... What a joke. I'm supposed to understand C well enough to understand the entirety of the perl interpreter, but I'm not supposed to program in it. Speaking of which, I should read the entirety of the EMACS source too because that'll be my text editor. So, I should be able to start Coding in 6-8 Months. OOPS! Sorry, Kernel update and I have to read the entire source and all of my device drivers, give me another month or two.

      Again, we trust our OS'es to be reasonably secure. Open Source, Closed Source, it's like Democrat or Republican. Some always choose one or the other, but the intelligent choose the best one (at the time) based on common sense and trust.

      One must also assume that at the ripe old age of 22 (haha) he has tons of real-world application experience. Perhaps he'll be sending letters to Apple about interface design next.... Actually, considering he has another 43 years of software drudgery ahead of him (If he manages to get a job in the industry) we should be seeing high quality software pouring off of Long Island for the rest of my forseeable life. Hooray!

      Perhaps his life was aimless until 22 when he broke his head and discovered the magic of stupidity.

      And all that nonsense about the end of civilization.... Takes our job too seriously don't he. Yes dear, poor software design will be the end of civilization. Lets just ignore the fact that civilization A: existed before software and computers, B: continues even in the midst of all of this bad software.

      I could just keep going on this. But I will finish by saying, "How the f*ck did this nonsense get put up on /. anyway? "
  • Is written by someone without any relevant experience in the field. Someone who has not put down any specific examples / case studies to support his case. He makes a point that he has not prove and we are supposed to argue about his unfounded and unproven theorem?

    Yet, his article appears on the front page of /., the very "home" of the people he offends. To quote Michael:

    It is well worth the time to read it

    No it is definitely not.

  • Let the flames begin!

    Honestly, I applaud open source. I think it can be quite a boon to the rest of the world. However, I've definitely seen enough public code that looked like it was written by a wannabe compsci major. It's nice to see this topic discussed. Open source is a powerful tool, but without good management and high coding standards it's just broken source.
    • However, I've definitely seen enough public code that looked like it was written by a wannabe compsci major.

      That may be true, but proprietary software I've experienced on a collective whole has been, far far worse.

      At least with open source you can actually look at it and say that it's trash that's waiting to fall to pieces, no?

  • I'd like to see someone, just once, say "He's right. I'm a bad programmer and I do these things sometimes. My co-workers are much more competent than I am and maybe that's why, because their code is over my head."

    But noooo. It's always "the other guy" and "the place I used to work", etc... Bah.
  • by Anonvmous Coward ( 589068 ) on Friday November 15, 2002 @03:59PM (#4679477)
    ... Isn't it great how book titles are getting worse and worse about calling their customers names? How long before we have "Total Retarded Dumbfuck's Guide to the Blindingly Obvious"?
  • Guess what, every choice every entity makes regarding anything is a compromise. Security is no exception and so long as it costs money and takes time, it will never be THE ONLY FACTOR as this guy thinks it should be.

    What does he expect? One security expert per I.T. staff to watch over their shoulder and make sure they never do anything insecure? Maybe we should train everyone on the planet as a security expert, and dedicate 100% of every available resource we have to securing software.

    I understand what he's saying, but give it a rest. We take chances all the time and adjust according to the outcome.
  • by Pyromage ( 19360 ) on Friday November 15, 2002 @04:21PM (#4679647) Homepage

    Bull fucking shit.

    It should be a crime to *start* students on a protected environment like Java. Programmers who start on Java begin with less understanding of what's going on, because it sweeps too much complexity under the carpet.

    I realize this argument was made for assembler when C was introduced. BUT! There was a massive shift between assembler and C, which is why that argument is not valid.

    C and Java both have pointers/references. They both have functions, etc. But Java's references are hidden from the user, and most students don't have a clue about a reference.

    Asm. vs. C was a big difference, but Java and C++ share so much, but Java sweeps all that complexity under the carpet. If a programmer who's only used Java gets into a C++ project, he'll fsck it up so fast it'll make your head spin.

    It should be a crime to teach Java as a beginners language. It's not a bad language, but under no circumstances could it conceivably be considered a beginner's tool.
  • Many have already commented on the claim of supposed security of not using C/C++. So following his "logic" - you shouldn't increase the length the length of code by 4-8 times by using C++ (my paraphrase) -- but you should write all of your own code?

    avoid third party code whenever possible. Take the time and reinvent the damn wheel.

    Sorry, but if I agree that one person can not make Oracle (by this I assume he means the database) secure - then wouldn't multiple people on the project at least help? Maybe they can see the things that I cannot see? AFAIK, the more that can find flaws in software the quicker it gets more secure (as in nothing will sever be completely secure)

  • Make up your mind? (Score:2, Insightful)

    by kelcey ( 626524 )
    Earlier in this article, we see the inflammatory statement: It should be a crime to teach people C/C++. But later in the very same article: Whenever possible, use industry standards. For example: POSIX, ANSI C, OpenGL, SQL, etc. Beg pardon? If we use C or C++ (industry standards, I believe), should we then be imprisoned or lauded? Or is it just possible that the author is getting a little carried away with himself? "Programming languages without mandated security don't break systems. Bad code written in programming languages without mandated security breaks systems." Kelcey
  • I wonder if the image links on the page, eg:


    are intentionally broken to show how easy it is to screw up.

    I dunno.
  • C vs C++ (Score:3, Informative)

    by magi ( 91730 ) on Friday November 15, 2002 @04:56PM (#4679929) Homepage Journal
    Article says: "It should be a crime to teach people C/C++."

    I've been wondering if there's much difference between C and C++ in security. C seems to be most used language for system and server programming nowadays, especially in Open Source projects.

    C++ has many features that forgive your mistakes. With proper string, buffer, and other basic data type classes your bounds are always checked so there can't be buffer overflows which seem to be most common source of problems. In addition, automatic destruction of objects eases memory leaks.

    You can, of course, do all the same things in C, but it's always syntactically more complex than in C++. You need to learn dozens of different coding rules just to avoid trivial problems. Often you forget to apply them; each time you create a risk.

    For example, just today I noticed a dangerous situation when I initialized a callback function table with:
    somestructtype myfuncs = {myFuncA, myFuncB};
    While this works quite nicely, it's secure only if the struct always contains the two items. If a new item is added to the struct, all uses of the structure would have to be updated, but the compiler might not warn about this situation. In this case, the result would probably be a program crash. A more secure way would be:
    somestructtype myfuncs;
    memset (&myfuncs, 0, sizeof (myfuncs));
    myfuncs->func1 = myFuncA;
    myfuncs->func2 = myFuncB;
    This is much safer. However, in C++, this problem simply wouldn't exist because structs are typically never used and classes have constructors that always initialize them properly and user doesn't have to care so much about possible changes in the classes.

    This is just one example. There are plenty more.

    On the other hand, stuff is more often allocated from heap in C++ rather than stack. Memory might therefore fragment more easily in C++ than in C.
  • by dstone ( 191334 ) on Friday November 15, 2002 @05:29PM (#4680200) Homepage
    From the article: "Considering that most good programmers are pretty bad at security,"

    I don't necessarily accept this assumption. Most good programmers are good at coding up the design and requirements they've been given. The customer/architect/business analyst/technical lead needs to identify security requirements before they can be coded. It's very expensive to leave identifying security requirements to programmers. Not every project has the same needs. Sure, the programmer could guess. But each programmer on the project would end up spending a different amount of time and money on the security aspect if it's not clearly prioritized.

    Likewise, if security requirements are not specified well enough, a security test-plan cannot be written or executed. If you need security, ensure it's somebody's explicit JOB on the project to ensure security gets into the design & QA.

    Security costs money before a single line of code is written. Decide how much you need, where it's to be applied, and ensure it becomes a critical requirement through coding and testing. You can't expect security to just "happen" simply by hiring some "good programmers" as the author says.
  • by ydrol ( 626558 ) on Friday November 15, 2002 @06:04PM (#4680928)
    It seems that Security aware coding is moving towards a situation akin to the bean counters that decide whether to recall a certain model of a car ... People didnt set out to write insecure code. But usually thay have a set of requirements to meet in order to get paid. Apart from a few industries where large sums of money or human life were directly involved , meet the requirements ASAP and get paid... Even "closed source" development projects have Quality Assurace processes where some dude checks your code (whether they know what they are looking for is another issue)... But particularly with bespoke code, people write according to a set of requirements. "I want it to do this, I want it to do that..". If it doesnt I can sue/refund/get free upgrades, if it gets hacked by some snotty nosed kid , tough, that kid wasn't in your requiremnets. Security is not easily specified as a requirement and is hard to insure against (financially) .. so pretty soon you will see the emergence of "security support contracts". This is the direction Micro$oft are going in .. (sustainable revenue is good for any business) Yes, there is a wide range of programmers with varying abilities. but (apart from open source products), certain companies have realiazed they can/will charge big bucks for more security oriented support contracts, so what do they care. For non-opensource companies lack of security/defensive programming has changed from being a liability to a profit generator. Either they'll make a lot of money or open source will prevail. Also expect a lot of specialist code review/certification/QA companies to pop up "This product is independantly DeadBolt certified" and hence costs $30 more + $30 a year for the latest security upgrades...." (multiply those figures as appropiate!)

"No, no, I don't mind being called the smartest man in the world. I just wish it wasn't this one." -- Adrian Veidt/Ozymandias, WATCHMEN