Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Security Programming IT Technology

Blame Bad Security on Sloppy Programming 592

CowboyRobot writes "ACM Queue has an article that blames security flaws on poor programming, rather than any inherent problems with particular languages. From the article: 'Remember Ada? ... we tried getting everyone to switch to a 'sandboxed' environment with Java in the late 1990s... Java worked so well, Microsoft responded with ActiveX, which bypasses security entirely by making it easy to blame the user for authorizing bad code to execute.'"
This discussion has been archived. No new comments can be posted.

Blame Bad Security on Sloppy Programming

Comments Filter:
  • Uhh.. (Score:5, Insightful)

    by cbrocious ( 764766 ) on Monday June 28, 2004 @12:42PM (#9552360) Homepage
    Does anyone feel that this is just publicizing what every GOOD developer has been saying for the last 10-15 years?
    • Re:Uhh.. (Score:5, Insightful)

      by strictnein ( 318940 ) * <strictfoo-slashdot AT yahoo DOT com> on Monday June 28, 2004 @12:44PM (#9552382) Homepage Journal
      Yeah, no shit... This is news? Bad programming = security issues. Wow... we learn something new every day on slashdot.

      Here's a tip editor boys: if group A says statement A and you post it as a news item, great. But when group B, C, D, E, F, G, and H all say the same statement A, it's not news. It's redundant (remember that modifier you put in? -1 Redundant? That's what it is).
      • Re:Uhh.. (Score:4, Insightful)

        by Cruciform ( 42896 ) on Monday June 28, 2004 @12:59PM (#9552553) Homepage
        Redundant it may be, but how many people were going to read the article or those like it before it got put on Slashdot.

        Now that it's up there, reporters cruising for an easy story for one of the big news providers will take that story, turn it into something at a 5th grade reading level (that's the bar for tabloid news, I shit you not) and turn it into something that the average consumer may read and understand (albeit poorly, because the meaning will get twisted along the way).

        But if it gets more uninformed consumers to say "Hey, why are we paying for poorly designed crap?" then it's done its job, and you're out nothing but the time it took to reply to the parent.
        • Re:Uhh.. (Score:5, Funny)

          by aardvarkjoe ( 156801 ) on Monday June 28, 2004 @01:11PM (#9552707)
          Redundant it may be, but how many people were going to read the article or those like it before it got put on Slashdot.
          Are you implying that anyone's going to read it now that it's been posted on Slashdot?
      • Which is now /.'ed you would know the author was arguing the point against the hardware people putting buffer overflow checks in hardware for essentially a software design problem.

        He was saying that warnings, etc. need to require the programmer to fix errored code (ie, gets() vs fgets()). Like code optimization, security needs to be a low level function of a runtime/compiler.

        It is a different take on an old problem. If you don't like it simply don't read it. However, since you waste time posting redundant
        • And it's a great idea, too! Let's extrapolate it, and have gun manufacturers asking gun owners to get safeties for their fingers. After all, it's not the gun's problem if you've got shaky hands! It's not the car's problem if the road is icy and people don't know how to pump the brakes...let's get rid of ABS!

          Seriously...if after soo many years and soo many exploits programmers still haven't got it, they probably never will -- they'll just ignore the warning or throw in lip service methods (the way some J
          • by bee-yotch ( 323219 ) on Monday June 28, 2004 @02:11PM (#9553360) Homepage
            Let's extrapolate it, and have gun manufacturers asking gun owners to get safeties for their fingers.

            Actually, it would be more like gun manufacturers telling bullet manufacturers not to make defective bullets that will blow up in the gun so that the gun manufacturers don't have to add safety against defective bullets.

            they'll just ignore the warning or throw in lip service methods

            At least they'll get a warning.
      • Re:Uhh.. (Score:5, Insightful)

        by ackthpt ( 218170 ) * on Monday June 28, 2004 @01:13PM (#9552722) Homepage Journal
        Yeah, no shit... This is news? Bad programming = security issues. Wow... we learn something new every day on slashdot.

        Here's a tip editor boys: if group A says statement A and you post it as a news item, great. But when group B, C, D, E, F, G, and H all say the same statement A, it's not news. It's redundant (remember that modifier you put in? -1 Redundant? That's what it is).

        Here's a clue: Not everyone started programming at the same time, back in the enlightened age of limited resources and cautious programming. When I saw some jerk writing login spoofs on a PDP 11, back in the early 80's I worked out a few ways to spot these running and suspend them. (Also pass information on to Campus Police to have the perpetrator evicted from the grounds.) People are still learning to program and it's not uncommon for them to take idiot-proofing for granted, unless one of two things took place: 1) They had a good instructor who warned them of the consequences untrapped errors 2) There's a directive where they work which they must follow. I expect even Microsoft must be able to backtrack to the person who wrote leaky code. Problem also is two or more departments whose products must interface, but pass the buck on who is responsible for trapping errors, etc. That role should be filled by a management group responsible for the work between groups.

        Microsoft responded with ActiveX, which bypasses security entirely by making it easy to blame the user for authorizing bad code to execute.'"

        When's the tenth anniversary of the Win95 bug which allowed people to hack Quicken?

        • Re:Uhh.. (Score:4, Insightful)

          by e2d2 ( 115622 ) on Monday June 28, 2004 @01:36PM (#9552979)
          Of course you should try to learn how to write the most secure code possible but the developer is only a link in the chain of quality and security. It comes from the top by management demanding higher quality AND at the same time giving the developers and testers the proper time and tools to succeed in that mission. The market told them that mediocre was acceptable and this lesson has been hard to unlearn. Only when the market demands it will it change. Until then we will continue to see bugs and it's inherit bad security.
          • Re:Uhh.. (Score:5, Insightful)

            by jc42 ( 318812 ) on Monday June 28, 2004 @03:18PM (#9554104) Homepage Journal
            Of course you should try to learn how to write the most secure code possible ...

            This sounds nice, but there's a serious problem: There is a widespread attitude in the security community that the details of security holes should be kept secret from programmers. They're worried about those evil hackers exploiting the holes, and there is reason to worry. But when they keep such things secret, the major effect is to keep programmers ignorant of how they might be making mistakes.

            If you combine this "keep the programmers ignorant of the details" practice with the widespread "don't bother the readers with nerdy stuff that'll be over their heads", the result is the security disaster we now have in some parts of the industry.

            As a programmer, I'd like to learn how to write the most secure code possible. But when I try to read about it, I usually find myself reading text that is frustratingly vague on exactly how something might go wrong. If I could learn the details, I could usually write (meta-)code that would check my own code for those problems. But I can't do that if all I have is a vague "don't do things wrong" sort of statement.

            So, yes, sloppy programming is part of the problem. But keeping your programmers ignorant is also a major part of the problem. Don't feed us vague, feel-good commands to write secure code. Tell us exactly how things have been screwed up in the past. Then maybe we can figure out how not to do it again in the future.

            • Re:Uhh.. (Score:4, Insightful)

              by llefler ( 184847 ) on Monday June 28, 2004 @04:19PM (#9554738)
              This sounds nice, but there's a serious problem: There is a widespread attitude in the security community that the details of security holes should be kept secret from programmers. They're worried about those evil hackers exploiting the holes, and there is reason to worry. But when they keep such things secret, the major effect is to keep programmers ignorant of how they might be making mistakes.

              You shouldn't need the security community to tell you about the issues addressed in this article. Basically all he is saying is have the compiler give warning messages for all unsafe practices. (learn to keep your arrays safe, there are no exploits for the security community to find) And there is no such thing as an OK warning. Just look at his gets/fgets example.

              Personally I'd like to see people keep writing these kinds of articles until everyone gets it; because a lot of programmers don't.

              BTW, I read the summary and was all ready to disagree with the premise that it's lazy programmers and not a language issue. Then he explained that programmers are lazy because we haven't fixed the compilers so that we don't have to worry about these problems. But I'm still leaning towards 'C is evil'.

      • Re:Uhh.. (Score:5, Insightful)

        by bay43270 ( 267213 ) on Monday June 28, 2004 @01:16PM (#9552764) Homepage
        First of all, if it weren't worth talking about, there wouldn't be so many comments here.

        Although I can't read the article (/.ed already), I won't let that stop me from disagreeing with the premise. While ignorant developers may have directly caused more individual security problems, the long-term solution isn't to blame the programmers and consider the issue solved. It's a lot more realistic and efficient to fix the programming tools than the programmers (even if the tools can already be used securely).

        Saying that security issues will go away by educating developers is like saying America's obesity problems will be solved by telling all fat people to work out. Its just not practical or constructive on a large scale. (On a small scale, of course, a developer can educate himself just as an obese person can loose weight - with hard work and dedication).

        So rather than criticizing our co-workers (who probably don't care what we think anyway), why don't we identify ways to isolate these people from situations where they could cause harm?
        • by khasim ( 1285 ) <brandioch.conner@gmail.com> on Monday June 28, 2004 @01:58PM (#9553230)
          It isn't the coders.

          I believe it is management's fault for insisting on designs that are unsecure (like ActiveX).

          You can have coders who KNOW the correct way.

          You can give them tools that help them secure their code.

          But those only work if management focuses on secure code instead of "user friendly" features.

          And management will only focus on security when the buyers focus on security.

          I think we're seeing that now. More people are accepting that Linux is more secure than Windows, so Microsoft has to start focusing on security in an attempt to narrow that perception.
        • by zogger ( 617870 ) on Monday June 28, 2004 @02:16PM (#9553420) Homepage Journal
          ..and there he said it was (paraphrasing here) common for programmers to sorta ignore error flags and just code out the warnings about memory leaks and arcane whatnot like that, like that made the problem "fixed". No warnings-no problems! On to the next project.....

          probably more stuff too, that's all I read though.....

          Not a coder here, so I have *no* idea if this is common or not, or true or not, but I *have* noticed on slashdot NO ONE writes bad code,or has written bad code, or thought about bad code, and *everyone* has personally corrected every other coder they ever met on their code, and no one has ever had a boss who knew what he was doing or could read so much as a grocery list without speaking the big words out loud, and only the *other guys* someplace else write bad code, and they always use the wrong language and editor to boot, like on bizarro dotslash forum or something. It's ALL "their" fault that there's ANY of this alleged "bad code" that causes buffer overflows and like acne and flat tires and girls who say no.. Them dang guys "over there", buncha no-good slackers....let's hang 'em!
          • by clamatius ( 78862 ) on Monday June 28, 2004 @03:43PM (#9554347) Homepage
            Ok, it's time for me to own up. I'm the one creating all the bugs you're talking about.

            Acne? Bug in face.cpp.
            Flat tires? You guessed it, tire.cpp, line 5572.
            Girls who say no? That's not a bug, it's a feature.
          • by I8TheWorm ( 645702 ) on Monday June 28, 2004 @03:55PM (#9554507) Journal
            Being a coder, I'll own up to having written bad code in the past. I even tried justifying it at times with "but I had a deadline" or "I tried to plead my case but management wouldn't listen" or other drabble.

            These days, I simply won't take work from people who demand I write code their way, or impose unreasonable deadlines. Even in the programming decline since 2001 (it's bounced back well this year) I refuse to compromise my work because of someone elses ideas/deadlines/etc... because the end result is a reflection on me.

            I like to think most programmers, early on at least, went through the same thing, but I could be wrong. It had nothing at all to do with having to build experience before I knew anything about the necessity of writing apps with security in mind. Rather, it had to do with needing work and compromising my own principles to gain employment.
    • Re:Uhh.. (Score:5, Insightful)

      by Short Circuit ( 52384 ) <mikemol@gmail.com> on Monday June 28, 2004 @12:48PM (#9552426) Homepage Journal
      Unfortunately, unless someone as big as Microsoft (ha!) or IBM gets behind the message, you're not going to see much come of it.

      It's too cheap to quickly pump out code, then run it by QA. You don't even need a shoddy programmer to do it...just pile too many high-priority near-deadline tasks on a good programmer. (Which is all too likely...if you build a reputation for getting things done, you'll get landed with a workload that would put a tech-support guy in a funny farm.)
      • Re:Uhh.. (Score:3, Insightful)

        by doinky ( 633328 )
        If Microsoft hadn't killed OS competiton, IBM _would_ be doing this today. OS/2 had a far more secure infrastructure than did Windows (at the time, of course, the main concern was the ability of a bad app to screw with the system; but one could easily imagine today's OS/2 doing a better job against things like internet exploits).
        • Re:Uhh.. (Score:3, Funny)

          Fact is, if it's an operating system written by anyone but Microsoft, it's more secure than Microsoft's. If you are bored, we can make up some reasons as to why this is so. (sarcasm implied, but hell, I know ya'all will miss it)
      • Re:Uhh.. (Score:5, Insightful)

        by C.Batt ( 715986 ) on Monday June 28, 2004 @01:12PM (#9552708) Homepage Journal
        As one of those "good" programmers with a reputation for getting things done, I must concur with your statement. In fact I've observed that the first thing cut from most project budgets, if it's even included in the first place, seems to be adequate technical QA. There's lots of emphasis on meeting business requirements/application feature goals, but very little on engineering quality under the hood.

        Part of the problem is that enforcing best practices and doing techincal QA is both time consuming, and expensive, not to mention boring as all heck. So there isn't much motivation to do it. Bad, bad attitude and we're paying the price.
        • BINGO! (Score:4, Insightful)

          by sterno ( 16320 ) on Monday June 28, 2004 @01:51PM (#9553155) Homepage
          The problem is that QA and development of good specifications prior to a project have a huge impact on the quality of the product that results. Having said that, QA and specifications are never seen directly by the outside world.

          Most programmers I know WANT to write good code but have the odds stacked against them. They aren't given the time and resources to do the job well. When it's crunch time, security and quality are the first things to go because they are less likely to get canned over a bug than over a completely missing feature.
          • Re:BINGO! (Score:5, Insightful)

            by GCP ( 122438 ) on Monday June 28, 2004 @02:36PM (#9553675)
            That's where I think the author completely failed to make his case for changing programming languages not being a solution.

            People who program in C/C++ are vulnerable to all of the security risks Java and C# programmers are vulnerable to, plus quite a few more that Java and C# programmers are NOT vulnerable to.

            So, if you have a program that could reasonably be written in either Java or C++, and you choose C++, you've just increased the number of security vulnerabilities you'll have to check for. Given the same development deadlines, but with more areas to check, you're going to be handicapped from a security perspective if you choose C++.

            Then add to that the fact that almost everybody with equivalent experience is more productive at implementing a feature in Java or C# vs. in C++, with the same deadline pressures you have even less time available for security checking on top of more things to check if you work in C++.

            Of course there are some tasks for which C or C++ are the still best choice for other reasons, so I still use both frequently and applaud any attempt at adding better security scanning to the compiler.

            I can't help thinking, though, that even in those cases a language with the granularity of C but with built-in strings (UTF-8), arrays that are checked by default but with an override, with fixed built in data types (e.g., a 'byte' type that isn't signed in some places and unsigned in others), and yet without all of the massive baggage of C++, would go a long way to improving C's bug proneness without removing its power.

            Unfortunately, most developers value such things as security, globalization and, frankly, reliability so little, resist change so much, and are so arrogant about their l33t ski11z that would only be impeded by "guard rails", that a language that offered only these improvements on top of C would never put a dent in C's popularity.

            And to that extent only I agree with his thesis that bad programmers are the root of the problem.

      • Re:Uhh.. (Score:5, Insightful)

        by Unnngh! ( 731758 ) on Monday June 28, 2004 @01:13PM (#9552724)
        Yep. In my experience, the good developers get way more work and their quality goes to sh*t b/c they're under pressure and generally unhappy with their jobs at that point. The poor develpers get lower-priortity tasks to work on.

        Being in QA, however, I can honestly say that all the testing you can do on a poorly developed product results only in a poorly developed product with fewer bugs. There is just no way to catch all the bugs in a really POS piece of software when the entire framework is jacked. Not that you can ever catch *all* the bugs, but there's a point at which everyone pretty much agrees that something is good to ship...this usually never happens with crap; crap just ships.

      • They have (Score:3, Informative)

        by bonch ( 38532 )
        Unfortunately, unless someone as big as Microsoft (ha!) or IBM gets behind the message, you're not going to see much come of it.

        They have--see C# and .NET. Longhorn will be entirely .NET-based.
        • Re:They have (Score:3, Insightful)

          by rgmoore ( 133276 ) *

          They have--see C# and .NET. Longhorn will be entirely .NET-based.

          Which doesn't really address the underlying issue. Yes, managed languages like C# and .NET are essentially immune to some classes of exploits that cause problems in C and C++. That doesn't mean that they're completely secure, though; there are still plenty of classes of security holes that apply to managed languages. You can bet that bad programmers will find plenty of them.

          • Re:They have (Score:4, Insightful)

            by GCP ( 122438 ) on Monday June 28, 2004 @02:03PM (#9553289)
            Yes, it DOES address the underlying issue. Just because part of the problem remains doesn't mean that the problem hasn't been addressed at all.

            If you stop using C/C++ by default and use safer languages such as Java or C#, your code will become more secure. The fact that it still isn't 100% secure doesn't mean you've made no progress. And with fewer vulnerabilities, you can pay more attention to the types of vulnerabilities that remain.

            • Re:They have (Score:5, Informative)

              by johnnyb ( 4816 ) <jonathan@bartlettpublishing.com> on Monday June 28, 2004 @02:52PM (#9553837) Homepage
              Actually, you can continue to use C/C++ and just use a garbage collector [hp.com] with them. I don't know why more people don't do this. You don't even need to change your code, as Boehm's garbage collector translates malloc() to it's own allocation routine, and free() does nothing.

              In fact, even better, if you have Boehm GC installed anywhere on your system you can do this for already compiled programs using LD_PRELOAD.

              Just do:

              export LD_PRELOAD=/path/to/libgc.so
              /path/to/program

              and I'm automagically using a garbage-collected runtime for the program, even if it was compiled to use the standard malloc()/free() calls.
    • Re:Uhh.. (Score:5, Insightful)

      by kfg ( 145172 ) on Monday June 28, 2004 @01:03PM (#9552602)
      Don't play with matches. Dont' run with scissors. If you push it hard enough it will fall over.

      Some things you just have to keep saying over and over. People are dense, and by the time one group gets it there's a whole new litter coming up from behind.

      You, for instance, who thinks we've only been saying that for 10-15 years, wheras, in reality, 10-15 years ago you heard that from someone who'd already been saying it for 10-15 years.

      Now it's your turn to smack your forhead and say "Oy".

      KFG
  • by SilentChris ( 452960 ) on Monday June 28, 2004 @12:44PM (#9552373) Homepage
    "Microsoft responded with ActiveX, which bypasses security entirely by making it easy to blame the user for authorizing bad code to execute"

    Uh, not quite. ActiveX was more a response to JavaScript/Flash/et al. Anything that created a lightweight web app. .NET is their response to Java (and, for all intents and purposes, .NET is miles ahead of anything MS has ever created in terms of security).
    • by tcopeland ( 32225 ) * <.tom. .at. .thomasleecopeland.com.> on Monday June 28, 2004 @12:49PM (#9552432) Homepage
      > ActiveX was more a response
      > to JavaScript/Flash/et al.

      Right on... I thought the "ActiveX was a response to Java" was a bit of a stretch too. Also, the author says

      > "everyone complained about wanting to
      > bypass the "sandbox" to get file-level
      > access to the local host.".

      I'm not sure that was why applets were not a big hit... I'd blame the slow JVM startup time for that one.
      • by RAMMS+EIN ( 578166 )
        ``I'm not sure that was why applets were not a big hit... I'd blame the slow JVM startup time for that one.''

        There is that, and there are the various incompatibilities. Microsoft's VM is not going to run your code, unless you specifically write it to work on it. For other code, you'll typically need a fairly recent VM, which means a hefty download if yours is not up to date. Many users are not willing to invest so much time just to see your sucky applet - and most of them are sucky, compared to real native
    • by StephenLegge ( 558177 ) on Monday June 28, 2004 @12:56PM (#9552524)
      I think the writer meant ActiveX was Microsoft's response to Java *Applets*.

      Java Applets had a well-defined and flexible security API that provided fine-grained set of privaleges for what an Applet could do on the user's system.

      To combat Applets, Microsoft implemented ActiveX with brain-dead all-or-nothing approach that is still used today ("Do you want to trust whoever wrote this to do anything they want to your system? Yes / No"). Then Microsoft forced Java Applets to work the same brain-dead all-or-nothing way in IE.

      SLL
    • by jeffy124 ( 453342 ) on Monday June 28, 2004 @01:01PM (#9552592) Homepage Journal
      .NET is miles ahead of anything MS has ever created in terms of security

      I'm a little hesitant at that. About a year ago plus (when I was still in college), a MS guy came to campus to give a demo on .NET, and that included a survey of the security features. A classic MS fallicy has happened here too: The Feature Creep. It is so overloaded with features I felt it made the thing unusable and difficult to understand.

      My suspicions were confirmed when the guy couldn't get a Demo to work correctly. His demo program, written in C# and provided to him by MS, was supposed to deny him access in scenario #1, and it worked correctly. But he couldn't retool the program to get scenario #2 right, where access was to be granted. No matter what the guy did, he kept getting denied access. Makes me wonder whether scenario #1 was actually correct or provided the expected denial for a different reason than intended. Oddly enough, the same guy had it working at a conference of local ACM chapters a week earlier.

      My take on something like this: Yeah, you could get the configuration right when setting up your security model. But if it's this easy to get it wrong such that the program is unusable, then it's just as easy to get it wrong while still being usable.
    • by Tassach ( 137772 ) on Monday June 28, 2004 @01:08PM (#9552657)
      Almost, but not quite

      ActiveX was MS's answer to Java Applets. Flash is another applet alternative.
      .Net is MS's answer to J2EE.

      J2EE and Java Applets, despite being written in the same langage, have very little to do with one another.

    • by Decaff ( 42676 ) on Monday June 28, 2004 @02:49PM (#9553796)
      Uh, not quite. ActiveX was more a response to JavaScript/Flash/et al. Anything that created a lightweight web app.

      Uh, yes quite! ActiveX was exactly and precisely a response to Java - Java Applets that is. The idea was to run embedded binaries in a web browser. ActiveX components would run only on Windows, so they could use the Windows APIs, and so not require a plug-in or pre-installed VM, like Java Applet. ActiveX 'security' was by digital authentication, as against Java's sandbox. ActiveX is not related at all to client-side scripting, as with JavaScript. Microsoft's response to JavaScript was - to support JavaScript. .NET is their response to Java (and, for all intents and purposes, .NET is miles ahead of anything MS has ever created in terms of security). .Net is what MS developed when they decided not to support Java client-side. Its pretty good in terms of security, but still has weaknesses when compared to Java.
  • The human factor (Score:5, Insightful)

    by SIGALRM ( 784769 ) * on Monday June 28, 2004 @12:44PM (#9552377) Journal
    Anything we do to improve software security must work without the programmer having to switch languages

    I agree; it's not so much the language--or the tools--each developer on a project must be personally aware of vulnerabilities and exploits. Using "managed code" does not "secure" your projects. These days, a C programmer ignoring the dangers of gets(), for example, is incompetent and should not be trusted. It's not, as the article reads, "sloppy"... it's ignorance pure and simple.

    Also, relying on tools like an updated gcc, gprof, or splint [splint.org]--helpful as they are--without experience and education in writing secure code... is asking for trouble also.
    • by Short Circuit ( 52384 ) <mikemol@gmail.com> on Monday June 28, 2004 @12:52PM (#9552464) Homepage Journal
      ...it's ignorance pure and simple.

      No, it's not. You try being a programmer with a six-digit salary, a mortage, and a workload Hercules couldn't metaphorically shoulder.

      Fast, good, cheap. Companies have chosen to drop "good" in favor of fitting more products through the pipeline.
      • by surreal-maitland ( 711954 ) on Monday June 28, 2004 @01:01PM (#9552582) Journal
        you could argue, though, that 'good' saves you time in the long run because you don't have to patch and patch and patch and eventually scrap it and redesign.

        ignorance isn't always the fault of the programmer, but if he doesn't have the knowledge, ignorance is still the problem.

        • by leerpm ( 570963 ) on Monday June 28, 2004 @01:11PM (#9552702)
          you could argue, though, that 'good' saves you time in the long run because you don't have to patch and patch and patch and eventually scrap it and redesign.

          Try arguing that to the CEO, who is seeing his marketshare drop by 25% to his competitors, because his development team needs 2 extra months to ensure the security is top-notch. The reality is until the market and customer start demanding that security be a priority, there isn't going to much of a change from the status quo.

          That is part of the reason why Microsoft is so successful, they listen to what the customers want. Up until now their customers wanted features, features, and more features. Now their customers have started to realize that security can have a significant impact on their bottom line. So they are wising up to the situation and demanding that software vendors (not just Microsoft) start making security a priority too.
      • I think I see your problem. You should become Atlas instead of Hercules! I hear he can bear more on his shoulders...
    • absolutely.

      i think a major part of the problem is that security is not an idea which is ingrained in young programmers from the start. i believe this is because teachers don't want to overwhelm students who are learning a whole new set of ideas already, but it's critical that security be something that is kept in mind at all times when programming. i mean, nobody ever *means* to introduce security bugs. there are a few simple techniques which take only a little more time and can save you a lot of hearta

    • by Tassach ( 137772 )
      Using "managed code" does not "secure" your projects.
      No, and it's not supposed to. What it does do is make it EASIER to write secure code by eliminating a very common source of security bugs. This allows you to concentrate on the big picture rather than having to waste time micromanaging the code.
  • by tcopeland ( 32225 ) * <.tom. .at. .thomasleecopeland.com.> on Monday June 28, 2004 @12:45PM (#9552390) Homepage
    > ...if you can just shoot the message?"

    So true. Thus the logo for PMD [sf.net], a Java static analysis tool - "don't shoot the messenger".
  • by Anonymous Coward on Monday June 28, 2004 @12:47PM (#9552408)
    I blame bad security on the Speak'n'Spell keyboards we have to use in this office.
  • by mratitude ( 782540 ) on Monday June 28, 2004 @12:47PM (#9552411) Journal
    I remember the bad ol' days when security was a matter of what you did or didn't do rather than what you didn't know was occurring without your knowledge!

    Abstracting the user from programmatic events wasn't supposed to make your use of the computer a crap-shoot.
  • It is time (Score:5, Interesting)

    by roman_mir ( 125474 ) on Monday June 28, 2004 @12:47PM (#9552413) Homepage Journal
    It is time to create an official Engineering Certification for software designers/developers, the certified Engineer will have to be financially responsible (insurance etc.) for their creations.

    I would like to see that happen, anyone else?

    • Fuck no. (Score:5, Insightful)

      by Mongoose Disciple ( 722373 ) on Monday June 28, 2004 @12:52PM (#9552459)
      Are you crazy?

      Anyone who's worked on a software project of any size (especially in terms of number of people on the project) can tell you that the person who takes the official blame for a development flaw is almost never the person actually responsible for it.

      Maybe if we had a programmers union and I could strike if I was ever asked to implement bad design or put out someone else's fire... maybe. But as things stand? You'd drive a lot of good developers out of the field because they're not skilled enough at office politicking to avoid being made scapegoats for the messes of others, and can't afford to bear the direct financial burden of it.

    • Re:It is time (Score:3, Informative)

      by gront ( 594175 )
      They already have that, its the whole FE/EIT/PE type deal.


      Not real popular with EEs, unless you are in power systems. I've never heard of a software engineer with a PE stamp, but the system is in place.



      http://www.ncees.org/licensure/licensure_for_eng in eers/

      • by Raul654 ( 453029 ) on Monday June 28, 2004 @01:02PM (#9552597) Homepage
        Speaking as a computer engineer who passed the FE (on my first try) - the FE is most definitely biased in favor of civil and mechanical engineers, and against electrical and chemical. That being said, there's really very little incentive for EEs to take it. The only things you need it for are government work or testifying in court.

        However, it really gets under my skin when people call themselves "engineers" and they have *no clue* about engineering in general. In texas, they had a school collapse and kill 100 children because the guy who designed it wasn't a real engineer. As a result, they passed the toughest engineering-standards legislation in the country - if you call yourself an engineer and you are not certified (that is, you have not passed the PE) then you go to jail.
    • Re:It is time (Score:3, Insightful)

      by enjo13 ( 444114 )
      Yes, that would be great. Open source would cease to exist overnight for one.

      We could all be just like doctors, and spend half of our salaries paying for malpratice insurance. That's AWESOME.
    • Re:It is time (Score:3, Insightful)

      here in the UK the BCS [bcs.org] in conjunction with the Engineering Council [engc.org.uk] do accreditation for Chartered Engineer (CEng) status, and the new Chartered IT Professional (CITP) too.
  • Well duh/ (Score:5, Insightful)

    by grub ( 11606 ) <slashdot@grub.net> on Monday June 28, 2004 @12:47PM (#9552416) Homepage Journal

    That's why OpenBSD's continuous code auditing makes for good security. Everything but the kitchen sink != better.
    That all said, a sandbox environment allows the developer to make sloppy mistakes, not program better.
    • Re:Well duh/ (Score:4, Insightful)

      by hchaos ( 683337 ) on Monday June 28, 2004 @01:21PM (#9552809)
      That all said, a sandbox environment allows the developer to make sloppy mistakes, not program better.
      The point isn't to allow anyone to program better, but to protect the user from the sloppy mistakes that already happen regardless of the programming environment.
  • by Dozix007 ( 690662 ) on Monday June 28, 2004 @12:48PM (#9552419)
    The same is especially true in PHP. The short learning curve for getting started in the language allows for a great deal of insecure coding on the internet. I run a site that promotes secure programming, and is running a security challenge for writing scripts as well. The URL is http://www.uberhacker.com
    • I'd have to agree and throw a language into the mix, ColdFusion is traditionally regarded as a language where newbies start their web programming (usually Macromedia Flash people who need a little more power from the server, so we have artists learning how to program). Consequentially ColdFusion has a bad reputation as being an insecure language. While IMHO (I'm Macromedia Certified in ColdFusion so I know my stuff) it is as secure as any other programming language, it's the programmers, not the language p
  • Well, duh! (Score:3, Insightful)

    by dacarr ( 562277 ) on Monday June 28, 2004 @12:48PM (#9552422) Homepage Journal
    Figure this - code is only as good as the coder.
  • No. (Score:5, Insightful)

    by Tarantolato ( 760537 ) on Monday June 28, 2004 @12:48PM (#9552425) Journal
    Okay, it's the happy-fun Slashdot thing to talk about how retraded 'lusers' are. Almost as hi-larious as jokes about Clippy and rebooting Windoze machines.

    But you know what?

    Most developers are retraded too.

    This probably includes you, my friend, as you read it in your grease-stained Manga t-shirt. This is not a problem that will be solved by yelling at people about bad code - they're going to produce it anyways, and in droves. The solution to dumb users is good UI design and a sensible permissions architecture. Similarly, the only workable solution to this problem is architectural.
  • It Rolls Downhill? (Score:4, Interesting)

    by stinkyfingers ( 588428 ) on Monday June 28, 2004 @12:50PM (#9552436)
    There's the old maxim that "shit rolls down hill". Let's change it to "shit stays at the lowest part of the valley".

    When will we see "ACM Queue has an article that blames security flaws on HR departments and middle management?"

  • by prostoalex ( 308614 ) on Monday June 28, 2004 @12:50PM (#9552438) Homepage Journal
    A lot of the production code that gets written nowadays is created by college graduates who have learned to develop in a quick-and-dirty way to roll out the prototype for their home assignment as soon as possible.

    When you're in college, the graders are not trying to break into your application, they're just evaluating the source code and give you points for correct stack and linked list implementation. Thus giving a false assurance that the real-world development is pretty much the same - friendly and non-threatening environment, no need to check and validate input, no need to resort to minimum security permissions and so on.

    I think Caustictech [typepad.com] said it better than I can:

    PrototypeProductionMan come to the ObjectFools team after successful stints at the Unemployment Office and the basement in his parents home. PrototypeProductionMan's talent is making sure that barely functional prototype mockups get rolled out into production. Exception management, security, separation of concerns between business logic and UI code, thread safety, resource management...these are all things you could say good-bye to with PrototypeProductionMan on site! With a mentality like that, it's no surprise that every production deployment ObjectFools has been involved with has turned into a completely fucking unmitigated disaster! At the end of the day, our clients should really thank PrototypeProductionMan as the reason we need to charge them a fucking arm and leg for post-rollout support and maintenance.
  • by fiannaFailMan ( 702447 ) on Monday June 28, 2004 @12:51PM (#9552453) Journal
    a bad workman always blames his tools.
    • by DunbarTheInept ( 764 ) on Monday June 28, 2004 @02:11PM (#9553361) Homepage
      People who repeat that phrase are typically trying to imply the converse, that anyone blaming his tools must be doing so because he is a bad workman. This is only true in the case where the workman got to pick his tools himself. The whole point of the expression, when it was originally coined, was that picking good tools to use is still part of the responsibility of the good workman, so he's got no right to complain about having bad tools - even if he has bad tools it's still his fault anyway. The problem is that ignoramuses keep trying to use this expression to refer to the software industry while ignoring the fact that in the software industry, the "workman" that they are referring to rarely gets to pick his own tools, and so the analogy completely fails on that point.
  • C / C++ (Score:3, Interesting)

    by Brandon Glass ( 790653 ) on Monday June 28, 2004 @12:51PM (#9552455) Homepage

    A lot of problems result from the C and C-like languages' inherent faultiness. The language is a great way of writing "portable assembly language", but makes for sloppy code a lot of the time, even from relatively experienced programmers. It's mainly due to the "New Jersey" [jwz.org] approach to development.

    C++ and C are very popular and widely used, and will probably never fade completely as they are too entrenched, however, there are a lot of languages these days with compilers which can produce code as tight and fast as C/C++, but without the mess. This [purebasic.net] is one example, there are many others.

  • about time (Score:4, Insightful)

    by Deadbolt ( 102078 ) * on Monday June 28, 2004 @12:52PM (#9552467)
    I'm glad someone other than me (who can get published on a site slashdot will link to) said it:

    Compilers shouldn't generate warnings, they should generate errors.

    It's time to stop holding the programmer's hand. If I write a C program that makes 5 malloc() and 4 free(), the compiler should notice that and say, "Gee, you have a memory leak here" and refuse to compile. It should NOT say, "Well, what you're doing is provably unsafe and probably not what you want, but yes sir Mr. Developer, I'll happily crash the system for you!" It is NEVER correct to write unsafe code.

    I understand that there is a certain laxness built into C to make it easy to port to multiple platforms, etc., but these were compromises made in the early 70s, ffs. How long must we live with choices made under circumstances that became outdated 20 years ago?
    • Re:about time (Score:3, Insightful)

      by tcopeland ( 32225 ) *
      > If I write a C program that makes 5 malloc()
      > and 4 free(), the compiler should notice
      > that and say, "Gee, you have a
      > memory leak here"

      That's a tricky tradeoff, though... the more stuff the compiler checks, the longer a compile takes.

      Some things couldn't be caught at compile-time, too. I mean, the compiler would have to actually run the program to ensure it correctly allocated and deallocated memory. That's what stuff like Valgrind [kde.org] is for...
    • Re:about time (Score:4, Insightful)

      by Anonymous Coward on Monday June 28, 2004 @12:58PM (#9552541)
      I know you are trying to be helpful but static analysis of code like you are suggesting is usually pretty worthless. Consider:

      int somefunc()
      {
      if (somecondition)
      z = malloc(100);
      else
      z = malloc(200);

      [... some operation on z ...]

      free(z);
      return 0;
      }

      You could argue that the contrived example code is poorly written (which it is) but I merely wanted to demonstrate how easily it is to produce code which breaks your suggestion of counting mallocs/frees.
      • Re:about time (Score:4, Insightful)

        by DunbarTheInept ( 764 ) on Monday June 28, 2004 @04:33PM (#9554878) Homepage
        Your sentiment is correct, but that's a poor example, that doesn't really demonstrate the problem. A compiler could still follow your if/else ladder and detect that no matter what the condition is, exactly one instance of the malloc call will be made, and thus the one free call is correct. Consider - this is kind of what happens when a compiler complains that a line can be reached while a variable is uninitialized.

        A better example is this. Consider the following code:
        int x;
        int i;
        char **strings;

        x = 5;
        strings = calloc( x, sizeof(char*) );
        // Make some 100-character strings:
        for( i = 0 ; i < x ; i++ )
        { strings[i] = calloc( 100, sizeof(char) ) );
        }

        // do some stuff with the strings (not shown)

        // commented-out line:
        // x = x - 1;

        // Free the strings:
        for( i = 0 ; i < x ; i++ )
        { free( strings[i] );
        }
        free( strings );
        That code works without orphaning memory.

        But now, consider modifying the above example so that the 'x = x - 1' line is uncommented. Then what would happen. Then there'd be 5 allocations, but only 4 frees.

        Trying to write a compiler that can detect the difference between those two cases, with regard to counting the allocs and frees, is essentially a restatement of the halting problem, and cannot be done. The only way to detect that the change to the x variable is important to the orphaning of memory, is for the compiler to go through and examine every statemnt of the code and think "hypothetically, what would happen if I ran this statement?", and at that point the compiler ceases to be a compiler and becomes an interpreter, and thus has the same memory orphaning problem that the code itself has.

  • Mozilla (Score:5, Informative)

    by IntlHarvester ( 11985 ) on Monday June 28, 2004 @12:53PM (#9552476) Journal
    While it's easy to rip on the idea behind ActiveX, Mozilla.org thought it was a good enough idea to copy it as XPI*.

    The basic idea is that plugins and toolbars should be easy to install, and due to the nature of these things, they often can not be "sandboxed" or run in a Java VM. One of the big complaints about Mozilla is that people find it difficult to install the Flash/Java/Real plug-ins. If vendors supported XPI, this would be mostly resolved.

    The real security problems with IE are not directly related to ActiveX, but instead the holey and flawed "zone" system. There's also some operational annoyances with ActiveX (like throwing up dialogs even though ActiveX is disabled and the lack of an easy way to whitelist), but it sounds like XP SP2 is going to try to fix some of those things.

    * ? Apologies if I'm confused about the moz alphabet soup.
    • Re:Mozilla (Score:5, Insightful)

      by 1010011010 ( 53039 ) on Monday June 28, 2004 @01:01PM (#9552585) Homepage

      It's not just that. They integrated web browsing into the file manager -- which is different than merely integrating html viewing. They designed the entire Windows UI Shell to be, basically, remotely exploitable.

      There's no good reason to confound the local file manager with a networked program.
  • Warnings (Score:5, Insightful)

    by dekashizl ( 663505 ) on Monday June 28, 2004 @12:55PM (#9552508) Journal
    The final and main point the author makes in the article is to suggest that compilers start getting smarter and generate warnings for security problems (such as the "gets()" warnings put in many compilers not too long ago. But:
    These tools have existed for years but are not popular. Why? Because they generate a lot of warnings, and, as countless software engineers have pointed out, it's time-consuming to sift through the spurious warnings looking for the ones that really matter. I've got news for them: there is no such thing as a warning that doesn't matter. That's why it warns you.
    I can't agree more. Almost every large project I've worked on with multiple programmers has tons of warnings throughout development. I mean BOTH compiler warnings AND runtime warnings in the log files. Sometimes you can track one down and find out "I forgot to tell you that you need to change XXX in your config file", but most of the time you don't even see the new warnings amid a sea of "acceptable" ones, and the rest of the time, it's more of a "I don't know why that's happening, but it seems to work anyway" type of response.

    If you see a warning, get rid of it right away! Once you slack off a bit, it becomes like dirty dishes piling up in the kitchen sink. Nobody wants to touch them, and everybody feels like most of them are the other roommate's anyway.

    • Re:Warnings (Score:3, Informative)

      by cbowland ( 205263 )
      Check out the Pragmatic Programmers for their 'broken window' [pragmaticprogrammer.com] theory, which they use as an analogy for software development.

      In inner cities, some buildings are beautiful and clean, while others are rotting hulks. Why? Researchers in the field of crime and urban decay discovered a fascinating trigger mechanism, one that very quickly turns a clean, intact, inhabited building into a smashed and abandoned derelict .

      A broken window.

      One broken window, left unrepaired for any substantial length of time, instil

  • by Anonymous Coward on Monday June 28, 2004 @12:58PM (#9552544)
    Ada as a language roughly equivalent to C++ in form and expressiveness. Ada goes beyond C++ in that it allows one to more tightly specify constraints on data and to have these constraints automatically checked and enforced. That is the basic strength of Ada.

    The weakness of Ada is its woefully outdated standard libraries which are more oriented to a 1960s mainframe view of the world. There are no containers, no STL, no general algorithms. That is the weakness of Ada.

    If Ada had the powerful standard libraries which C++ has, that combined the safety of Ada would make it a first choice for many programming tasks. Ada can still deliver on bug free programming. But it lacks the scaffolding needed for 21st century projects.

  • by jellomizer ( 103300 ) * on Monday June 28, 2004 @12:59PM (#9552552)
    Most security issues are a combination of Bad Specs and Limited Time (where Time==Money) That truely make a program insecure.
    Companies are afraid to make the Specs simple they want the program integrated, customizable and expandable. And all that other good stuff so programmers are forced to make their application very dynamic which makes the program more complex and open for security issues. But combined with these specs they are not willing to pay the programmer for all the time is needed and they get very annoyed when the programer is over budget. So the programmer in order to keep his job will find short cuts to make the programming time faster (Hoping the product will be used in a well protected network. But of course once the program is completed they decide to use it outside the normal specs and put it on a hostile network.
    • by Anonymous Coward
      Specs? They gave you specs? Man, where I work, they give you vague instructions, then you go through several successive iterations of "you're getting warmer... now your getting colder" until they finally run out time and ship whatever you last checked in...
  • by MadRocketScientist ( 792254 ) on Monday June 28, 2004 @01:00PM (#9552563)
    Didn't even finish reading the article before:
    Fatal error: Call to undefined function: message_die() in /var/www/acmqueue.com/htdocs/db/db.php on line 88
  • by Animats ( 122034 ) on Monday June 28, 2004 @01:01PM (#9552589) Homepage
    Over in the Boost sandbox, some of us are working on C++ classes to replace C strings in existing code. The usual C string operations (sprintf, strcat) work, but they're all protected against overflow. The idea is that you replace just the declarations, and the code either becomes safe or won't compile. So
    • char s1[80];
      ...
      void foo(char* out, char* in)
      { sprintf(s,"In = %s\n",in); }
    which has a risk of buffer overflow, becomes
    • char_string<80> s1;
      ...
      void foo(char_string_base& s)
      { sprintf(s,"In = %s\n",in); }
    which will truncate the string at the specified length. Note that the "sprintf" line hasn't changed. So you don't have to rewrite complex formatting code. Changing the declarations does the job.

    The new "sprintf" is actually an overload on fixed_string.

  • by Tassach ( 137772 ) on Monday June 28, 2004 @01:02PM (#9552593)
    Good security requires that you understand the principles of what makes a program secure as well as knowing the exploitable weaknesses of the language in which you are developing the software. Using a "more secure" language will not improve your security if your system architure is not built with security in mind. A securely implemented system is rendered insecure if it isn't administred intelligently.

    The security advantage of some langages is that it makes it EASIER to write secure code, not that they make it impossible to write insecure code. There's a difference between protecting you from accidentially shooting yourself in the foot and preventing you from intentionally aiming at your foot and pulling the trigger.

    It is possible to write secure code in C or C++ -- but it takes a whole lot more effort and talent to get it right than it would to do so in a language which does automatic bounds checking and runs in a sandbox. Unfortunately, history has shown us that it's extremely difficult to write secure C/C++ code -- only a handful of programmers are able to consistently get it right, and even the best of the best still make basic mistakes.

  • by sql*kitten ( 1359 ) * on Monday June 28, 2004 @01:04PM (#9552616)
    Fatal error: Call to undefined function: message_die() in /var/www/acmqueue.com/htdocs/db/db.php on line 88
    Seems some folk ought to practice what they preach, eh?
  • by Doesn't_Comment_Code ( 692510 ) on Monday June 28, 2004 @01:08PM (#9552664)
    Depending on how skeptical you are today, you might think:

    Really bad/inexperienced users write insecure code.

    Good programmers write good,secure code.

    Excellent programmers that work for companies that make a lot of money from support and updates write insecure code that is easy to fix.
  • I Disagree (Score:5, Insightful)

    by RAMMS+EIN ( 578166 ) on Monday June 28, 2004 @01:10PM (#9552691) Homepage Journal
    I don't agree. Yes, if programmers wrote perfect code, there would not be vulnerabilities. But programmers are people, and people make mistakes. This is a given.

    For the solution, I think we must look not to the programmers, not to the languages per se, but to their standard libraries. C's pointer arithmetic and unchecked array bounds allow for a variety of mistakes, but also for great efficiency. It's the standard functions like gets, scanf, sprintf, even printf that make C unsafe. Sure, the programmer can be blamed for writing unsafe code, but if these functions were removed and replaced by safer ones, there would be that many fewer mistakes to make.

    Pointer arithmetic is mostly evil and should be avoided. As for bounds checking, I would think that with all the constant propagation modern optimizing compilers do, it would be easy enough to determine which accesses are guaranteed to not go out of bounds, and do bounds checking for the rest. Exceptions help, too. If something goes wrong that the programmer didn't account for, the program stops. In the best case, that means no harm done. In the worst case, the system is DoSed, a situation which is so undesirable from a productivity point of view that it's going to be fixed, whether or not the parties involved care about security.

    Comparing a language that follows all the guidelines set out here to one that doesn't (e.g. Java to C) will quickly reveal the truth: there are far fewer vulnerabilities in safer languages than in unsafer languages.

    Of course, mentality plays a role, too. With the industry having mainly focused on features and quantity, I am not surprised that software is so insecure, and I think businesses depending on this model are getting what they asked for.
  • by javajedi ( 81810 ) on Monday June 28, 2004 @01:15PM (#9552751) Homepage
    "We need to be realistic in recognizing that we're stuck with a set of languages and environments that are not susceptible to a massive change."
    This is a huge cop-out. Buffer overflows simply can not happen in Java. The same goes for almost all of the security problems that are turned into exploits these days. Instead of applying patches to compilers and yelling at ignorant developers, how about just switching to a development language and runtime environment (e.g. Java and its Virtual Machine) that simply doesn't allow these kinds of mistakes to be made?
  • Define secure. (Score:5, Interesting)

    by bs_02_06_02 ( 670476 ) on Monday June 28, 2004 @01:27PM (#9552853)
    Define secure.

    I can guarantee that a developer and a customer will have two different definitions of secure. And, the cost will be more than the customer will want to pay.

    How many customers can write a scope of work, send it off to a developer, and get a proper quote for a project that includes adequate security? How many customers actually remember to ask for security? Or if they do, do they put enough priority on security?

    I bet the answer is very few. I know from past experience that most customers take the cheapest bid. The cheapest bid is usually the one that is skipping something, and the easiest thing to skip is security. If the customer didn't ask for it, is the developer responsible? Is Micro$haft reponsible? Nope. Security is not in their project. They want speed. So, there's always a niche for ActiveX. Microsoft knows they can undercut someone's cost because security isn't an issue.

    And everyone complains about Microsoft's future security ideas. Well, what do people really want? Security? Or no security?
  • 1. the bad programmer can't fix it, but says they can, tries anyways and makes it worse.

    2. the good programmer can't design it, but says they can, tries anyways and makes it ok.

    3. the great programmer invents something like the blinking cursor, which makes life better.

    3. the expert is on an island living off of the revenue generated by their two great ideas, one was to hire the good and the bad programmer for peanuts, since the economy sucks, and two was score clients using the great programmer's invention of the blinking cursor.
  • by gillbates ( 106458 ) on Monday June 28, 2004 @01:29PM (#9552887) Homepage Journal

    Good security is more a matter of developer foresight than anything else. Almost all of the security flaws known to date hinge on two factors:

    1. The developer failed to foresee the manner in which his code could be used for malicious purposes.
    2. The developer failed to build a security implementation that was practical for his intended users.

    The first point applies to a lot of Microsoft software; the second, to a lot of software across the board. The fact that a sysadmin blames compromises on easily-guessed passwords is no solution at all - yes, the user is at fault, but the user wouldn't have chosen a bad password if the system of username/password wasn't broken in the first place. It seems that sysadmins and developers alike forget that ordinary people have to remember things far more important than the dozen or so username/password combinations that it takes to live in today's society...

  • Make it hard to fail (Score:5, Informative)

    by mcrbids ( 148650 ) on Monday June 28, 2004 @01:33PM (#9552933) Journal
    Bugs should not result in security issues.

    I repeat: Bugs should not result in security issues!

    A properly designed application will have multiple layers of error detection and security checking. As you write your software, abstract things like security checks and database access into an API, and then do insane amounts of input validation behind that API!

    In my home turf language, PHP, one of the biggest common problems in applications is the dreaded SQL-Insertion bug.

    The pat, standard answer is to validate-validate-validate!

    But, I'm human. I *WILL* make mistakes. It's only a question of when, not if.

    Ask yourself: How can I structure my application so that mistakes in this regard do not result in an immediate, full compromise?

    I bury database access behind an API that forces me to identify the data being passed to the database, and then trap errors from the database so it doesn't show anything to the web client.

    Example:

    <?
    $sql="INSERT INTO logindata (login, password) VALUES ('[login]', '[password]')";

    $todb=array('login'=>$login, 'password'=>$password);

    if (!$DB->SafeQuery($sql, $todb))
    Error($DB->Error());

    ?>

    What happened here? The SQL statement does not contain any data - instead I'm passing a template for the query, and the data array to parse into the query. The function SafeQuery() does a pattern match to get the names of the fields (in the square brackets) and then does the requisite addslashes(), as well as checking the number of fields to ensure that everything matches up, before actually dumping this statement over to the database.

    Errors get trapped within the object, and are accessed through function Error(). This prevents any sensitive information being sent to the browser, and the global Error() function simply displays an "Sorry but an error occured" webpage while logging the text of the error message, and quits.

    Now, none of this negates the need to do input validation - but this makes a very bad threat for PHP application all but disappear!

    As you develop your applications, structure them as much as possible such that bugs and errors do not result in security breaches.

    Use constraints and triggers in your databases to kick out data that can't be demonstrated as good. Use APIs and functions to interface with areas (such as the shell/CGI interface) so that common security mistakes (such as not escaping a shell argument) simply can't happen.

    Repeat after me: Bugs should not result in security issues!
  • Well no Sh!* ! (Score:5, Insightful)

    by mysterious_mark ( 577643 ) on Monday June 28, 2004 @01:34PM (#9552942)
    This is exactly what those of us in the trenches coding have been saying for many years. The current abysmal state of poor software performance seems directly correlated to the race to the bottom in 'cutting' develoment cost. The solution to producing secure reliable code is to hire experienced competent programmers who understand security issues, and have a vested and sincere interest in producing reliable secure code. This generally means a long term relationship and with and understanding of the clients's needs and business perspectives, as well as the technical competence and willingness to put forth the efffort required to produce quality code. This is necessarily the oppossite of the current trend towards going with the lowest bidder, outsourcing, H1-B's, and throwing large numbers of low skilled developers at a project rather than using a small group of highly skilled developers. Fortunatley for me however my current client regognizes this and only retains a couple of long term highly skilled developers, they do have a number of very nice, secure and relaible applications to show for this, absent the usual bloated development team. This however may be the exception in the industry. Hopefully the corporate types will eventually figure out that throwing large numbers of low skilled developers at a project will not produce relaible secure code. This issue been well documented obstensibly in works such as 'Mythical Man Month" and "The Pragmatic Programmer" however it seems most corporate manager types have yet to acquire this wisdom. Mark
    • Re:Well no Sh!* ! (Score:5, Interesting)

      by doinky ( 633328 ) on Monday June 28, 2004 @01:50PM (#9553149)
      The "lesson" most CEOs learn from an unsuccessful software project which failed due to one or more of the reasons you cite is:

      "software people are worthless".

      except he'd insert "shit" for "worthless". In this country (USA), the people responsible for the failures of these types of projects are never held accountable in a way that makes it possible to the next executive to learn from their mistakes.

      Some days working in this industry feels like the story of Sisyphus

  • by Junks Jerzey ( 54586 ) on Monday June 28, 2004 @01:52PM (#9553157)
    The first image conjured up by "sloppy" is someone using sprintf in production code (much buffer overrun potential), raw pointers unnecessarily, ad-hoc string manipulation code, and so on. But it's much deeper than that.

    Consider something as simple as BMP file format decoder. Writing a decoder is easy. It takes about 30 minutes tops to write one for a subset of the format. But writing a safe version is much more difficult. First, you have to validate all fields. Easy enough. Then you have to handle attempts to crash an application by passing in really huge values, like 10,000,000 pixels in each dimension. That's a bit trickier, because you have to figure out what you should allow and what you shouldn't. Then you have to deal with intentionally malformed images, where the RLE information doesn't add up to the total image size. Depending on how the code is written, this can cause you to chew through memory past the end of the image. To fix this, you have to put some checks into your inner decoding loop. The temptation to avoid doing this is strong, especially among "performance" oriented coders.

    So, yes, you can blame this on poorly written code. But had this been written in a checked language, like Lisp or Python or any similarly safe language, then some of the problems go away immediately. Not all of them but some.
    • Here is a better solution. Write the decoder in c++ using Vector insted of arrays, and use at(pos) insted of []. That way the worst thing that can happend is an exception which is not a security risk.

      (This solution still have the problem that there might be input which causes it to block forever, thus allowing a dos attack, so if you really want security you need to validate the header anyway).

      Martin

  • by tz ( 130773 ) on Monday June 28, 2004 @04:26PM (#9554804)
    Starting with the last - He praises M$ development environments. OK, what was the last major Bug that even made the Drudge Report. An IIS and IE combo - why are IIS sites defaced more since it is the minority and even CERT is saying to switch browsers (how do I load Firefox or Apache into Visual.net?).

    He complains about C malloc/free (ever heard of electric fence?). C++ (wasn't OOP the magic bullet?) gave us new and delete and some garbage collection and more memory leaks (but he says we shouldn't use Java which actually gets the conceptual model right). Oh, and every makefile I have has -Wall, and running them produces no warnings (at least when I'm through changing things algorithmically). And I'll have to look again for a good opensource lint. I used commercial products for a while.

    Since I'm often doing embedded, I have to be careful and have space and timing requirements (and in many cases NO debug facilities) few others have to deal with.

    The article is probably worth reading, but won't be very helpful. A lot of people don't understand the art of programming even if they make their living that way. The people doing the hiring either want the buzzword checklist, or don't care if the result is brittle (at least not when the project starts).

    His solution is more "magic bullets". Everyone I know (even many I would not let write a simple sort routine) was horrified by ActiveX (v.s. Java). How do you make that secure? You can't.

    A security shell inserter like pixie? Maybe it would be a source of exploits (basically it is a manual virus - if you can alter an EXE to add a security shell...). And an IDE can be a great tool (Emacs works for me) but also can make one lazy - I assume IE and IIS with all their holes are developed on the same praised IDEs.

    There is an art to programming and it often takes 5 years and is a way of thinking, not a "method". I do Java and C++, but I don't do them differently than I do C. But much education (and investment) seems to be toward finding a product or method to replace process.

    As often has been said: Security is a process not a product.

    My own saying: I don't write complex programs, I simplify and reduce complex tasks into simple programs.

    Good Cheap Quick, pick 2.
  • by TastyWords ( 640141 ) on Tuesday June 29, 2004 @04:41AM (#9558455)
    This'll end up on the fourth screen of threads, but it's worth reading for those who find it. It's over seven years old, but essentially everything in it holds true today. (worth reading on various lists I don't have it bookmarked - I knew where it was.

    "They Write The Right Stuff" [fastcompany.com]. I'm not even going to provide a summary of everything which is listed in the article. There are a lot of good lessons in well organized, well thought out explanations as to why the software doesn't shut down but how few errors are found.
    There is a difference between a shuttle crew and standard users. 1) A shuttle crew is a smaller user body. 2) They're more likely to follow instructions ala "what happens if I hit this button?"

    I've never sent a note to the author, but I think it would be a book as important as Writing Solid Code (is that the right one? (I've been up a little too long without a syringe.)

An adequate bootstrap is a contradiction in terms.

Working...