Why Coding Is Insecure 176
Stuart of Wapping writes "Even patches are not safe, especially if they come from a closed background (maybe) - An interesting article on why coding, is naturally insecure, from SecurityFocus."
It is easier to write an incorrect program than understand a correct one.
Bridges and software (Score:1, Insightful)
Root of the problem (Score:5, Insightful)
Deadlines are normally imposed by companies trying to earn a living through the development of software.
Then it would be a good idea to think that the Open Source community, not faced with deadlines, would be able to code the programs in a more ideal situation, leading to code that has a higher degree of elegance and security than code developed by companies attempting to make money from it.
Then you have the flip side of that, where the software may perpetually never reach a state of stableness since it is contunually in flux. But, how you view this state is totally dependant on your point of view.
At least the code in flux has a higher chance of adapting to it's environment and thus surviving over the slower to adapt Closed Source code.
Sloppy design, sloppy coding, sloppy enforcement (Score:4, Insightful)
Often the designer doesn't consider the bigger picture, how this piece fits it. It can be as simple as not requiring verification on input.
Coders if rushed, inexperienced, or simple bad (like rafts of people who suddenly became "programmers" in 1998-2000 when demand was extemely high, even though the only had a couple classes and were really english, anthropology, history, or other majors, just to fill positions) will fail to see the lapses left by designers and build porous code.
Lack of review or review that so anal it's focus is spelling errors in prompts or whether there's enough documentation lines, but fails to identify where secure practices are not followed. Well, don't get me started. ;)
Last, Q/A, everyone knows Microsoft's Q/A is called ship it out and let customer support pick up the bug reports and then sometimes charge the people reporting for the fix. Q/A is often the first department cut in layoffs, because management underestimates its importance. To bad, like the Enron execs, they won't take a cut themselves to save the product and the company. Good Q/A needs to ask the unthought of questions, what happens if I do this instead of what's expected?
Perhaps somewhere in the evoltion if IT that has lowered programmers from the status of mystical wizards to grunt code jokeys, management will recognize that code, even new products, aren't just some big patch and give it the attention and personnel it really deserves.
Wrong conclusion (Score:5, Insightful)
To have any hope of writing secure software, a programmer first has to be aware that a problem exists. Aware of issues like safely handling user input and securely transporting data (and when it's appropriate and when it's pointless).
Once a programmer is aware of the existance of these issues he can start learning about all of the technical problems of writing secure code. In a UNIX environment, it's things like not exposing unnecessary parts of the filesystem to external users, and not blindly writing to files in /tmp, and not trusting your PATH or your IFS in privileged scripts.
Forget focus, we need education.
He nailed it. (Score:5, Insightful)
When code has to be done before a certain deadline (usually yesterday), this kind of shit always happens. I happen to be one of those idealistic (youthful) coders, and cringe thinking about what sometimes goes into released software. Is it any wonder why there are so many bugs in software? There is never even time to design, let alone test.
Why does this happen? No one really has perfected the art of accurately estimating projects. So you end up taking a quick look at the project's complexity, compare it to something you did before, and tell them how long that previous project took. Then when you give sales/management the time estimate (which is usually bogus anyway), they just ignore it and continue on with their own schedule.
Then you have sales/marketing types who consider software to be "magical." They don't have a clue how it's designed, written, and tested. All they see is something in a box that they have to sell. So when they ask for more features (as if you simply add them like you add flour to a recipe), and an engineer tells them that rushing it out may lead to security holes, etc. etc. they blank stare.
Inexperienced programmers and C/C++ (Score:2, Insightful)
Where I work there are people, people who're responsible for an important part in a project, who can't understand why returning pointers to variables on the stack (from functions in c/c++) is bad. When this happened to one guy, he blamed the library he was using (an in-house library we're currently developing). When a colleague checked out the code he was horrified that the guy did just that, returned a pointer to this local variable.
But how do you differentiate between good and bad programmers? First of all I think a good programmers have to really enjoy programming. When I went to college (software development degree), I coded a lot of stuff in my spare time (I'm not saying that I'm a particulary good programmer, but at least I'm better than some of the other guys at work :). Not everyone does that, some hardly complete their programming assignments. This means that after some years of college, they will get their degree but they can't write a good program. But they will still get a job.
When writing software, especially in C, C++, you have to have a good knowledge of how stuff actually works. How virtual functions work, the difference between the stack and the heap, what happends when objects get out of scope and stuff like that. This stuff may be a boring part of the programming course, but it is actually very important. One problem is that in some places people don't learn C or C++ at all, only Java, and thus they don't need to learn most of this stuff. (Although they maybe have to learn a lot of java-specific stuff, such as how the garbage collector works etc).
The problem, as I see it, isn't that there are too many inexperienced programmers, just too few of the good ones. Another problem is the tool. Many projects is written in C or C++, which pretty much allows you to do everything. It is possible to write robust programs in C++. If I should manage a large C++ project, one of the first thing I would to is to ban almost all use of pointers and C-style arrays. Smart pointers with reference counting, array-classes with optional boundschecking and things like that. Why use char* when you can use std::string (or your own string class). Another solution is to not use C/C++ at all, but in many cases this is just not an option. And I think that C++ is a really powerful language, which with a tiny bit of effort by the programmer(s) can be a robust language, even for "newbies".
Re:Throw-away code (Score:3, Insightful)
Let's face it: security holes are bugs, and good tests and documentation help spot them earlier. Obfuscating your code intentionally won't make your life easier
Nonsense: Consider Open BSD (Score:2, Insightful)
In my opinion, the article is extremely badly written. Also, it is nonsense, as is easily proven by giving a link to another operating system:
Open BSD: Four years without a remote hole in the default install! [openbsd.org]
If the Open BSD team can make a secure operating system as volunteers, Microsoft, with a reported $33 billion in the bank, could take one of those billions and clean up their code.
Microsoft's security problems come partly from feeling that they don't have to care, apparently.
Also, maybe there is some secret U.S. government surveillance agency that requires that Microsoft operating systems not be secure. For years the U.S. government tried to prevent cryptography. For example, see these notes from the Center for Democracy and Technology: An overview of Clinton Administration Encryption Policy Initiatives [cdt.org]. The notes say, "The long-standing goal of every major encryption plan by the [U.S. government] has been to guarantee government access to all encrypted communications and stored data."
It is not impossible that software insecurity is secret U.S. government policy. The U.S. government is involved in many hidden activities, as this collection of links and explanation shows: What should be the Response to Violence? [hevanet.com]
Re:Bridges and software (Score:3, Insightful)
They just learnt from what stayed up...
Re:Nonsense: Consider Open BSD (Score:3, Insightful)
Or more precisely, that features were literally more important than security.
If they spend 80% of their time trying to improve their feature set, then they will only be able to spend 20% worryting about security; and if that turns out not to be enough, tough.
What's been happening recently is the fact that Linux is competing with them, and is seen as more reliable, has actually hit Microsoft in their pocket books. They are having to change their priorities to adapt to this new threat to them.
It will be interesting to see if they can change perceptions quickly enough.
>Also, maybe there is some secret U.S. government surveillance agency that requires that Microsoft
>operating systems not be secure. For years the U.S. government tried to prevent cryptography.
That's more or less one of the two jobs that the NSA does, to 'protect national security' the other is to protect commerce. The latter probably requires a secure OS, the former doesn't. (That's why there were export versions of software). NSA is pretty schizoid organisation; but most of the time they do a good job.
Aren't most security holes (Score:5, Insightful)
The argument that these things slow down code too much doesn't make much sense, considering that we have to do the runtime bounds checking ourself, everytime, and that we occasionally make mistakes.
I think that it is time we drop all insecure functions from the standard C library and replace the library with a bounds checking version that also was more complete and consistent.
It would also be interesting to have a taint flag on the standard C compiler like the perl compiler has to detect when people are using user input as format strings and the like, without cleaning the input first.
Software Engineering not yet Engineering (Score:5, Insightful)
Consider, however, software engineering. The platform you use, the language you speak, the tools you employ -- they all evolve over short time scales. None have had a century or more of Darwinian pressure applied. No one expects them to work, fully. The liability for failure rests with the company or person using the software, not with the company or person writing it. We haven't had the time to develop the technical or social methods for preventing bad software and reinforcing good software.
How many computer programmer professional societies require rigorou entrance exams and periodic proof of competency?
This will continue until the costs are brought back to the companies that write insecure code. This can happen through government regulation -- the creation of a "software building code" -- or through the dead hand of Adam Smith -- companies start to avoid purchasing insecure software.
The greatest sign that this sort of sea change might be a-coming? The fact that Microsoft feels there is enough market interest to attempt, at the very least, to jump onboard a PR train.
I don't get it (Score:2, Insightful)
Actually, coding is not inherently insecure. There are a couple of good counter examples (qmail and djbdns, for example).
Microsoft's code is insecure because this way customers can be made more dependent on them. And each time they download a patch, they get a big Microsoft logo in their face. Talk to a PR specialist if you don't see why this is good for them. Besides, there is no incentive to make bug-free code. Nowadays customers are so used to broken code that they actually believe that it can't be any different.
Re:Root of the problem (Score:2, Insightful)
Deadlines aren't the problem: unreasonable, inflexible deadlines are the problem. All the vices associated with coding under deadline pressure come from bad time management, not the simple fact that some thing needs doing by some specific time.
Joel Spolsky [joelonsoftware.com] goes on at great length about proper scheduling of software development, and seems to get it right.
Why is code insecure? (Score:2, Insightful)
Time. The article was right about this one. If you look through our source code, you can see a definate difference between the "we've got all the time in the world, so follow the style guide to the letter, comment everything, and desk check it all before you send it to test" code and the "beta is due on Monday, so tell you girlfriend to have a nice weekend, and could you get some Code Red on the way in, we're going to be here a while" code.
When you are trying to get code done fast, one is much more prone to looking only at the stated goal of the code (i.e. it takes file X, converts it to format Y, and sends it to machine Z) and ignoring things like modularity and security. One tends to be much more concerned with "how do I get this to work" than "how can some one get this to break".
Ignorance. I don't know a whole lot about buffer overflows, or gaining root when I shouldn't have it, etc. I've got a book on it (which I'm sure my sys admin would love to see sitting on my desk), but the fact of the matter is that most colleges don't doa whole lot of teching in this area; what people know about security holes is usually because they hack around (either on their system or someone elses), or they got hacked. The industry would be a lot better off if schools were teaching woul-be programmers what people will try to do to their systems, and how to avoid it.
Over Reliance on the OS. At least in Microsoft's case, I beilieve they are trying to do too many things at the OS level, which means a security flaw that effects one program can often be opened up to exploit all programs. Take, for example, the registry. If one program's
Why software developers write insecure code (Score:3, Insightful)
Developers who are more inclined to write secure code seem to come from a background that involves administering free UNIX systems in the mid-90s. This is when we started seeing an explosion in the number of nodes attached to the internet 24/7, most of them running a freenix. We were first to bear security problem onslaughts that everyone now deals with today. A sneak preview.
We had to deal with release after agonizingly insecure release from Berkeley, Washington University, Carnegie Mellon. Deal with urgent "security patches" that simply add bounds checking to strcpy, and praying to god that we get our bugtraq email before the script kiddies have figured out how to uncripple the exploit code.
Servers being attacked just because one user was running an IRC bot in a channel some teenage punk wanted to take over. ISPs being knocked off the net just for running an IRC server. Spammers, denial of service attacks, buffer overflow exploits, rootkits, social engineering, man-in-the-middle attack, password sniffing, brute force cracks, .
Developers who lived through this find that the rest of the world (ie, the people starting to do serious stuff on the internet today) are blissfully unprepared for the security onslaught. More NT servers are connected now than ever, ASPs are coming to the harsh reality that they have 40,000 lines of insecure trash running their web site, home users completely unaware that their broadband "always-on" connection really means "always-vulnerable".
The only common traits we share are cynicism. Cynicism for all developers, all companies, all users, everyone. Hundreds of security holes being introduced every second. Every gadget you buy, every shopping cart you push, your comb could have a buffer overflow, careful! that milk might be sour!, oh no! quiet or the cake won't rise!!! they're crawling all over my skin--get them off get them off, use the ice pick use the ice pick!!$%*)!@!!
If you as a programmer don't see the world that way, don't expect to write anything but insecure garbage. But don't worry, you'll learn your lesson just as we all did. And don't be mad at us if we laugh, because we're laughing with you.
Alarm bells going off! (Score:1, Insightful)
Statements like this are silly. *HOW* can the author say M$ has brilliant designers when all you see is the end product?!?! They could have gone through thousands of design interations, each entirely different, with no vision until they hit on something they think looks good! And brilliant programmers?!?! There are an unlimited number of ways you can write an algorithm but there are only a handful of ways to do it `brilliantly!' Do the programmers at M$ write brilliant algorithms? Well, let's check the source...oops!
Re:Bridges and software (Score:2, Insightful)
Also, butressing was a mid-project refactoring on a few of them, as their sibling projects started to fall down as they got above a certain size.
Finally, cathedrals were a projects lasting a few hundred years with fairly stable requirements 'watertight building to worship a deity"; I dream of s/w projects that stable.
Re:Software Engineering not yet Engineering (Score:1, Insightful)
In the first, a hacker exploits a buffer overrun to run malicious code.
In second scenario, a terrorist destroys a bridge by planting a time bomb on one of the pillars.
Is there a real difference here? You say that the difference is that in the software case a design flaw is being exploited. Couldn't someone just as easily claim that a bridge's inability to withstand a bomb-blast is a design flaw? Before you answer too quickly, know that a bridge's inability to withstand an earthquake or high winds (see Galloping Girney (the old Tacoma Narrows Bridge)) is considered a design flaw; is a bomb much different than an earthquake or high winds? In both cases, the bridge's physical strength is being tested."
A bridge's failure to withstand an earthquake or high winds is a design flaw only if those conditions can be seen as occurring in the location of the bridge; otherwise, it's one of those "We don't _have_ earthquakes in Iowa, why would we need a California (or Tokyo)-hardened bridge?" Same for the high winds - those winds were well-known to occur in the Narrows, and _any_ bridge designed to be located there should have taken winds into account.
The World Trade Center was engineered to withstand a 707-class airplane strike. Strikes by much larger aircraft weren't covered under the original design guidelines.
The relevancy to a piece of software being resistant to crackers should be obvious - especially software exposed to anonymous users via the Internet. They're very much a part of the intended "location of use", and so a design has to take such abuses into account.
OpenBSD security flaws? (Score:2, Insightful)
Exercise: how many OpenBSD security flaws exist (or have existed) where the weakness was exploited before the team fixed it? What has the severity of the flaw been compared to flaws that have been found in other systems>
There are no programs called ftp, httpd or smtp. FTP, SMTP and HTTP are protocols for which there are many implementations; rarely does a protocol have a bug. Implementations of these protocols may have bugs. So it makes sense to talk of Apache or Sendmail having a bug, but not httpd since there's no such thing.If one particular OS distribution -- one of the *BSDs or a Linux distribution -- runs BIND as root, and another runs it as a user with no privileges except to read files in one particular part of the filesystem, then a flaw in BIND is obviously much more severe in the former than in the latter.
With OpenBSD, when you run BIND you're not just running BIND version 4, you're running a version of BIND 4 that has been audited by the OpenBSD team for flaws. (This is why OpenBSD is still using BIND4 and will continue to do so for a while: the code has been audited, and it works perfectly well providing DNS. Why "upgrade" when the old version isn't missing anything you need?)
All the code that is part of a standard OpenBSD install has been audited. If Apache is found to have a bug, it is not necessarily true that Apache on OpenBSD has a bug. And unfortunately bug fixes that the OpenBSD team makes in standard daemons don't always get accepted into the mainstream code for it.