Forgot your password?
typodupeerror
Security

Too Cool For Secure Code? 471

Posted by michael
from the java-for-everything dept.
An anonymous reader writes "Looks like not everyone believes Linux is the monolith of security folks might like us to think. Jon Lasser raises some interesting points in this article over at Security Focus. Though it has to be said, that whilst he focuses on the Linux/Unix side of things, a good proportion of programmers (no matter what they work on) are guilty of similar conceit to some extent."
This discussion has been archived. No new comments can be posted.

Too Cool For Secure Code?

Comments Filter:
  • Everyone thinks they can hide in teh crowd. Well is someone is determined to hack YOU then they will hack you. If you have something valuable hacking attempts WILL BE MADE! Many will involve social engineering and stuff like that, they will be targeted at YOU.

    Everyone likes to brag about what they are doing and to be nice to people. The best security is social in nature: clam the fuck up about what you are working on, isolate yourself from others who are trained to know the meaning of what you are doing. T

    • "In an age where processing power is cheap, there's no excuse for a mail client written in C or C++."
      This sounds like Microsoft's philosophy - bloat because we can afford to.
      Why not use security tools and a dedication to high performance?
    • Despite your random capitalization of words, I disagree with you. The best security is not
      The best security is social in nature: clam the fuck up about what you are working on, isolate yourself from others who are trained to know the meaning of what you are doing.

      The reason this does not work is because nobody understood what you just said.
      It is true that you cannot always hide in a crowd, because most exploits end up in the hands of script kiddos who run a scan on *.*.*.* from a previously exploited
    • Security through obscurity? Yeah, that works, just ask MS.
  • Of course not (Score:2, Interesting)

    by ch-chuck (9622)
    But it is OPEN meaning a COMPETANT admin can MAKE it very secure. About the closest thing to 'out of the box' security is OpenBSD. My Linux (RH71) box was rooted in less than a day after putting it on the 'net. My OpenBSD box has lasted for almost a year.
    • by Azghoul (25786)
      So, what you're saying is, you're not a competant admin? :)

    • I don't think that's the article's point. The author argues that programming in low level languages reveals bugs inherently. I'm betting your RH7 box was running a few daemons which got rooted. OpenBSD runs many of the same daemons. And sysadmins could care less from a security standpoint as to whether or not it was open as they don't have time to look at every single line of code they're using.
    • I've had 6 Solaris boxes in the Internet with no firewall for over 2 years now with no break ins. It's not rocket science. Packet filters and chrooted services go a long way.
      • Re:Of course not (Score:3, Informative)

        by dougmc (70836)

        chrooted services go a long way

        A minor nitpick ...

        chrooted services don't prevent break-ins. They prevent/reduce damage done after the break-in.

        With some security holes, a chrooted daemon can't be cracked where a non-chrooted daemon can, because the exploit does something like invoke /bin/sh, but that just means that the exploit needs to be altered somewhat to do something other than just invoke /bin/sh.

        Don't get me wrong -- chrooted daemons are more secure than non-chrooted ones if done pr

    • I am sick and tired of this "Competent Admin" prick waving. It's a rehash of the old "only bad girls get knocked up" mentality. If you're rooted, it's because your were a Bad Admin.

      Bullshit. Exploits happen -- and they're going to happen right through your firewall, over your precious port 23, and right into your public key encrypted file system. And when it counts, they're going to happen well before the information hits BugTraq.

      What makes you a Competent Admin is quickly noticing attacks, quickly re
  • Well first off, I would like to say that every OS has had their fair share of vulnerabilities.

    Secondly, we learned at skool (yes some non cool nerds go to school to learn things) certain ways to make your code less likely to be exploitable. For example, making certain objects static is often useful. Object Orientated Programming lends itself to be useful in making secure code.

    I will leave it to karma sluts to find a link for me :) (bah, but I will also look for some old sample assignments which dem
  • by grub (11606) <slashdot@grub.net> on Thursday March 27, 2003 @09:48AM (#5606858) Homepage Journal

    I've been trying to r00t GORILLAS.BAS on my DOS box for years with no luck.
  • So true (Score:3, Insightful)

    by Jack Wagner (444727) on Thursday March 27, 2003 @09:51AM (#5606879) Homepage Journal
    I've found over the years that most new coders aren't taught the proper basics of coding because they focus on learning high level languages and arcane algorithms, instead of focusing on the art of computing, like Donald Knuth's books.

    Only too often have I sat in on meetings with immature little dweebs who rant on and on about XML or the technolofy flavour of the month or hacking at code to achieve Olog(n) cahche hits instead of focusing on making proper underlying designs.

    Frank Brooks talks quite a bit about this in his book "The Mythical Man Month" where he states that secure computing is getting worse on the level of one order per generation of new programming languages. That book should be required reading by all CS students.

    Warmest regards,
    --Jack
    • Only too often have I sat in on meetings with immature little dweebs who rant on and on about XML or the technolofy flavour of the month or hacking at code to achieve Olog(n) cahche hits instead of focusing on making proper underlying designs. The correct notation is O(logn).
  • by Ed Avis (5917) <ed@membled.com> on Thursday March 27, 2003 @09:51AM (#5606882) Homepage
    The thing is, you can program in a low-level language yet still avoid bugs like buffer overflows and failing to handle NUL characters in strings.

    In C, it just requires using a library like glib for your string handling, and some similar library to provide bounds-checked arrays.

    In C++, it means avoiding char * strings like the plague and using the standard 'string' class instead; similarly using 'vector' or other STL containers instead of C-style arrays.

    I think it would help if there were a standard, minimal C string library, and if existing APIs including system calls were given equivalents using these. So open() could take a pointer to a 'const struct String' rather than a char *. Having done this, the existing functions like strlen() and strdup() could be declared deprecated. There are plenty of decent string manipulation libraries for C, what's lacking is a single one library which everyone can agree to support.

    At least in C++ there is a standard 'string' type, although some people insist on reinventing the wheel (Microsoft's MFC with CString, Qt with QString).
    • by Tom7 (102298) on Thursday March 27, 2003 @10:17AM (#5607111) Homepage Journal
      How do you prevent against double-frees ... Use a garbage collector? Integer overflow ... use a package that checks for overflow on each integer op? Bad pointer arithmetic ... disallow it? Force the programmer to check error conditions on system calls ... use exceptions?

      What you're describing, is, in fact, a modern high-level language. Except that if you were to do all this stuff in C, you'd be in such an environment that even the simplest stuff is a pain in the ass... calling special functions or macros to access arrays, etc. Nobody wants to do this, and nobody wants to read code that does -- and that's why nobody does. These high-level safe languages do all this stuff automatically, and transparently, so that you can write clean natural code and it is secure against the most common kinds of holes. Java is an example if you like Object Oriented programming, for my part I like SML.

      Of course it's possible to write bug-free C code. But it's really hard when you are at the scale of even a modest network daemon. Even our best programmers make security bugs when writing in C though (see: Quake I, II, III, Half-Life, Linux Kernel, sshd, ftpd, apache, perl, mozilla, and just about every other software package you think is written by great programmers), so how can we expect the rest of the world to do it?
      • I'm pretty much a Java coder that used to to a lot of programming in C and C++. If I ever go back to a job that required me to code in C, the first thing I'm going to do is look for a memory management package (at least keep track of what's been alloced and freed) a decent string handling library, and something for bounded-array access. Honestly though it won't be for the security aspect, but for the EASE OF DECENT CODING.

        I won't have to worry about debugging something that segfaults before I get to real
      • You're right, C or C++ code with explicit memory allocation still makes it too easy to code double-free bugs. But despite the recent zlib vulnerability, double-free is not the most common cause of security holes.

        It would help if the standard free() did a bit more checking, in particular making sure you can only free() an address you've previously malloc()ed. That would make it slower, but not too much slower. Programmers could always turn off this checking for code that relies on allocating and freeing
    • That's what I'm saying. I mean he (the author of the article) even shot himself in the foot by claiming that he sees bugs in higher level web-languages like PHP.
      Just because you use a high level language, if you suck at coding, your program will have security holes.

      Referring to the standard, minimal C string library. I used one in the past. I believe it was called APSTRING. It was a nice string object and and it had a method to make it compatible with those older functions that wanted char *'s. Making o
      • That's what I'm saying. I mean he (the author of the article) even shot himself in the foot by claiming that he sees bugs in higher level web-languages like PHP.

        No he didn't. The point is that a good programmer can reduce their risk by using a high level language. They can't eliminate it. According to your logic, because people can get killed in Volvos, Volvos are as safe as motorocycles.

        Just because you use a high level language, if you suck at coding, your program will have security holes.

        Brillian

        • How would it be a defacto standard if only one person used it? After more than 10 years of C++ don't you think that such a thing would have arisen if it was going to?

          It has. It's called the STL, and when used properly it fixes nearly every complaint the columnist had.
    • Another way? (Score:2, Informative)

      by spydir31 (312329)
      Use source analyzers to find common mistakes, here are a few
      Flawfinder [dwheeler.com]
      RATS [securesoftware.com]
      ITS4 [cigital.com]
      Splint [splint.org]
      also look at Splint's Links page [splint.org] for more on the topic
    • The problem with C or C++ is that no matter how much library code you add to them, they don't provide fault isolation. That is, I can do everything right in my code, but some other module can still screw up my data structures through a stray pointer.

      An additional problem with C (and to a lesser degree, C++) is that it doesn't protect you against mistakes or typos: it is very easy to introduce crashes and security problems accidentally.

      Many languages other than C/C++ almost completely eliminate that wor

    • What he said.

      Tainting support ala Perl would be another great thing to have in a widely available library. Lazy programmers might still cast 'struct Poisonedstringfullofshellcode' to 'struct String' but at least it wouldn't happen by accident.
  • SecurityFocus columnist Jon Lasser is the author of Think Unix (2000, Que), an introduction to Linux and Unix for power users. Jon has been involved with Linux and Unix since 1993 and is project coordinator for Bastille Linux, a security hardening package for various Linux distributions. He is a computer security consultant in Baltimore, MD.

    Jeeze, he isn't even a programmer..

    • Jeeze, he isn't even a programmer..

      I would think that writing a massively popular security-oriented program (Bastille) would qualify him to comment on secure programming, wouldn't you?

    • Jeeze, he isn't even a programmer..

      See, this is exactly what he's talking about. Just because he doesn't code day in and day out doesn't mean he can't understand the vulnerabilities low-level languages introduce.

      Basically, I see his argument as an extension of the "coders vs. scripters" discussion [slashdot.org] that took place a few weeks ago. A lot of people seem to think that it's not programming unless it's written in C/C++ (and sometimes Java). Which is, of course, complete bullshit.

      For example, I came acro

  • To be honest, I've thought for quite some time that people should be moving away from C and C++ for most stuff on UNIX, simply for the productivity benefits it would bring - the added security is OK, but as pointed out in the article, high level languages aren't a silver bullet.

    Unfortunately, the vast majority of our desktop software on Linux is still being written in C or C++. Why? Well, those are the "native" languages of the two big desktop projects. C++ was chosen in KDE because, well, that's what Q

    • The article talks about three big classes of vulnerabilities: format string bugs, buffer overruns, and input validation bugs. The first two need not ever occur in a C++ program: the STL containers do not use fixed-size buffers, and the STL input/output routines don't use format strings, printf() or scanf(). If you deliberately ignore what the C++ standard library provides for you and go back to C library routines and fixed-size arrays you become more vulnerable to these attacks. (Out-of-bounds subscripts
      • It's not C and C++ which are at fault, it's the C *standard library*.

        The language takes some of the blame. What other programming language in widespread usage lacks a built-in, pointer safe string class?

      • Tools like the STL definitely help, but they don't offer a guarantee against these kinds of problems. Like you said, you can still reference out of bounds indexes. Furthermore, you can still reference uninitialized pointers, double-free memory, and use objects before they're initialized.

        Also, these kinds of abstractions have performance costs. At some point, these costs are going to mount until something like Java or .NET becomes competitive in terms of speed and memory footprint.

        At their core, C and C
    • the added security is OK, but as pointed out in the article, high level languages aren't a silver bullet.

      No, but they do remove several major classes of vulnerabilities that have a long history of being exploited. There's definitely a security advantage to them.

      What's really needed is a decent component model, making it easier to choose the right language for the job, instead of choosing a langauge because that's what all the libraries are written for, or because that's what everybody else uses.

      I qu
  • by binaryDigit (557647) on Thursday March 27, 2003 @09:54AM (#5606919)
    After programmers take responsibility, perhaps they can consider using the right tool for the job, rather than the right tool for the job of their dreams.

    I don't think this macho thing plays into it nearly as much as he states. I think it has to do with comfort and laziness. I've been programming in C/C++ for over 15 years, so obviously, if I have a programming task to tackle, I will lean towards using those languages. I can do a minimal amount of vb, so if I need to slap together a ui, I can, but not anything that did anything interesting. If I have a task, how much time should I spend learning a new language if that language is better suited for the task than a language I know? Since I'm new to this language, how much worse is the code going to be than what I could have written in a less suitable language?

    I'm not saying that I'm against learning new languages, but a programmer can only realisticly be "good" at a small set of languages. And the realities are that unless I'm working on a pet project, I don't have the time to learn something new or try to come back up to speed on a language I last used two years ago. Perl is an excellent example in my case, I don't know a lick of it. If I have simple text processing to do, I use the "simpler" text utilities (awk and sed primarily), unless the problem is very simple, in which case I fire up my text editor. If it's more complex, then I use C/C++. Could it be done quicker in Perl, maybe, maybe not. If by quicker you allow me to ignore the ramp up time to learn the language. If I were doing this type of thing all the time, then the rampup could be amortized in the long run. If it's onsie twosie, then forget it, out comes the c compiler.
    • And the realities are that unless I'm working on a pet project, I don't have the time to learn something new or try to come back up to speed on a language I last used two years ago.

      The author is talking about significant applications, not little toy scripts. You are going to be very familiar with (and maybe even expert in) the language used to build a significant application by the time you are done programming it. But another point the author is making is that once you learn a high level language you'

    • Think of it as an investment - you need to spend a couple hours reading at least the first chapter of Learning Perl [oreilly.com], maybe browsing a few others. Then, when you need to do something, look it up in The Perl Cookbook [oreilly.com]. Most of the time the example code will do everything you need. If not, you can modify it with what you've learned in Learning Perl. For the kind of tasks you mention, this should be more than sufficient, and for a few hours and $75 (retail) you'll save hours and hours and hours down the road
    • I've been programming in C/C++ for over 15 years, If I have a task, how much time should I spend learning a new language if that language is better suited for the task than a language I know?

      Java and C# are based on a subset of C++. If you know C++ well, you should be going full steam in these languages within 2 weeks. If your task is long enough, it might be worth it.
      • Java and C# are based on a subset of C++. If you know C++ well, you should be going full steam in these languages within 2 weeks. If your task is long enough, it might be worth it.

        True, but are Java and C# truely high level enough to warrant the time to learn their environments (remember, the development environment, esp with vm based languages, is often more than half the battle of learning "the language"). BTW, I don't do Java, but my company is moving slowly in the .net world so I've already started
  • This comparison is unfair.

    He's comparing all the vunerabilities in open source programs that he found has been released in a period of time and calls them 'linux' bugs.

    Those programs have nothing with the linux kernel. Other than they can be run on it.

    Do you want to compare that list with every program from every vendor that codes for microsoft windows. On a hunch, I'd say it's a lot higher.

  • If coders must use C or C++ for everything, there are tools to make these languages a little less dangerous: WireX's StackGuard and FormatGuard come immediately to mind, as do various high-level string libraries.

    This part intrigued me. It seems like most of the issues are with the libraries (libc in particular), not with the languages. Forgive my ignorance here (don't do much C) but IIRC there are safe and unsafe ways to copy strings, for example.

    The author seemed to be advancing a stronger argument (ag
    • > This part intrigued me. It seems like most of the issues are with the libraries (libc in particular), not with the languages.
      > Forgive my ignorance here (don't do much C) but IIRC there are safe and unsafe ways to copy strings, for example.

      Though the libraries are partially at fault (printf is a big one), it's not just strcpy any more. It's easy to grep source code for unsafe library functions and replace them. Buffer overflows these days are usually in programmer-written parsing routines (recent s
  • by Omkar (618823) on Thursday March 27, 2003 @09:57AM (#5606951) Homepage Journal
    "In an age where processing power is cheap, there's no excuse for a mail client written in C or C++."
    This sounds like Microsoft's philosophy - bloat because we can afford to.
    • A higher level language doesn't mean "bloat".

      Bloat means heaping unneeded functionality upon unneeded functionality. I'll assume instead that you'd put the same functionality in the c and in the python/ruby/whatever mail client. So it isn't a fair comparison, "higher level language" = "M$ bloat philosophy".

      Secondly, C is high level compared to assembly or machine code. But everyone programs in C or higher and only uses assembly for the 1% in the code where it does matter (if any). Likewise you could do ev
    • Actually, this sounds like a catch-22 on your part as well. "Bloat is bad even if it brings security. We need to have the most heavily optimized applications possible, so we can run them on obsolete hardware."

      I think you can run Linux on a machine that doesn't have a 33 MHz bus, man. If you tell your boss that a $300 processor and memory upgrade will allow you to use a free application that'll keep company records and email secure, I'm pretty sure he'll open his wallet.
  • Actually, the bugs are coming out because use of the software is increasing. The more the software is used, the more the different code paths are traveled in varying ways. When a particular code path or set of input data is invalid, then you can find the bug. You can peer-review all you want, but no one can find 100% of all bugs. You can only fix bugs you know about.

    If anyone has ever written a piece of software, they should be able to tell you that they always found bugs while the software was in produ

  • ...about the recent paper detailing the cracking of a JVM [slashdot.org]?

    Of course he is, because that would make the article larger in focus than what he wants, which is to push buttons. C may be a riskier language than some, but even the big-dog interpreted languages have their own problems.


    • OK, chewie, what the hell does that mean?

      First, a safe language does NOT have to be interpreted. You can statically compile most of Java to native code -- you'll still have all of the class checking, but there will be no virtual machine, no JIT overhead, and the result will be fairly lean. Java is far from the best though -- SML (see mlton) and O'Caml both produce fast lean native binaries from safe language source.

      Second, the JVM "cracking" is totally irrelevant here. The scenario described in that paper
      • Language and implementation are irrelevant. That is my point. Security breechs happen. Bad code happens. Cracks happen. If he wanted to make a general plea to programmers to use more secure tools, he didn't have to focus on Unix/Linux and C to do it. He could have chosen any language, any interpreter, and made exactly the same point. Instead, he chose to point his finger at a specific group of developers to see what kind of response he could raise. He succeeded in getting over 130+ posts on /.

        I don

  • by lildogie (54998) on Thursday March 27, 2003 @10:07AM (#5607030)
    Please, spare me from the armchair drivel of these SecurityFocus columnists! (Okay, I should spare myself, but I'm compelled to comment.)

    The thrust of the article is that most programmers are not skilled enough to write secure code, so they should be using HLL's that do the security for them, and leave C/C++ code to the "experts."

    Hogwash.

    Repeat after me: Security is a process, not a product. HLL's can be misused just as effectively as LLL's.

    Back to this columnist's soapbox rant. It ends up reveling in an admittedly fallacious comparison:

    > Real programmers manipulate the system at the lowest possible level,
    > for the maximum possible effect.

    I'll accept that, in the diversity of programmers, there are some that are writing insecure code. But stereotyping of this sort is an act of the columnist. Even if there are some programmers who adopt this stereotype, they do not nearly comprise the entire population. The existence of many professional, responsible programmers is completely discounted by the columnist.

    > The fallacy of the comparison should be obvious...(
    > I think it's safe to say that programmers spent less time at
    > self-criticism than pilots.)...

    It's safe to say in the one-way communication of the columnist's world. It's safe to say when your profession is to write sassy, not-too-verifiable copy. It's safe to say if you don't have to have your article vetted by fact-checkers.

    > It would be nice if we could expect that our programmers would act
    > more like airline pilots than fighter pilots: that they acknowledge,
    > and accept, the responsibility that they take for the well-being of
    > others. Until they take this step, I doubt that the quality and
    > security of the code that we all rely on will improve.

    Here the columnist exercises the same comparison he recognized as fallacious. Programmers are not pilots. Not airline pilots, not fighter pilots. While I believe there is a need for the computing industry to move towards more responsibility for security, focusing just on C/C++ programmers will not do the job. There is plenty of improvement to be made by the end users, and the columnists as well!

    > There is also a macho streak in programmers:

    There's a macho streak in this columnist who disparages professions that he probably hasn't been participating in as of late.

    Pfft!
    • The thrust of the article is that most programmers are not skilled enough to write secure code, so they should be using HLL's that do the security for them, and leave C/C++ code to the "experts."

      No. The that is not the thrust of the argument. The thrust of the argument is that high level languages have automated features (memory management in particular) that can reduce the chance of security bugs just as there are features in airplanes to help pilots avoid mistakes.

      Security is a process, not a produc

    • The article really touched a nerve, didn't it?

      While I believe there is a need for the computing industry to move towards more responsibility for security, focusing just on C/C++ programmers will not do the job. There is plenty of improvement to be made by the end users, and the columnists as well!

      Where in the article did he say that the only way to improve security is to use high level languages? As I read it, he said that one way to improve security is to do so.

  • Problems (Score:3, Informative)

    by evilviper (135110) on Thursday March 27, 2003 @10:08AM (#5607032) Journal
    Well, he wants everyone to write everything in perl, except for the fact that he brings up, that perl is just as insecure itself.

    So what will everything be written in??? Every part of the OS in java? Nevermind the performance, and the lack of a good, Open Source JVM/JDE.

    If you want secure programs, start putting strong-typing, and other security measures into GCC. You can't have a string overflow if GCC checks everywhere data is input, and makes sure the input can be no longer than the string...

    A bit different, but it deserves to be mentioned that OpenBSD 3.3 (which will be released VERY soon-already tagged), has numerous protections as described here, by Theo [theaimsgroup.com]. Yes, that's right... OpenBSD has been doing the job before Lasser even lifted a finger to complain.
  • by grokBoy (582119) on Thursday March 27, 2003 @10:09AM (#5607042)
    "Why do we still see these bugs?"

    Well, perhaps its because of one of the following reasons:

    1) Too many programmers aren't granted adequate debugging time by companies who'd rather get any code to market on time rather than test it thoroughly and miss deadlines.
    2) How many programmers do you know who actually know how to audit code for security issues? How many companies are going to invest time and effort (and money) in hiring these people (or training their own?)
    3) As people learn new languages, do they learn secure practices too? No, they learn how to write functional code. For some, thats enough to get the job done.

    "In an age where processing power is cheap, there's no excuse for a mail client written in C or C++."

    How about portability? Or efficiency? Or the fact that the guys writing the code would rather work on the mail client than go and learn a new language first? If they are writing bad code in a language they have been using for years, why move the problem to an area where they have even less expertise?

    Writing secure code is a black art to many, and we can only hope that peer review and open source will help to spread the word amongst today's developers.

  • You said it yourself: "Neither programmers nor system administrators like diversity in the underlying environment: it makes debugging much more difficult." So, the solution isn't to switch en masse to Java or Perl; the solution is to make it harder to write insecure code in gcc. No one should be using sprintf anymore, so why doesn't its use triggers a warning of some sort? For instance, have libc only export "unsafe_sprintf", and have stdio.h #define sprintf to that *and* emit a #pragma warning each time
  • I routinely get into "discussions" with old coworkers about how I can possibly stand my job, which is largely Java-related. It's plenty fast for my tasks (I'm not streaming video), and pretty darn secure. I just sort of sit back and smirk as they rip what little hair remains from their heads struggling to figure out why the vtable gets corrupted on a certain long-lived object but only after an uncertain number of events.

    Moreover, my productivity is three times better than with C/C++. (Measured back whe
  • The article's author states:
    For users accessing mail via IMAP or POP, network speed and congestion have a greater influence over performance than anything done on the client side; even for users with local mailboxes, I doubt that we're looking at a huge performance hit.

    I beg to differ. I use email a lot - as do most people. I need it to be lightning quick in handling multi-megabyte local mailboxes. Basically, he could not have thought of a worse example of where performance supposedly does not matter
    • Basically, he could not have thought of a worse example of where performance supposedly does not matter since modern email repositories resemble databases these days.

      Amen! For example, AOL's new AOL Communicator, which I believe is based on the interesting but weighty Mozilla Mail component, downloads mail from an IMAP server *5 times* as slowly as Outlook Express, and has slower list boxes as well.
  • most of the most serious vulnerabilities (in things like the kernel, tcp stacks, libc, high-traffic web servers, etc), *need* to be written in a low-level language for reasons of efficiency. While I agree that it probably makes sense to re-implement lpd in perl or python or whatever, we will always need core services to be written "close to the metal".

    That being said, I think ANSI should revisit printf and find a way to fix it. Any compiler can do a good job at type checking, and it should be possible to
  • C sucks, C++ sucks, PHP sucks, Perl sucks, programmers suck (I'm not a programmer, but trust me they suck) and there may be tools to make them not suck, about which I know nothing so I'll just use this lame-ass fighter pilot analogy instead. Vote Quimby.

  • I guess he didn't get the memo. Real programmers use assembler! ;)

  • by BigBir3d (454486)
    I never knew ego was such a problem for humans.

    </sarcasm>
  • by Kunta Kinte (323399) on Thursday March 27, 2003 @10:24AM (#5607169) Journal

    It's bad programming habits and the lack of tools that catch those program time errors.

    Static analysis, the use of programs to analysis code that has not been compile completely to machine code, has historically been undeveloped in open source. I use to have a list, was my research focus for a while, but basically we have one decent static analyzer Splint [splint.org], and it's not that hot compared to commercial offerings.

    For instance the HANDS group from stanford has tracked down lots of kernel bugs using their in house analyzer, in lots of obscure places. MS has an in house program I hear they guard closer than the kernel itself! I have a friend that did kernel work for them that agreed with this, they give him the kernel source but not the ananlyzer binaries even. The guy who wrote it, known in pointer analysis circles often goes on about how he's found tons of bugs in open source projects ( bet you he's not filing bug reports )

    There are lots of groups working on static analysis, but no one wants to open source their code.

    Hind, Michael http://www.research.ibm.com/people/h/hind/ [ibm.com]

    Hind has written amongst other things probably the best and most recent introductory papers on pointer analysis at http://www.research.ibm.com/people/h/hind/paste01. ps [ibm.com].

    stanford checker [stanford.edu]

    SUIF compiler suif [stanford.edu]

    I had a few other links, but the lameness filter is complaining about "too many junk characters". You'd think slashcode would have a better filter by now.

    • It's bad programming habits and the lack of tools that catch those program time errors.

      ...There are lots of groups working on static analysis, but no one wants to open source their code.

      A high level language compiler/interpreter is a tool. ;) And there are lots of them out there. Even open source ones!

  • I got maybe halfway into it when he started talking the same old bullshit about higher level languages etc etc. His intro "Until Unix and Linux programmers get over their macho love for low-level programming lanaguages, the security holes will continue to flow freely."

    Sure, but ummm the same goes for every other programmer. Just not Unix or Linux programmers. What about your favorite Windows programmers? Is not their system based on a low level language? Even in comparison to high level languages the secur
  • I started off as a C/C++ programmer, after about 8 yearss I got into Perl instead, for much the same reasons he objects to using C for a mail client.

    C/C++ is wonderful for implmenting operating systems, programming languages, servers that need to be fast, and libraries/modules.

    For a program which spends its time waiting for the user to click on a key, it doesn't matter whether it's implemented in CRAY assembler or GWBasic---it's still waiting for the user to click a key. Of course, once an action has be

  • It's a bitch to deliver anything that doesn't compile using the standard, built-in tools. It's annoying as hell to deal with a package the requires a lot of installation.

    Perl is especially nasty in this regard. For any given nifty package, one needs to go out to CPAN and install 15 different libraries, some of which are broken, which make assumptions about the particular developer's environment, etc.

    The only way to deliver a portable program that can be assured to compile on the majority of machines is
  • A lot of what he gripes about is infeasible to write in a higher level language (which he does note). In those cases, the problem is better addressed with mandatory access controls.

    In the descriptions below, where HLL appears, I would mean something like Ruby, Python, maybe Perl.

    Kernel: If you're trying to be a UNIX system, C is only practical

    OpenSSL: The bulk of this is performance critical (C/C++)

    MySQL: Much of MySQL is performance critical (ie C/C++), although there's no damned reason why MySQL

  • Jon Lasser doesn't sounds like a professional programmer. He also doesn't sound like he is very familiar with software engineering.

    There are no silver bullets.

    That's the first rule of software engineering.

    High level languages might help prevent or detect buffer overflows, but they don't prevent library function misuse, which is the cause of the format string errors. They don't prevent input validation, in fact most Perl programmers that use -T for taint also use stupid, unsafe untaint routines for user
  • "Computer science is the art of using a bulldozer to put your problems in someone else's backyard."

    I respect what Jon's trying to say here, and he's not entirely wrong...but it's a bit more complicated than he thinks.

    He seems to believe there's a direct correlation between bare metal risk and high level safety. This just isn't entirely accurate. For example, shell scripting is extraordinarily high level code; you're literally directing individually compiled applications to do your dirty work. And yet,
    • "High level languages have the advantage that "....

      Agreed entirely. The language exists at a given level of "highness" or otherwise.

      I like to take assembler, C and Haskell as 3 handy options, for example. You can write something fairly funky like:

      unwords (filter ((>3).length) (words "a short string or something")

      and it'll return a string comprising all those words that were >3 characters long in the original, when split on whitespace. ("short string something", duh.)

      You can also write some functi
  • Studies have shown that programmer productivity, measured by lines of code over time, varies little between languages.

    Great! Now let's move on to some benchmark that actually matters. Lines of code over time has never been a good benchmark. Better ones are number of bugs, time to milestone, number of milestones accomplished on time, and user satisfaction. No, none of them are perfect. Welcome to reality.

    the low-level constructs that C and C++ programmers spend time managing are the same ones that can ge
  • Lasser has a pessimistic view of programmers. He assumes they care about security, and he also assumes they are unable to write secure (abuse-resistant) application code.

    Lasser is optimistic of high-level languages. He assumes that the buffer-overflow and input validation problems can be solved by giving programmers complicated components wrapped in high-level language constructs, and forcing programmers to use them, as in making the low-level features unavailable.

    I agree with Lasser that sloppy use of l


  • What Mr. Lasser seems to miss is the fact that security holes are caused by programmer error. Whether an application was written in C or Perl, the developer can still make fatal errors in anticipating input into the program leading to security problems.

    For example, he cites a very good example that a POP/IMAP server does not need to be written in a low-level language. I tend to agree, but the bug problems certainly won't stop there. Sure it'll be harder to create a buffer overflow (provided the Perl/Pytho
  • While I have to give him credit for managing part of the Bastille project, which I've used at home and which I think is extremely useful, his philosophy of programming is pretty simplistic and dumb, IMHO.

    He thinks we should all be using languages like Perl and Java, unless we're doing something His Highness considers elite enough to *require* the use of a lower-level language like C or C++. He thinks that all the people developing for Linux who have chosen to use C/C++ instead of his pet languages (again,
  • I think he's absolutely right: trying to write secure systems in C or C++ is an uphill battle. With enormous amounts of work and testing, it can be done, but why would anybody want to?

    On the other hand, Linux and Windows are in the same boat here: most of their critical components are written in the same languages: C and C++. Ditto for Solaris and Mac OS X. So, Windows doesn't have intrinsic an advantage there when it comes to languages and tools.

    But Linux's development processes, community, and modu

  • by icantblvitsnotbutter (472010) on Thursday March 27, 2003 @11:29AM (#5607795)
    Somewhere in there, Lasser has some good intentions. But his line of argument basically goes:

    • there's no excuse to code in "low level" C++
    • okay, if you insist on coding in C++, use a scoping toolkit to make it less dangerous
    • "high-level" languages like Perl and PHP have an astounding number of bugs


    Huh???

    So, where's the improvement coming from? Anyone who arbitrarily lumps C++ into "low-level" and Perl into "high-level" needs to get his head checked. Then there's the fallacy that automated tools make things better.

    Look, if you're going to code well, it's a technique and a discipline. Bugginess depends on your toolkits -- and you sure as hell aren't getting fewer by using PHP as opposed to C++. Unless I live in a parallel universe, Perl is no more innately readable than C/C++, so that clears up any "natural" predisposition to bug-free code.

    This is an article that could've been expressed in one paragraph and made the same point. Or taken the same amount of space and actually had some supporting logic.
  • by j3110 (193209) <samterrellNO@SPAMgmail.com> on Thursday March 27, 2003 @11:45AM (#5607941) Homepage
    The biggest problem with C/C++ is the complete lack of a standard safe library. Developers don't like to say "You need to install these 20 other packages before my program will run on your system."

    Also there are Language biggots that say "C is better than X because it's faster" or "Just because we have a 2Ghz, doesn't mean we should waste it."

    C is faster than X?
    A language isn't fast, only it's compiler has had more time to be optimized. Natively compiled programs in fact have less of a chance to be fast for these reasons:
    1) compiled for a specific architecture, usually 386.
    2) can not inline shared library calls.

    At the moment C may be faster than X, but in a lot of cases, X (where X is an interpretted or VM compiled language) will eventually over-run C in terms of performance.

    Just because we have 2Ghz, doesn't mean we should waste it?

    Also, you have to consider that these languages usually have enormous libraries and abstractions like Java so that developers don't have to code as much. This means there is less code that has to be checked for security issues. It means more efficient developers. At some point, the time of the developer out ways the time spent on the CPU. I would rather people were making this choice rather than to skimp on security. Security is often implemented after the fact in C.

    Security is always more important than performance to me. You are playing with data. Data is the single most important asset that anyone has. People cry for hours over loosing just 4K-10K of data. I would say to those that don't have the time to make a program right in C (takes more time than Java if you consider security and performance) then don't write it in C. You could end up causing a lot of people a lot of pain.

    If you don't like Java, use Python. If you don't like Python, write a good VM. If you don't like VM's, you better take the time to check your code well. If you don't have time either, don't even bother programming. You are just going do more bad than good.
  • bah humbug (Score:3, Informative)

    by dh003i (203189) <dh003i@@@gmail...com> on Thursday March 27, 2003 @03:09PM (#5609759) Homepage Journal
    This is a bunch of hogwash. This guy sounds more like a MS rep than a security-focus programmer. Because RAM and CPU is so cheap now and so powerful, it doesn't matter if we write crappy bloated code? That's such bullshit. Consider that writing crappy bloated slow code in fact reduces system stability, by adding to CPU and RAM strain. Just as a security breach can result in data-loss, so can a system or program crash.

    And, if this author not know, not everyone is buying the latest 2GHz CPUs. For most people's needs, a 100MHz PC is still fine and dandy -- that is, unless fuckwits like him and the genius' at MS continue to make ever-more bloated and slow code.

    And why is RAM-usage so important, even with typical PC's having 256MB of RAM? Well, apparently this guy is still stuck in the good old one-program-at-a-time days. People run many programs concurrently -- the more resources each one takes up, the less happy the user is, because that means everything is slower overall, and may even mean the system has to switch over to virtual memory.

    And why is it that we always get complacent jackasses like this always assuming that speed, memory-usage, stability, and security are always inversely related to one-another? They aren't. For the most part, writing smaller faster code isn't necessarily going to make the program less secure, nor is writing more secure stable code going to make it perform worse. There a *few* specific cases where there are trade-offs: in those cases, it's up to the programmer to decide what's best based on his target audience. If he's targetting home-users, a little bit of security can be sacraficed for increased performance. If he's targetting business users, a little bit of performance can be sacraficed for security. Stability is always a priority.

    Alas, I'm sick of hearing "this programming language (C) is faster than that one (Perl), and assembly is faster than them all". Programming languages aren't "fast". Assembly allows a programmer to produce very highly optimized, fast, code. However, poor assembly will run much worse than C compiled with GCC. Comparing C to other languages, C programs tend to be faster because C allows for more direct control and compilers have had more time to be optimized. I doubt other languages will ever catch up to C in that regard, but a crappily written C program will still be beaten by a well-written Perl programmer (personally, I suggest Objective-C, because it provides OO with only a few additions to C).

"Everything should be made as simple as possible, but not simpler." -- Albert Einstein

Working...