Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming Operating Systems Security Software

How Your Compiler Can Compromise Application Security 470

jfruh writes "Most day-to-day programmers have only a general idea of how compilers transform human-readable code into the machine language that actually powers computers. In an attempt to streamline applications, many compilers actually remove code that it perceives to be undefined or unstable — and, as a research group at MIT has found, in doing so can make applications less secure. The good news is the researchers have developed a model and a static checker for identifying unstable code. Their checker is called STACK, and it currently works for checking C/C++ code. The idea is that it will warn programmers about unstable code in their applications, so they can fix it, rather than have the compiler simply leave it out. They also hope it will encourage compiler writers to rethink how they can optimize code in more secure ways. STACK was run against a number of systems written in C/C++ and it found 160 new bugs in the systems tested, including the Linux kernel (32 bugs found), Mozilla (3), Postgres (9) and Python (5). They also found that, of the 8,575 packages in the Debian Wheezy archive that contained C/C++ code, STACK detected at least one instance of unstable code in 3,471 of them, which, as the researchers write (PDF), 'suggests that unstable code is a widespread problem.'"
This discussion has been archived. No new comments can be posted.

How Your Compiler Can Compromise Application Security

Comments Filter:
  • by istartedi ( 132515 ) on Tuesday October 29, 2013 @07:25PM (#45274579) Journal

    If my C code contains *foo=2, the compiler can't just leave that out. If my code contains if (foo) { *foo=2 } else { return EDUFUS; } it can verify that my code is checking for NULL pointers. That's nice; but the questions remain:

    What is "unstable code" and how can a compiler leave it out? If the compiler can leave it out, it's unreachable code and/or code that is devoid of semantics. No sane compiler can alter the semantics of your code, at least no compiler I would want to use. I'd rather set -Wall and get a warning.

  • by Zero__Kelvin ( 151819 ) on Tuesday October 29, 2013 @07:46PM (#45274757) Homepage

    "For example, if a pointer is passed to a function, is the function allowed to dereference it without first checking it for NULL?"

    Of course it is, and it is supposed to be able to do so. If you were an embedded systems programmer you would know that, and also know why. Next you'll be complaining that languages allow infinite loops (again, a very useful thing to be able to do). C doesn't protect the programmer from himself, and that's by design. Compilers have switches for a reason. If they don't know how it is being built or what the purpose of the code is then they can't possibly determine with another program if the code is "unstable".

  • -Wall (Score:3, Insightful)

    by Spazmania ( 174582 ) on Tuesday October 29, 2013 @07:50PM (#45274785) Homepage

    If I set -Wall and the compiler fails to warn me that it optimized out a piece of my code then the compiler is wrong. Period. Full stop.

    I don't care what "unstable" justification its authors gleaned from the standard, don't mess with my code without telling me you did so.

  • by Myria ( 562655 ) on Tuesday October 29, 2013 @08:03PM (#45274871)

    The C standard needs to meet with some realities to fix this issue. The C committee wants their language to be usable on the most esoteric of architectures, and this is the result.

    The reason that the result of signed integer overflow and underflow are not defined is because the C standard does not require that the machine be two's complement. Same for 1 31 and the negative of INT_MIN being undefined. When was the last time that you used a machine whose integer format was one's complement?

    Here are the things I think should change in the C standard to fix this:

      * Fixation of two's complement as the integer format.
      * For signed integers, shifting left a 1 bit out of the most-significant bit gets shifted into the sign bit. Combined with the above, this means that for type T, ((T) 1) << ((sizeof(T) * CHAR_BIT) - 1) is the minimum value.
      * The result of signed addition, subtraction, and multiplication are defined as conversion of all promoted operands to the equivalent unsigned type, executing the operation, then converting the result back. (In the case of multiplication, the high half is chopped off. This makes signed and unsigned multiplication equivalent.)
      * When shifting right a signed integer, each new bit is a copy of the sign bit. That is, INT_MIN >> ((sizeof(int) * CHAR_BIT) - 1) == -1.

    That should fix most of these. Checking a pointer for wraparound on addition, however, is just dumb programming, and should remain the programmers' problem. Segmentation is something that has to remain a possibility.

  • Re:-Wall (Score:3, Insightful)

    by Anonymous Coward on Tuesday October 29, 2013 @08:31PM (#45275077)

    If the compiler finds two constants it can combine then I've usually made a mistake in my code...

    Or it inlined a function for you. Or you indexed at a constant index (perhaps 0) into a global array. Or any number of other things that can arise naturally and implicitly.

    The compiler has a setting where it doesn't "mess with your code" -- it's called -O0.

  • Re:PC Lint anyone? (Score:4, Insightful)

    by EvanED ( 569694 ) <evaned@NOspAM.gmail.com> on Tuesday October 29, 2013 @08:34PM (#45275113)

    Don't worry, the authors know what they're doing.

    Just because PC Lint could find a small number of potential bugs doesn't mean it's a solved problem by any means. Program analysis is still pretty crappy in general, and they made another improvement, just like tons of people before them, PC Lint before them, and tons of people before PC Lint.

  • by seebs ( 15766 ) on Tuesday October 29, 2013 @08:36PM (#45275125) Homepage

    Pretty sure the embedded systems guys wouldn't be super supportive of this, and they're by far the largest market for C.

    And I just don't think these are big sources of trouble most of the time. If people would just go read Spencer's 10 Commandments for C Programmers, this would be pretty much solved.

  • by Old Wolf ( 56093 ) on Tuesday October 29, 2013 @10:06PM (#45275821)

    >The dereference is undefined, and therefore

    Stop right here. Once undefined behaviour occurs, "all bets are off" as they say; the remaining code may have any behaviour whatsoever. C works like this on purpose , and it's something I agree with. It means the compiler doesn't have to insert screeds of extra checks , both at compile-time and run-time.

    There are plenty of other languages you can use if you want a different language definition :)

The use of money is all the advantage there is to having money. -- B. Franklin

Working...