Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming Security

Eric Raymond Shares 'Code Archaeology' Tips, Urges Bug-Hunts in Ancient Code (itprotoday.com) 109

Open source guru Eric Raymond warned about the possibility of security bugs in critical code which can now date back more than two decades -- in a talk titled "Rescuing Ancient Code" at last week's SouthEast Linux Fest in North Carolina. In a new interview with ITPro Today, Raymond offered this advice on the increasingly important art of "code archaeology". "Apply code validators as much as you can," he said. "Static analysis, dynamic analysis, if you're working in Python use Pylons, because every bug you find with those tools is a bug that you're not going to have to bleed through your own eyeballs to find... It's a good thing when you have a legacy code base to occasionally unleash somebody on it with a decent sense of architecture and say, 'Here's some money and some time; refactor it until it's clean.' Looks like a waste of money until you run into major systemic problems later because the code base got too crufty. You want to head that off...."

"Documentation is important," he added, "applying all the validators you can is important, paying attention to architecture, paying attention to what's clean is important, because dirty code attracts defects. Code that's difficult to read, difficult to understand, that's where the bugs are going to come out of apparent nowhere and mug you."

For a final word of advice, Raymond suggested that it might be time to consider moving away from some legacy programming languages as well. "I've been a C programmer for 35 years and have written C++, though I don't like it very much," he said. "One of the things I think is happening right now is the dominance of that pair of languages is coming to an end. It's time to start looking beyond those languages for systems programming. The reason is we've reached a project scale, we've reached a typical volume of code, at which the defect rates from the kind of manual memory management that you have to do in those languages are simply unacceptable anymore... think it's time for working programmers and project managers to start thinking about, how about if we not do this in C and not incur those crazy downstream error rates."

Raymond says he prefers Go for his alternative to C, complaining that Rust has a high entry barrier, partly because "the Rust people have not gotten their act together about a standard library."
This discussion has been archived. No new comments can be posted.

Eric Raymond Shares 'Code Archaeology' Tips, Urges Bug-Hunts in Ancient Code

Comments Filter:
  • Well, that reminds me on my this weeks Rust bootstrapping experience to actually only build Firefox: https://www.youtube.com/watch?... [youtube.com]
  • by Bruce Perens ( 3872 ) <bruce@perens.com> on Saturday June 16, 2018 @03:08PM (#56795560) Homepage Journal

    Eric suggests it's time to move on from C, and there are indeed better languages today that can help eliminate many classes of error.

    Crystal [crystal-lang.org] is a rising programming language with the slogan "Fast as C, Slick as Ruby". It has some compelling features that make it more attractive than other modern language attempts like Go. You really can program in a Ruby-like language and achieve software that performs with the speed of a compiled language. And you can do systems programming in Crystal, too, because while it doesn't encourage you to use them for anything but systems programming and inter-language interfaces, it has pointers, and it can format structs as required to work on hardware registers.

    But the greatest advantage of Crystal, that I have experienced so far, is that it provides type-safety without excessive declarations as you would see in Java. It does this through program-wide type inference. So, if you write a function like this:

    def add(a, b)
    a + b end

    add(1, 2) # => 3, and the returned type is Int32
    add(1.0, 2) # => 3.0, and the returned type is Float64

    You get type-safe duck-typing at compile-time. If a method isn't available in a type, you'll find out at compile-time. Similarly, the type of a variable can be inferred from what you assign to it, and does not have to be declared.

    Now, let's say you never want to see nil as a variable value. If you declare the type of a variable, the compiler will complain at compile-time if anything tries to assign another type to it. So, this catches all of those problems you might have in Ruby or Javascript with nil popping up unexpectedly as a value and your code breaking in production because nil doesn't have the methods you expect.

    There are union types. So, if you want to see nil, you can declare your variable this way:

    a : String | Nil

    a : String? # Shorthand for the above.

    Crystal handles metaprogramming in several ways. Type inference and duck typing gives functions and class methods parameterized types for free, without any declaration overhead. Then there are generics which allow you to declare a class with parameterized types. And there is an extremely powerful macro system. The macro system gives access to AST nodes in the compiler, type inference, and a very rich set of operators. You can call shell commands at compile-time and incorporate their output into macros. Most of the methods of String are duplicated for macros, so you can do arbitrary textual transformations.

    There is an excellent interface to cross-language calls, so you can incorporate C code, etc. There are pointers and structs, so systems programming (like device drivers) is possible. Pointers and cross-language calls are "unsafe" (can cause segmentation faults, buffer overflows, etc.) but most programmers would never go there.

    What have I missed so far? Run-time debugging is at a very primitive state. The developers complain that LLVM and LLDB have changed their debugging data format several times recently. There's no const and no frozen objects. The developers correctly point out that const is propagated through all of your code and doesn't often result in code optimization. I actually like it from an error-catching perspective, and to store some constant data in a way that's easily shareable across multiple threads. But Crystal already stores strings and some other data this way. And these are small issues compared to the benefits of the language.

    Lucky

    Paul Smith of Thoughtbot (a company well-known for their Ruby on Rails expertise) is creating the Lucky web framework [luckyframework.org], written in Crystal and inspired by Rails, which has pervasive type-safety - and without the declaration overhead as in Java.

    The point of all of this is that you can create a web application as you might using Ruby on Rails, but you won't have to spend as much time writing tests, because some of the

    • I find it very difficult to find really good C programmers who can also have good understanding of low level systems with some domain knowledge. Lots of self taught C programmers though, they can write stuff that's good enough for their own simple tools, but who lack experience working on complex software in a team, mostly they're coming from EE or science backgrounds. People coming from a CS background don't seem to know C, only vaguely know C++ (and can't make the tiny step from there to C), and who com

      • I find C programmers in embedded work who don't understand that they are responsible for everything coming up out of the reset vector. They assume timers are initiated, that interrupt vectors are pre-defined, and that they can just coast along at an application level. That's just the consequence of too much programming on top of an operating system, I guess.

      • I am not so concerned with companies making the move, as the Open Source community and green-field projects. The same folks who use Rails in start-ups should be keeping an eye on this.
      • I find it very difficult to find really good C programmers
        If you really need C programmers, why not hire a competent programmer and teach him C or simply ask him to learn it over a weekend ...

      • by jma05 ( 897351 )

        C takes a huge amount of experience before you stop shooting yourself in the foot and start being productive. Languages like Go are trivial to learn. Rust is less trivial.

        No one is expecting C to disappear any time soon.

    • by jma05 ( 897351 )

      You should also check out Nim.
      https://nim-lang.org/ [nim-lang.org]

      It is the Python version of Crystal. It transpiles to C/C++/Node. Has type inference, integrates Boehm GC, good FFI, meta-programming etc. Similar performance to C (CPU, RAM, static binary size), but with the productivity of Python.

      Crystal seems to have more modules - both seem to have the essential libraries covered.

  • Another language de-jour article, I remember in the early days of UNIX one of the sayings about c was something like "it expects the programmer to know what they are doing, it is in a hold-your-hand language".

    Well I guess we should go back to COBOL or FORTRAN then

    • by jmccue ( 834797 )
      Well a typo, instead of " it is in a" I meant " it is NOT a". Anyway this was tongue due to "archaeology"
  • Eric Raymond is not well reknowned for his programming/engineering achievements, but for being a public speaker. What is the value of his advice?

    • He is visionary in the same sense that a guy assigned to give the keynote speech at an IBM Symposium in 1966 was visionary.

    • What is the value of his advice?

      I always find it valuable to evaluate the advice first; then see whom it came from.

      Let's look at his advice:

      • Add static code analysis even to legacy code.

        Frankly I think static code analysis is always worth doing. It's a way of doing automated "apply learned lessons from someone else". Yes, there may be false positives - but you'll quickly uncover code smells.

      • Change to modern languages.

        I think this one is also a no-brainer. I used to do Fortran...switched to C/C++

  • by Anonymous Coward

    "we've reached a typical volume of code, at which the defect rates from the kind of manual memory management that you have to do in those languages are simply unacceptable anymore"

    Translation: "We kept telling programmers they're too dumb to do their own memory management and rely on high level languages like java which handle memory management automatically and now the programmers coming out of college don't even understand memory management because it's too complex so it's time to move onto something like

    • It's not a matter of dumb or smart, it's a matter of being human. There's only so many things you can pay attention to at once, and at a certain level of machine capability it's just not worth doing anymore for many use cases. And the the choice isn't C++ or silly over to top OO there is a lot of options for memory-safe languages.

  • It's odd that he talks about using tools to validate your code on one hand and then recommends moving away from C or C++ on the other.

    There's actually some pretty fantastic work on sanitizers being done right now in Clang (and other tooling chains) that can enforce memory and type safety at run time.

    You can do all your development with the sanitizers turned on, and then when you want speed when you're ready to release, turn them off.

    There's still nothing faster than C or C++ than assembly, and even then you

  • Documentation is important.....Code that's difficult to read, difficult to understand

    "Real programmers don't comment their code. If it was hard to write, it should be hard to understand." - some coder who doesn't work here anymore.

It's a naive, domestic operating system without any breeding, but I think you'll be amused by its presumption.

Working...