Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
Operating Systems Software Security

Embedded RTOS Maker Raises Linux Security Issues 341

drquizas writes "Embedded RTOS provider Green Hills recently delivered an address where they raised the question of whether Linux can be considered secure enough to be used in defense applications. Much of the usual FUD is present in the remarks, although an interesting question is raised regarding what defense and other government contractors are required to do in testing code (in this case anyway): is the closed code here being held to a higher standard than its open-source equivalent, and does this change the 'security through obscurity' argument?"
This discussion has been archived. No new comments can be posted.

Embedded RTOS Maker Raises Linux Security Issues

Comments Filter:
  • by mindless4210 ( 768563 ) * on Saturday April 10, 2004 @11:14PM (#8828453) Homepage Journal
    "Open Source is actually more secure than closed source proprietary software because the oversight of technology content is broader and deeper. Instead of just one company monitoring its own contributions -- or potentially hiding security holes and exploits -- a worldwide community of interested parties actually oversees Linux to make it strong and secure. That's why the NSA -- the most security-conscious organization in the world -- chose to standardize on Linux, and even supplies its own version of secure Linux."

    Can't put it much better than that. When you have the contribution of the entire open source development community, so much knowledge and experience comes to the table that it's difficult for any one group of programmers to compete.

    • by beacher ( 82033 ) on Saturday April 10, 2004 @11:31PM (#8828538) Homepage
      Yeah but he's spewing this crap.. "Everyday new code is added to Linux in Russia, China and elsewhere throughout the world. Everyday that code is incorporated into our command, control, communications and weapons systems. This must stop." ... Cmon he has a vested interest... His own company puts out it's own RTOS [ghs.com] Go to that link. Now. Read the TOP of the middle column "Real-Time Operating Systems Must be Highly Reliable"
      Microsoft Windows, MacOS, Unix, and Linux often crash, lock up, or go crazy. They indicate this condition by displaying a sad face, an exploding bomb, a red X, a blue screen of death, or by simply refusing to respond to mouse-clicks or keyboard input.

      This is FUD and he does have a vested interest.
      • Well, yes, he does have a vested interest and is trying to sell his product, but maybe why his company devised this product, because they felt their was a need and market for it.

        Frankly, even as a faithful Linux user, I still have to agree with him. Our missile defense systems should not be running the same software as my home PC whether it is a commercial or open-source product.

        • No.

          You want custom, quality, made for Govt. spec code! The kind that is produced by either the low-bidder, or corporate crony!

        • by cmacb ( 547347 ) on Sunday April 11, 2004 @01:17AM (#8828905) Homepage Journal
          "Frankly, even as a faithful Linux user, I still have to agree with him. Our missile defense systems should not be running the same software as my home PC whether it is a commercial or open-source product."

          Funny... I feel just the opposite. Whether it's missile control, voting machines or accounting system 99% of what the operating systems components are doing is the same. I'd want that code tested millions of times if possible. Of course some of the code, unique to that application, can only be tested in place, but the less there is of that the better. For every person who would want to introduce a flaw into such software there are hundreds, more likely thousands, who would want to expose that flaw and fix it. It really doesn't matter if their reasons are patriotic or ego related.

          It is closed systems after all that produce voting machines with huge bugs in them, and closed systems that crash vehicles into Mars due to metric to English conversion bugs. It is also closed systems that had laptop computers being used in Afghanistan being subverted by pop-up messages from ... well, nobody really knows. The notion that closed systems are superior from the security point of view simply doesn't hold up to any sort of statistical analysis. Heck, it doesn't even hold up to a back of the napkin analysis.
          • by Anonymous Coward
            Most RTOSs are small, a tiny fraction in size compared to general purpose operating systems, making them easier to write well and test thoroughly.

            The feature requirements for control systems are also vastly different and would inevitably exercise different features of the system, so testing in the server or desktop areas would be of limited value. No general purpose operating system provides hard real-time constraints out of the box.

            My preference would be an open source RTOS. I know there are a lot of p
    • by Total_Wimp ( 564548 ) on Saturday April 10, 2004 @11:33PM (#8828548)
      Come on. These guys have a valid point. When you rely on high-quality closed source vendors like Cisco at least you guarentee you won't have back doors built into your system.

      Oh. Wait. Nevermind.
    • by alangmead ( 109702 ) * on Sunday April 11, 2004 @12:59AM (#8828848)

      When you buy a RTOS, you usually aren't getting compiled executable code. You usually get source code that you need to port to the hardware you are building.

      Data sheets like this [ghs.com] implies that Green Hills adheres to this common practice. So all the open source is more trustworthy than a black box arguments don't apply. Anyone who wishes to deploy a system based on Green Hills' RTOS can audit the code, it isn't hidden from them. Also, this PDF [ghs.com] linked says:

      INTEGRITY178B has been audited and approved by the FAA for DO178B Level A use.
      Which to me implies that it has had a more thorough external audit than most open source packages.

      One final argument is that an RTOS is usually very small. Their Velocity [ghs.com] RTOS can run in 3KB of RAM. When the OS is stripped down to something that small, a full audit seems like a much less daunting task.

      This implies that he isn't arguing security through obscurity. He is arguing for the cathedral approach vs. the bazaar. Don't get me wrong, he still is spreading FUD. Its just a different FUD than you think. He is ignoring the role that Linus Torvalds and some of his trusted lieutenants like Alan Cox play in planning a direction, vetting ideas, and protecting the stability of the code base. Patches don't just come out of the blue from anonymous sources and applied without any examination, no matter what Dan O'Dowd may think.

      • that they sell source code instead of compiled code just makes it even funnier and more desparately pathetic that, in their press release, they made a reference to this ken thompson paper [acm.org] as proof that the "many eyes" theory doesn't hold.
      • Sure Integrity is certified but it has very limited capability. If I were doing something that required DO178B level A certification, I would consider it and I would likely not consider Linux (yet). I would consider other vendors (Windriver pops into mind) as well as going OS'less and using a smaller microkernel approach.

        However, very very little Defense software requires DO-178B level ANYTHING certification.

        This certification does not mean that there are not bugs in the software. Based on some limited

    • by Eskarel ( 565631 ) on Sunday April 11, 2004 @01:09AM (#8828879)
      Well the problem here is that that's not entirely true. Yes OSS receives testing from a much larger and broader group of people, but how much of an asset is that for the military.

      I mean I can test the latest version of redhat, I can even, if I really desire to do so and am willing to work out the specifics, fix some of the problems I might encounter, but the militray is unlikely to care how something works on my system, they are going to want to know how it performs on their systems, the most important of which are likely to be either expensive and difficult to obtain servers or proprietary military hardware. I can't test that nor, I believe, can 99% of the people who test and examine OSS software.

      Even the NSA doesn't use Linux, they use their own brand of Linux which they've probably modified the bejesus out of, Linux was just an easier place to start than other OS's(I don't doubt that the NSA could make their own version of Windows if they liked and there wouldn't be a damned thing MS could do about it, but it'd be a pain).

      • the military uses the exact same off-the-shelf software and hardware that the rest of the world does - you think they have their own computer chip manufacturing going on? of course not.

        you think they don't use the same big oracle databases that everyone else uses?

        there was an article posted in the last week about the US navy's newest fanciest warship, the commanders were all drooling about how they can run the ship with 3 people on the bridge compared to 8 on a standard ship - and the article SPECIFICALL
    • by HungWeiLo ( 250320 ) on Sunday April 11, 2004 @01:18AM (#8828910)
      I develop aircraft safety software, and the FAA's guidelines require that all code and tools must be certified at the same level of competency. Windows cannot be qualified as a valid development tool or environment, because it is closed source.
      • by flossie ( 135232 ) on Sunday April 11, 2004 @07:22AM (#8829653) Homepage
        I develop aircraft safety software ... Windows cannot be qualified as a valid development tool or environment

        Perhaps that may be true of civilian aircraft systems, but the DoD certainly has no objection to using Windows as a development environment for military aircraft. The Common Operating Environment [dtic.mil] may change that in the future, but MS Windows is definitely used at the moment.

    • I believe that being able to build on top of open source software is one of the best parts about it. Customizing an open source project, in my mind, doesn't make it a proprietary or closed source project by any means.

      It is not too difficult to build your own customized OS based off Linux [linuxfromscratch.org], even using Red Hat (Although it wouldn't be my choice of distributions to start with).
  • by Anonymous Coward
    on a related topic... here [slashdot.org].
  • by zogger ( 617870 ) on Saturday April 10, 2004 @11:19PM (#8828473) Homepage Journal
    quote from this raty-os dude

    "It costs us $500 to $1,000 a line to review our source code. It would cost billions of dollars to review Linux."

    Say whut? It actually costs this? why? where can I sign up???? I'll sub my per-line auditing out, rake it in...

    Naw, cmon, really? the government charges this, or he just pays this cost? Because..huh?

    • OS dude's got the quote wrong:

      "It costs us $500 to $1,000 a line to review our source code. It would cost _us_ billions of dollars to review Linux."

      That's why he's losing business.
    • In all fairness (Score:5, Insightful)

      by Anonymous Coward on Saturday April 10, 2004 @11:41PM (#8828581)
      The parent post is funny but in all fairness I think the general idea is that he's discussing the cost per line for a very large system. A single line in isolation is easy to debug. But you can't debug them in isolation, can you now? I think it should be fairly obvious the average cost to debug per line of code increases the more lines of codes you have in the system. Since the different lines of code interact, you know.

      And this tendency is probably much more pronounced when rather than debugging, you are, for example, attempting to certify something as a failsafe system.

      Linux is a fairly large and multifarous system. If his company sells a product that is designed and streamlined to be an RTOS embedded kernel, it more than likely achieves this in far, far fewer lines of code than Linux overall. While he is probably being unfair by counting in the total number of Linux number of lines of code things like desktop video card drivers, it is an altogether reasonable statement to suppose that the streamlined and smaller RTOS kernel this company sells is probably easier to debug and reason about than the Linux kernel, which is relatively larger, more complex, and has more complex design goals.

      • Let's not forget, also, that what he's selling is not as capable across multiple architectures as Linux is, nor is it going to have the diverse hardware support.

        Sure, a kernel designed for a specific hardware config and for specific applications is going to be more secure than one designed as I pointed out above.

        I fail to see what point he is making, if any. Apples, Oranges, and FUD.

        SB
      • Microsoft can at least assure that they know where their coders spend their workdays, and can submit their programmers for higher-level government background checks if need be. Open Source can't exactly counter that... I mean, how do you prove that an OSS project it's being tampered with by contributors loyal to the enemy?

        Annonymous anything is annoying to the military. They need to be able to trust who and what they're dealing with. They want to be friendly with the Iraqis in the street as much as possibl
        • oss anonymous? (Score:3, Insightful)

          by IncohereD ( 513627 )
          Excuse me? Isn't the whole point of the LKML/CVS/BitKeeper process that every line that goes into the kernel (at the least) is traceable to somebody? Do any major projects give out anonymous CVS access? Or even access to people who aren't at least somewhat known by other developers?

          Meanwhile, at many commercial companies you could have employees who worked there for a few months and got fired/quit. Depending on their internal code tracking it might be hard to tell what code they submitted, and whether it s
      • So CP/M would be the ultimate system then? So secure not even someone on a network can log in due to lack of support.

        No features and really just a program bootstrapper.

    • While that figure is probably a fairly high exagerration, I guess he is factoring in all costs of testing, not just the verificaiton that a single line is correct.

      ie- perhaps to test a piece of software that is responsible for guiding missiles or whatever, they may have to actually fire a few million dollar missiles. Or they may have to build a test suite or simulation software etc for testing.
  • by pair-a-noyd ( 594371 ) on Saturday April 10, 2004 @11:20PM (#8828479)
    that Linux can be made pretty damn secure.
    If they have faith in it....
    http://www.nsa.gov/selinux/ [nsa.gov]
    • by Anonymous Coward
      that Linux can be made pretty damn secure.
      If they have faith in it....
      http://www.nsa.gov/selinux/


      except they say:

      There is still much work needed to develop a complete security solution. In addition, due to resource limitations, we have not yet been able to evaluate and optimize the performance of the security mechanisms.

      One problem, as I see it is there are many people messing with the code that each update would require a line by line check to verify nothing has changed - greatly increasing the co
      • One problem, as I see it is there are many people messing with the code that each update would require a line by line check to verify nothing has changed - greatly increasing the cost to maintain it certified as secure. Close source, however, can be maintained by strict procedures to ensure only parts of it get changed, greatly reducing the time needed to verify. Is it "more secure" - that's debatable, but it is certainly easier to control changes; making it easier to keep secure.

        No matter wether closed o
    • by Anonymous Coward
      Wow! The NSA also thinks that Windows can be made pretty damn secure [conxion.com]!!

      Back to square one for that argument.
    • by Animats ( 122034 ) on Sunday April 11, 2004 @01:52PM (#8831319) Homepage
      There's a general misconception about NSA Secure Linux. It has a tough security model, but it's not developed to high security standards. The whole point of NSA Secure Linux is to find out if useful applications can be built on an OS with a mandatory security model. NSA has had tough OSs built for them before (I worked on one), but they were so restrictive that very few applications were developed for them.

      What developers can do to help is to modify a web server, a mail server, and a DNS server (the most attacked server side software) to run under NSA Secure Linux, partitioned into levels of integrity.

      The idea is that just because somebody attacks, say, the mail server receive program, that doesn't get them power over the whole system. All it should do is let them run their attack code in a jail where it can't do anything except burn CPU cycles, and maybe generate phony incoming mail. Mail associated with the known IP address from which that copy of the receiver was launched, so the problem can be tracked.. When they disconnect, or some monitor program kills the corrupted component off, all the damage should be flushed.

      You need to rework key server apps so that about 95% of the code is untrusted and jailed, while the 5% that has to do security-related functions is isolated, identified, and carefully examined.

      That's what NSA Secure Linux is for.

  • Review cost (Score:3, Insightful)

    by CrystalChronicles ( 706620 ) on Saturday April 10, 2004 @11:24PM (#8828501)
    "It costs us $500 to $1,000 a line to review our source code. It would cost billions of dollars to review Linux."

    Hows that any different from if they chose windows? Wouldnt it still cost them just as much? thats assuming they can get access to the windows code. At least with Linux you don't have to pay to get it.

    And no the leaked source does not count.
    • Re:Review cost (Score:3, Informative)

      by plone ( 140417 )
      Green Hills isnt comparing Linux with Windows, but rather with their own RTOS, Integrity. Which in that case, THEY ALREADY OWN AND CONTROL THE BLEEDING SOURCE CODE!
    • One would never use windows for a secure system, well it can be secure when its sitting at a blue screen.

      Anyway software security I belive is more about keeping it simple, this means fewer lines of code to go wrong, knowing every move the OS makes. The full linux source is hugh but you don't need all of it. But linux isn't the answer for everything but everyone is trying to make it solve another problem which adds to it size and complexity.

      I think it would be alot easier starting from scratch which is jus
  • by CaptainPinko ( 753849 ) on Saturday April 10, 2004 @11:25PM (#8828507)
    While it is never good to rely on "security through obscurity", it doesn't mean that it is useless. For example, if after all the thorough testing the same number of bugs were left (hypothetically) in the software, they would be harder to find in the closed system where you wouldn't know where to starting looking as opposed to open source where you could scan the source until you came upon what looked like a vulnerability. The obscurity isn't harmful in itself and it provides an additional barrier. Maybe not a powerful, but every little bit helps. I'd feel a little nervous if I knew some terrorist (as a much over used example) could look over the source code (even if it had no holes!) for a nuclear weapon command centre or something of that sort. I think the ultimate question should be whether the open nature of the open source development can lead to the less bugs - and thus greater security- than closed source development plus the small bonus of obscurity. I think the value of obscurity may have been undervalued in the past, it does have some value.
    • by Aneurysm9 ( 723000 ) on Saturday April 10, 2004 @11:35PM (#8828559)
      The problem with your thinking is that you assume military applications would be opened. That's highly unlikely. Military applications may be built on an open source platform, but the code for a "nuclear weapon command centre" will remain closely guarded. And, as was mentioned earlier, terrorists don't need open source software to exploit security holes. Have you ever used Microsoft's Flight Simulator? How about Wilco's 767 Pilot-In-Command? There are two pieces of closed-source software that could have greatly facilitated the September 11, 2001 attacks.
  • give us a break (Score:5, Insightful)

    by Aneurysm9 ( 723000 ) on Saturday April 10, 2004 @11:26PM (#8828510)
    "Everyday that code is incorporated into our command, control, communications and weapons systems. This must stop."
    I don't know how they do things at his shop, but if the DoD is pulling code from CVS into their production systems without auditing it, we deserve whatever we get as a result. That said, I highly doubt that's happening and it's more likely this blowhard is just trying to put a good scare into the technophobic jarheads who control procurement.
    • Re:give us a break (Score:2, Insightful)

      by gnugie ( 757363 )
      IAWTP.

      If the Gov't requires the vendor to audit the code that stringently, why wouldn't they put the same requirement on the embedded Linux provider?

      In that case, it's the vendor's responsibility to audit to the gov't requirements. I'm going to seriously doubt it'll cost $500/line, but it should already be a part of the quote.
      • Re:give us a break (Score:3, Informative)

        by calidoscope ( 312571 )
        In that case, it's the vendor's responsibility to audit to the gov't requirements. I'm going to seriously doubt it'll cost $500/line, but it should already be a part of the quote.

        It might be more like $2,000/line - or more

        There's also the issue of what kernel version you want to run - once you've decided on a certain version, it will be extremely painful to updtae to a new one. You've also have to validate that the compilers are generating the expected code. Compared to a well designed RTOS, Linux is bl

  • Compare that to (Score:3, Informative)

    by rasafras ( 637995 ) <tamasNO@SPAMpha.jhu.edu> on Saturday April 10, 2004 @11:27PM (#8828518) Homepage
    in-house code, as well.
    The advantages of closed source coding seem to me to be a faster development time, stronger integration of components, and more support. The drawbacks, though, are that you are ultimately trusting somebody else.
    Open source code, I would say, is more secure overall - there are more people looking at the code, so it is less likely that bugs slip through. The drawbacks would be that open source is less custom-made and possibly less supported than the rest (also, as O'Dowd would have it, people 'contributing' backdoors).
    As for simply writing your own secure code (an agency doing this, that is), it's obviously just more expensive.
    The best solution, in my opinion, is to make your own custom flavor of Linux that is open to all, but contribution is regulated so no questionable code can be admitted - the tack taken by the NSA [nsa.gov].
  • Higher Standards (Score:5, Interesting)

    by njcoder ( 657816 ) on Saturday April 10, 2004 @11:28PM (#8828520)
    There are much higher standards for security in these situations.

    I know Sun had to have a special version of Solaris just to meet these needs and Solaris was already considered very secure to begin with. I can't remember if MS released a secure NT for this reason as well or if they tried to and failed.

    Talking about the openess of the linux code, there's another question I always wonder nobody asks. Sure Linux is open source and that's what helps it get better but I don't see the argument in terms of cost and security. Saying "you have the source you can see how secure it is" doesn't work for me. People buy an OS because it's cheaper to spend a few hundred or a few grand per PC than it is to hire the staff to build their own OS. Having to have the staff that can review, maintain and patch their own linux kernel alone isn't easy. It's something like 1.5 million lines of code right now. People want an OS that just works and is cheaper than building one themselves.

    • Re:Higher Standards (Score:3, Informative)

      by HiThere ( 15173 ) *
      They got a Class C license (whatever that means), but only on the condition that it wasn't connected to a network.
    • Talking about the openess of the linux code, there's another question I always wonder nobody asks. Sure Linux is open source and that's what helps it get better but I don't see the argument in terms of cost and security. Saying "you have the source you can see how secure it is" doesn't work for me. People buy an OS because it's cheaper to spend a few hundred or a few grand per PC than it is to hire the staff to build their own OS. Having to have the staff that can review, maintain and patch their own linux
      • Just to clarify. I'm not saying open source is bad or not secure. I just don't understand that argument. That peice wasn't necessarily in reference to anything the article author said but a general comment that is on topic to linux/security/open source, and gets brought up a lot.

        I'm not saying open source is bad I'm saying that I don't see that argument as being valid in terms of security.

        but I'd like to comment on :

        "Third is that if there's a problem with a particular piece of code (say, the SSL l

  • by pridkett ( 2666 ) on Saturday April 10, 2004 @11:28PM (#8828521) Homepage Journal
    First, this isn't the first time that Green Hills has come out complaining about Linux, you may remember a previous slashdot story [slashdot.org] where they claimed that the embedded linux tools market was a myth. Secondly, this article, like their previous one is through EETimes. If you've ever read EETimes you'll know why that should make you question the quality/validity/truthfulness of all the statements in the article.

    Basically, Green Hills seems to be just another proprietary software vendor scratching for ways to try and derail a competitor in their market space. Nothing to see here. Move along now.

    • They're taking their example from someone we know quite well, aren't they :) F, U, and D.

      (Sorry, just finished reading Kaplan's "StartUp" again...)

      SB
    • I have to say that if a market didn't exist for embedded linux, why would they feel compelled to say anything about it?

      Microsoft didn't say much about Linux until it started becoming a threat.

      Linux certainly isn't always the best tool for the job, it is inappropriate to say that Linux can do every job as much as it is inappropriate to say it can't be trusted for any job.
  • Is it only me - cause when i read green hills I immediately thought about the Windowss XP background :P
  • Open vs. closed... (Score:3, Insightful)

    by briaydemir ( 207637 ) on Saturday April 10, 2004 @11:30PM (#8828532)

    This is kind of a side remark that I haven't really thought on too much, but here goes. (I think I'm playing devil's advocate...)

    (1) Who audits the open source software that they use? I certainly don't. I rarely even bother to look at the source. So in this respect, it doesn't matter (to me) if the software is closed source or open source since the code isn't looked at even if you (I) had the chance to.

    (2) If you're not going to audit the code, will you trust the code developers to have done adequate auditing? Again, the folks who write open source software are, for the most part, as much a stranger as the folks working in some company (at least if you're me). Why should I trust "open source" strangers more than "closed source" strangers?

    These points rarely seem to get brought up here. I can certainly see the answers to (2) giving the edge to open source, but what about (1)?.

    • 1) Doesn't matter. The source is open to everyone, so anyone is allowed to audit. The "more eyes looking at it" is a good thing, since only takes one person to find a hole and report it.

      2) This one is a bit off base, since by definition the programmer who wrote the intrusive code is the one who introduced it to the system; thus, it is inhierent in any system that one does not trust the programmer. That is why companies have audits. The real question is why trust open source strangers?

      Trust is often mi
    • Look at it this way. (Score:5, Informative)

      by mcc ( 14761 ) <amcclure@purdue.edu> on Sunday April 11, 2004 @12:02AM (#8828669) Homepage
      Who audits the open source software that they use? I certainly don't. I rarely even bother to look at the source. So in this respect, it doesn't matter (to me) if the software is closed source or open source since the code isn't looked at even if you (I) had the chance to.

      Let's say you're in a major city. Let's say there's a small, narrow street. And let's say you walk down this street twice; once in the middle of the day, when the street is well-lit and crowded. And the second time in the middle of the night, when it is empty, abandoned and dark.

      In which of these situations do you feel more safe?

      I would probably guess the second. Why? In both the first and the second case, you have no way of knowing if there is someone who has a weapon and wishes you harm. But in the first case, this possibility is not something that worries you.

      Open source is kind of like the well-lit, crowded street, and auditing is kind of like being able to tell if someone wishes you harm. First off, you don't *have* to be constantly watching your back; it's more than likely someone is looking at your back at any given moment, and can tell if someone is walking up behind you with a blunt instrument. Second off, the fact that *everyone knows people are watching* means it's less likely anyone will try anything, because they know they'll get caught and have messy problems with the law.

      Note that this argument does not go so well unless the open source product is relatively well used. If you're the only user, well, you're not much better off.

      ---

      Anyway, as far as (1) goes, I would imagine that while it may probably be a very important point insofar as you go, as far as the kinds of software discussed in this article go it's not so useful. You probably don't audit the open source software you use. But a propreitary embedded RTOS vendor certainly would, since their demands for security and reliability are much higher.

      But wait, you say, wouldn't the need to audit the code at that level be an argument against open source, since it destroys the "free" nature? Well, here we run into the basic problem of the conflation of "free" software with "open source" software, or the conflation of "free" software with what RMS might call "software libre". These two concepts are often described by the same term. However, they are not the same thing!

      A program being open source does not mean that it can have a trusted company of some kind behind it! A company such as IBM could be providing an open source program they internally wrote, or you could have a case such as that with MySQL (ok, maybe that's not a great example, but you know what I mean) where a community-developed program passes through a certain central trusted point (MySQL AB) which can be trusted to-- or demanded to-- perform the auditing for you. So this is not a problem with Open Source, this is just a problem with software you downloaded for free off the internet.
  • by Nice2Cats ( 557310 ) on Saturday April 10, 2004 @11:30PM (#8828535)
    I had submitted this two days ago and it got thrown away, probably because I had the better quote:

    "Now that foreign intelligence agencies and terrorists know that Linux is going to control our most advanced defense systems, they can use fake identities to contribute subversive software."

    The whole story is so absolutely paranoid (The Russians are coming! Beware of the Yellow Peril!) and shows such a complete lack of understanding of the Linux Open Source process that it would make me worry if I were buying Green Hills' software: Do you want to buy something from somebody who is this divorced from reality and has this little understanding of how his competitor works?

    • Considering the number of double agents we have caught in the US lately, I think our concern should be the employees of closed source companies sticking evil easter eggs into the code used in national defense. We have all these Americans selling secrets for years before being caught. OTOH, we keep arresting these residents only to release them for lackof evidence. It is not the foreign agent that is the danger, but the domestic agent doing anything to pay a mortgage, private school, and vacations.
    • by Anonymous Coward
      The whole story is so absolutely paranoid (The Russians are coming! Beware of the Yellow Peril!) and shows such a complete lack of understanding of the Linux Open Source process that it would make me worry if I were buying Green Hills' software: Do you want to buy something from somebody who is this divorced from reality and has this little understanding of how his competitor works?

      There are a number of open source projects that have had their servers 0wn3d by crackers in the last year or two. In at leas
  • by bogie ( 31020 ) on Saturday April 10, 2004 @11:31PM (#8828539) Journal
    For example you'll never see backdoors in commercial software. You can rest easy that they've done their job well and everything is nice and secure. That's why its better to stick with big commercial vendors like Cisco.

    btw, why even give a story like this press? What a joke.
  • by Perseid ( 660451 )
    I half expected to see a big "Sponsored by Microsoft" sticker on the bottom of the page.

    Basically this guy is recommending we entrust the security of our defense systems to the code review teams of the closed-source OS, rather than taking the time and money to do have the DOD do it themselves. Sounds like a money saver until a missile goes blue-screen and blows up a school...

    If these people are so concerned about code review(which, admittedly, they ought to be), then perhaps they should be writing their
  • by e**(i pi)-1 ( 462311 ) on Saturday April 10, 2004 @11:36PM (#8828564) Homepage Journal
    Some good reading about this topic can be found here [dwheeler.com].
  • Pot Kettle (Score:5, Insightful)

    by DAldredge ( 2353 ) <SlashdotEmail@GMail.Com> on Saturday April 10, 2004 @11:36PM (#8828565) Journal
    "Everyday new code is added to Linux in Russia, China and elsewhere throughout the world. Everyday that code is incorporated into our command, control, communications and weapons systems. This must stop."

    Does he get this pissed about Microsoft, IBM, Sun, HP and other companies that outsource core dev to those same countries?
  • by chendo ( 678767 ) on Saturday April 10, 2004 @11:43PM (#8828595)
    In the article, he states that anyone can contribute backdoors/trojans into the code because nobody is looking at the code. This is completely and utterly wrong. I'm pretty sure that to insert code into the kernel, you have to sign up to the mailing-list, and send in a diff. There, other kernel hackers can easily see the code, and if Linus accepts it, it goes into the tree. Even though anyone around the world can do this, the process is fairly strict.

    Anyone want to place bets that Microsoft paid him to say that?
    • by Tony ( 765 )
      Anyone want to place bets that Microsoft paid him to say that?

      Nah, MS isn't the only one with a livelihood at stake. Linux is going to change the way a lot of people do business; some companies will not be able to adapt, and will die.

      This is the sound of someone running scared. Plus, he probably believes what he says. Think about it from his perspective: his company is in the business of supplying good software, and he *knows* it's good software. Linux is deveoped in a strange way, one that is counte
  • by Animats ( 122034 ) on Saturday April 10, 2004 @11:50PM (#8828622) Homepage
    Wind River makes a big deal about being "#1 in market share growth". [ghs.com] But in the RTOS market, independent analysts list them as in seventh place [embedded.com].

    Actually, this is somewhat misleading. The top players listed are Microsoft (Windows CE), Wind River, Symbian, Palm. QNX, OSE, and Green Hills. But Microsoft, Symbian, and Palm are really selling into handheld devices, not hard real-time control. (The phone and PDA markets are much bigger than real time control, though.) Wind River's VRTX is the dominant player by a big margin, especially in low-end embedded control. QNX is next, and is usable on a broad range of platforms. Wind River is more of a specialist maker catering to the military Ada market.

    Following these seven come LinuxWorks and MonteVista, who are moving up. These are the main Linux-based offerings.

    Also confusing the issue is Windows XP Embedded, which is basically a Windows XP from which you can delete stuff you don't need. This sells more into point-of-sale applications than hard real time control.

  • SAYODF (Score:3, Interesting)

    by 10101001 10101001 ( 732688 ) on Saturday April 10, 2004 @11:52PM (#8828630) Journal
    Green Hill seems to be making some unsubstantiated claim that open source isn't held up to the same standards as closed source, and I find that rather funny. I think the real issue is, when Green Hill approaches the FAA or whatever, the FAA will do its own testing of the source. If Green Hill's code is breakable, Green Hill is the one responsible for fixing it. But, if what the FAA is reviewing is open source, it's possible the FAA can just fix the source themselves (and avoid having to pay an outside contractor). So, Green Hill, to avoid the scenario where the FAA might be displeased with Green Hill's RTOS and switch to open source, decides on its *own* to spend $500/$1,000 per line to audit their OS.

    In the end, this means to me that Green Hill believes OSS has an unfair advantage. Personally, I think it's perfectly fair for people to offer free software. If Green Hill doesn't like it, tough. Or, they can just make their RTOS so good that the FAA or some other organization will be so impressed they won't bother going over some OSS and possible having to fix bugs or write documentation. Looks like the free market to me.

    PS: SAYODF == Self-Analysising Your Own Dog Food; it's like a water bottling plant bitching about there being freshwater lakes because lakes don't have to do their own quality control
  • In other news, Budweiser doesn't taste good according to Pete Coors, and a new study commissioned by Mitsubishi found that Sony equipment causes cancer in laboratory rats.

    I wish suits would stop blathering about each other's products because really it's just a waste of time. The source is so obviously biased that even reading is pointless.

  • by ShatteredDream ( 636520 ) on Sunday April 11, 2004 @12:08AM (#8828689) Homepage

    I caught this story on OSNews yesterday and posted a rebuttal on my blog [blindmindseye.com]. This sort of thing probably doesn't carry a lot of weight with most of the defense types because the military is the very definition of mission critical, no pun intended. Peoples lives are at risk on a daily business in most jobs in the military these days. There is almost no price too high to pay for the freedom to design to specification that Linux provides.

    Linux is certainly not ready to take over a lot of things yet, but it is good enough for many things that traditional defense contractors are involved with. I wouldn't trust it yet as an OS for our warships or other vehicles, but I would trust it for communication systems and things like that. For situations like that, a RTOS from a company like Green Hills may not provide enough benefit to justify the cost. Linux is free, their product isn't. They can try to get the military hooked for a while, but Linux will always be free and there are plenty of IT workers in the military who could work on existing RTOS Linux forks for military use.

    Another thing that has to be kept in mind is that with the push for homeland security, the laissez faire attitude that has been prevalent toward security has to go. The miltiary wants transparency so it knows it's not getting something bugged all to hell by some Jihadi who wormed his way into Microsoft or Sun via the H1-B visa program. The Debian and Fedora teams are great for that very reason. Everything is open to public scrutiny, from the installer to every package so the military gets a chance to audit everything.

    Free markets are great, but in this case the military has to perform a more core mission: defend the US from attack. If that means violating free market principles by pouring taxpayer dollars into a free OS for public use, then they should and most likely will do it eventually.

    • I wouldn't trust it yet as an OS for our warships or other vehicles,

      I would trust it more in that application than Windows or even Green Hills.

      Free markets are great, but in this case the military has to perform a more core mission: defend the US from attack. If that means violating free market principles by pouring taxpayer dollars into a free OS for public use, then they should and most likely will do it eventually.

      What makes you think that is not part of the free market? When the military invests
  • by thewiz ( 24994 ) * on Sunday April 11, 2004 @12:11AM (#8828700)
    Having worked as an systems administrator on DoD programs, I can tell you for a FACT that ANY software that goes on mission-critical systems is either developed in-house or very throughly scrutinized. They do code review, bug fixes and testing in a continuous cycle to get all the software bugs out. (This is one of the big reasons you hear about DoD projects going over time and/or over budget).

    If COTS products are used, the DoD programmers will test the software for defects and ask the vendor to correct the defects they find. There have been cases where the DoD has signed NDAs to gain access to source code for COTS software to fix bugs that caused problems with the DoD software that the software company WOULDN'T fix. This has even been done to find backdoors, trojans, and other bad things that disgruntled employees of proprietary software vendors have put into that company's products.

    OSS gives the DoD the power to make the changes they want to secure their systems the way they want. They WILL go through the code and look for backdoors, trojans, viri, etc. They may even set up their own repository and fork the kernel. Once the DoD has a trusted version of Linux, they'll use it in-house. I suspect that most DoD programs looking at Linux are probably testing NSA's version.

    The DoD should be able to release some of the improvements they make back to the community, but don't expect them to release everything. The military still has it's secrets.
    • The DoD should be able to release some of the improvements they make back to the community, but don't expect them to release everything. The military still has it's secrets.

      I, for one, am still waiting for the DoD to release the research of one Colonel Sanders, who apparently came up with 11 Secret Ingredients for a secret project with the code name of "Fried Chicken".

  • I used to work as a Defense contractor and I spent quite a lot of time going through the various processes. As a Linux developer, I can certainly say that Linux has not been developed to the same standards that the projects I've been involved on have.

    For starters, in the DoD, every line of code is reviewed by hand by a team of reviewers (usually 4-5). There are records of all the defects found and verification that fixes were made. After the initial development cycle, there's a rigorious testing phase where all specifications are tested, senarios are ran, and stress tests are performed. Any defect from this testing is recorded, and the software doesn't ship until it's fixed. This usually ends up being a 2-4 year process of just doing bug fixes.

    And for those that don't know the difference, Windows is *not* certified for tactical use. Having EAL4 is not the same as being certified for tactical use.

    It's really a different type of software. It's not that Linux isn't good piece of software, it's just that it wasn't developed for this type of work. There's nothing wrong with that.


    • Just curious - what about the NSA version of linux? Isn't that exactly what you are talking about?

      (Yeah, I'm aware that what the NSA makes public and their in-house versions are probably very different. Still curious - and no, I'm not a terrorist :)

      SB
      • NSA Linux is different. NSA linux isn't actually used on tactical systems. The NSA is not part of the military.

        NSA Linux is not a reviewed version of linux, it's a version of linux with enhanced security services (Linux Security Modules grew out of it for example).

        • Info from the NSA certainly filters down to the military, tho, so one could argue that they are part of the chain. But I may just be blowing smoke there :)

          I take it that tactical (?battlefield CCC?) is essentially written from scratch "inhouse" then? If so, that's a very good thing IMO. Gotta be damned difficult tho.

          Hats off to those devs.

          SB
  • by LostCluster ( 625375 ) * on Sunday April 11, 2004 @12:44AM (#8828797)
    Linux, in a proper definition, isn't very functional. It's the OS kernel... you're gonna need some software to go with that. So, which distro should be the "standard issue" for a military use?

    Drawing a line between what's secure enough to make the grade, and what that's out there might not be trustworthy enough for "secure" use is quite a tough thing. Sure, Open Source allows the code to be reviewed... but the government doesnt have the time to do that so that's no good for them.

    Microsoft can at least come forward and show a big company standing behind their product... how can Linux match that?

  • by technoCon ( 18339 ) on Sunday April 11, 2004 @12:48AM (#8828811) Homepage Journal
    Note to self: If ever pirate Russian gas pipeline control software, look for the "paybacksAreHell" subroutine.
  • Wow, that's pretty thin. But let's assume it is a real possibility. What are the employee vetting procedures of closed-source companies? How do we know terrorists aren't working for Microsoft? If I were a terrorist, I think I would rather go the route of working for closed source in order to insert my devious code. There isn't a public review of my code, and apparently, important decision-makers seem to want to blindly trust closed-source companies while being hyper-suspicious of publicly-available ope
  • I'm dealing with an issue like this at work. We have to qualify all software we use. Since we run mostly DOS and Windows on some stuff, we don't have to qualify that because its an off the shelf (OTS) product. However, when the subject of Linux comes up, they don't want to touch it because 1. How do you qualify it and 2. How are you sure it doesn't change. Since the source is open, technically I, having the root password, can compile a new kernel or something else and skew results (forgetting for a moment t
  • by stox ( 131684 ) on Sunday April 11, 2004 @01:32AM (#8828955) Homepage
    IMHO, closed source solutions must be held to a higher standard then open source solutions. Open source solutions are proven in the wild, while close source solutions are much less so. With the availability of the source code, far more permutations of attack have probably been attempted against open source than closed source. In bottom line testing terms, chances are that open source has had far better code coverage tested than the closed source competitor. Closed source solutions must be held to a higher standard to compensate for this difference.
  • One of the major advantages to open source is that it keeps everyone honest; no funny schtuff.

    With closed source, it might work fine in test cases but there might be code hidden/lurking in the shadows that does something other than what's intended. If a cyberterrorist (or some a-hole) adds code that can be remotely activated and wreak havoc, it's harder to detect it if the code is like a black box (a mystery), and more likely to be detected in a clear box.

    Just like ppl aren't likely to buy a car if you c
  • by hak1du ( 761835 ) on Sunday April 11, 2004 @01:41AM (#8828987) Journal
    I doubt Trojan horses are much of an issue in embedded systems: since embedded systems don't generally have external Internet access, it would be hard to trigger a Trojan horse in an embedded system, so any failures it would introduce would have to show up randomly and not just in response to a trigger.

    Furthermore, for embedded code to try and infer what kind of system it's running on (military/non-military, essential/non-essential, deployed/non-deployed) and only fail in the essential, deployed, military systems is essentially impossible with the kind of minimalist code that could be hidden in an open source project and not noticed.

    That means that if anybody planted a Trojan horse in OSS that was of any military significance, it would show up during testing as random failures, and that is just taken care of by normal testing procedures.

    Note that the same argument doesn't work for closed source: something like a Green Hills embedded kernel could easily ship with a huge Trojan horse that looks for specific strings in system output/logs ("military", "target", "live munitions", "vehicle speed", whatever the military lingo is) and/or looks for specific sensor types, output devices, and/or communications channels and only triggers under specific circumstances likely to represent actual combat situations. While such attempts to identify combat situations would be blatantly obvious in 100% open source software and be noticed right away, they could easily be hidden in any big binary component of any closed source system.
  • DO-178B and Linux. (Score:5, Interesting)

    by BStorm ( 107974 ) <bill AT mcleansoftware DOT com> on Sunday April 11, 2004 @02:02AM (#8829052)
    The FAA approves software when it is written according the DO-178B specification. This specification states that software when developed must adhere to a development process.

    This is defined within the D) 178b as software requirements, software specification, software design, source code configuration, and software test suites. If one changes one part then all levels affected must change as well.

    Simply put a paper trail must exist for every change made in a system. It is stringent anal rententive form of development. It is costly since the amount of book keeping that must be done to incorporate changes.

    This is the 'cost' that O'Dowd is refering to. In order to make a 'DO-178B' compliant version of Linux a group of developers/software house would have to:

    1) Ensure that a comprehensive set of functional requirements is generated to match the desired platform.

    2) Define a kernel that matches desired functional requirement. Any kernel portion that is not needed is defined out.

    3) Specify the behaviour for each driver. Ensure the driver is fully specified. Work from the source and ensure that the behaviour of each execution path is documented.

    4) Ensure that all changes to this build are reviewed and a paper-trail exists for all changes and changes are made for solid well documented reasons.

    5) Use the documented behaviours to generate test cases that validate the documented behaviour.

    It goes on and on...

    There is nothing inherent within Linux that would prevent a DO-178B build to be created.

    Only in the last 3 years has Green-hills has marketed a DO-178B compliant system. DO-178B as a standard has been around for I believe the last 10 years. Hmmm...

  • by Eric Smith ( 4379 ) * on Sunday April 11, 2004 @03:34AM (#8829298) Homepage Journal
    Mr. O'Dowd of Green Hills Software obviously didn't really learn anything when reading Ken Thompson's paper, or he would realize that the trust problem Thompson described is just as severe with commercial closed-source software. Actually, the compiler trojan Thompson described was for commercial, closed source software.

    In fact, open-source software may have a slight advantage here, because it's less of a monoculture. Presumably Microsoft always uses their own Visual C++ compiler to build Windows, so if there were a trojan in the compiler that compromised the resulting Windows executables, it would be present in all copies of Windows that Microsoft distributed. But open source software is by its nature built on many different platforms using different compilers, so a compiler trojan would only affect a portion of the deployed copies of the open source software. And it is possible that a trojan introduced by one particular compiler would be found due to the executable it produces being different in some noticable way from the executable produced by a different compiler. For instance, strace might show the trojaned executable making extra system calls.

    How does Mr. O'Dowd propose to assure us that his company's operating systems and compilers are more secure than Linux, xBSD, GCC, etc? Is he certain that none of his employess who have written code incorporated into his products have ever installed trojans? If so, how has he gained this certainty? Has he scrutinized every line of source code himself? Including those of the compilers that compiled the compilers, back all the way to the machine-code only origin of the system? Somehow I doubt it.

    It is a matter of historical fact that far more trojan and back door exploits have been present in commercial, closed source software than in open source software. Just two days ago Cisco had to issue a security advisory regarding a back door found in their WLSE and HSE products. Would Mr. O'Dowd conclude that foreign agents and terrorists are responsible for that? Would he really have us believe that these shadowy figures can compromise open source software developed in the public eye more easily than they could subvert a commercial closed-source software package for which the source code and development process get no public scrutiny?

    One is forced to conclude that Mr. O'Dowd feels his company's business model is threatened, and rather than change that model to reflect changes in the marketplace, he prefers to use "the sky is falling" proclamations in an attempt to scare customers into sticking with his products.

  • by oldCoder ( 172195 ) on Sunday April 11, 2004 @06:34AM (#8829582)
    The costs and benefits of reliability are different
    If you've got your real time system in rom inside a piece of equipment, or in thousands of pieces of equipment, you've got to be very careful with it.

    Desktop system can be patched and upgraded, but ROMs have to be replaced or flashed. For example, you've got to bring the missle into the hanger/lab and hook up the reflashing unit or swap out the ROM chip. You've got to test the missle with the new chip. Out in the field, the soldiers have new ly upgraded missles (or tanks...) and would really like to know that it will work when they need it. You can field test a tank, but some missles are expensive, especially when all you want to do is prove you installed the right chip in the right way.

    When a desktop or server software hiccups, the human user can work around it. Often this is not the case in communications and avionics.

    Linux Advantages Don't Translate to Military Embedded Systems
    Embedded systems are almost always memory-resident and have no disk for software storage. There are usually no user identities to manage, and the user interface is quite often absent or primitive.

    Most of the advantages of Linux do not apply to an embedded, military situation: Licensing fees for software are usually a negligable part of a tank, missle or radar. Embedded RTOS systems are already quite reliable, and do not suffer from nearly as many buffer overruns, neither are they susceptible to hackers. Embedded military systems are almost never connected to the internet.

    You could build a reliable, compact embedded software system from embedded Linux, but you'd want to write all your own drivers and you would have to port it to special hardware. This approximately the same amount of work that you would have to do if you were to use a proprietary RTOS.

    The vast bulk of the problems users experience with proprietary OS's are 1) expensive to license, more expensive to license across many machines. 2) Security vulnerabilities resulting from using a system designed and built for stand-alone personal use in an internetworked environment. Neither of these problems matters much to embedded, military systems.

  • He is overreacting. (Score:3, Interesting)

    by master_p ( 608214 ) on Sunday April 11, 2004 @10:34AM (#8830168)
    Defense applications are usually running in an isolated environment, not connected to the internet or any WAN. So I can't see how there is a security problem. Furthermore, most real-time weapon and radar systems use operating systems like Lynx, not Linux or Windows.

    Security issues may exist in development environments, that are usually LANs connected to WANs. In that case, Linux is preferrable, due to better security.

    As for open source being better when it comes to security, it is irrelevant to defense applications subcontracting. As long as the subcontractor is audited and found to have satisfactory methodologies and coding procedures, the contractor is ok. The focus in these cases is on qualification and testing, and they usually do exhaustive testing (i.e. testing every possible case) to make sure the application works as intended.

A modem is a baudy house.

Working...