Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security Software

The Cost of Crappy Security In Software Infrastructure 156

blackbearnh writes "Everyone these days knows that you have to double- and triple-check your code for security vulnerabilities, and make sure your servers are locked down as tight as you can. But why? Because our underlying operating systems, languages, and platforms do such a crappy job of protecting us from ourselves. The inevitable result of clamoring for new features, rather than demanding rock-solid infrastructure, is that the developer community wastes huge amounts of time protecting their applications from exploits that should never be possible in the first place. The next time you hear about a site that gets pwned by a buffer overrun exploit, don't think 'stupid developers!', think 'stupid industry!'"
This discussion has been archived. No new comments can be posted.

The Cost of Crappy Security In Software Infrastructure

Comments Filter:
  • Ugh (Score:5, Insightful)

    by Anonymous Coward on Friday June 01, 2012 @02:50PM (#40184415)

    Tools are dangerous. If I want to cut my hand off with a chainsaw, I can. If I want to leave my PHP script open to XSS, I can.

    • Re:Ugh (Score:5, Insightful)

      by h4rr4r ( 612664 ) on Friday June 01, 2012 @03:01PM (#40184639)

      This 1 million times THIS!

      Any tool that is useful will be dangerous.

      • Agree.

        As you increase the general-purpose utility of any piece of technology, you open corresponding opportunity for abuse or exploitation.

        Security comes through ongoing practice. This includes implementation specifics, ongoing management operations and individual initiative/decision capacity of users.

        To believe there is a technology solution that - correctly implemented at the correct point of design and lifecycle - would automatically solve the security "problem"? This is a naive point of view, which ign

    • Re:Ugh (Score:4, Insightful)

      by I_am_Jack ( 1116205 ) on Friday June 01, 2012 @03:03PM (#40184687)

      Tools are dangerous. If I want to cut my hand off with a chainsaw, I can. If I want to leave my PHP script open to XSS, I can.

      True. But I think the biggest impediment to secure systems and code is what people like my 82 year old dad are going to do if you ask them to start making selections or decisions regarding how tight or loose they want access to the internet. He's going to get angry and tell me, like he always does when I have to clean viruses off his computer, "I just want to read my email!" And there's more people a lot younger than him that will respond the same way, only it'll be over free smilies, fonts or porn.

      • by Anonymous Coward

        Have you considered buying him a Mac? Best investment I ever made when it came to my parents' computing, and my dad is even an electrical engineer. Some people will complain about the cost, but unless your time is completely free, it is easily worth it.

        • Have you considered buying him a Mac?

          ...or installing Linux Mint for him? (a decent amount cheaper, and less confusing for someone moving from Windows...)

          • The problem is, my Grandmothers (both of them) are partial to being able to pick up a bargain-bin tile/card/casino/casual games cd at walmart or best-buy... and it's not the easiest thing to walk them through fixing their video drivers when the system update mangled their settings.

            I'm running Linux, Windows, OSX and FreeBSD at home on various computers, let alone vmware, and mobile (android, webos and ios) devices. What's easiest for me, isn't what's most compatible for my parents/grandparents. I will
        • Re: (Score:1, Funny)

          by Anonymous Coward

          I considered giving up my girlfriend and peddling my ass down the gay part of town, but no, I really didn't. Just like I would never buy a Mac for myself or my family when I could get the same performance and security out of a Linux box for 1/7 the price.

          Here's a Mac-related joke, though - why did they bury Steve Jobs face-down? So people like you could stop by for a cold one! Hah heh, silly Macfags.

          -- Ethanol-fueled

        • Have you considered buying him a Mac? Best investment I ever made when it came to my parents' computing, and my dad is even an electrical engineer.

          Funny. My dad was a mechanical engineer and you'd think he'd know. I've told him under pain of death that he is never to buy another computer without my input, and yes, it'll be a Mac. Every computer in my house is a Mac, except for the file server, and it's running Mint.

        • Re:Ugh (Score:5, Insightful)

          by mlts ( 1038732 ) on Friday June 01, 2012 @03:41PM (#40185513)

          I personally am from the IT school of "all operating systems suck, so pick what sucks less", and in some cases, the Mac recommendation may be the best way to go.

          First, Apple has actual customer service compared to the PC companies (well, unless you buy from the "business" tier and get the better support plan.) So, they will have someone to call to get problems fixed and questions answered that isn't you.

          Second, building them a desktop is in some ways the best solution, but it means you are on 24/7/365 call if anything breaks.

          Third, Macs are not unhackable, but as of now, the biggest attack is through Trojan horses, while Windows is easily compromised through browser and browser add-on holes. So, for now, Macs have a less of a chance of being compromised by browser exploits.

          Fourth, Time Machine isn't perfect, but compared to other consumer level backup programs, it is good enough. Especially if paired up with Mozy or Carbonite for documents. That way, the parent's documents are stashed safely even if the computer and its backup drive are destroyed or stolen.

          Fifth, the App Store and a stern instruction to not run anything unless it came from there will help mitigate the possibility of Trojans. It isn't perfect, but it is a good method.

          Of course, Linux is a workable solution as well, but a Mac's advantage is that it still has a mainstream software selection available for it, so Aunt Tillie can get a copy of Photoshop if she so chooses.

        • Experts say Macs will be targeted more and more in upcoming years. Low market share is no longer protecting the platform, it seems.
      • Re:Ugh (Score:4, Insightful)

        by mlts ( 1038732 ) * on Friday June 01, 2012 @03:47PM (#40185659)

        I find the biggest impediment to secure systems is cost. In previous companies I have worked for, there was a mantra by the management, "security has no ROI."

        The fact that on the accounting ledger, proper security practices, doesn't mean black numbers are added, but that red numbers are not added escaped them. The typical response when I asked what the contingency plan about a break-in was usually "We call Geek Squad. They are open 24/7."

        Yes, good security costs. Good routers, firewalling, hiring clued network guys, and running penetration scenarios are not cheap. However compared to other business operating costs, it isn't expensive on the relative scale.

        Because there is little to no penalty if a business does get compromised, there is not much interest in locking things down. Until this is addressed, crappy security policies will be the norm.

        • by Bengie ( 1121981 )
          "security has no ROI."

          Security has a best case of no return and a worst case of "you lose everything".

          Preventative doctor visits also have no ROI, yet they keep saving money by saving lives.

          Ask them why they think banks "waste" money on security.
    • I think it's harder to cut your hand off with a chainsaw than you realize, since they really require two hands to operate. A circular saw, on the other hand. . .

      • by h4rr4r ( 612664 )

        Hit a nail in a tree onetime and see where it wants to go. It won't get your arm, but a chainsaw to the ribcage is not much better.

        • NOODLE ARMS! Really. You should have control over that bastard.

          • by h4rr4r ( 612664 )

            I grew up in rural area where logging was and still is a common occupation. Noodle arms has nothing to do with it, if you are not paying 100% attention at all times when it kicks back bad things will happen.

            • But don't you think it's advisable to pay 100% attention at all times when using a chainsaw? I mean, it is a chainsaw after all. . .

        • but a chainsaw to the ribcage is not much better.

          Or to the neck. A guy I knew lost his brother due to a chainsaw accident.

      • by Surt ( 22457 )

        My chainsaw is a couple of years old, but it just has a startup lock that requires two hands. Once it is in operation, it requires only one hand to operate.

      • This. Now, if you were talking about *feet*, the OP would have a point...
      • by eimsand ( 903055 )
        I wish I had mod points for this.
      • by AK Marc ( 707885 )
        I had an electric one, and it was surprisingly light. Sure, you couldn't chase the coeds down the street with it, or cut down a Real Tree, but for trimming, it was about as light as a hedge trimmer, and as powerful as the smallest common gas-powered ones.
    • Re:Ugh (Score:4, Insightful)

      by neonKow ( 1239288 ) on Friday June 01, 2012 @03:17PM (#40184973) Journal

      Yeah, and tools have safety standards too. Just because you accept the risk of a car crash when you buy a car doesn't mean you have to accept the risk of your car spontaneously exploding.

      More importantly, if you're writing PHP code that costs money when you have an XSS vulnerability, that means you're responsible for your users' information. So, no, if you want to leave your PHP open to XSS, do it where it doesn't add to the cost of crappy security. And do it in a way that doesn't result in your site being hijacked to serve malware and spam for months on end before you notice.

      You're not an island. Personal responsibility means you don't blame other people for stuff that's your own responsibility (like getting hacked); it doesn't mean you can just neglect the responsibility of protecting you customers' or boss's data, or the network that your share.

      • by Deorus ( 811828 )

        Yeah, and tools have safety standards too. Just because you accept the risk of a car crash when you buy a car doesn't mean you have to accept the risk of your car spontaneously exploding.

        That kind of safety is already available in modern implementations in the form of buffer canaries, randomized memory maps, read/write/execute memory protections, and managed memory allocations that are nearly transparent to the users. This is analogous to your car example, in which safety features exist in order to mitigat

    • by AK Marc ( 707885 )
      XSS is still a systemic error, not strictly coding. Why? Because it's code injection. If the browser was sandboxed, then the code couldn't do anything. Now, fi your bank was hit or your browser is sandboxed per instance, not tab, then you could lose your bank info to an attack, again, a high level design issue, not a coding issue.

      But designing things well is hard. For one, most designs like that focus on wants and nice to haves, and not strictly on needs, and the other, the designs are discarded when
  • Yeah, yeah, yeah. (Score:5, Insightful)

    by localman57 ( 1340533 ) on Friday June 01, 2012 @02:52PM (#40184467)

    The next time you hear about a site that gets pwned by a buffer overrun exploit, don't think 'stupid developers!', think 'stupid industry!'"

    Yeah, yeah. Hate the game, not the player, and all that. If you code a buffer overrun and you get pwned, it may mean the industry is stupid. But that doesn't mean that you're not stupid too.

    • Re: (Score:2, Insightful)

      by i kan reed ( 749298 )

      Except the industry has painfully simple solutions to buffer overruns, like, say, almost any programming language developed after 1990 has no risk of buffer overruns.

      • by h4rr4r ( 612664 )

        No risk of maturity either.

        Truly powerful tools are always dangerous.

        • Oh yeah, I've heard that java is such an immature platform that no one ever uses it, and it can't do ANYTHING.

          Get over yourself.

          • by h4rr4r ( 612664 )

            The JDK has never had a buffer overflow?
            Secunia disagrees.

            • I don't feel up to dealing with dissembling of this sort. You said "code a buffer overrun". You can't do that. The end. The JDK ISN'T programmed in java. The applicability of overruns it gets(which are basically impossible to cause with a significantly more complicated exploitation to run arbitrary code first) are irrelevant.

              • by h4rr4r ( 612664 )

                Fine, then I will just admit this one flaw is something java does a good job at preventing. It still will not magically check your inputs for you.

                • No, it doesn't, but the original point is that the industry does nothing systematic for security. It's untrue. We actually work pretty hard as a whole on considering at least nominal security. Perfect security requires a non-existent level of perfection, so we address the problems with our software as it stands, and plan for the underlying basic security concepts that are well known and we can afford to.

                  I honestly belief that software engineering is natively one of the most security conscious professions

            • The designers of Java tried to do two things regarding security:
              1. allow running untrusted code (applets) without letting it break out of its sandbox
              2. prevent unsafe memory access by bounds checking, type checking on casts, no explicit deallocation

              #2 is a prerequisite for #1, since if code can write to arbitrary memory locations then it can take over the Java runtime process. However, #1 is not a prerequisite for #2. Java has in practice done poorly at meeting goal #1 but has been quite solid at #2.

          • by 0123456 ( 636235 )

            Note that while Java may prevent common bugs like buffer overflows, they may simply cause it to throw an unexpected exception which is caught by random code which then causes the software to behave in an unexpected way. So it's an improvement, but not a magic solution to all your security issues.

            And you can probably do all kinds of exciting stuff with random Java programs by throwing so much data at them that they run out of memory and explode in a hail of cascading exceptions.

            • I wasn't saying it was a magic bullet, I was saying it addressed a fairly basic common security problem by being an advancement in technology. But it's 20 years old now, better things have come along too.

        • What are you saying? That modern languages aren't powerful because you can't perform buffer overruns? You realize that even if your statement that more power --> more exploit were true, there are a million other vulnerabilities out there besides buffer overruns, which is a feature completely useless for the vast majority of programming.

          • by dgatwood ( 11270 )

            Yes. If you truly cannot have buffer overflows, then there are many things the language cannot do. You will never, for example, be able to write a device driver for any existing hardware architecture in any language that does not allow you to construct fixed-length buffers and fixed data structures. By definition, any language that supports those things can experience a buffer overflow.

            This is not to say that languages should not have string handling capabilities that are immune to buffer overflows, mind

            • by blueg3 ( 192743 )

              Buffer overflows are independent of whether you have fixed-length buffers and fixed data structures. You can have them with variable-length buffers as well.

              The essential problem that causes a buffer overflow is that your language supports a data-copying (or data-writing) operation that either does not care about or must be explicitly told the amount of space available in the destination. This essentially means that you must have range-checking for all pointers.

              Last I knew, Ada is both immune to buffer overf

  • by BagOBones ( 574735 ) on Friday June 01, 2012 @02:54PM (#40184497)

    Most web app exploits ARE the developers fault!
    - They don't check their inputs (length) buffer over flow
    - They parse or merge database commands (SQL injection)
    - They don't limit abuse (brute force retry attacks)

    Yes some of these can be mitigated at other levels, but ALL are common APPLICATION DEVELOPER ISSUES! by measure of deployment to number of exploits I would say the programing languages and OS already do a MUCH better job than the application developers...

    • by Korin43 ( 881732 ) on Friday June 01, 2012 @03:08PM (#40184777) Homepage

      - They parse or merge database commands (SQL injection)

      I would argue that this one is sometimes the fault of the tool. In most database APIs, there's a function like:


      run_sql(String command, Object[] data)

      But the language that most amateur programmers use only has:


      mysql_query(String command);

      Looking at that function signature, who's the know that you're supposed to also use mysql_real_escape_string. Even if you know what you're doing, you might accidentally use addslashes or mysql_escape_string. Or forget it for one parameter.

      Interestingly, the language that does this best, is also the second worst language ever invented (after PHP). In ColdFusion, if you do this:

      select * from cats where cat_name = '#catname#'

      It's perfectly safe, since ColdFusion detects that catname is inside of quotes, so it automatically escapes it. You can still use variables inside of SQL, since it only escapes them when they're inside quotes.

      • Your example is still a failure of the developer understanding the tool which caused the problem, not the tool missing an alternate secure way to do it.

        • by Korin43 ( 881732 )

          Say someone sells a car where there's three pedals: a gas pedal in the normal place, a brake pedal in an unusual place, a pedal that functions like a brake normally, but swerves into oncoming traffic if you're going over 50 mph. Is it the users fault that suddenly they're all crashing? The manual clearly states, "don't ever use the pedal where the brake pedal used to be, because you might swerve into oncoming traffic". But still, isn't it mainly the manufacturer's fault for including that pedal at all?

          • Developers are not end users... they are some level of engineer, as they are BUILDING things for end users to use... They should be reading some kind of docs before choosing tool / function they use for the job... the more powerful the language the more you need to know.

            In your example the developers should be the ones that build the BAD CAR with the exploit in it that was sold, they where not the poor end users that purchased it.

            • by Korin43 ( 881732 )

              They should be reading some kind of docs before choosing tool / function they use for the job... the more powerful the language the more you need to know.

              This is the exact opposite of how normal people describe powerful languages. The most powerful languages let you get the most work done while fighting your language the least. With a language like Java or Python, I can generally *guess* what the correct function to do a task is, and it tends to work (although if you do this in Java, you tend to get the worst possible implentation -- but it will work correctly). In PHP, you need to read the full documentation for every function you call, then Google it to ma

              • by Bengie ( 1121981 )

                This is the exact opposite of how normal people describe powerful languages. The most powerful languages let you get the most work done while fighting your language the least.

                When a language is doing lots of work for you, you'd better know what kind of work it's doing.

            • In my opinion, acting like an engineer means accepting that you're only human and are going to make mistakes. Therefore, adopt tools and practices that reduce the chance of mistakes and reduce the damage when mistakes do occur.

              In the case of SQL, pick a database interface that automatically escapes every string substituted into a query. This one architectural decision eliminates an entire class of bugs; it's much more effective than double checking hundreds of individual query constructions. And it has the

      • This so much. I went from ColdFusion development to PeopleSoft, and I find myself missing such nice things from ColdFusion.

        • by Korin43 ( 881732 )

          I can't bring myself to actually miss anything about ColdFusion, but I do find it interesting that they managed to solve the #1 security problem on the web. If only they could automatically detect when you're trying to store a password, and run it through bcrypt first ;)

    • by Anonymous Coward

      > I would say the programing languages and OS already do a MUCH better job than the application developers...
      Unless the language in question is either PHP or JavaScript.
      The first was written by someone with no clue about language design or network security. The second was written by a competent enough programmer who was constrained by having to write the language and library from scratch within a week.
      PHP's idea of a database API was an "execute string as SQL" function, pushing the responsibility for avo

    • I have blamed the developers, but the greatest source of issues I've seen usually circles back to managers and users. These fundamental issues/problems are still going on
      - Prototypes are taken directly into production. The prototypes intention was to actually flesh out the business rules. After initial testing does it do what the customer wants or can it do what the customer wants, customer is happy and wants it right now.
      So how did the prototype get so poor -- after 4 or 5 iterations where the develope
  • Totally off-base (Score:4, Insightful)

    by i kan reed ( 749298 ) on Friday June 01, 2012 @02:55PM (#40184515) Homepage Journal

    Computers are inherently instruct-able. That's their power, and that's where all security flaws come form. The underlying problems don't arise out of an industry-wide antipathy. If anything the reality is opposite, the entire industry in quite interested in the fundamentals of security.

    The problem lies in the fact that we want to be able to tell computers what to do with a wide assortment of options on each of multiple layers(machine, operating system, high level language, and user application). Every one of those layers necessarily includes things we won't want to do that someone else could want to(i.e. security flaw)

    This is like blaming car theft on a general malaise towards car security, when in fact it's a simple matter of cars that don't go wherever the driver wants or only ever accepts one driver is nigh useless.

    • I disagree. While what you're saying COULD be true, everything I've heard from everyone from programmers to IT employees point to the opposite. As if the news weren't enough. Sure, a security firm getting hacked might be an instance of "if someone wants to hack you badly enough, they will," but many more problems arise because management makes the decisions about where to spend resources, and they're always pushing edge for features because that is where the money is.

      • Whoa, I'm blown away by the concept that there are compromises that establish a balance of different needs due to budget limitations. That never affects anything but security.

        • The underlying problems don't arise out of an industry-wide antipathy. If anything the reality is opposite, the entire industry in quite interested in the fundamentals of security.

          Whoa, I'm blown away by the concept that there are compromises that establish a balance of different needs due to budget limitations. That never affects anything but security.

          Stop being stupid. Sarcasm doesn't make you right, and my point is quite clear, but I see I need to spell it out for you anyway.

          There IS industry-wide apathy, and the reason is because higher-ups benefit more from pushing features that edge out the competition next month than they do from long-term investment in security that might save them two years from now, which is what they are doing. If the managers/CEOs at a small car maker could save money by making one identical key/lock across all their cars, the

    • by damm0 ( 14229 )

      The car industry did move towards key fobs that authenticate legitimate holders. And the computer industry can do similar kinds of tricks.

      • We do. We do all the time. That doesn't mean that every choice made should be with a "Security first, freedom second" attitude. Every situation needs known severe risks addressed, and everything else played by ear. That's just how complex projects work.

  • ...because we love hearing not only the clamor for new features, but also:

    "Why won't you run on commodity hardware? I can get a system that does everything yours does, plus more [including things others make it do against my will], for half the price!"

    "Why is your system so much slower? Every benchmark shows that other systems can do X in a quarter of the time [leaving the other 75% for executing malware]."

    "Why does your system make it such a PITA for me to do this simple operation, when all the other systems let me [or any unauthenticated user] do it with a few simple lines of code?"

    • ... and yet such PITA systems like SELinux do exist and are utilized.

      Solutions are there, people need to just stop being lazy bitches about it.

  • Just Ask Apple (Score:2, Insightful)

    When you protect developers and users from themselves, when you start making engineering tradeoffs that reduce functionality and tinkering and fiddling ability in exchange for greater security and stability, some people start screaming that you've being evil, paternalistic and unfreedomly and not letting them decide for themselves whether they want to make tragic mistakes.

    • I'd say ask IT security people. Apple is hardly a good example since their reasons for their walled garden model has been more about what makes money than what makes things secure. It's been very successful at creating a certain experience for the users, but only recently has it taken up the slack as far as security goes, so I feel Apple deserves the criticisms it receives.

    • by Xtifr ( 1323 )

      That's funny--I don't hear that sort of screaming about OpenBSD, which is miles ahead of Apple on making a solid, secure system. Maybe it's not the greater security and stability that pisses people off. And it's not the reduced functionality, because OpenBSD has that too. Maybe it's the reduced ability to tinker and fiddle, and the fact that you don't actually own what you bought, and the fact that Apple really are arrogant and paternalistic.

      (Actually, I think OSX is a perfectly adequate system. It's Ap

  • by brainzach ( 2032950 ) on Friday June 01, 2012 @03:11PM (#40184829)

    If you design your tools and infrastructure to prevent those with bad intent, it can also prevent those with good intent from using your system.

    There is no magical solution that will solve our security needs. In reality, everything will require tradeoffs which developers have to balance out according to what they are trying to do.

    • by DamonHD ( 794830 )

      Wait, the evil bit[RFC 3514]? ...

      Rgds

      Damon

    • by damm0 ( 14229 )

      The great majority of applications today could be coded up in environments similar to what developers are already used to using, but constrained by sandboxes, if the sandbox author were to provide useful tools for the developer to do things they want to do. Examples include local storage on the file system, database interactions, etc.

      Oddly enough, efforts to solve the concurrency problem might also help our security problem. Witness http://flowlang.net/ [flowlang.net] for example. Being able to analyze the source latti

      • f your variable includes user input without going through any of the input checking routines and is then passed to a string concatenation routine before being passed to the database... the run time can detect that easily and abort or check!

        Like Perl?

  • Ugh. What a flaky, uninformed piece of drivel that was.

    The author can think of himself as an artist all he wants to. Here's a newsflash: other "arts" have to do things responsibly, too.

    His whole argument is like an architect blaming the bricks when his/her poorly designed building falls over.

    • That took you 7 minutes to read?
    • by Tarsir ( 1175373 )
      Agreed. This was an article with many low points, but I think the following two excerpts highlight the flawed reasoning quite well:

      The underlying platforms and infrastructures we develop on top of should take care of [ensuring security], and leave us free to innovate and create the next insanely great thing.

      The other major factor in why things are so bad is that we don't care, evidently. If developers refused to develop on operating systems or languages that didn't supply unattackable foundations, companies such as Apple and Microsoft (and communities such as the Linux kernel devs) would get the message in short order.

      This article is missing even a gesture towards explaining why "the infrastructure" should be responsible for security while developers create their masterpeices, and boils down to mere whining: "Security isn't fun so someone else should do it for me!" Perhaps the worst part is that there is a good argument to be made that the OS and hardware should take of security, and a fundam

  • America's biggest threat is not terrorism. It's complacency. For such an arrogant industry, IT "solutions" sure do have a LOT of holes. That's what you get when you demand quantity over quality.
    • Correction: MANAGERS don't know anything about security.

      Let me assure you, IT does - unless MANAGEMENT has ensured they only hire those who don't. The ones that do, however, cannot exercise it because of MANAGEMENT.

  • by BenSchuarmer ( 922752 ) on Friday June 01, 2012 @03:16PM (#40184953)

    I've got a Farside on my cube wall. The caption is "Fumbling for his recline button, Ted unwittingly instigates a disaster." The picture is a guy sitting in an airplane seat about to grab a switch that's labeled "wings stay on" and "wings fall off".

    It's a reminder to me to try to avoid giving my users a way to shoot themselves in the foot.

    On the other hand, people need powerful tools to get their jobs done, and those tools can do horrible things when used incorrectly. There's only so much we can do to make things safe.

  • I .NET there is no buffer overflow or html inject (querystring and post data are scanned by default) or sql inject (using SqlParameter all data are encoded). I "feel" a lots safer about basic security problem.
    • My specialty is not .NET.

      I can tell you that when I rewipe my computer with a fresh Win 7 image there are over 120 security updates and half are .NET. Hours later there are more security updates with the .NET platform. I do not think it is secure as you think.

  • by Shoten ( 260439 ) on Friday June 01, 2012 @03:25PM (#40185157)

    The article focuses on security problems that have been largely addressed, in exactly the way he's complaining hasn't happened yet. He focuses on smack stashing and buffer overruns, for example...and disregards the latest higher-level languages that manage memory in ways that makes these attacks far less common. He entirely ignores the most frequent and effective attacks (XSS, SQL injection) nor does he talk about the underlying causes of such vulnerabilities. (I, for one, am extremely curious how a SQL injection attack can be the fault of a fundamentally insecure operating system, since in many cases the attack traverses across multiple different OSes with nary a hiccup.) I'm not entirely convinced that he even understands the current state of what most vulnerabilities look like, to be honest. And finally, he gives absolutely no indications as to how to accomplish this lofty goal of an OS that would prevent there from being such a thing as an insecure app in the first place. It looks to me that all he's doing is whining about the fact that he's having to learn about proper and secure programming methods, which is taking away from his hobby of eating bear claws two at a time.

  • Blaming the users, developers, tool chains, internet, or operating systems isn't going to help fix anything because those aren't the root cause of the problem.

    Complexity is the problem. The solutions we're all used to using involve adding even more complexity on top of things, which makes it impossible to secure things.

    There is another approach. It's called capability based security, and here's how it works:

    For a given process, create a list of the files it needs to access, and the rights it needs for each.

  • by Animats ( 122034 ) on Friday June 01, 2012 @03:59PM (#40185963) Homepage

    The article is stupid. But the language and OS problem is real.

    First, we ought to have secure operating system kernels by now. Several were developed and passed the higher NSA certifications in the 1980s and 1990s. Kernels don't need to be that big. QNX has a tiny microkernel (about 70KB) and can run a reasonable desktop or server environment. (The marketing and politics of QNX have been totally botched, but that's a different problem.) Microkernels have a bad rep because CMU's Mach sucked so badly, but that was because they tried to turn BSD into a microkernel.

    If we used microkernels and message passing more, we'd have less trouble with security problems. The way to build secure systems is to have small secure parts which are rigorously verified, and large untrusted parts which can't get at security-critical objects. This has been known for decades. Instead, we have bloated kernels for both Linux and Windows, and bloated browsers on top of them.

    On the language front, down at the bottom, there's usually C. Which sucks. The fundamental problems with C are 1) "array = pointer", and 2) tracking "who owns what". I've discussed this before. C++ doesn't help; it just tries to wallpaper over the mess at the C level with what are essentially macros.

    This is almost fixable for C. I've written about this, but I don't want to spend my life on language politics. The key idea is to be able to talk about the size of an array within the language. The definition of "read" should look like int read(int fd, &char[n] buf; size_t n); instead of the current C form int read(int fd, char* buf, size_t n); The problem with the second form, which the standard UNIX/Linux "read" call, is that you're lying to the language. You're not passing a pointer to a char. You're passing an array of known size. But C won't let you say that. This is the cause of most buffer overflows.

    (It's not even necessary to change the machine code for calling sequences to do this. I'm not proposing array descriptors, just syntax so that you can talk about array size to the compiler, which can then do checking if desired. The real trick here is to be able to translate old-style C into "safe C" automatically, which might be possible.)

    As for "who owns what", that's a language problem too. The usual solution is garbage collection, but down at the bottom, garbage collection may not be an option. Another approach is permissions for references. A basic set of permissions is "read", "write", "keep", and "delete". Assume that everything has "read" for now. "write" corresponds to the lack of "const". "delete" on a function parameter means the function called has the right to delete the object. That's seldom needed, and if it's not present, the caller can be sure the object will still be around when the function returns. "Keep" is more subtle. "Keep" means that the callee is allowed to keep a reference to a passed object after returning. The object now has multiple owners, and "who owns what" issues come up. If you're using reference counts, only "keep" objects need them. Objects passed without "keep" don't need reference count updates.

    Do those few things, and most low-level crashes go away.

    I won't live to see it.

    • int read(int fd, &char[n] buf; size_t n);

      char buffer[10] ;
      &buffer[9] will point to the address of the last element of the buffer.
      &buffer[10] is outside the buffer range -->> BUG, C programming 101.

      if the function as stated above requires that n be the buffer size, then:

      1. You will always be passing a pointer to outside the buffer size.
      2. You will always be required to read ONLY the full size of the buffer.This will prevent reading more than what the buffer can hold, but it will also prevent reading less than the buffer size.

      • by Animats ( 122034 )

        The intent of the new syntax is that &char[n] buf means passing a reference to an array of size n. char[n] is an array type, something C currently lacks. Syntax like this is needed so that you can have casts to array types.

        I've had a few go-rounds at this syntax problem. See "Strict pointers for C" [animats.com]. Unfortunately, there's no solution that's backwards-compatible with existing code. However, mixing files of old-style and new-style code is possible, and mechanical conversion of old-style code to ne

  • Ask any MBA or beancounter. THey will gladly tell you that IT is a cost center that adds no value so there is no costs at all associated running IE 6 with no security updates after June 7th 2009, on a Tuesday that wont work with intraCrap from MegaCorp.

    It is not like any financially sensitive information is ever used on computers anyway and since it is a dollar sink there is nothing wrong with using that and switching to typewritters since after all they are just a cost. ... (... end sarcasm)
    My rant above i

  • We can have a wide open no holds barred space to create anything good, bad or indifferent. Or we can lock it all down according to someone's idea of safe, fair and convenient. Under the second plan. a thousand things you are going to want to do will not be possible because they exceed the mandate of the security environment (no matter where you arbitrarily draw the line.) So you get to pick your demons. Me I like it the way it is. That's just me.

Think of it! With VLSI we can pack 100 ENIACs in 1 sq. cm.!

Working...