Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming Security Software

Are Flawed Languages Creating Bad Software? (techcrunch.com) 531

"Most software, even critical system software, is insecure Swiss cheese held together with duct tape, bubble wrap, and bobby pins..." writes TechCrunch. An anonymous reader quotes their article: Everything is terrible because the fundamental tools we use are, still, so flawed that when used they inevitably craft terrible things... Almost all software has been bug-ridden and insecure for so long that we have grown to think that this is the natural state of code. This learned helplessness is not correct. Everything does not have to be terrible...

Vast experience has shown us that it is unrealistic to expect programmers to write secure code in memory-unsafe languages...as an industry, let's at least set a trajectory. Let's move towards writing system code in better languages, first of all -- this should improve security and speed. Let's move towards formal specifications and verification of mission-critical code.

Their article calls for LangSec testing, and applauds the use of languages like Go and Rust over memory-unsafe languages like C. "Itâ(TM)s not just systemd, not just Linux, not just software; the whole industry is at fault."
This discussion has been archived. No new comments can be posted.

Are Flawed Languages Creating Bad Software?

Comments Filter:
  • by Anonymous Coward on Sunday October 02, 2016 @06:37AM (#52998137)
    It's not the language, it's the programmers and the rush to produce easy code. Speed and simplicity trumps security and efficient coding these days.
    • by Anonymous Coward on Sunday October 02, 2016 @06:46AM (#52998171)

      A good craftsman doesn't blame his tools because a good craftsman doesn't use poor tools.

      • Re: (Score:3, Insightful)

        by Anonymous Coward

        A good craftsman doesn't insist that his tools necessarily do the job for him either.

        • by Anonymous Coward on Sunday October 02, 2016 @07:41AM (#52998385)

          Give me a crappy handsaw and nothing else and expect me to do perfectly mitered crown molding in no time at all? You get shit.

          Same tools with more time? Now we can talk about my skills I.e. can I do it properly with just that handsaw but time to make it right?

          On the other hand, give me a nice table saw where I can simply set the saw to miter correctly and I can do these crown moldings perfectly in no time at all.

          The moral of the story? Yes in scenario 2 the poor craftsman will still blame the tools but a good one will also do it because he does know how to do it but he also knows that the table saw exists.

        • by Anonymous Brave Guy ( 457657 ) on Sunday October 02, 2016 @08:09AM (#52998495)

          A good craftsman doesn't insist that his tools necessarily do the job for him either.

          As programmers, automation is the essence of what we do. Any programmer who isn't insisting on their tools doing work so they don't have to do it themselves isn't making very good use of those tools. That is as true for safety, security and defensive programming as for any other aspect.

      • by Dunbal ( 464142 ) * on Sunday October 02, 2016 @07:33AM (#52998359)
        On the other hand, the tools don't make the craftsman. You give sophisticated tools to an idiot and you will still get something idiotic - although sophisticatedly idiotic.
        • by Joce640k ( 829181 ) on Sunday October 02, 2016 @08:08AM (#52998487) Homepage

          Yep. Too much 'critical' code is written by the boss's nephew just because he "seems to be good at computers".

          Bjarne said it best:

          The idea of programming as a semiskilled task, practiced by people with a few months' training, is dangerous. We wouldn't tolerate plumbers or accountants that poorly educated. We don't have as an aim that architecture (of buildings) and engineering (of bridges and trains) should become more accessible to people with progressively less training. Indeed, one serious problem is that currently, too many software developers are undereducated and undertrained. Obviously, we don't want our tools--including our programming languages--to be more complex than necessary. But one aim should be to make tools that will serve skilled professionals--not to lower the level of expressiveness to serve people who can hardly understand the problems, let alone express solutions. We can and do build tools that make simple tasks simple for more people, but let's not let most people loose on the infrastructure of our technical civilization or force the professionals to use only tools designed for amateurs.
          - Bjarne

          • by plopez ( 54068 )

            Or built by some low end contractor who got their degree from a street corner diploma mill school. And then get thrust into building enterprise scale software.

            • Credential-itis (Score:4, Insightful)

              by stoicio ( 710327 ) on Sunday October 02, 2016 @12:32PM (#52999629) Journal

              To a certain extent this can be true. Our society, however, also suffers from anther problem at the other end of this scale.

              This is what I refer to as 'Academic Credentialitis'. This disease is pervasive in our society and needs to be stamped out.

              There is no certainty that anyone achieving academic standing in any subject actually makes them good enough at that subject to be 'fail-safe' .
              There is a systemic myth that somehow links academic standing with actual skill.

              Another question is in the context of the design of academic programs. Programs can only be developed reactively based on social context. This means that any new technology that may be disruptive can only have curriculum developed once it is known. If we look at the top 20 historic developments of technology, that have shaped human history in a disruptive way, the majority of those were non-credential-ed developments.

              Consider if you will the rise of the desktop computer. It was **NOT** a degreed professional who designed the first broadly successful consumer P.C..
              (Wozniak only finished his engineering degree in 1986.) Even If we only consider the commercial success of the PC from an academic perspective, there were no business academics who predicted or persued the development of a consumer personal computer until after it had already arrived on the business scene from a garage. So, in the context of the computing world that we now live in, academia had little to do with the early development and adoption of the PC technology except to claim it after the fact. In short, if we had all adhered to academic credentials as the basis for the development of this technology, none of us would have it right now. We would all still be using tele-type and reading paper newspapers delivered by hand.

              The academic myth has been created as a socio-economic filter to ensure that only those with suitable amounts of cash may achieve status in industry or government. This does not scale well to either skill or aptitude.

              It has been suggested that aptitude testing would be a better way to validate skill level rather than degrees. The question is, "who designs the test?". There would be a strong bias to load the content of tests with useless information that only a degreed academic would know in just the same way as requests for proposals are biased toward favoured contractors.

              The credential is a problem, not a solution. We need to remove our social addiction to that particular social snake oil and get back to skills assessment instead.

          • Definitely agree.

            It is really good that programming is so accessible. It was really easy for me to get started back in the day. First in BASIC. Then in C/C++.

            The problem is that line between casual use and professional.

            I've made bridges before. I built them using Lego at one point. I build them using wood and Popsicle sticks. But who would think that qualifies me to build an actual bridge across a river that people would use?

            Or less crazy, I use Excel and know spreadsheets. I have pretty good knowledge of n

      • Re: (Score:3, Insightful)

        by plopez ( 54068 )

        I have worked in the industry since the late 80's. I have NEVER been allowed to choose all of my tools.

      • False Analogy (Score:4, Insightful)

        by stoicio ( 710327 ) on Sunday October 02, 2016 @10:40AM (#52999103) Journal

        First of all it's a programming language not a saw.

        Secondly, almost all other languages are compiled using a 'C' compiler.
        If the 'C' language were a flawed language then producing code for all those other languages, using 'C' would make all of those languages inherently contain the same systemic flaws.

        'C/C++' gets a bad rap from programmers because most programmers lack the skills necessary to make reusable patterns or program securely in any language let alone 'C'.

        A better analogy would be that programming has become a lot like carpentry. All manner of people claim to be carpenters and joiners just because they own a hammer. In computing, all manner of people claim to be programmers because they own a computer, have a Comp. 150 or 250 course, became a Microsoft Certified Engineer (what ever that means), or downloaded a free compiler and read a manual once. Such is our culture.

        It takes a great deal of experience to understand where problems can be produced in any programming language. Unfortunately the under-informed masses of under skilled programmers tend to be negative about the technologies they understand the least.

        The industry needs job entrance tests to demonstrate efficacy in programming rather then simply accepting that people are 'qualified' because they dicked with code for 20 hours in high school.

        • Re:False Analogy (Score:4, Interesting)

          by gweihir ( 88907 ) on Sunday October 02, 2016 @12:18PM (#52999547)

          Indeed. And just look at what pretty impressive, secure, stable and fast code is written in C by people that know what they are doing. This whole idiocy about "we need better languages" comes from the same morons that want to teach everybody how to code: They still believe that coding is a menial task that can be done very cheaply, when in actual reality it is among the hardest engineering disciplines known to man.

    • Re: (Score:2, Interesting)

      by awe_cz ( 818201 )
      While your statement is correct, it kind of misses the problem. With current demand we need more quality craftsmen than there is or ever will be available (in foreseeable future). In this regard, having safe languages looks like a good choice. Not all pieces of software needs to be written by the best of the best anyway. We just need to make sure someone's badly written GUI application does not crash the whole OS. If if it crashes, make sure there is an output from that crash that even sub-standard programm
      • If if it crashes

        Case in point, semantic analysis is available in some text editors to point out your mistake. A programming language which doesn't require that the programmer state their intentions makes it hard for silly mistakes to be identified early.

    • by ganv ( 881057 )
      Are flawed programmers creating bad code? Yes, but there is a bigger cause. Our era assumes that complicated things should be able to be done by a small number of people in a tiny amount of time. It is the failure to simplify and allocate adequate resources to creating great code that are really creating bad code.
    • by allo ( 1728082 ) on Sunday October 02, 2016 @07:18AM (#52998289)

      A good craftsman chooses good tools.

      Of course you can create excellent work with very bad tools.
      But the first a good craftman does is to search for the right tools. He checks his budget, then starts to search for the right tools and if they are too expensive, he searches for replacements, which are for him (but not for everyone) similiar useful. If he cannot find a tool he needs for good work, he's honest about it and tells his client before starting to work.

      • And therein lies the problem. Programming tools are not interchangeable. Once a programming project starts using one set of tools, everyone working on it then and in the future has to use that same set of tools. A lot of the craftsmen writing software aren't very good. Many of them are self-taught with very little formal training in programming or algorithms. A lot of the "popular" tools are the ones chosen by these poor craftsmen. It's like making science decisions by democracy, when 80% of the popul
    • This meme, that certain languages are memory unsafe, is BS. The programmer is free to add all the memory checking that a so-called memory safe language automatically inserts. However the programmer using the "unsafe" language is free to use knowledge unavailable to the compiler to decide when and when not to perform such checks.

      Furthermore the suggestion that memory-safe languages are faster is bogus. You don't get faster by automatically generating more code.

      These memory-safe languages are only more
      • True, but the effort required to do it in (eg.) C is ridiculous. You can't really expect anybody to do it.

        Other languages can do it a lot more easily, eg. C++. Range checking for things like std::vector is turned on by default in recent compilers.

        ie. "array[-1] = 0x666;" will throw an exception, just like in Java.

        You can go outside the box and start using raw pointers in C++ but it's not something you need to do, or should be doing very often.

        And *this* is why Linus Torvalds is an idiot with all his anti-C+

        • As someone who spent a lot of time writing assembly language back in the day I understand the "we only use C here" mentality. Its only one small step from assembly language, which is what C was intended to be. I still have an urge to go to the bare metal (assembly), but its just not worth the time anymore.

          I also understand caution with C++. Its features can be overused to the point of gross inefficiency and/or a lack of clarity in the code. However with highly curated source code like the Linux kernel th
        • And sometimes when talking directly to hardware, array[-1] is exactly what is needed. Of course for the majority of software that will never be the use case so it would be nice to have warnings for such things and thankfully there are static source code checkers that does exactly this. http://cppcheck.sourceforge.ne... [sourceforge.net] for example would warn about "out of bounds" for this very code.
    • by Anonymous Brave Guy ( 457657 ) on Sunday October 02, 2016 @07:47AM (#52998413)

      It's not the language, it's the programmers and the rush to produce easy code.

      Well, I think it's a lot the language as well. To a first approximation, every major piece of system and networking software written in C has had serious security issues at one time or another, even the ones written by the best programmers of their generation and hailed as being exemplary in their code quality. I think after the first few decades of evidence we're allowed to call this one now, and say that writing critical software in unnecessarily dangerous languages produces less than optimal results.

      • The first time I was confronted with C I was shocked - they check for binary zero as a string terminator? seriously? wtf?
        I studied compilers and languages (amongst other things) in the 70's and that one decision - which Dennis Ritchie has since said he very much regrets - flies against everything we were being taught. Of course C had been released a year earlier but I don't remember it being mentioned directly as something to avoid.
        What else was there?

        • Pascal with its fixed-length strings was certainly no
    • by guruevi ( 827432 )

      Exactly this. I am building an application, slowly building it up and taking my time and there are minimal amounts of bugs in it. I test after each section. It's a rebuild of an application I built years ago under extreme time and budget pressure, needless to say the thing is held together with metal wire and duct tape at this point. Now I have the opportunity to rebuild it on my own time frame, it's so much better.

      The other problem is that things are trying to be too much. They are following the weirdest c

    • by Greyfox ( 87712 )
      This is the core of the problem, here. I've been in the software industry since '89 and see the same patterns again and again. The software you're working on was always rushed. They rushed it out the door with incomplete (or no) understanding of the problems they were trying to solve, the business model and in a lot of cases, of writing software at all. They crapped out one giant piece of software with no way to verify its outputs and then went into maintenance mode where they were constantly putting out fi
  • Yes (Score:4, Interesting)

    by MichaelSmith ( 789609 ) on Sunday October 02, 2016 @06:56AM (#52998217) Homepage Journal

    Every programming language I have used commercially has had a few things it does well and a whole bunch of limitations. Bad code gets written all the time to work around language limitations. Consider the lack of data type declarations in python, java script, coffeescript, etc. Terrible for code readability.

    Its all religious theory and habit with very little up front thought and design. RIP Pascal and Ada.

    • by tomhath ( 637240 )

      Consider the lack of data type declarations

      On the other hand, pretty much every language that uses static typing gives you a way to get around it with overloading, annotations, injections, factories, etc. All of which makes it even more buggy and difficult to read, verify, and maintain.

  • The suggestion to use formal verification has been around for decades. It isn't used outside of the ivory towers because writing a correct specification and proving it's correct is harder than writing correct software. It becomes increasingly difficult as the code base gets bigger and more interfaces are added.
    • by gtall ( 79522 )

      There's another problem, writing correct specifications and proving them + code together is correct requires a different set of mental tools. In my estimation, most programmers are terrible at formal logic. It is not computation, it is more like mathematics. Programmers just do not have the mathematical maturity to grind their way thought logical proofs.

      And it is extraordinarily time consuming.

  • by Required Snark ( 1702878 ) on Sunday October 02, 2016 @07:18AM (#52998291)
    Back in the early days of computers, when the world had many hardware architectures, there were machines that had co-designed hardware and software. They were designed from the ground up to avoid certain types of problems, and they worked really well.

    Burroughs Large System [wikipedia.org] stack machines were one example and Symbolics Lisp Machines were another. Burroughs had array descriptors that did bounds checking at run time and tagged memory. Tagging added non-user accessible bits to each memory word. The tag defined what kind of data the word contained and the hardware detected any attempt to use a memory value illegally. Symbolics machines also had tag bits, but their implementation was microcoded, so the tag interpretation was also in microcode.

    Until computer implementations include features like tagged memory and hardware array bounds checking they will never be truly secure. Some problems cannot be addressed by an isolated software layer: they can only be made secure by hardware enforcement of fundamental features that prohibit some classes of software errors.

  • Let's move towards writing system code in better languages, first of all -- this should improve security and speed.

    There ain't no such thing as free memory checking. It takes extra code and therefore takes extra time.

    Plus the memory-unsafe premise is BS. There is nothing preventing a programmer from adding their own memory checking in such languages.

    • by rew ( 6140 )

      The "better" counter to the original argument is that not all bugs are memory overruns.

      Back in the early nineties I was reading the manual page for the daemon that would send a message to a terminal when a mail message came in. I concluded, from the "published specs" that I could trick it to do "nasty" things. And that turned out to work.

      This is an example where no overrun, just the published actions of a program lead to a security issue.

    • Re: (Score:2, Interesting)

      by Anonymous Coward

      There is nothing preventing a programmer from adding their own memory checking in such languages.

      Sure there is: ignorance and laziness.

      One of the problems is that programmers generally view PEBKAC as a "get out of jail free" card. Once a problem is diagnosed as PEBKAC, they wash their hands of it and say "not my problem". But ask any UX expert, and they will disagree with this - if a user is having problems with a computer system, it's the computer system's fault (or at least an opportunity for the computer system to be better). Heck, ask any *security* expert about it. You can't just ignore issues bec

    • There ain't no such thing as free memory checking. It takes extra code and therefore takes extra time.

      It wasn't clear to me whether the author meant run-time speed or development speed. Certainly better languages and tools can make a big difference to the latter.

      It's also quite possible for better languages to generate run-time code that is more efficient. The more semantic information about programmer intent, restrictions and guarantees can be encoded using the language, the more scope there is for optimisers to produce better output.

      Plus the memory-unsafe premise is BS. There is nothing preventing a programmer from adding their own memory checking in such languages.

      But that only matters if programmers do add their own checks. Evidently i

      • But that only matters if programmers do add their own checks. Evidently in reality most do not.

        And therefore the idea that the problem is some programmers not the language, the craftsman not the tool.

        • But that only matters if programmers do add their own checks. Evidently in reality most do not.

          And therefore the idea that the problem is some programmers not the language, the craftsman not the tool.

          If a tool doesn't fit my hand, it's a bad tool. If a tool doesn't fit human intuition and tendencies, it's a bad tool. Tools are here to assist humans, not the other way around, and have no inherent value of their own, just their utility, thus any mismatch between a tool and a user is a problem with the tool,

        • Perhaps, but when close to 100% of a population have the same trait, arguments that the trait should be changed rather than designing tools and processes that accommodate that trait are unrealistic and therefore not very useful.

  • Bugs in code are not a given, they are only a given for a certain complexity of software created with certain competence with a certain requirement of perfection. The only ways around this is to control the above variables, so this does have some merit. For instance in the process industry when programming safety systems you don't assume perfect competence of the programmer, so you present to them a limited language made of pre-vetted function blocks to limit the amount of damage they can do.

    However wheneve

  • When you write a program that needs to print the primes up to a certain number, you can easily create a formal proof that your program program is correct.

    But when your program is say "apache", that needs to interact with many different browsers on one side, and interpret PHP scripts that interact with databases, this formal proof becomes impossible. Similarly, you cannot write a formal spec for the interaction with the user in for example, a web browser.

    Even though both examples I put forward today (web server and web browser) didn't exist back then, I've held this opinion for thirty years (spring 1987).

    • by Halo1 ( 136547 ) on Sunday October 02, 2016 @08:52AM (#52998673)

      When you write a program that needs to print the primes up to a certain number, you can easily create a formal proof that your program program is correct.

      But when your program is say "apache", that needs to interact with many different browsers on one side, and interpret PHP scripts that interact with databases, this formal proof becomes impossible. Similarly, you cannot write a formal spec for the interaction with the user in for example, a web browser.

      While things like the halting problem obviously prevent fully formally proving the correctness of programs, you can go much farther than we generally go today. For example, I participated in an EU project [euromils.eu] where they constructed a formal model [isa-afp.org] of the PikeOS separation kernel (kind of like an embedded real-time hypervisor). They also generalised this model, which includes support for things like interrupts and context switches.

  • The most commonly used languages do absolutely nothing to prevent programmers from creating completely unmaintainable and broken code. Like creating a public arraylist and then accessing it from all over, or creating 8-layer-deep inheritance hierarchies with untested spaghetti code that's impossible to understand and modify, or copy-pasting all over.

    There are efforts to fix some of the problems -- making it harder to use 'null', encouraging immutable objects, simplifying concurrency, more capable type syste

    • by Tomster ( 5075 )

      Regarding a few of the other comments:

      "It is a poor craftsman that blames his tools." Absolutely. However, the same good craftsman who can do amazing things with terrible tools will choose high quality tools because he can do so much better work with them.

      "It's... the rush to produce easy code." That continues to be a problem -- and everyone who wants to throw more bodies at a late project needs to be fed a copy of The Mythical Man-Month one page at a time -- but it's a different problem.

      "Formal verificatio

      • I think this is a bigger problem than is being recognized here. Most coders that I work with don't get to decide on ship dates. They may in a few cases have a claimed "veto power" if the code isn't ready, but they won't use it, because they'll be let go if they don't ship on time.

        The management that I see is too often of the "Give me a demo. What are you talking about, that works fine! Ship it! Let's move the press date up by two months!" variety. Some of the better ones are of the "What's our risk exposure

  • by aglider ( 2435074 ) on Sunday October 02, 2016 @08:28AM (#52998567) Homepage
    Programming languages are just tools to help create programs.
    Flawed languages in the hands of skilled programmers can still allow for god programs.
    And vice versa.
    So my answer is: no, they don't.
  • by renzhi ( 2216300 ) on Sunday October 02, 2016 @08:50AM (#52998669)
    Posting this issue on /. would never have any meaningful discussion, given the attitude of this crowd towards anything barely resembling formal engineering. When the majority think it's cool to write large and complex systems in languages which don't even support strong static typing, and would have a blank stare if you ask them why they think their code is correct, there is simply no place for any serious discussion. Good tools certainly do not guarantee good outcome, but why do we think bad tools would have good outcome, given the same pool of talents? If we can't fix bad programmers, why not thinking about creating better tools? Formal engineering is hard, and no one said it's easy, but it's not a reason not to strive for better ways.
  • Their article calls for LangSec testing, and applauds the use of languages like Go and Rust over memory-unsafe languages like C.

    Criticizing C and C++ for being unsafe is quite justified (although both C and C++ can, in fact, be implemented in a type-safe manner). But then lauding language turds like Go and Rust for being safe is laughable. There are plenty of mature safe languages around: Java, C#, Swift, SML, OCaml, Scala, F#; even Python and Clojure are safe (though dynamically typed). In fact, safe langu

  • New languages and methods always have the advantage over older ones that the big old spaghetti projects written in the newer language with the new method do not exist yet.

    The problem is right here:
    > opting to cram an enormous amount of unnecessary complexity

    Doesn't matter which language is used - as complexity goes up so does the bugginess and new features are increasingly difficult to get working.

    This, together with that nobody wants to admit when they think the system becomes too complex, afraid to loo

  • by charronia ( 3780579 ) on Sunday October 02, 2016 @09:58AM (#52998927)

    I think part of it is that software developers simply do not have seas of time to optimize their code to perfection. There's a strong culture of cranking out features as quickly as possible, because otherwise competitors might beat you to the punch.

"Mach was the greatest intellectual fraud in the last ten years." "What about X?" "I said `intellectual'." ;login, 9/1990

Working...