Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
Programming Government Security

NSA Urges Organizations To Shift To Memory Safe Programming Languages (nsa.gov) 196

In an press release published earlier today, the National Security Agency (NSA) says it will be making a strategic shift to memory safe programming languages. The agency is advising organizations explore such changes themselves by utilizing languages such as C#, Go, Java, Ruby, or Swift. From the report: The "Software Memory Safety" Cybersecurity Information Sheet (PDF) highlights how malicious cyber actors can exploit poor memory management issues to access sensitive information, promulgate unauthorized code execution, and cause other negative impacts. "Memory management issues have been exploited for decades and are still entirely too common today," said Neal Ziring, Cybersecurity Technical Director. "We have to consistently use memory safe languages and other protections when developing software to eliminate these weaknesses from malicious cyber actors."

Microsoft and Google have each stated that software memory safety issues are behind around 70 percent of their vulnerabilities. Poor memory management can lead to technical issues as well, such as incorrect program results, degradation of the program's performance over time, and program crashes. NSA recommends that organizations use memory safe languages when possible and bolster protection through code-hardening defenses such as compiler options, tool options, and operating system configurations.
The full report is available here (PDF).
This discussion has been archived. No new comments can be posted.

NSA Urges Organizations To Shift To Memory Safe Programming Languages

Comments Filter:
  • Because capitalism (Score:5, Insightful)

    by Gabest ( 852807 ) on Thursday November 10, 2022 @06:27PM (#63042175)

    Making everything more secure in the last 40 years of me following computers and such did not eliminate viruses all, but stipped our access to our devices bit by bit. I want to hack my own rom to my phone, I want to exploit my router and put openwrt on it, I want to boot any PC operating system from usb, I want to record whatever is on the screen, I want retro console emulation.

    • by flyingfsck ( 986395 ) on Thursday November 10, 2022 @06:41PM (#63042205)
      In my experience, C# has memory leaks.
      • Every language will leak memory. What I am convinced the complaint actually is that people on the heap side of development do not want to tangle with platform memory, architecture memory problems when it's their job to just give users what they ask for.

        • I fail to see users asking for apps that crash. Am I looking in the wrong feature request list?
        • by lsllll ( 830002 )

          Every language will leak memory

          Funny how I didn't ever have that problem while programming in the 80s. It's not the language. It's the implementation and careless programming.

          • by theshowmecanuck ( 703852 ) on Friday November 11, 2022 @01:05AM (#63042737) Journal

            From current programmers who think programming is implementing frameworks many layers of abstraction away from anything actually dealing with memory, but that eats memory regardless. They don't care if it leaks, just through ever more powerful computers that are now support 64GB and more of RAM. When you are sitting so high up on a stack of frameworks, it's impossible to see anything leaking so far down at the bottom.

          • Everybody who has written more than trivial software has made mistakes that leaked memory. If you don't think you did, it just means you didn't discover them.

      • In my experience, C# has memory leaks.

        Memory leak and memory safe aren't (necessarily) the same thing.

      • by Spazmania ( 174582 ) on Friday November 11, 2022 @03:49AM (#63042861) Homepage

        In my experience, C# has memory leaks.

        Every language has memory leaks.That's about garbage collected memory versus not; it has nothing to do with memory safety.

        Memory safety is about whether it's possible to reference memory not associated with the handle. Here's a simple example: what happens when:

        b = -100
        a[b] = 5

        In a "safe" language, this either throws an error or does nothing at all. In an "unsafe" language like C, this overwrites memory in some random location that doesn't belong to array "a".

        Unsafe languages can be much faster because they don't have to check whether "b" is valid for "a"'s current size before writing the value 5 to the location. In fact, there are some really hyper-optimized things you can do in C that just aren't possible in other languages. But unless you need that hyper optimization (you rarely do) you're better off using a memory safe language.

      • Memory safe means you can't have buffer overflows that clobber data. It's not a statement about the potential for memory leaks. And yes you can have some kinds of memory leaks in C# and Java.
        • And yes you can have some kinds of memory leaks in C# and Java.
          You can have memory leaks in every language.
          Just keep allocating memory ...

          It is the fault of the programmer, not the language.

          However, DamnOrigon pointed out an exception: recursive calling of closures in Perl (not sure what the leak is, seems the closures are heap objects that later do not get collected - because Perl uses reference counting - instead of GC. However: that implies that you can construct memory leaks in any way, not only via rec

    • It's not harder, it requires not being lazy; which could be worse.
    • Except for the part where you can still do all of those things, you have a totally valid argument.

    • Making everything more secure in the last 40 years of me following computers and such did not eliminate viruses all

      The goal is not to eliminate risk, it is to make risk harder. I still have not fond memories of connecting a computer with Windows XP to the internet only to have it automagically owned before Windows Update was even able to download security fixes for major wormable issue du jour.

      You just think in absolutes, but malware has gotten significantly more difficult to apply without some form of user interaction. The fact that stupid users continue to exist is notwithstanding.

  • by devslash0 ( 4203435 ) on Thursday November 10, 2022 @06:30PM (#63042183)

    Given its current popularity, why is it not mentioned in the paper?

    • by CaptainLugnuts ( 2594663 ) on Thursday November 10, 2022 @07:29PM (#63042323)
      Why? Because nobody does anything serious in Python. All the serious stuff is in libraries written in a more efficient language.
      • Re: (Score:2, Insightful)

        by TwistedGreen ( 80055 )
        Like Javascript?
      • Why? Because nobody does anything serious in Python.

        I wish that were true, but there are now tons of examples of influential internet-facing projects written mostly in python because the devs were afraid of curly braces. Dropbox. Instagram. Pinterest. Spotify.

      • That is wrong.

        E.g. Plone ... https://plone.org/ [plone.org]

        All the serious stuff is in libraries written in a more efficient language.
        Nope. The serious stuff is the Python code linking those libraries together.

      • I don't agree with this statement. I've been a Python dev for the past 15 years, out of which 11 professionally. A LOT of serious projects are written in Python. Instagram is written in Python, for example. So are significant parts of Dropbox, Ansible, Spotify, World of Tanks, and many more. Most pentesting tools these days are also written in Python. Oh, and serverless architecture is only going to boost the use of this language even more.

        What I'm trying to say is that just because you personally are not a

    • Re: (Score:2, Interesting)

      Because it's a gigantic pile of shit.

      There is a small group of things that Python actually does really damn well. But everything else, it's worse than nearly every other tool.

      Notably, one of the few things Python does do really well, is make it very easy to use C libraries.
      And there we are, full circle.
  • Should we trust their advice? Maybe they're trying to lull people who use "memory-safe languages" into a false sense of security.

  • by greytree ( 7124971 ) on Thursday November 10, 2022 @06:32PM (#63042189)
    .. is a sure sign that they think hacking costs the US much more in security terms than the NSA thinks it can get from hacking other countries.

    In other words, China, North Korea and Russia are winning this war.
  • by franzrogar ( 3986783 ) on Thursday November 10, 2022 @06:37PM (#63042197)

    Doy they reaaaaaaaaally think we are so gullible as to do whatever NSA "urges"?

    And more over... suggesting to use C#, Go, Java, Ruby, or Swift as *the* languages... ROLF.

    Sorry, but no.

    • Actually it's a double bluff. The NSA knows how contrarian many programmers are, especially C fanatics, so this is actually a double bluff to get you to cling on to C ever more tightly. Lots of juicy exploits for them that way you see.

  • I am really liking the new & improved approach to security & spying. Agencies have to come remove their dependencies on backdoors and are instead sharing tools, softwares & expertise for creating more secure software. Of-course the spying continues unabated with metadata, but that is not a bad thing imo.
    • And... they got ya. All this means is that they have managed to get bugs in all of these language specifications that are obscure and hidden enough that they are confident nobody else will find them. It is a clear indication that you should use anything but these languages if you're concerned about the NSA hacking your system. /s
    • "" Of-course the spying continues unabated with metadata, but that is not a bad thing imo.""

      Could be your sense of humor here, but... looking for clarification...

      So spying is OK as long as it's metadata being spied on?
  • No-Brainer (Score:5, Insightful)

    by DaPhil ( 811162 ) on Thursday November 10, 2022 @06:43PM (#63042215)

    This is so obvious that it is amazing it took so long to point this out at that level.

    (Most) human beings suck at writing good code - so, the more help we can get from the language we use the better.

    Is it possible to write good code in C? Sure! Its just way harder than doing the same thing in Java or .

    I wish things wouldn't turn so... IDEOLOGICAL once a programming language is involved.

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      If you follow the CVE databases you will see that these "memory safe" languages have MORE vulnerabilities than skillfully written traditional software.

      Plus they're bloated and slow as hell (yes, even Rust).

    • Re:No-Brainer (Score:4, Interesting)

      by phantomfive ( 622387 ) on Friday November 11, 2022 @12:51AM (#63042723) Journal

      Is it possible to write good code in C? Sure! Its just way harder than doing the same thing in Java or .

      If you aren't writing good code in C, then you aren't writing good code in Java, either. Easier/harder doesn't apply here.

      Just look at all the security bugs in log4j.

    • by Tom ( 822 )

      Is it possible to write good code in C? Sure! Its just way harder than doing the same thing in Java or .

      It's not really HARDER. It just requires a lot more DISCIPLINE, and that's something coders are terrible at. Most of the stuff that the more modern programming languages take care of isn't stuff that's hard to do. It's just stuff that's TEDIOUS to do, and thus often gets overlooked, forgotten or ignored.

      • It's just stuff that's TEDIOUS to do, and thus often gets overlooked, forgotten or ignored.
        Not only that. It is simply error prone. More code to write, more options for errors. On top of that coyp&paste programming. People copy a code block, start changing some variable names and forget something ...

        I always want to disable copy / paste their machines ^_^

    • Efficiency is a genuine concern. When you have code running on billions of devices or millions of servers, the energy use is a real concern. I bet you could reduce cooling costs by half at Google, Microsoft or Amazon by re-writing everything (safely) in c.
  • by dohzer ( 867770 ) on Thursday November 10, 2022 @06:45PM (#63042219)

    Absolute drive-by on Rust. They just drove-by without acknowledging it.

    • Re:Rust? (Score:5, Informative)

      by dohzer ( 867770 ) on Thursday November 10, 2022 @06:49PM (#63042229)

      Actually, reading the PDF reveals they did list it, but the /. author omitted it.

      Examples of memory safe language include C#, Go, Java, Ruby, Rust, and Swift

      • Right, just 4 days ago, this was posted on /.
        "Wired Hails Rust as 'the Viral Secure Programming Language That's Taking Over Tech' (wired.com)"

        https://developers.slashdot.org/story/22/11/05/2143237/wired-hails-rust-as-the-viral-secure-programming-language-thats-taking-over-tech

  • by david.emery ( 127135 ) on Thursday November 10, 2022 @06:50PM (#63042235)

    Ada, of course, is a memory-safe language by design since the original Ada83 version. And NSA did some work with the verifiable SPARK subset and proof tools. Yet another opportunity lost when the DoD walked away from its substantial investment in Ada because "it's not what industry is doing." (As if 'what industry is doing' was a justification...)

    I was sitting at a SIGAda meeting in the late '80s, the guy next to me had a badge that had his name and "US DoD". I said to him, "You must work at NSA, do you know xxx?" He was not happy, and said "Yeah, I'm his boss. How did you know?" "Everyone else here from DoD lists the Service (e.g. Army)/Agency. Only NSA says 'US DoD' "

    • Going a different way than industry means you can't find talent to hire.
      • by david.emery ( 127135 ) on Thursday November 10, 2022 @08:06PM (#63042389)

        My back of the envelope estimate was that DoD & contractors trained at least 40k developers in Ada by the early 1990s. Many of my friends from the Ada days moved into other languages because they couldn't find work in Ada, which they would have preferred. Some still do under-the-table work in Ada, when there's no language requirement for delivered code, just the working product.

        Sure, 'you go where the jobs are.' But as I said to a contractor that tried to justify its choice of programming languages by "ads in the LA Times," - "Don't try to justify the decision you've already made for 'business reasons' with some veneer of technical rationality." And this was a contractor that had a significant Ada investment in its own workforce.

        The real hell of it is that DoD abandoned Ada just when they could start to -measure- the return on early investments. There was a study by MITRE where in the early 1990s they recalibrated their COCOMO cost model from the first half dozen or so of large delivered Ada systems. To their surprise, the exponent for life-cycle costs written in Ada was 1.0. Ada83 was shown to be linear (rather than exponential) on lines of code for development and maintenance. I tried very hard to get that report released to the public, but that never happened. Instead, DoD abandoned Ada, moved to "languages used by industry", and paid the price.

        What I'm saying here is that (a) there have been memory-safe languages designed for production use for 40 years. (b) the results from memory-safe, strong typing, exceptions and modularity -have been shown- to have SIGNIFICANT value. (c) There are new languages that are getting lots of buzz like Rust, Scala, etc, and we'd be better off using them than C or C++. But (d) software development MANAGERS do not want to pay the cost for better software, either in training or tools. Sure, just toss bodies at the problem...

        See this https://www.lawfareblog.com/se... [lawfareblog.com] that includes an argument for vendor liability, something I've been demanding for at least 30 years. ONLY WHEN COMPANIES HAVE TO PAY A PRICE FOR BAD SOFTWARE, will they break the habits that produce bad software cheaply.

        • by PCM2 ( 4486 )

          The real hell of it is that DoD abandoned Ada just when they could start to -measure- the return on early investments.

          Yeah, but what I always heard was that there was going to be such an ongoing investment in things like compliance and certification that the the Ada dog just wouldn't hunt. Hell, just buying an Ada compiler cost like $3,000. Might be fine for the timeshare era, but when you've got individual developers trying to build code on their desktops, that just ain't gonna cut it. The massive overheard involved in just launching an Ada project pretty much made it infeasible for anybody but the Federal government. And

          • That was true for a while. But there were also $100 compilers for PCs, and eventually the GNAT project produced a very high quality, open source, -free- Ada compiler (that continues...)

            But that also gets back to my "refusal to capitalize software engineers". A company that wouldn't spend $3k (what is that, a week's wage at the time?) for a tool that substantially reduced errors (including vulnerabilities) probably got what it paid for.

            • But there were also $100 compilers for PCs

              Those were very crappy compilers. They didn't support the entire language and produced bloated code.

              I worked on defense projects for several years. Ada was a requirement, but other languages could be used with a waiver.

              We would implement each project in C and then demo it. The clients were happy. We then told them they couldn't have it because it had to be rewritten in Ada, requiring months of development time, twice the memory, and a much faster CPU, which they would have to pay for.

              We would always get the

              • rewritten in Ada, requiring months of development time, twice the memory, and a much faster CPU,
                Then you were lying. As non of this is/was true. I hope this is not a habit.

              • Why not use the trick that seemed to satisfy most managers (who didn't read code).
                Have your Ada program simply call your C program -- that was completely allowed in Ada !!!

    • During the time when Ada "was thing", a C++ compiler (for a PC/Windows) costed $1500 and upward. Over the next decade costs dropped significantly, sown to $200 and less.
      Ada was always expensive in the range of unobtainable. Outside of universities and Nasa or Defense projects: simply no one could afford it.
      Basically the same happened with Eiffel.

  • by iliketrash ( 624051 ) on Thursday November 10, 2022 @06:54PM (#63042243)

    From TFA:

    "Examples of memory safe language include C#, Go, Java®, Ruby, Rust®, and Swift®."

    Mind-boggling that Ada is not listed as an "example."

    • by gweihir ( 88907 )

      Ada is a failed language. That is why it is not listed. It has just far too many peoplems to be a sane choice.

  • by cyberfunkr ( 591238 ) on Thursday November 10, 2022 @07:36PM (#63042341)

    And by memory safe they mean hire developers that remember to check for Null Pointer Exceptions, Buffer Overflow, no passwords in the git repository... That sort of thing, right?

  • This is precisely what Congress is for. New law, as of 1 Jan 2024 all new applications or new application versions used by the US government or critical infrastructure must be written in a CISA approved memory safe programming language.

    Send to POTUS for signature.

    ... back to my insider trading.

    • ..... At which point no compilers can be written anymore. Machine language itself is not considered to be a memory safe programming language.
  • by WaffleMonster ( 969671 ) on Thursday November 10, 2022 @07:59PM (#63042377)

    Poor memory management can lead to technical issues as well, such as incorrect program results, degradation of the program's performance over time, and program crashes.

    One of the reasons I prefer C is that at least it is reliable. Garbage collected languages are black boxes that only guarantee reachability not that your programs resource utilization will be managed in an understandable or coherent manner. It makes no guarantees about randomly slowing your software down to a crawl as it runs the system clear out of memory.

    While the forced RAII options don't suffer from these problems personally I think the most likely scenario for the future is we will see C with constraints. The compiler/assistant analyzes code and provides some kind of feedback to the coder so they understand what can be changed so that it becomes tractable for it to verify behavior. Analysis capabilities would improve over time granting more freedom to coders and requiring less code changes in existing software.

    This at least has a snowballs chance in hell of working vs. telling everyone to rewrite everything in a completely different language.

    It's important to remember over 90% of security compromises exploit people not systems. Even if all of the security bugs were patched in all of the worlds software overnight very little would actually change.

    • Stack overflow and pointer mistakes are easy to find. Logic errors or just plain buggy code is much harder to discover. I've discovered that badly written C code can be reverse engineered and fixed. Even well written Javascript is essentially write only code. You have no chance of looking at someone else's code and figuring out if the code does what the writer intended. So C will always have more security bugs discovered because some of them are easy to find and because the code is easier to analyze th
      • by narcc ( 412956 )

        I've discovered that badly written C code can be reverse engineered and fixed. Even well written Javascript is essentially write only code.

        Do you know how I can tell that you've never used either language? [ioccc.org]

        I'd love to see an example of JavaScript code you think is both well-written and "write only". Hell, I'd like to see you give an example of that in any language.

        • How to tell you have your head up your ass: you think Javascript can be well written.

          • Being angry and opinionated is not the same as being smart and knowledgable no matter how much it feels like that. You are not as good as you think you are and Javascript is not that bad.

          • You can write in JavaScript just like in C, and except for the absence of the "*-operator" and the idioms associated with it, there is no real visible difference.

    • by gweihir ( 88907 )

      With sound practices and tool use, we already have that. For example, using "gcc -Wall" and treating every result as a potential error is pretty good. Adding Valgrind and running good tests with it helps with things like uninitialized pointers and buffer-overflows. Fuzz-testing also helps a lot. Invalidating pointers to memory after freeing it helps a lot with use-after-free. Doing input validation instead of stupidly relying on input being what you expect helps. And so on.

      The real problem is incompetent co

      • With sound practices and tool use, we already have that. For example, using "gcc -Wall" and treating every result as a potential error is pretty good.

        You mean adding -Werror :)

        Adding Valgrind and running good tests with it helps with things like uninitialized pointers and buffer-overflows. Fuzz-testing also helps a lot. Invalidating pointers to memory after freeing it helps a lot with use-after-free. Doing input validation instead of stupidly relying on input being what you expect helps. And so on.

        Might I a

  • I suspect the voyager code was written in assembler, still going.
    • by gweihir ( 88907 )

      The Voyager code was written by engineers that knew what they were doing and using sound engineering processes. Most software today is not and that is the actual problem.

  • I'm pretty sure that step #1 is a non-starter:
    My employers contingency plan is I will fix it . . . usually realtime.

    At some point it ceases to be paranoia once you can chart getting spanked on a graph

  • The LISP family of languages is also memory-safe, by design, and has been since 1958.

    It is pre-dated only by Fortran (1954), which is not memory-safe, to the best of my recollection.

    Just sayin'.

    • Re:LISP? (Score:4, Funny)

      by dsgrntlxmply ( 610492 ) on Thursday November 10, 2022 @09:38PM (#63042537)
      Years ago I attended a public talk by Henry Baker on realtime garbage collection. Someone asked if Symbolics Lisp machines used any of the techniques described. I don't recall if it was Baker or someone else who said that people typically rebooted the machine when they believed that a major GC was needed.
      • by gweihir ( 88907 )

        GC is a really hard problem if you need to do it generic. For example, I implemented some custom memory management for a large in-memory table a few years back, because one hard requirement was that users must not notice andy delays and there were hard constraints on allocated memory. A generic GC cannot ensure both. It will either lock things up too long or it will leave too much memory allocated that could be collected.

        What this boils down to is that generic GCs are limited and sometimes you need to do it

        • Android just requires you to have more RAM, mainly because of GC. I don't think it matters with the beefy phones of our era.

          • by gweihir ( 88907 )

            If you already are within 30% of the max the hardware can give you, that is not an option. Also, I am talking about a server/proxy situation with a few 100 people accessing things concurrently. No idea why you bring up a mobile os targeted at weak-ass devices with slow networking (relatively speaking).

        • because one hard requirement was that users must not notice andy delays and there were hard constraints on allocated memory. A generic GC cannot ensure both.
          Of course it can. And there are super simple solutions for it.

          I implemented some custom memory management for a large in-memory table a few years back,
          While it is an interesting experiment, it is kind fo doomed to fail. But one learns from it. And it might be fun.

          You could have done a shortcut by simply reading a book about GC(s) ...

    • I haven't written any Fortran in decades, but fondly remember the EQUIVALENCE statement. It allowed for all sorts of mischief in your memory space.
  • by gweihir ( 88907 ) on Thursday November 10, 2022 @11:09PM (#63042637)

    Yes, unless you have a good reason to use a non-memory safe language, you should use a memory safe one, but there _are_ good reasons to use non-memory safe languages in quite a few situations. As always, "the right tool for the job" is paramount and risk-management must be a part of that decision. But if you good reasons to use a specific tool, then use it. Professional chefs will not stop using chef's knives because you can hurt yourself really badly with them if you do not know what you are doing.

    Hence, for example, C will not vanish because nothing can replace it in a lot of applications and it is a _good_ thing that we have C and not a zoo of languages filling its role. There are reasons C has been in the top-3 Tiobe index languages forever and a lot of those are and will remain good reasons. Of course, when using a professional tool (which often is dangerous in some way) the qualification of the tool-users becomes critical. If that is not there, things go to hell. Same if processes (architecture, design, review, testing, etc.) are not good. But the primary problem is not tools used. The reason why so much software (and a majority of it actually written with memory-dafe languages) is so bad, is because it is often produced cheaper than possible and that cannot work. If you let a gardener calculate the parameters of a bridge, chances are it will collapse. Chances are also the ones that allowed this to happen will face personal punishment. And that is something missing in software creation and maintenance: Professional standards and personal accountability. Language used is really a minor concern.

  • ... now offer enough other vulnerabilities to compensate for much of the gains. In it's extreme case one can exploit a large number of software projects by posting, then changing a library like "leftpad". In other cases like Log4J this is caused by well documented, but unexpected behaviour of widely used dependencies.

    In any case, instead of providing a memory safe alternative to C, most "C Alternatives" have integrated dependency management systems instead of specifying standard libraries.

    • The (in?)famous log4j Bug was not well documented.
      Existed only in one version and that version was removed from all repositories ASAP. No idea what you mean with "dependencies" - the bug was in log4j, and not any of its dependencies.

      And: it was not a typical bug but an intentional introduction of a "feature" that simply was nonsense.

  • on a low cost microcontroller. Otherwise it's business as usual using C/C++. I'm surprised they didn't mention rust though
  • by LordHighExecutioner ( 4245243 ) on Friday November 11, 2022 @02:52AM (#63042845)
    Quoting Bob Pease [jsneidhart.de]: "my favourite programming language is solder!".
  • If they suggest C# or Java over languages such as C/C++, they have a screw loose for sure. We live in a world where power usage reduction is a thing, for the environment and all. Java/C# are incredibly inefficient and bloated languages, and a bad software developer can create buggy and wasteful applications with any language under the sun.

    What they should be doing instead is research on tooling for analysis/modification of C/C++ code or other low-level efficient languages, so that the existing mega code
    • Java/C# are incredibly inefficient
      That is simply wrong.

      and bloated languages,
      Bloated compared to what? Both are certainly "more bloated" than C, and both are certainly less bloated than C++

      And a language is usually not called "bloated", just because it requires a bit more typing.

  • The English language and risk management are still things.

"Show me a good loser, and I'll show you a loser." -- Vince Lombardi, football coach

Working...