NSA Urges Organizations To Shift To Memory Safe Programming Languages (nsa.gov) 196
In an press release published earlier today, the National Security Agency (NSA) says it will be making a strategic shift to memory safe programming languages. The agency is advising organizations explore such changes themselves by utilizing languages such as C#, Go, Java, Ruby, or Swift. From the report: The "Software Memory Safety" Cybersecurity Information Sheet (PDF) highlights how malicious cyber actors can exploit poor memory management issues to access sensitive information, promulgate unauthorized code execution, and cause other negative impacts. "Memory management issues have been exploited for decades and are still entirely too common today," said Neal Ziring, Cybersecurity Technical Director. "We have to consistently use memory safe languages and other protections when developing software to eliminate these weaknesses from malicious cyber actors."
Microsoft and Google have each stated that software memory safety issues are behind around 70 percent of their vulnerabilities. Poor memory management can lead to technical issues as well, such as incorrect program results, degradation of the program's performance over time, and program crashes. NSA recommends that organizations use memory safe languages when possible and bolster protection through code-hardening defenses such as compiler options, tool options, and operating system configurations. The full report is available here (PDF).
Microsoft and Google have each stated that software memory safety issues are behind around 70 percent of their vulnerabilities. Poor memory management can lead to technical issues as well, such as incorrect program results, degradation of the program's performance over time, and program crashes. NSA recommends that organizations use memory safe languages when possible and bolster protection through code-hardening defenses such as compiler options, tool options, and operating system configurations. The full report is available here (PDF).
Because capitalism (Score:5, Insightful)
Making everything more secure in the last 40 years of me following computers and such did not eliminate viruses all, but stipped our access to our devices bit by bit. I want to hack my own rom to my phone, I want to exploit my router and put openwrt on it, I want to boot any PC operating system from usb, I want to record whatever is on the screen, I want retro console emulation.
Re: Because capitalism (Score:4, Informative)
Re: Because capitalism (Score:2)
Every language will leak memory. What I am convinced the complaint actually is that people on the heap side of development do not want to tangle with platform memory, architecture memory problems when it's their job to just give users what they ask for.
Re: (Score:2)
Re: (Score:3)
Every language will leak memory
Funny how I didn't ever have that problem while programming in the 80s. It's not the language. It's the implementation and careless programming.
Re: Because capitalism (Score:5, Insightful)
From current programmers who think programming is implementing frameworks many layers of abstraction away from anything actually dealing with memory, but that eats memory regardless. They don't care if it leaks, just through ever more powerful computers that are now support 64GB and more of RAM. When you are sitting so high up on a stack of frameworks, it's impossible to see anything leaking so far down at the bottom.
Re: (Score:2)
Everybody who has written more than trivial software has made mistakes that leaked memory. If you don't think you did, it just means you didn't discover them.
Re: (Score:3)
In my experience, C# has memory leaks.
Memory leak and memory safe aren't (necessarily) the same thing.
Re: Because capitalism (Score:5, Informative)
In my experience, C# has memory leaks.
Every language has memory leaks.That's about garbage collected memory versus not; it has nothing to do with memory safety.
Memory safety is about whether it's possible to reference memory not associated with the handle. Here's a simple example: what happens when:
b = -100
a[b] = 5
In a "safe" language, this either throws an error or does nothing at all. In an "unsafe" language like C, this overwrites memory in some random location that doesn't belong to array "a".
Unsafe languages can be much faster because they don't have to check whether "b" is valid for "a"'s current size before writing the value 5 to the location. In fact, there are some really hyper-optimized things you can do in C that just aren't possible in other languages. But unless you need that hyper optimization (you rarely do) you're better off using a memory safe language.
Re: (Score:2)
Re: (Score:2)
And yes you can have some kinds of memory leaks in C# and Java. ...
You can have memory leaks in every language.
Just keep allocating memory
It is the fault of the programmer, not the language.
However, DamnOrigon pointed out an exception: recursive calling of closures in Perl (not sure what the leak is, seems the closures are heap objects that later do not get collected - because Perl uses reference counting - instead of GC. However: that implies that you can construct memory leaks in any way, not only via rec
Re: Because capitalism (Score:5, Insightful)
Maybe the backdoors are built in to these new compilers...
Or... maybe the NSA just wants apps to be generally safe from unintentional buffer overruns and race conditions, so that the only remaining back doors are the ones that companies explicitly add to the product per their 'request'.
Re: Because capitalism (Score:4, Informative)
The NSA has other ways to get to your stuff, while most cybercriminals don't have access to those ways. So making software more safe is usually a benefit.
Plus I'm sure there's quite a bit internal conflict within the NSA about its two sometimes conflicting missions.
Re: Because capitalism (Score:5, Insightful)
In managed languages, memory leaks generally refer to objects that the GC system for whatever reason, inappropriately, fails to reap.
These things are almost always a consequence of the particular implementation of the GC in that language's runtime.
For example, in perl, recursive closure calls will leak, unless you weaken the reference to the closure.
No other language that supports closures that I have ever used has that particular bug. It's a quirk of perl.
Re: Because capitalism (Score:2)
Angelo, you've come out of the gate saying something stupid, yet again.
His language of choice is Java and he thinks linked lists are the only good way to create a double-ended stack while insisting that having a CS degree means he's an expert. What else did you expect?
Re: (Score:2)
he thinks linked lists are the only good way to create a double-ended stack
Well, they're the easiest/quickest way, perhaps. Definitely not a good way, unless you like to build structures that defy any attempt at caching.
But ya, I've got a BS in CS as well. Previous discussions with him have led me to the determination that CS educations in Germany are trash.
Then again, my CS degree focused on cache architectures, compilers, kernels, and low-level programming primitives.
I'm not sure when Java became part of CS courses, but I suspect it has something to do with the declining qua
Re: (Score:2)
My language of choice is C++, diot.
thinks linked lists are the only good way to create a double-ended
No, I don not think that. Idiot. Why would I?
while insisting that having a CS degree
You seem to mix me up with someone. Perhaps check your posting history?
he's an expert.
In programming? Most certainly. Are you?
Re: (Score:2)
In managed languages, memory leaks generally refer to objects that the GC system for whatever reason, inappropriately, fails to reap.
That is wrong.
Sorry, perhaps you should finally start reading something about programming.
For example, in perl, recursive closure calls will leak, unless you weaken the reference to the closure.
No other language that supports closures that I have ever used has that particular bug. It's a quirk of perl.
So: it is a bug. Good. So it is not "a feature of the language" - right? In
Re: (Score:2)
Sorry, perhaps you should finally start reading something about programming.
Shut up, Angelo.
You're conflating a programmer forgetting that he's got an object in scope, with a runtime that has lost an object due to whatever quirks exist in its reference counting implementation.
When someone makes the claim, "Language X leaks memory..." they're referring to the runtime implementation in that language, not programmers who have accidentally left references sitting in a property list of an object somewhere.
P.S. I doubt you can call a closure recursive, how would you do that? Assign the closure to a variable, and then call inside of the closure the closure which was assigned to that variable, which is out of scope?
I don't care what you doubt, lol. I wouldn't have said it if I didn't know it c
Re: (Score:2)
BTW, here is a post about what is "wrong" with perl in that regard: https://metacpan.org/pod/Sub::... [metacpan.org]
Re: (Score:2)
That CPAN module is neat, but I just solve the problem myself by weakening the scalar reference.
s_closure = closure = { closure->() };
weaken closure;
closure->();
Basically, you break GC for closure, but leave s_closure unaffected, with the correct reference count, and that way the environment of the closure won't be left dangling every time it's called from the outer scope.
As said, it's a quirk of perl.
Re: (Score:2)
Re: (Score:2)
It is harder.
Lazy implies, you know what you should do, but you are to lazy to do it ...
Re: (Score:2)
Except for the part where you can still do all of those things, you have a totally valid argument.
Re: (Score:2)
Making everything more secure in the last 40 years of me following computers and such did not eliminate viruses all
The goal is not to eliminate risk, it is to make risk harder. I still have not fond memories of connecting a computer with Windows XP to the internet only to have it automagically owned before Windows Update was even able to download security fixes for major wormable issue du jour.
You just think in absolutes, but malware has gotten significantly more difficult to apply without some form of user interaction. The fact that stupid users continue to exist is notwithstanding.
Python is memory-safe, too (Score:3)
Given its current popularity, why is it not mentioned in the paper?
Re:Python is memory-safe, too (Score:4, Insightful)
Re: (Score:2, Insightful)
Re: (Score:3)
Most python libraries are written in C.
Re: (Score:3)
Why? Because nobody does anything serious in Python.
I wish that were true, but there are now tons of examples of influential internet-facing projects written mostly in python because the devs were afraid of curly braces. Dropbox. Instagram. Pinterest. Spotify.
Re: (Score:3)
That is wrong.
E.g. Plone ... https://plone.org/ [plone.org]
All the serious stuff is in libraries written in a more efficient language.
Nope. The serious stuff is the Python code linking those libraries together.
Re: (Score:3)
I don't agree with this statement. I've been a Python dev for the past 15 years, out of which 11 professionally. A LOT of serious projects are written in Python. Instagram is written in Python, for example. So are significant parts of Dropbox, Ansible, Spotify, World of Tanks, and many more. Most pentesting tools these days are also written in Python. Oh, and serverless architecture is only going to boost the use of this language even more.
What I'm trying to say is that just because you personally are not a
Re: (Score:2, Interesting)
There is a small group of things that Python actually does really damn well. But everything else, it's worse than nearly every other tool.
Notably, one of the few things Python does do really well, is make it very easy to use C libraries.
And there we are, full circle.
Re: (Score:2)
Rust is still new, new enough that it may be just a fad. Also the long list of benefits couple with a conspicuous lack of drawbacks indicates that it is probably 99% viral marketing and 1% ready. If it is a programming language, then stop treating it like a religion.
Re: (Score:2)
Re: (Score:3)
The number of people who love Rust is way higher than the number of people who actually use it.
Re: Python is memory-safe, too (Score:4, Informative)
The main problem is most people don't understand the concept of memory ownership. The good news is that the learning curve behind rust grows ever more simple because the borrow checker is a lot less strict than it used to be, and keeps getting less and less strict with almost every release.
I've only been using rust for two years and it's already a hell of a lot easier than what it was when I started. That, and the fact that:
- It has by far the most user friendly compiler
- Its coding patterns are so flexible that they are in many ways as easy as scripting languages
- Traits are a hell of a lot easier to learn and master than classes and interfaces
- async/await is easily the most intuitive pattern for concurrency/parallelism of any language (something it has in common with c#) in addition to being data race free (which is hard to debug otherwise)
- of any language, it's the most likely to prevent you from running into unexpected runtime errors
I think rust has a ton of promise that is yet to be realized, even by its current users, let alone new ones.
Re: (Score:2)
The main problem is most people don't understand the concept of memory ownership.
I think you underestimate most people. (Including mixing them up with each other, referring to your double linked list stack example and my "language of choice", lol)
And it seems you do not know that, "memory ownership" is an artificial construct, which has nothing to do with:
a) the algorithms in your code
b) the data structures
c) the business model
d) or object model
You only have it: because otherwise manual memory management w
Spy agency explains how to avoid spying (Score:2)
Should we trust their advice? Maybe they're trying to lull people who use "memory-safe languages" into a false sense of security.
Re: (Score:3)
Interesting. NSA supporting safer programming... (Score:3)
In other words, China, North Korea and Russia are winning this war.
LOL ROFL *giggles* (Score:3)
Doy they reaaaaaaaaally think we are so gullible as to do whatever NSA "urges"?
And more over... suggesting to use C#, Go, Java, Ruby, or Swift as *the* languages... ROLF.
Sorry, but no.
Re: (Score:2)
Actually it's a double bluff. The NSA knows how contrarian many programmers are, especially C fanatics, so this is actually a double bluff to get you to cling on to C ever more tightly. Lots of juicy exploits for them that way you see.
Great job on the turn-around (Score:2)
Re: (Score:2)
Re: Great job on the turn-around (Score:2)
Could be your sense of humor here, but... looking for clarification...
So spying is OK as long as it's metadata being spied on?
No-Brainer (Score:5, Insightful)
This is so obvious that it is amazing it took so long to point this out at that level.
(Most) human beings suck at writing good code - so, the more help we can get from the language we use the better.
Is it possible to write good code in C? Sure! Its just way harder than doing the same thing in Java or .
I wish things wouldn't turn so... IDEOLOGICAL once a programming language is involved.
Re: (Score:3, Interesting)
If you follow the CVE databases you will see that these "memory safe" languages have MORE vulnerabilities than skillfully written traditional software.
Plus they're bloated and slow as hell (yes, even Rust).
Re:No-Brainer (Score:4, Interesting)
Is it possible to write good code in C? Sure! Its just way harder than doing the same thing in Java or .
If you aren't writing good code in C, then you aren't writing good code in Java, either. Easier/harder doesn't apply here.
Just look at all the security bugs in log4j.
Re: (Score:2)
Is it possible to write good code in C? Sure! Its just way harder than doing the same thing in Java or .
It's not really HARDER. It just requires a lot more DISCIPLINE, and that's something coders are terrible at. Most of the stuff that the more modern programming languages take care of isn't stuff that's hard to do. It's just stuff that's TEDIOUS to do, and thus often gets overlooked, forgotten or ignored.
Re: (Score:2)
It's just stuff that's TEDIOUS to do, and thus often gets overlooked, forgotten or ignored. ...
Not only that. It is simply error prone. More code to write, more options for errors. On top of that coyp&paste programming. People copy a code block, start changing some variable names and forget something
I always want to disable copy / paste their machines ^_^
Re: (Score:2)
Rust? (Score:3)
Absolute drive-by on Rust. They just drove-by without acknowledging it.
Re:Rust? (Score:5, Informative)
Actually, reading the PDF reveals they did list it, but the /. author omitted it.
Examples of memory safe language include C#, Go, Java, Ruby, Rust, and Swift
Re: (Score:2)
Right, just 4 days ago, this was posted on /.
"Wired Hails Rust as 'the Viral Secure Programming Language That's Taking Over Tech' (wired.com)"
https://developers.slashdot.org/story/22/11/05/2143237/wired-hails-rust-as-the-viral-secure-programming-language-thats-taking-over-tech
Re: (Score:2)
Someone at Wired has a friend who is a rust fan-boi. THAT should have been the headline.
I remember NSA at Ada meetings (Score:5, Informative)
Ada, of course, is a memory-safe language by design since the original Ada83 version. And NSA did some work with the verifiable SPARK subset and proof tools. Yet another opportunity lost when the DoD walked away from its substantial investment in Ada because "it's not what industry is doing." (As if 'what industry is doing' was a justification...)
I was sitting at a SIGAda meeting in the late '80s, the guy next to me had a badge that had his name and "US DoD". I said to him, "You must work at NSA, do you know xxx?" He was not happy, and said "Yeah, I'm his boss. How did you know?" "Everyone else here from DoD lists the Service (e.g. Army)/Agency. Only NSA says 'US DoD' "
Re: (Score:3)
Re:I remember NSA at Ada meetings (Score:5, Interesting)
My back of the envelope estimate was that DoD & contractors trained at least 40k developers in Ada by the early 1990s. Many of my friends from the Ada days moved into other languages because they couldn't find work in Ada, which they would have preferred. Some still do under-the-table work in Ada, when there's no language requirement for delivered code, just the working product.
Sure, 'you go where the jobs are.' But as I said to a contractor that tried to justify its choice of programming languages by "ads in the LA Times," - "Don't try to justify the decision you've already made for 'business reasons' with some veneer of technical rationality." And this was a contractor that had a significant Ada investment in its own workforce.
The real hell of it is that DoD abandoned Ada just when they could start to -measure- the return on early investments. There was a study by MITRE where in the early 1990s they recalibrated their COCOMO cost model from the first half dozen or so of large delivered Ada systems. To their surprise, the exponent for life-cycle costs written in Ada was 1.0. Ada83 was shown to be linear (rather than exponential) on lines of code for development and maintenance. I tried very hard to get that report released to the public, but that never happened. Instead, DoD abandoned Ada, moved to "languages used by industry", and paid the price.
What I'm saying here is that (a) there have been memory-safe languages designed for production use for 40 years. (b) the results from memory-safe, strong typing, exceptions and modularity -have been shown- to have SIGNIFICANT value. (c) There are new languages that are getting lots of buzz like Rust, Scala, etc, and we'd be better off using them than C or C++. But (d) software development MANAGERS do not want to pay the cost for better software, either in training or tools. Sure, just toss bodies at the problem...
See this https://www.lawfareblog.com/se... [lawfareblog.com] that includes an argument for vendor liability, something I've been demanding for at least 30 years. ONLY WHEN COMPANIES HAVE TO PAY A PRICE FOR BAD SOFTWARE, will they break the habits that produce bad software cheaply.
Re: (Score:2)
The real hell of it is that DoD abandoned Ada just when they could start to -measure- the return on early investments.
Yeah, but what I always heard was that there was going to be such an ongoing investment in things like compliance and certification that the the Ada dog just wouldn't hunt. Hell, just buying an Ada compiler cost like $3,000. Might be fine for the timeshare era, but when you've got individual developers trying to build code on their desktops, that just ain't gonna cut it. The massive overheard involved in just launching an Ada project pretty much made it infeasible for anybody but the Federal government. And
Re: (Score:2)
That was true for a while. But there were also $100 compilers for PCs, and eventually the GNAT project produced a very high quality, open source, -free- Ada compiler (that continues...)
But that also gets back to my "refusal to capitalize software engineers". A company that wouldn't spend $3k (what is that, a week's wage at the time?) for a tool that substantially reduced errors (including vulnerabilities) probably got what it paid for.
Re: (Score:2)
But there were also $100 compilers for PCs
Those were very crappy compilers. They didn't support the entire language and produced bloated code.
I worked on defense projects for several years. Ada was a requirement, but other languages could be used with a waiver.
We would implement each project in C and then demo it. The clients were happy. We then told them they couldn't have it because it had to be rewritten in Ada, requiring months of development time, twice the memory, and a much faster CPU, which they would have to pay for.
We would always get the
Re: (Score:2)
rewritten in Ada, requiring months of development time, twice the memory, and a much faster CPU,
Then you were lying. As non of this is/was true. I hope this is not a habit.
Re: (Score:2)
Why not use the trick that seemed to satisfy most managers (who didn't read code).
Have your Ada program simply call your C program -- that was completely allowed in Ada !!!
Re:I remember NSA at Ada meetings (Score:4, Interesting)
During the time when Ada "was thing", a C++ compiler (for a PC/Windows) costed $1500 and upward. Over the next decade costs dropped significantly, sown to $200 and less.
Ada was always expensive in the range of unobtainable. Outside of universities and Nasa or Defense projects: simply no one could afford it.
Basically the same happened with Eiffel.
Ada not listed as example of memory-safe language (Score:4, Interesting)
From TFA:
"Examples of memory safe language include C#, Go, Java®, Ruby, Rust®, and Swift®."
Mind-boggling that Ada is not listed as an "example."
Re: (Score:2)
Ada is a failed language. That is why it is not listed. It has just far too many peoplems to be a sane choice.
And by memory safe they mean... (Score:3)
And by memory safe they mean hire developers that remember to check for Null Pointer Exceptions, Buffer Overflow, no passwords in the git repository... That sort of thing, right?
We just had an election (Score:2)
This is precisely what Congress is for. New law, as of 1 Jan 2024 all new applications or new application versions used by the US government or critical infrastructure must be written in a CISA approved memory safe programming language.
Send to POTUS for signature.
Re: (Score:2)
Managed languages leak (Score:3, Interesting)
Poor memory management can lead to technical issues as well, such as incorrect program results, degradation of the program's performance over time, and program crashes.
One of the reasons I prefer C is that at least it is reliable. Garbage collected languages are black boxes that only guarantee reachability not that your programs resource utilization will be managed in an understandable or coherent manner. It makes no guarantees about randomly slowing your software down to a crawl as it runs the system clear out of memory.
While the forced RAII options don't suffer from these problems personally I think the most likely scenario for the future is we will see C with constraints. The compiler/assistant analyzes code and provides some kind of feedback to the coder so they understand what can be changed so that it becomes tractable for it to verify behavior. Analysis capabilities would improve over time granting more freedom to coders and requiring less code changes in existing software.
This at least has a snowballs chance in hell of working vs. telling everyone to rewrite everything in a completely different language.
It's important to remember over 90% of security compromises exploit people not systems. Even if all of the security bugs were patched in all of the worlds software overnight very little would actually change.
In defense of C - many eyes (Score:2)
Re: (Score:2)
I've discovered that badly written C code can be reverse engineered and fixed. Even well written Javascript is essentially write only code.
Do you know how I can tell that you've never used either language? [ioccc.org]
I'd love to see an example of JavaScript code you think is both well-written and "write only". Hell, I'd like to see you give an example of that in any language.
Re: (Score:2)
How to tell you have your head up your ass: you think Javascript can be well written.
Re: (Score:2)
Being angry and opinionated is not the same as being smart and knowledgable no matter how much it feels like that. You are not as good as you think you are and Javascript is not that bad.
Re: (Score:2)
You can write in JavaScript just like in C, and except for the absence of the "*-operator" and the idioms associated with it, there is no real visible difference.
Re: (Score:3)
With sound practices and tool use, we already have that. For example, using "gcc -Wall" and treating every result as a potential error is pretty good. Adding Valgrind and running good tests with it helps with things like uninitialized pointers and buffer-overflows. Fuzz-testing also helps a lot. Invalidating pointers to memory after freeing it helps a lot with use-after-free. Doing input validation instead of stupidly relying on input being what you expect helps. And so on.
The real problem is incompetent co
Re: (Score:2)
With sound practices and tool use, we already have that. For example, using "gcc -Wall" and treating every result as a potential error is pretty good.
You mean adding -Werror :)
Adding Valgrind and running good tests with it helps with things like uninitialized pointers and buffer-overflows. Fuzz-testing also helps a lot. Invalidating pointers to memory after freeing it helps a lot with use-after-free. Doing input validation instead of stupidly relying on input being what you expect helps. And so on.
Might I a
Hmm (Score:2)
Re: (Score:3)
The Voyager code was written by engineers that knew what they were doing and using sound engineering processes. Most software today is not and that is the actual problem.
Meh (Score:2)
I'm pretty sure that step #1 is a non-starter:
My employers contingency plan is I will fix it . . . usually realtime.
At some point it ceases to be paranoia once you can chart getting spanked on a graph
LISP? (Score:2)
The LISP family of languages is also memory-safe, by design, and has been since 1958.
It is pre-dated only by Fortran (1954), which is not memory-safe, to the best of my recollection.
Just sayin'.
Re:LISP? (Score:4, Funny)
Re: (Score:2)
GC is a really hard problem if you need to do it generic. For example, I implemented some custom memory management for a large in-memory table a few years back, because one hard requirement was that users must not notice andy delays and there were hard constraints on allocated memory. A generic GC cannot ensure both. It will either lock things up too long or it will leave too much memory allocated that could be collected.
What this boils down to is that generic GCs are limited and sometimes you need to do it
Re: (Score:2)
Android just requires you to have more RAM, mainly because of GC. I don't think it matters with the beefy phones of our era.
Re: (Score:2)
If you already are within 30% of the max the hardware can give you, that is not an option. Also, I am talking about a server/proxy situation with a few 100 people accessing things concurrently. No idea why you bring up a mobile os targeted at weak-ass devices with slow networking (relatively speaking).
Re: (Score:2)
because one hard requirement was that users must not notice andy delays and there were hard constraints on allocated memory. A generic GC cannot ensure both.
Of course it can. And there are super simple solutions for it.
I implemented some custom memory management for a large in-memory table a few years back,
While it is an interesting experiment, it is kind fo doomed to fail. But one learns from it. And it might be fun.
You could have done a shortcut by simply reading a book about GC(s) ...
Re: (Score:2)
What nonsense (Score:3)
Yes, unless you have a good reason to use a non-memory safe language, you should use a memory safe one, but there _are_ good reasons to use non-memory safe languages in quite a few situations. As always, "the right tool for the job" is paramount and risk-management must be a part of that decision. But if you good reasons to use a specific tool, then use it. Professional chefs will not stop using chef's knives because you can hurt yourself really badly with them if you do not know what you are doing.
Hence, for example, C will not vanish because nothing can replace it in a lot of applications and it is a _good_ thing that we have C and not a zoo of languages filling its role. There are reasons C has been in the top-3 Tiobe index languages forever and a lot of those are and will remain good reasons. Of course, when using a professional tool (which often is dangerous in some way) the qualification of the tool-users becomes critical. If that is not there, things go to hell. Same if processes (architecture, design, review, testing, etc.) are not good. But the primary problem is not tools used. The reason why so much software (and a majority of it actually written with memory-dafe languages) is so bad, is because it is often produced cheaper than possible and that cannot work. If you let a gardener calculate the parameters of a bridge, chances are it will collapse. Chances are also the ones that allowed this to happen will face personal punishment. And that is something missing in software creation and maintenance: Professional standards and personal accountability. Language used is really a minor concern.
In other news: memory safe programming languages.. (Score:2)
... now offer enough other vulnerabilities to compensate for much of the gains. In it's extreme case one can exploit a large number of software projects by posting, then changing a library like "leftpad". In other cases like Log4J this is caused by well documented, but unexpected behaviour of widely used dependencies.
In any case, instead of providing a memory safe alternative to C, most "C Alternatives" have integrated dependency management systems instead of specifying standard libraries.
Re: (Score:2)
The (in?)famous log4j Bug was not well documented.
Existed only in one version and that version was removed from all repositories ASAP. No idea what you mean with "dependencies" - the bug was in log4j, and not any of its dependencies.
And: it was not a typical bug but an intentional introduction of a "feature" that simply was nonsense.
Let me know when any of those languages can run (Score:2)
Memory safe programming language? (Score:4)
Code analysis tools, not shitty languages (Score:2)
What they should be doing instead is research on tooling for analysis/modification of C/C++ code or other low-level efficient languages, so that the existing mega code
Re: (Score:2)
Java/C# are incredibly inefficient
That is simply wrong.
and bloated languages,
Bloated compared to what? Both are certainly "more bloated" than C, and both are certainly less bloated than C++
And a language is usually not called "bloated", just because it requires a bit more typing.
Safer not safe. (Score:2)
Re:One can write shitty code in any language. (Score:4, Insightful)
But it's MUCH EASIER to look at code and see it's shitty in some languages than others. After all, how many programming languages besides C have "obfustication" contests (as if it was A Good Thing to write incomprehensible code...)
I have seen bad Ada. It was -obviously bad-. And that was actually A Feature, because I could go to the PM and say "Look, the software side of this system is in trouble." Now I was often called in when "They have an Ada problem, they can't get the code to compile." And that's because the code was bad! It was inconsistent in its use of types. Its package structure was convoluted, so they got into order-of-compilation problems. The Compiler was actually telling them "Your code is so messed up here I can't make sense of it." But of course, the -compiler- and the -language- got blamed.
Re: One can write shitty code in any language. (Score:3)
Forth 3 if honk then
Re: One can write shitty code in any language. (Score:2)
./ is stripping my heart tonight.
sorry but no (Score:2)