Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Security Programming Software IT

CISA Boss: Makers of Insecure Software Are the Real Cyber Villains (theregister.com) 120

Software developers who ship buggy, insecure code are the true baddies in the cyber crime story, Jen Easterly, boss of the US government's Cybersecurity and Infrastructure Security Agency, has argued. From a report: "The truth is: Technology vendors are the characters who are building problems" into their products, which then "open the doors for villains to attack their victims," declared Easterly during a Wednesday keynote address at Mandiant's mWise conference. Easterly also implored the audience to stop "glamorizing" crime gangs with fancy poetic names. How about "Scrawny Nuisance" or "Evil Ferret," Easterly suggested.

Even calling security holes "software vulnerabilities" is too lenient, she added. This phrase "really diffuses responsibility. We should call them 'product defects,'" Easterly said. And instead of automatically blaming victims for failing to patch their products quickly enough, "why don't we ask: Why does software require so many urgent patches? The truth is: We need to demand more of technology vendors."

This discussion has been archived. No new comments can be posted.

CISA Boss: Makers of Insecure Software Are the Real Cyber Villains

Comments Filter:
  • by Arzaboa ( 2804779 ) on Friday September 20, 2024 @03:07PM (#64803769)

    What is this guy smoking? I believe most of us have been demanding mostly the opposite of what Microsoft has been packaging for years. I mean lets get real. Who asked for Windows 11?

    Microsoft turned their patches into bloatware. While there may be a patch in the patch update, most are fixes to make sure you get more advertising and to turn things on that you've already turned off.

    --
    It is never too late to be what you might have been. - George Eliot

    • I blame anything "version 11" on Nigel Tunfel [wikipedia.org].

    • by Roger W Moore ( 538166 ) on Friday September 20, 2024 @03:32PM (#64803871) Journal

      I mean lets get real. Who asked for Windows 11?

      That's easy: Microsoft shareholders. MS is there to make money, not to make the best, most secure OS possible, it only has to make it good enough and secure enough that people don't think it is so bad that they have to look for something else and that is a pretty low bar given the difficulty to switch.

      • Re: (Score:2, Insightful)

        by Narcocide ( 102829 )

        The problem is, they've gotten there by knowingly lying about whether they're making the best and most secure OS possible, and that's actually illegal but they're also only getting away with it because early on it wasn't clear to law enforcement and government officials (and common idiots) that it was possible to do any better or that they weren't just incompetent. But I think we now can all see it's far past time to keep letting this slide.

        • The problem is, they've gotten there by knowingly lying about whether they're making the best and most secure OS possible

          Other than a few specific releases that claim outside certification as being "secure" or "suitable for [certain] high-risk operations" I don't think Microsoft is marketing Windows as being secure, at least not in the "secure enough to run the guts of your nuclear plant safety systems"-secure sense of the word.

          To the contrary: I haven't read Windows 11's license agreement, but decades ago the Windows license agreement specifically disclaimed suitability for being used in things like nuclear-reactor-safety-s

          • What they say or even used to say in the TV commercials and what their 3rd party vendors still say still matters when it comes to laws about truth in advertising. Covering their ass in the most recent version of the TOS while top secret government departments and hospitals are still getting regularly pillaged for using it because they think that they have no other options doesn't actually make any of this okay.

      • by Torodung ( 31985 ) on Friday September 20, 2024 @06:23PM (#64804381) Journal

        The shareholders will be upset if Microsoft actually starts paying damages for the negligence liabilities they have accrued over the years. TFA points out that it is an externality being shifted to the wrong parties. I don't care what the EULA says, criminal negligence and civil negligence are a thing, because they're negligence, ie: not accidental but willful. A choice "not to make the best, most secure OS possible" in order to boost profits for stakeholders is textbook negligence, if not malfeasance. It's a direct harm and lots of paying customers have standing to consider legal action.

        The Clownstrike incident first and foremost. That, AFAIC, was not solely their fault. Microsoft should do a better job securing and processing newly introduced kernel extensions too. Like maybe tell people when they boot and give the operator a chance to reject the module. On every boot, until it is dismisssed, with a sensible timeout.

        However, expect to kiss backward compatibility goodbye if we go with that. Maybe expect to lose some performance hacks too. The actual truth is business demands that they keep around the insecure stuff, so they don't have to keep up a real software maintenance budget. There's a lot of negligence to go around here.

        TL;DR: Software maintenance and security is apparently too expensive for everyone involved.

        • There is much truth to this, although it's a Pandora's box.

          The reality is that a majority of coders care, and a handful do not, about security and building secure platforms. Some organizational practices are sound, and there is excellent QA. Many apps and libs on the periphery have no QA, or are built on dodgy dependencies.

          Be real. It's the culture that has to change, and its values. Litigation might help, but it casts a wide and ugly net that has no referential and legal standards to fall back on-- just ha

          • by mbkennel ( 97636 )

            Coders care more about code quality

            Managers care more about finance and schedule and they control the coder's employment and compensation

      • by Z00L00K ( 682162 )

        Just make the following statement and similar statements legally invalid:

        Yes, this would be a pain in the behind for many software companies, but in most other areas of engineering except software you have laws regulating the responsibility of the maker of something like a bridge.

    • by gweihir ( 88907 )

      What I would demand from Microsoft (or rather have enforced on them) is full liability whenever they do not follow the state-of-the-art. If it was not by accident, add triple damages on top. Obviously, they would go out of business with that, but that would be a good thing.

      • Obviously, they would go out of business with that, but that would be a good thing.

        As would every OSS developer or startup. Microsoft might be big enough to navigate such waters, (read: lobby for a grandfather clause / government waiver), but such legislation would over penalize those just trying to get started.

        It's also unnecessary. Software cannot do anything without hardware to run on, and it's not Microsoft providing it. The sysops and those hiring them have just as much culpability as the developers. Arguably more so if they do it for a living, as they are supposed to know when so

        • by gweihir ( 88907 )

          As would every OSS developer or startup.

          Stop building strawmen. Obviously FOSS (not OSS) would need some exception here. For example, anybody selling FOSS could be liable, but anybody just offering it would not be. FOSS could get professional, paid-for security review if important enough. The German government is doing that, for example. There are other approaches. Clearly FOSS is important enough now that some acceptable solution would be found. But you know what, Red Hat selling RHEL and _not_ having liability for what is in there is not accept

    • You mean the makers of Windows 11, the OS that still can't get a pinned program to start reliably when you click on it?

    • I believe most of us

      Who is this "us" you are talking about. Slashdot is not "us" globally. We are a curious and tiny group of nerds detached from the reality of how normal people use computers.

      Most people largely give zero fucks about our anti Windows culture-war.

  • by jddj ( 1085169 ) on Friday September 20, 2024 @03:08PM (#64803775) Journal

    Right?

    • There are niche situations where paying for perfection, or at least fully-known, avoidable failure modes, is called for.

      But paying for mathematically-proven-sound software [wikipedia.org] running on hardware you can trust rarely comes cheap.

      • by gweihir ( 88907 )

        Nobody is talking about that. What we need, though, is software actually developed according to the state-of-the-art and vendor liability for the actual damage done if it is not.

        That call for "mathematically proven software" is just bullshit.

        • Hmmm.. Even OpenSSH has had a huge hole within memory and another concerning one just recently.

          It's true that a lot of software is just insecure slop. But it's also true that nobody can write "defect-free" software. Nobody. Just as no car or pair of pants will be free of flaws if you look closely enough.

          The idea that it's 'kinda like math' with a "correct" answer is just not the case, outside very narrowly defined critera. If people who think it is took a minute to see what formal verification m

          • by gweihir ( 88907 ) on Friday September 20, 2024 @06:32PM (#64804403)

            OpenSSH only had moron distro maintainers patching systemD crap into it. I am not preparing to compile from sources to avoid that stupidity.

            But it's also true that nobody can write "defect-free" software.

            And, again (!), this is _not_ needed. Defects need to be a) reasonably sparse b) not entirely stupid and c) caught by redundancies. For example, the architecture pattern of "privilege separation" can nicely achieve (c). But you know what? when I teach that to my year 3 CS students, they have never heard of it. And that is just not acceptable because it means the state-of-the-art in secure software is not even taught in many places.

            If we finally make software development an actual engineering task with skill and education requirements, exploitable bugs will get rare enough that most of the current attackers will go into other professions to avoid starving and the remaining few will be government-level and their work will expensive enough as to be rarely used.

            As to formal verification, I have some experience with that and it is definitely not the way to go except in very rare cases. I agree with you there.

            • by ksw_92 ( 5249207 )

              If we finally make software development an actual engineering task with skill and education requirements,

              Ah, so you're for licensing. Even PEs make mistakes, and even perfect systems can be damaged by factors outside the reasonable scope of engineering designers. Take the recent Key Bridge collapse in Baltimore as an example. Well-designed and efficient but was built without scoping in the future where container ships got huge...Dali is a "Neopanamax" ship. A real monster.

              Another example might be the recent battlefield evolution of armor in Ukraine. Russian designs have some pretty bad design flaws and the Ukr

              • by gweihir ( 88907 )

                If we finally make software development an actual engineering task with skill and education requirements,

                Ah, so you're for licensing. Even PEs make mistakes, and even perfect systems can be damaged by factors outside the reasonable scope of engineering designers.

                Yes. So? That is a non-argument. This whole thing is about risk-management and that means probabilities.

                • by ksw_92 ( 5249207 )

                  You are correct and I am not against licensing for certain classes of software development. Guys that do internal LOB apps that never touch the internet probably don't need to be licensed, for example.

                  My point is that licensing, while a responsible risk management tool, does not ensure good results over the entirety of a product lifespan. My thought is that, for major infrastructure, which may operate for generations and outlive the companies that built it, some sort of perpetual insurance or bonding should

          • But it's also true that nobody can write "defect-free" software

            C++ developers always claim they can. That's why they always poo-poo memory safety.

    • Re: (Score:3, Interesting)

      by Bongo ( 13261 )

      There's the notion of negligence (carelessness mistakes) and there's the notion of structural engineering (liability for miscalculations).

      But, software does not resemble those much. Like the guy who wrote an open source library, and one day he gets a letter from NASA demanding he prove its safety because NASA was planning to use it on a Mars rover (something like that).

      Software has extreme complexity and fragility and change.

      But it is getting better in how we have better architectures -- remember when we di

  • by david.emery ( 127135 ) on Friday September 20, 2024 @03:22PM (#64803825)

    I've been calling for corporate and individual liability for bad software for at least 35 years. For individuals, that includes licensing software engineers under terms similar to that for other engineers (like civil engineers). Engineering liability also includes means to limit that liability.

    Corporations who deliver products that don't work should NOT be allowed to disclaim any liability, particularly through 'shrinkwrap/clickthrough licenses.'

    When corporations and corporate officers feel the pinch in their pocketbooks when they f**k up, then they'll take quality seriously.

    • by davidwr ( 791652 )

      I've been calling for corporate and individual liability for bad software for at least 35 years.

      This makes sense when life or limb is at stake. It also makes sense when the customer is willing to pay the high cost of software that doesn't have any bugs they can't live with.

      But in the everyday world of consumer software and a lot of business software, the cost of holding programmers and their employers liable for bugs would make a lot of projects economically infeasible.

      It would also pretty much kill open-source software unless said software was allowed to be distributed with a "use at own risk" claus

      • Liability applies where there's an exchange of money. If you sell something, you assume liability that product actually does what you said it would. If you accept money, you should also assume liability for that work. If some company packages OSS in their product, it's their liability. And if a company off-shores development, it does NOT offshore liability. So an argument "this will drive US programmers out of work" is silly. But maybe if the US is full of people who aren't sufficiently competent to w

        • CrowdStrike wasn't hacked. Their mess was entirely their fault. What I have a problem with is blaming developers when hackers break into a system using some vulnerability. Can you sue Master Lock when thieves use bolt cutters to break into your shed?

    • by gweihir ( 88907 )

      Exactly. Same here. Liability (and insurance) is the only way to fix this mess. What happens with that is that insurers pretty fast demand reasonable qualifications.

      Sure, we need some ideas for FOSS, but there are ways.

      • You need to rid yourself of the delusion that lawmakers will create laws that are acceptable to your world view (unless your world view insists on the end of small software developers). If lawmakers get involved, a simple echo server will cost $5M and take several years to make, and will cost each purchaser $15M to buy. FOSS will go away, and nothing will have changed except for the largest software vendors getting much larger and much more powerful. They will, once again, buy their way out of liability.

        We'

        • by gweihir ( 88907 )

          What you call "delusion" is what I call "knowledge of engineering history". I guess you lack that and hence claim nonsense.

    • You've been calling for an end to software.

      Great.

      I'm all for going Amish, but that's not possible short of cataclysm.

    • by Dadoo ( 899435 )

      I've been calling for corporate and individual liability for bad software for at least 35 years.

      LOL. The only kind of corporate liability I've seen in my lifetime is where the corporations get bailed out and the taxpayers pick up the tab.

      No. The only way you're going to get a company to change their ways is to stop using their products and impact their bottom line.

    • A long time ago before the laws got stupid one simple thing was obvious: if you make hacking illegal, the only hackers will be criminals.

      Well, now this is the world we live in. Hacking was freakishly stupidly made illegal and now most bugs are found by foreign hacking gangs running crypto extortion schemes. It's completely stupid. Your laws aren't making these computer systems more secure, they are making them less secure. Let the local nerds have a crack at it where using extortion would be illegal so a

  • There's a difference between making an imperfect product because making a perfect one is infeasible/too-expensive-to-sell, and deliberately putting in harmful imperfections with the intent of harming others.

    I also wouldn't fault customers for deferring patches until they are tested.

    The rest of the summary is mostly sensible: Defects should be called out as defects, glamorization doesn't help, and we should expect more from technology vendors.

    • by Rinnon ( 1474161 )
      For the sake of injecting a bit of extra nuance into a position I mostly agree with, I would suggest that there are certain types of negligence that can reasonably be categorized as villainous. Evil intentions are not a strict requirement of villainy. (For clarity, I'm not suggesting this situation fits the bill)
      • by davidwr ( 791652 )

        If you are thinking of things that currently qualify (or that, in your opinion, should qualify) under "criminal negligence" laws, I see where you are coming from.

        I wouldn't necessarily call those things "villainous" but I would agree that the person doing the act (or failing to fulfill a legal obligation) should be held criminally responsible.

        Here's one example:

        In much of my country, things like serving alcohol to someone who is obviously drunk then not taking reasonable action to prevent them from driving

    • by gweihir ( 88907 )

      Organizations that are willing to take a moderate gain for them in exchange for a massive damage to society qualify as "villains", no exceptions. Seriously.

  • Because people like Mark Zuckerberg "want to really have a culture that values shipping and getting things out and getting feedback" [slashdot.org] than shipping a product free of security defects.
  • When governments make the decision that software needs to be intentionally less secure so they can spy on everyone (in the name of "think of the children" or "terrorism" or whatever) or when they intentionally hoard known security flaws for the same purposes, security is worse for everyone.

  • by lsllll ( 830002 ) on Friday September 20, 2024 @03:25PM (#64803843)

    I know of no software out there that guarantees itself to be defect free. Even Excel EULA tells you that Microsoft is not responsible for the calculations Excel does and it's up to you to double check to make sure they're correct. It's impossible to build any sort of half-complicated software without bugs, let alone complicated. Furthermore, many times when we write software, we take other people's software and libraries for granted. The clause in contracts that pertain to this usually says that the developer uses industry best practices for development. I bet even NASA has managed to send software with bugs up in space.

    She is correct that we need to demand more from technology vendors. And that's fine and dandy, as long as you've got the money to pay for it.

    • But outside of academic examples, it's not common, at least not yet.

    • by gweihir ( 88907 )

      That is a bullshit argument. The discussion is not about "defect free". The discussion is about software produced according to the state-of-the-art, with qualified people, well-tested and well-designed.

    • by Tom ( 822 )

      It's impossible to build any sort of half-complicated software without bugs, let alone complicated.

      True, but it absolutely is possible to build software with two or even three orders of magnitude fewer bugs than we currently do.

      That's not guessing, there are organisations who do that. It just requires non-hero coders and strict processes including proper root cause analysis of any bugs that creep in (and I mean actual root cause, not the "ok, we found someone to blame" bullshit that most people call by that name).

      • by lsllll ( 830002 )

        I agree. Look at NASA. I'm sure their software is orders of magnitude more bug free than the software out here, but it comes at a cost and someone has to be willing to pay that cost.

        • by Tom ( 822 )

          Correct. And it might also decide not to have the latest bells & whistles from that new framework you pulled of definitelynotmalware.io yesterday.

          Or, in simple terms: Writing software like that isn't cool. It's tedious and precise.

  • by Tablizer ( 95088 ) on Friday September 20, 2024 @03:30PM (#64803865) Journal

    "why don't we ask: Why does software require so many urgent patches? The truth is: We need to demand more of technology vendors."

    1. It's usually not free to ensure responsible coding, it takes monitoring.

    2. Technology moves too fast to define responsible system design in a clear way.

    3. Companies value short-profits over long-term profits. It's similar to why neither political party pays down the national debt: they'd take a short-term hit to the economy for benefits that may only come after their term. "A dollar today is worth more than a dollar tomorrow".

    Fix the incentive systems instead of nag.

  • by tinkerton ( 199273 ) on Friday September 20, 2024 @03:32PM (#64803869)

    We would never had that worldwide outage if we'd listened to CISA. Oh wait..
    https://www.zdnet.com/article/... [zdnet.com]

  • by Rick Schumann ( 4662797 ) on Friday September 20, 2024 @03:34PM (#64803879) Journal
    Over all the years I've been a Slashdotter, and on other places on the Internet as well, my observation is that software engineers (or 'coders', as they seem to be referred to now, or 'programmers', which is what they used to be called) more often than not aren't allowed by their bosses or companies to dot all the I's and cross all the T's necessary to really deliver a sofrware or firmware product that is truly bomb-proof, so stuff gets pushed out the door with holes in them.
    To be fair about it, though, 'due dilligence' can only cover so much, no one is omniscient or precognitive, you can't always think of every possible exploit someone might use, or every set of circumstances that might cause a piece of code to act in a way other than what was intended.
    But I'd say very often our old friend 'profit above all else' is to blame, along with poor decision-making by management types whose responsibility it is to decide what the requirements of software or firmware should be.
    Am I right, folks?
    • "Coders" and software engineers are very different. Most software engineers know how to code, but not the reverse. Engineers are the ones designing a building, and coders are the ones doing the actual welding a riveting.
      • I've worked with 'software engineers' who also were writing code; I'd assumed it's dependent on the size of the organization they're working at.
    • by ctilsie242 ( 4841247 ) on Friday September 20, 2024 @11:32PM (#64804831)

      As someone who has had the job title "Software Engineer", that is just how life is. The focus will always be getting a product out the door, and if there are some show-stoppers, oh well.

      The problem is that there isn't anything pushing back on the concept of "it builds, ship it!". Government isn't going to yank a corporation's articles of incorporation. The top brass will never see any penalties. In fact, companies have to do this, or else shareholders will sue because all the work on fixing stuff is work that could be done to make more features.

      Think government will do anything? It won't be the US. It likely will be Europe, but even then, business seems to be able to easily tap dance around stuff like the GDPR.

      Probably the best option is to see about funding F/OSS projects. The money spent for a year's license of some enterprise software, if added up, could probably get 100+ top notch developers to make a F/OSS project and do it right.

  • by Murdoch5 ( 1563847 ) on Friday September 20, 2024 @03:38PM (#64803905) Homepage
    People, generally, don't write insecure code, or use insecure hardware, intentionally. Sometimes, to be fair, the problem is a backdoor that was left open, but, how many times is the issue compatibility? Why do we need patches on patches? Why do you run a 20-year-old email server, upgraded to service pack 10.4, that needs Window 7, hooked up to a mainframe?

    This idea that we need to hold the engineers accountable, is fine, but then to say the user isn't the problem, is f'ing stupid. If I had my say, all my users would be on Qubes OS, or Fedora, locked down with policy on top of policy, with updated software stacks, and all the best security features enabled. The reality is I can't do that, and I really do have to support software, and systems, that are 20+ years old.

    What's her plan for creating ideal security? If I email her office, will the response emails be signed / encrypted with PGP, or will they be plain text, or even worse, HTML? Should we hold engineers accountable, yes, absolutely, but a lot of the problems we have to work around, aren't are fault.

    A real-world example, I have a library in my project that can't be updated. The update completely breaks our platform, and the work required to put everything back together again is idiotic, on the measure of weeks. Moreover, they want to charge us a license fee in the thousands / 10s of thousands, and force us into a support package, which costs about the same, but, they won't help during the upgrade.

    The library being on the old version, blocks another library from being updated, which in summary leads to the framework we use locked on version X instead of Y. X has some known bugs, not serious, but known, and the way to solve them is to update to version Y, which we can't do. In this case, I'm in the engineer and the user, so who's at fault? We need the old library, to support an out of date stack, that someone in the government uses because why update?
    • fork/patch. red herring bullshit. try harder.
    • People, generally, don't write insecure code, or use insecure hardware, intentionally.

      They do if they don't want to get fired and replaced with someone who will shit out a mostly-functional product by the deadline promised by marketing.

  • by gavron ( 1300111 ) on Friday September 20, 2024 @03:40PM (#64803911)

    Another poster says "Demand More From Microsoft" but those of us in the industry have been doing that for literally almost 30 years!

    Windows For Workgroups was a popular update to Windows 3.1, but Microsoft wants to sell liceneses. So they created Windows 95, but in order to keep/maintain/convert current WFG users, Microsoft declared that Win95 would run whatever WFG could run. They even special cased some software so Win95 would detected it and go into "super-emulation" or "bug-emulation" modes so the software would see it as true WFG instead of the Win95 it really was.

    In the process of doing so, big gaping security holes and shoddy programming wasn't just continued... it was purposefully maintained, supported, and required by Microsoft. Win 95 was revised majorly on the outside for Win98, a bit on the inside for WinME, but it wasn't until the NT Kernel was put in Win2000 that Microsoft truly had an opportunity to put real multi-layer security, separate the user and kernel, etc. But, for the backward compatibility, they didn't.

    Win2003, 2008, all continued this trend Add a bit of "user verification" that was easily bypassed --especially if Microsoft made its software think you were one of those "special case" pre-1995 pieces of software.

    ALL OF THIS was a nice warm sugary petry dish for the growing of the entire malware industry.

    Win8 didn't fix this. Win9 couldn't exist because older software checked the first digit of the Windows major version to see how it needed to frame its API calls, and if starts with a "9" it's 95/98. Win10 and Win 11 didn't fix this mess either.

    Don't Demand "More" From MIcrosoft. Demand the minimum in developing a secure OS or levy a fine. That's the government's operation and they're so deep in Microsoft's packet every time they hear "Linux" or "LibreOffice" they hiss like a Slytherin Vampire entering a church and being hit with a hose of holy water.

    To use an analogy, Microsoft is the home builder that sold you a house with a lock anyone can open in seconds, and walls that open up when someone hits them with a 2600Hz tone. An entire "cottage" industry built up around robbing people in these Microsoft houses. Sure, the criminal is the one who robs you... but when you buy a house with wall openings and doors that open too... and it wasn't disclosed to you except as "This house is 100% backward compatible with the empty lot that was here before"... that's where the blame should start.

    • by gweihir ( 88907 )

      Don't Demand "More" From MIcrosoft. Demand the minimum in developing a secure OS or levy a fine.

      We need one step more: If their products do damage (and they do so all the time), unless they can prove they followed sound engineering practices, qualified personnel was used and the state-of-the-art was respected, they become liable for any and all damage. Like any other vendor of an engineering product, really.

    • Most homes in my country have locks that be defeated in less than a minute by breaking a nearby window.

      Substitute "big truck moving at high speed" for "2600Hz tone" and most homes' walls will come tumbling down as well.

      I've been told that there is a cottage industry [google.com] that helps you break into a home in a less attention-grabbing way than breaking a window or ruining your truck to open up a wall.

      By the way, thanks for pointing out that the need for backward 100%-compatibility can lead to a less-secure environm

    • Maybe I'm blaming the victim here, but instead of "demanding" anything from them, people should go elsewhere. It's the only way they'll listen. Stop using Microsoft software.

    • No. Websites that run solely on Linux servers, are no more secure than the ones that run on Windows.

  • by El_Muerte_TDS ( 592157 ) on Friday September 20, 2024 @03:40PM (#64803915) Homepage

    Software comes with no warranty because nobody is willing to pay for it.

    • by gweihir ( 88907 )

      Given the damage done, this model is not sustainable. For example, in Germany, the damage of cybercrime as 2600 EUR per person per year in 2023. That is not a small problem anymore.

  • ...there is NO procedure, that if followed carefully, will produce perfectly secure, bug free code.
    I do agree that managers often cut corners, hire cheap, poor quality programmers and impose unrealistic deadlines, but even the best of us can't create perfect code

    • Re: (Score:3, Informative)

      by davidwr ( 791652 )

      ...there is NO procedure, that if followed carefully, will produce perfectly secure, bug free code.

      In practical terms, at least for most modern situations, you are correct.

      But I would encourage you to read up on "formal verification" as it applies to computing. There are a few use cases where the work needed for formally-verified-correct algorithms and their implementations is worth the effort.

  • by gweihir ( 88907 ) on Friday September 20, 2024 @03:53PM (#64803969)

    Sure, the attacker are to blame to some degree as well, but they are just opportunists. The real problem are those that create the opportunity. They are also the only point where this problem can be fixed, because it is impossible to even begin to track down the attackers in a globally connected world.

    • This is like blaming stores for not locking up their products tightly enough. The problem is, if they lock it up too tightly, people won't shop. So they have to do this balance between security and accessibility. You know, kind of like all software security.

      Yes, there is a such thing as negligence. But most software security flaws are difficult to spot, not just the result of carelessness.

      • by gweihir ( 88907 )

        That is bullshit and you know it. Most software security bugs are due to pathetic incompetence of the coder, designer and architect. I guess you are somebody that has a lot to lose when you would be held accountable for your work.

        • Security is a moving target.

          Consider the C language, with its malloc() and free() functions, and pointers to access memory. By today's standards, this kind of design would be considered "incompetent." But when the language was built, this type of security was not top of mind. Were the designers of C incompetent and should they be prosecuted? Was the design "buggy"? I think not. It's not fair to apply today's standards to software that was designed 50 years ago.

          Should we *immediately* replace all C code with

          • by gweihir ( 88907 )

            Security is a moving target.

            Not really. Linus is right when he calls security bugs "just bugs". The only reason IT security feels like a moving target at the moment is because so much software and systems are so bad in this regard.

            Also I disagree on C. The design is in no way "incompetent". It is low-level, but C has a ton of advantages that are still valid today. You just need to be competent when you use it and you need to know when to be extra careful. Too many coders these days are simply incompetent and that is not the fault of t

  • 100% guarantee that "Jen Easterly" has never written any code for anything. She has no idea how to code securely, and worse no idea of how vulnerabilities happen. Who would get into software development if they can be held liable for something like this. Even the smartest and most diligent person can't be expected to foresee every possible type of attack and vector. It's dumb. And btw, I'm not saying it cause all the code I wrote back in the 90s and prolly even early 2000s didn't give a fuck about security

    • jump up your own ass
    • Who would get into software development if they can be held liable for something like this.

      People who stood to make a lot of money. Now you can make a lot of money and not be liable for the product defects. Its not really surprising that we have a lot of insecure crappy products out there with defects.

      You can't build houses without flaws either, but when a house burns down because the wiring was bad the people responsible are held liable. That's why we license electricians.

      Its long past time that we required software to be programmed by licensed programmers and software approved for release only

  • by vladoshi ( 9025601 ) on Friday September 20, 2024 @05:03PM (#64804197)
    Learn to comply with the standards real engineers have to, or face sending their companies broke and even go to prison?
    • Stop with this.

      The word engineer is a thousand years old.

      20th Century bureaucrats don't get to steal and redefine the word.

      Call it bonded-engineers or whatever is actually true.

  • This dim bulb is basically making the case that the job and the agency is unnecessary.

  • by mukundajohnson ( 10427278 ) on Friday September 20, 2024 @05:19PM (#64804237)

    When you construct something that people physically use, there are a lot of things that support safety and discourage fires and such. In software infrastructure, there are certifications and authorizations such as SOC-2 and FedRAMP (requiring a security audit to maintain "secure" status), but I'm not sure about anything for the software itself.

    Software get extremely complex, so obviously inspection isn't as easy, but what if B2B vendors were required to have regular code reviews ("inspections") from a 3rd party to help with security? I'm not sure how cost viable that is when it comes to software, but something should be done to attack vendors that really don't give a shit while milking their customers for huge profit margins.

    • SOC, FedRAMP, and other items don't really affect code quality. They will tell you about setting up machines via STIGs, but code quality isn't something those guidelines touch in any way, shape or form. At most, it may touch on "program 'A' has to access program 'B' via TLS, use only this type of auth", but it doesn't really go into how much defensive programming is in the code.

      • One other thing that comes to mind is that FedRAMP requires authorized vendors to fix their code within a short timeframe (depending on severity) if a vulnerability is discovered. To pass that security control you need to prove that your engineering process will be effective.

        But yeah, nothing I've heard of enforces audits while building the program to begin with.

  • It doesn't exist. There is only tolerance and price. Lower tolerance == more money. And you have to specify things like temperature. You ok with your computer only being "valid" at 70F +/- 0.0001F?

    If you want legal liability for bugs, you are asking for VERY expensive software. And it would require ditching most hardware as well. All the bugs in CPUs lately? And patches are not good enough, so not only is hardware going to be expensive, it's going to be SLOW. There's a reason space missions don't use the la

  • What a maroon. No wonder she's in government.

    Which also means she could easily go to NASA and study how they produced mostly-correct code for the Space Shuttle.

    I forget the exact number from my Software Engineering class in college, but it's something like 300 man-hours per LOC of production code. That reduced the error rate to something like 1 in 1000 LOC. At the time it wss estimated that the Shuttle was launching with probably 40 unknown bugs.

    Why not just have EPA require 99% efficient engines in cars

    • Digressing slightly, I wonder if writing the Space Shuttle code in Ada helped with the bugs per LoC, or made it worse.

      Overall, I think "AI"... or in reality, an expanded "lint" binary which does deeper checks than just syntax, may help with this... but who knows.

  • And raped and pillaged your tiny bits. Such victim blaming here.
  • If a store has merchandise on shelves where people can easily pick it up and walk away with it, are the store owners the "real villains" for not implementing sufficient security? Hardly.

  • "Other" engineering professions have strict compliance for safety reasons, not for security.
    - Safety is the protection against unintentional harming of humans.
    - Security is the protection against intentional harming of humans.
    - Everything else falls under general protection. Eg: Robust uptimes.

  • This is a drum Easterly has been beating since she took the helm of the US cyber defense agency. She tends to bang it louder at industry events, such as the annual RSA Conference where she told attendees secure code "is the only way we can make ransomware and cyber attacks a shocking anomaly."

    This is alarmingly naive. Vast majority (over 90%) of compromises exploit people not systems.

    But the pledge remains voluntary, so software companies who fail to follow its guidelines - such as increasing the use of multi-factor authentication across their products and reducing default passwords - aren't going to be slapped down if they ignore it.

    This is the wrong thing to focus on. The most salient problem with authentication is ubiquity of insecure authentication methods which completely lack verifier impersonation resistance.

    A single factor properly implemented is worth more than multiple factors that are not in the face of phishing attacks the single biggest threat faced by users. Virtually nobody implements knowledge factors with verifier impersonati

    • > This is alarmingly naive. Vast majority (over 90%) of compromises exploit people not systems.
      That is correct for initial access. PE and Lateral movement is then done using system exploits.

  • You should be able to have your passwords set to password and your sql queries un-sanitized and no one would even consider trying to hack in. The problem is a severe lack of morality in the human race in general.
  • Absolute security on your basic office system? Sure.

    Your OS now costs $100,000 per client seat.

    Oh, you want basic office apps? Wow. Big spender! Well, here's Word and Excel; $30,000. Each.

    Outlook? No. We gave up on securing that immediately. Here's a locked down web browser for an absolute steal at $150,000. Per Year (What? We have to pay a competent "engineer" to stand at your shoulder at all times whiole you're using it. That gets expensive). To access your plain text only hotmail account.

    • If we did not externalize costs, a lot of products would be prohibitly expensive. Same as this situation.

/earth: file system full.

Working...