Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security Programming Software IT

CISA Boss: Makers of Insecure Software Are the Real Cyber Villains (theregister.com) 74

Software developers who ship buggy, insecure code are the true baddies in the cyber crime story, Jen Easterly, boss of the US government's Cybersecurity and Infrastructure Security Agency, has argued. From a report: "The truth is: Technology vendors are the characters who are building problems" into their products, which then "open the doors for villains to attack their victims," declared Easterly during a Wednesday keynote address at Mandiant's mWise conference. Easterly also implored the audience to stop "glamorizing" crime gangs with fancy poetic names. How about "Scrawny Nuisance" or "Evil Ferret," Easterly suggested.

Even calling security holes "software vulnerabilities" is too lenient, she added. This phrase "really diffuses responsibility. We should call them 'product defects,'" Easterly said. And instead of automatically blaming victims for failing to patch their products quickly enough, "why don't we ask: Why does software require so many urgent patches? The truth is: We need to demand more of technology vendors."

CISA Boss: Makers of Insecure Software Are the Real Cyber Villains

Comments Filter:
  • by Arzaboa ( 2804779 ) on Friday September 20, 2024 @04:07PM (#64803769)

    What is this guy smoking? I believe most of us have been demanding mostly the opposite of what Microsoft has been packaging for years. I mean lets get real. Who asked for Windows 11?

    Microsoft turned their patches into bloatware. While there may be a patch in the patch update, most are fixes to make sure you get more advertising and to turn things on that you've already turned off.

    --
    It is never too late to be what you might have been. - George Eliot

    • I blame anything "version 11" on Nigel Tunfel [wikipedia.org].

    • I mean lets get real. Who asked for Windows 11?

      That's easy: Microsoft shareholders. MS is there to make money, not to make the best, most secure OS possible, it only has to make it good enough and secure enough that people don't think it is so bad that they have to look for something else and that is a pretty low bar given the difficulty to switch.

      • The problem is, they've gotten there by knowingly lying about whether they're making the best and most secure OS possible, and that's actually illegal but they're also only getting away with it because early on it wasn't clear to law enforcement and government officials (and common idiots) that it was possible to do any better or that they weren't just incompetent. But I think we now can all see it's far past time to keep letting this slide.

        • The problem is, they've gotten there by knowingly lying about whether they're making the best and most secure OS possible

          Other than a few specific releases that claim outside certification as being "secure" or "suitable for [certain] high-risk operations" I don't think Microsoft is marketing Windows as being secure, at least not in the "secure enough to run the guts of your nuclear plant safety systems"-secure sense of the word.

          To the contrary: I haven't read Windows 11's license agreement, but decades ago the Windows license agreement specifically disclaimed suitability for being used in things like nuclear-reactor-safety-s

          • What they say or even used to say in the TV commercials and what their 3rd party vendors still say still matters when it comes to laws about truth in advertising. Covering their ass in the most recent version of the TOS while top secret government departments and hospitals are still getting regularly pillaged for using it because they think that they have no other options doesn't actually make any of this okay.

      • by Torodung ( 31985 )

        The shareholders will be upset if Microsoft actually starts paying damages for the negligence liabilities they have accrued over the years. TFA points out that it is an externality being shifted to the wrong parties. I don't care what the EULA says, criminal negligence and civil negligence are a thing, because they're negligence, ie: not accidental but willful. A choice "not to make the best, most secure OS possible" in order to boost profits for stakeholders is textbook negligence, if not malfeasance. It's

    • by gweihir ( 88907 )

      What I would demand from Microsoft (or rather have enforced on them) is full liability whenever they do not follow the state-of-the-art. If it was not by accident, add triple damages on top. Obviously, they would go out of business with that, but that would be a good thing.

      • Obviously, they would go out of business with that, but that would be a good thing.

        As would every OSS developer or startup. Microsoft might be big enough to navigate such waters, (read: lobby for a grandfather clause / government waiver), but such legislation would over penalize those just trying to get started.

        It's also unnecessary. Software cannot do anything without hardware to run on, and it's not Microsoft providing it. The sysops and those hiring them have just as much culpability as the developers. Arguably more so if they do it for a living, as they are supposed to know when so

        • by gweihir ( 88907 )

          As would every OSS developer or startup.

          Stop building strawmen. Obviously FOSS (not OSS) would need some exception here. For example, anybody selling FOSS could be liable, but anybody just offering it would not be. FOSS could get professional, paid-for security review if important enough. The German government is doing that, for example. There are other approaches. Clearly FOSS is important enough now that some acceptable solution would be found. But you know what, Red Hat selling RHEL and _not_ having liability for what is in there is not accept

          • So what you're saying is, to become a millionaire all I need to do is start a business than runs 1 copy of every application out there and profit from all the vulnerabilities?

            Seriously, this would jack the prices WAY up to an unaffordable level for most people. Technologies exist to secure software regardless of whether or not it's secure by design or has exploits.

            AI should be analyzing code bases for vulnerabilities and there should be a bounty program, that's it. No liability. It's not that difficult
    • You mean the makers of Windows 11, the OS that still can't get a pinned program to start reliably when you click on it?

    • There are niche situations where paying for perfection, or at least fully-known, avoidable failure modes, is called for.

      But paying for mathematically-proven-sound software [wikipedia.org] running on hardware you can trust rarely comes cheap.

      • by gweihir ( 88907 )

        Nobody is talking about that. What we need, though, is software actually developed according to the state-of-the-art and vendor liability for the actual damage done if it is not.

        That call for "mathematically proven software" is just bullshit.

        • Hmmm.. Even OpenSSH has had a huge hole within memory and another concerning one just recently.

          It's true that a lot of software is just insecure slop. But it's also true that nobody can write "defect-free" software. Nobody. Just as no car or pair of pants will be free of flaws if you look closely enough.

          The idea that it's 'kinda like math' with a "correct" answer is just not the case, outside very narrowly defined critera. If people who think it is took a minute to see what formal verification m

          • by gweihir ( 88907 )

            OpenSSH only had moron distro maintainers patching systemD crap into it. I am not preparing to compile from sources to avoid that stupidity.

            But it's also true that nobody can write "defect-free" software.

            And, again (!), this is _not_ needed. Defects need to be a) reasonably sparse b) not entirely stupid and c) caught by redundancies. For example, the architecture pattern of "privilege separation" can nicely achieve (c). But you know what? when I teach that to my year 3 CS students, they have never heard of it. And that is just not acceptable because it means the state-o

  • by david.emery ( 127135 ) on Friday September 20, 2024 @04:22PM (#64803825)

    I've been calling for corporate and individual liability for bad software for at least 35 years. For individuals, that includes licensing software engineers under terms similar to that for other engineers (like civil engineers). Engineering liability also includes means to limit that liability.

    Corporations who deliver products that don't work should NOT be allowed to disclaim any liability, particularly through 'shrinkwrap/clickthrough licenses.'

    When corporations and corporate officers feel the pinch in their pocketbooks when they f**k up, then they'll take quality seriously.

    • I've been calling for corporate and individual liability for bad software for at least 35 years.

      This makes sense when life or limb is at stake. It also makes sense when the customer is willing to pay the high cost of software that doesn't have any bugs they can't live with.

      But in the everyday world of consumer software and a lot of business software, the cost of holding programmers and their employers liable for bugs would make a lot of projects economically infeasible.

      It would also pretty much kill open-source software unless said software was allowed to be distributed with a "use at own risk" claus

      • Liability applies where there's an exchange of money. If you sell something, you assume liability that product actually does what you said it would. If you accept money, you should also assume liability for that work. If some company packages OSS in their product, it's their liability. And if a company off-shores development, it does NOT offshore liability. So an argument "this will drive US programmers out of work" is silly. But maybe if the US is full of people who aren't sufficiently competent to w

    • by gweihir ( 88907 )

      Exactly. Same here. Liability (and insurance) is the only way to fix this mess. What happens with that is that insurers pretty fast demand reasonable qualifications.

      Sure, we need some ideas for FOSS, but there are ways.

      • You need to rid yourself of the delusion that lawmakers will create laws that are acceptable to your world view (unless your world view insists on the end of small software developers). If lawmakers get involved, a simple echo server will cost $5M and take several years to make, and will cost each purchaser $15M to buy. FOSS will go away, and nothing will have changed except for the largest software vendors getting much larger and much more powerful. They will, once again, buy their way out of liability.

        We'

        • by gweihir ( 88907 )

          What you call "delusion" is what I call "knowledge of engineering history". I guess you lack that and hence claim nonsense.

    • You've been calling for an end to software.

      Great.

      I'm all for going Amish, but that's not possible short of cataclysm.

    • by Dadoo ( 899435 )

      I've been calling for corporate and individual liability for bad software for at least 35 years.

      LOL. The only kind of corporate liability I've seen in my lifetime is where the corporations get bailed out and the taxpayers pick up the tab.

      No. The only way you're going to get a company to change their ways is to stop using their products and impact their bottom line.

  • There's a difference between making an imperfect product because making a perfect one is infeasible/too-expensive-to-sell, and deliberately putting in harmful imperfections with the intent of harming others.

    I also wouldn't fault customers for deferring patches until they are tested.

    The rest of the summary is mostly sensible: Defects should be called out as defects, glamorization doesn't help, and we should expect more from technology vendors.

    • by Rinnon ( 1474161 )
      For the sake of injecting a bit of extra nuance into a position I mostly agree with, I would suggest that there are certain types of negligence that can reasonably be categorized as villainous. Evil intentions are not a strict requirement of villainy. (For clarity, I'm not suggesting this situation fits the bill)
      • by davidwr ( 791652 )

        If you are thinking of things that currently qualify (or that, in your opinion, should qualify) under "criminal negligence" laws, I see where you are coming from.

        I wouldn't necessarily call those things "villainous" but I would agree that the person doing the act (or failing to fulfill a legal obligation) should be held criminally responsible.

        Here's one example:

        In much of my country, things like serving alcohol to someone who is obviously drunk then not taking reasonable action to prevent them from driving

    • by gweihir ( 88907 )

      Organizations that are willing to take a moderate gain for them in exchange for a massive damage to society qualify as "villains", no exceptions. Seriously.

  • Because people like Mark Zuckerberg "want to really have a culture that values shipping and getting things out and getting feedback" [slashdot.org] than shipping a product free of security defects.
  • When governments make the decision that software needs to be intentionally less secure so they can spy on everyone (in the name of "think of the children" or "terrorism" or whatever) or when they intentionally hoard known security flaws for the same purposes, security is worse for everyone.

  • by lsllll ( 830002 ) on Friday September 20, 2024 @04:25PM (#64803843)

    I know of no software out there that guarantees itself to be defect free. Even Excel EULA tells you that Microsoft is not responsible for the calculations Excel does and it's up to you to double check to make sure they're correct. It's impossible to build any sort of half-complicated software without bugs, let alone complicated. Furthermore, many times when we write software, we take other people's software and libraries for granted. The clause in contracts that pertain to this usually says that the developer uses industry best practices for development. I bet even NASA has managed to send software with bugs up in space.

    She is correct that we need to demand more from technology vendors. And that's fine and dandy, as long as you've got the money to pay for it.

    • But outside of academic examples, it's not common, at least not yet.

    • by gweihir ( 88907 )

      That is a bullshit argument. The discussion is not about "defect free". The discussion is about software produced according to the state-of-the-art, with qualified people, well-tested and well-designed.

    • EULA tells you that Microsoft is not responsible for the calculations Excel does

      This terms and condition bullshit has got to stop. How is microsuck not responsible for the bullshit they make?
      If a hammer company builds a hammer out of a metal that shatters on impact and you lose an eye, it's your fault?

  • by Tablizer ( 95088 ) on Friday September 20, 2024 @04:30PM (#64803865) Journal

    "why don't we ask: Why does software require so many urgent patches? The truth is: We need to demand more of technology vendors."

    1. It's usually not free to ensure responsible coding, it takes monitoring.

    2. Technology moves too fast to define responsible system design in a clear way.

    3. Companies value short-profits over long-term profits. It's similar to why neither political party pays down the national debt: they'd take a short-term hit to the economy for benefits that may only come after their term. "A dollar today is worth more than a dollar tomorrow".

    Fix the incentive systems instead of nag.

  • by tinkerton ( 199273 ) on Friday September 20, 2024 @04:32PM (#64803869)

    We would never had that worldwide outage if we'd listened to CISA. Oh wait..
    https://www.zdnet.com/article/... [zdnet.com]

  • by Rick Schumann ( 4662797 ) on Friday September 20, 2024 @04:34PM (#64803879) Journal
    Over all the years I've been a Slashdotter, and on other places on the Internet as well, my observation is that software engineers (or 'coders', as they seem to be referred to now, or 'programmers', which is what they used to be called) more often than not aren't allowed by their bosses or companies to dot all the I's and cross all the T's necessary to really deliver a sofrware or firmware product that is truly bomb-proof, so stuff gets pushed out the door with holes in them.
    To be fair about it, though, 'due dilligence' can only cover so much, no one is omniscient or precognitive, you can't always think of every possible exploit someone might use, or every set of circumstances that might cause a piece of code to act in a way other than what was intended.
    But I'd say very often our old friend 'profit above all else' is to blame, along with poor decision-making by management types whose responsibility it is to decide what the requirements of software or firmware should be.
    Am I right, folks?
    • "Coders" and software engineers are very different. Most software engineers know how to code, but not the reverse. Engineers are the ones designing a building, and coders are the ones doing the actual welding a riveting.
      • I've worked with 'software engineers' who also were writing code; I'd assumed it's dependent on the size of the organization they're working at.
  • People, generally, don't write insecure code, or use insecure hardware, intentionally. Sometimes, to be fair, the problem is a backdoor that was left open, but, how many times is the issue compatibility? Why do we need patches on patches? Why do you run a 20-year-old email server, upgraded to service pack 10.4, that needs Window 7, hooked up to a mainframe?

    This idea that we need to hold the engineers accountable, is fine, but then to say the user isn't the problem, is f'ing stupid. If I had my say,
  • by gavron ( 1300111 ) on Friday September 20, 2024 @04:40PM (#64803911)

    Another poster says "Demand More From Microsoft" but those of us in the industry have been doing that for literally almost 30 years!

    Windows For Workgroups was a popular update to Windows 3.1, but Microsoft wants to sell liceneses. So they created Windows 95, but in order to keep/maintain/convert current WFG users, Microsoft declared that Win95 would run whatever WFG could run. They even special cased some software so Win95 would detected it and go into "super-emulation" or "bug-emulation" modes so the software would see it as true WFG instead of the Win95 it really was.

    In the process of doing so, big gaping security holes and shoddy programming wasn't just continued... it was purposefully maintained, supported, and required by Microsoft. Win 95 was revised majorly on the outside for Win98, a bit on the inside for WinME, but it wasn't until the NT Kernel was put in Win2000 that Microsoft truly had an opportunity to put real multi-layer security, separate the user and kernel, etc. But, for the backward compatibility, they didn't.

    Win2003, 2008, all continued this trend Add a bit of "user verification" that was easily bypassed --especially if Microsoft made its software think you were one of those "special case" pre-1995 pieces of software.

    ALL OF THIS was a nice warm sugary petry dish for the growing of the entire malware industry.

    Win8 didn't fix this. Win9 couldn't exist because older software checked the first digit of the Windows major version to see how it needed to frame its API calls, and if starts with a "9" it's 95/98. Win10 and Win 11 didn't fix this mess either.

    Don't Demand "More" From MIcrosoft. Demand the minimum in developing a secure OS or levy a fine. That's the government's operation and they're so deep in Microsoft's packet every time they hear "Linux" or "LibreOffice" they hiss like a Slytherin Vampire entering a church and being hit with a hose of holy water.

    To use an analogy, Microsoft is the home builder that sold you a house with a lock anyone can open in seconds, and walls that open up when someone hits them with a 2600Hz tone. An entire "cottage" industry built up around robbing people in these Microsoft houses. Sure, the criminal is the one who robs you... but when you buy a house with wall openings and doors that open too... and it wasn't disclosed to you except as "This house is 100% backward compatible with the empty lot that was here before"... that's where the blame should start.

    • by gweihir ( 88907 )

      Don't Demand "More" From MIcrosoft. Demand the minimum in developing a secure OS or levy a fine.

      We need one step more: If their products do damage (and they do so all the time), unless they can prove they followed sound engineering practices, qualified personnel was used and the state-of-the-art was respected, they become liable for any and all damage. Like any other vendor of an engineering product, really.

    • Most homes in my country have locks that be defeated in less than a minute by breaking a nearby window.

      Substitute "big truck moving at high speed" for "2600Hz tone" and most homes' walls will come tumbling down as well.

      I've been told that there is a cottage industry [google.com] that helps you break into a home in a less attention-grabbing way than breaking a window or ruining your truck to open up a wall.

      By the way, thanks for pointing out that the need for backward 100%-compatibility can lead to a less-secure environm

      • by KlomDark ( 6370 )
        Jay and Silent Bob will just ram a cheap old import car thru your wall and call Tommy Chong his dad.
    • Maybe I'm blaming the victim here, but instead of "demanding" anything from them, people should go elsewhere. It's the only way they'll listen. Stop using Microsoft software.

  • Software comes with no warranty because nobody is willing to pay for it.

    • by gweihir ( 88907 )

      Given the damage done, this model is not sustainable. For example, in Germany, the damage of cybercrime as 2600 EUR per person per year in 2023. That is not a small problem anymore.

  • ...there is NO procedure, that if followed carefully, will produce perfectly secure, bug free code.
    I do agree that managers often cut corners, hire cheap, poor quality programmers and impose unrealistic deadlines, but even the best of us can't create perfect code

    • by davidwr ( 791652 )

      ...there is NO procedure, that if followed carefully, will produce perfectly secure, bug free code.

      In practical terms, at least for most modern situations, you are correct.

      But I would encourage you to read up on "formal verification" as it applies to computing. There are a few use cases where the work needed for formally-verified-correct algorithms and their implementations is worth the effort.

  • by gweihir ( 88907 ) on Friday September 20, 2024 @04:53PM (#64803969)

    Sure, the attacker are to blame to some degree as well, but they are just opportunists. The real problem are those that create the opportunity. They are also the only point where this problem can be fixed, because it is impossible to even begin to track down the attackers in a globally connected world.

  • 100% guarantee that "Jen Easterly" has never written any code for anything. She has no idea how to code securely, and worse no idea of how vulnerabilities happen. Who would get into software development if they can be held liable for something like this. Even the smartest and most diligent person can't be expected to foresee every possible type of attack and vector. It's dumb. And btw, I'm not saying it cause all the code I wrote back in the 90s and prolly even early 2000s didn't give a fuck about security

  • by vladoshi ( 9025601 ) on Friday September 20, 2024 @06:03PM (#64804197)
    Learn to comply with the standards real engineers have to, or face sending their companies broke and even go to prison?
    • Do engineers in other disciplines have to worry about government sanctioned actors trying to exploit their projects? If I build a bridge I need to make sure it doesn't fall down under normal circumstances.. but I don't have to worry about what would happen if someone puts a bunch of c4 under it right?
    • Stop with this.

      The word engineer is a thousand years old.

      20th Century bureaucrats don't get to steal and redefine the word.

      Call it bonded-engineers or whatever is actually true.

  • This dim bulb is basically making the case that the job and the agency is unnecessary.

  • by mukundajohnson ( 10427278 ) on Friday September 20, 2024 @06:19PM (#64804237)

    When you construct something that people physically use, there are a lot of things that support safety and discourage fires and such. In software infrastructure, there are certifications and authorizations such as SOC-2 and FedRAMP (requiring a security audit to maintain "secure" status), but I'm not sure about anything for the software itself.

    Software get extremely complex, so obviously inspection isn't as easy, but what if B2B vendors were required to have regular code reviews ("inspections") from a 3rd party to help with security? I'm not sure how cost viable that is when it comes to software, but something should be done to attack vendors that really don't give a shit while milking their customers for huge profit margins.

  • It doesn't exist. There is only tolerance and price. Lower tolerance == more money. And you have to specify things like temperature. You ok with your computer only being "valid" at 70F +/- 0.0001F?

    If you want legal liability for bugs, you are asking for VERY expensive software. And it would require ditching most hardware as well. All the bugs in CPUs lately? And patches are not good enough, so not only is hardware going to be expensive, it's going to be SLOW. There's a reason space missions don't use the la

  • What a maroon. No wonder she's in government.

    Which also means she could easily go to NASA and study how they produced mostly-correct code for the Space Shuttle.

    I forget the exact number from my Software Engineering class in college, but it's something like 300 man-hours per LOC of production code. That reduced the error rate to something like 1 in 1000 LOC. At the time it wss estimated that the Shuttle was launching with probably 40 unknown bugs.

    Why not just have EPA require 99% efficient engines in cars

Welcome to boggle - do you want instructions? D G G O O Y A N A D B T K I S P Enter words: >

Working...