Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
Programming

The Great Software Quality Collapse (substack.com) 182

Engineer Denis Stetskov, writing in a blog: The Apple Calculator leaked 32GB of RAM. Not used. Not allocated. Leaked. A basic calculator app is hemorrhaging more memory than most computers had a decade ago. Twenty years ago, this would have triggered emergency patches and post-mortems. Today, it's just another bug report in the queue. We've normalized software catastrophes to the point where a Calculator leaking 32GB of RAM barely makes the news. This isn't about AI. The quality crisis started years before ChatGPT existed. AI just weaponized existing incompetence.

[...] Here's what engineering leaders don't want to acknowledge: software has physical constraints, and we're hitting all of them simultaneously. Modern software is built on towers of abstractions, each one making development "easier" while adding overhead: Today's real chain: React > Electron > Chromium > Docker > Kubernetes > VM > managed DB > API gateways. Each layer adds "only 20-30%." Compound a handful and you're at 2-6x overhead for the same behavior. That's how a Calculator ends up leaking 32GB. Not because someone wanted it to -- but because nobody noticed the cumulative cost until users started complaining.

[...] We're living through the greatest software quality crisis in computing history. A Calculator leaks 32GB of RAM. AI assistants delete production databases. Companies spend $364 billion to avoid fixing fundamental problems. This isn't sustainable. Physics doesn't negotiate. Energy is finite. Hardware has limits. The companies that survive won't be those who can outspend the crisis. There'll be those who remember how to engineer.

The Great Software Quality Collapse

Comments Filter:
  • by TigerPlish ( 174064 ) on Tuesday October 14, 2025 @10:45AM (#65723882)

    Part of this decline is MBA-driven.

    Beancounters ruin every shop they run, whether it be Boeing, or any ones that I've worked at where Mr. Money Bags controls IT.

    In shops that Mr. Moneybags (CFO) do not control IT I've noticed things are much better for IT.

    • IT, whether it be software, hardware or the people that tie it together, is seen by all-too-many CxOs as a commodity cost to be minimised, rather than an asset and competitive advantage to invest in and nurture.

      Of course, those same CxOs don't take any responsibility for the failures they create either...

    • Might you be seeing that small companies with focused development teams (perhaps it's just a development team) are more efficient that large companies, which need bean counters?
    • by gweihir ( 88907 ) on Tuesday October 14, 2025 @11:16AM (#65723978)

      Short-term planning and profit is everything to these cretins. Everyone of them is essentially a get-rich-quick preacher. That destroys things reliably. In some sense this is a variant of the "Tragedy of the Commons": Nobody makes things sustainable, nobody gets things to a stable and dependable state, nobody looks at the longer-term effects.

      I mean Microsoft is making record profits while their cloud gets hacked again and again and their OS is crumbling. Same thing for Boeing with their planes falling out of the sky while they have forgotten how to design planes. Same thing for Intel (with the profits there more reflecting their actual performance but still being too high). Same thing for Crowdstrike that again has record profits after screwing up so badly that it could hardly have been any worse.

      This cannot go on. We need regulation and liability. We need to recognize that IT is critical infrastructure. We need to send CEOs that endanger it to prison until they all have gotten the message.

      • Honestly, I just had a "I have thoughts, let me spit them out without thinking" moment with this post - unnecessary complexity (piles of dependencies, frameworks, etc) absolutely IS a big part of the problem.
  • by gnasher719 ( 869701 ) on Tuesday October 14, 2025 @10:49AM (#65723892)
    On iOS, malloc (1ull 35) will succeed. It will reserver 32 GB in the address space but nothing else. As long as you don't write to the memory, nothing happens at all. So you are right to ask "WTF is going on here", but it won't hurt any user.
    • by evanh ( 627108 )

      I think you'll find that leaking means written to. And the SSD is swapping like crazy. Hence the bug complaints.

      • by PCM2 ( 4486 )

        But the OP says the 32GB was "Not used. Not allocated. Leaked." It's a little hard to parse, but if true, then maybe the actual effect is truly negligible.

        • I think:

          Not used = not legitimately used, as in trying to calculate for example 9^(9^9) with all digits (which obviously will blow up)

          Not allocated = not only saying it needs that much and in fact not using any real resource

          Leaked = just some garbage pushed there and forgotten (as in unused, but still taking the space)

    • by gweihir ( 88907 )

      Dame on Linux. This is really bad software engineering, but it gets made worse by a really bad OS (Windows).

    • by monkeyxpress ( 4016725 ) on Tuesday October 14, 2025 @11:31AM (#65724050)

      I bet the problem is that parts of original calculator app is written in Objective-C from before there was automatic reference counting, and someone messed with the legacy code and didn't free memory properly - a good old fashioned dangling pointer leak.

      It should not be possible for this sort of bug to occur in a modern language with garbage collection (or even ARC unless you're going crazy with your program structure).

    • by vyvepe ( 809573 )

      On iOS, malloc (1ull 35) will succeed. It will reserve 32 GB in the address space but nothing else.

      Well, at least some page table entries were created. Free memory block list in heap was extended as well (though this could be avoided if only brk or mmap were used). The operation has some overhead but it is very little if the allocated memory is never used.

  • Answer is simple: (Score:3, Insightful)

    by Travelsonic ( 870859 ) on Tuesday October 14, 2025 @10:50AM (#65723894) Journal
    Software development needs to slow the hell down, to allow development to happen properly - resources need o be given so developers can take a machine and do what it needs to - no more, or less.

    We need to stop shifting the blamer on costs (it's a bullshit excuse IMO; not WANTING to spend the resources, time and money to allow quality to shine =/= being unable to).

    We need to stop blaming complexity when we don't give developers the time and space needed to explore that complexity and work with it.

    Companies need to learn that gaining X amount of money slower than before is not only still earning X amount, but more worthwhile because the issues (and thus costs of dealing with them after) will be better mitigated,

    So much in this fucking industry needs to change if software quality is to improve.
    • by gweihir ( 88907 ) on Tuesday October 14, 2025 @11:06AM (#65723938)

      Actually we need to bring down complexity. No amount of time and money will make the current complexity in the IT space sustainable.

      And there is another thing: Disregard for KISS and high complexity is amateur-hour. Actually competent engineers avoid complexity like the plague it is.

      • Actually competent engineers avoid complexity like the plague it is.

        Yeah but avoiding complexity takes a lot of time. It's like the saying 'I didn't have time to write a short letter...'. Especially with powerful high level languages, I can just bash out something that works extremely quickly. Things like JavaScript where what they call 'dynamic programming features' could also be construed as 'no respect for any sort of useful scoping rules' means you can pull assets around your project like globals on steroids. This is damn fast if you just want to mock something, but it'

        • by gweihir ( 88907 )

          It definitely is much more effort and takes more skill and experience to respect KISS than just heap complexity on complexity. That is why high complexity in systems is a almost sure sign of the designers being amateurs or being forced to do it on the cheap.

          The problem with "management" is that management is dumb and often greedy and cannot do long-term planning. The only way to get that under control is regulation and liability. And a few "managers" that got in the way of engineers trying to do it right in

          • The problem with "management" is that management is dumb and often greedy and cannot do long-term planning.

            They can do long term planning, but their incentives are about doing things that improve their own career. (If you want people to change, their incentives need to change).

      • by dargaud ( 518470 )
        There's a saying that goes "A complex software project that works started from a simple project that worked". For instance I have a 40k lines program that has been working non-stop for years without any memory leak or crash, and it started from a half-page of specs "to do a test" and then they kept asking me to add thing... Granted it now looks like a monstrous pile of kludges, but still...
        • by gweihir ( 88907 )

          Yes, very much so. Obviously, if you keep adding too much, at some point it all collapses. But if you start with a good solid as-simple-as-possible design and architecture, that point comes much later.

    • Re:Answer is simple: (Score:5, Interesting)

      by Pieroxy ( 222434 ) on Tuesday October 14, 2025 @11:10AM (#65723956) Homepage

      "If you are not embarrassed by the first version of your product, you've launched too late"

      Reid Hoffman originally coined this sentence when discussing startup culture and the launch experience of LinkedIn.

      This is the reality of the market. People will jump on the first service available, be it bad or otherwise. The competitor that release its stuff 1 year later has already lost the race.

      So the bad technical decision makes market sense. You can either make money with fast crap or go bankrupt with a well engineered project.

    • Re: (Score:3, Interesting)

      by Anonymous Coward

      Nope. Most software engineers I've known spend most of their days in planning meetings, scrums, side engineering meetings, off-sites, training, slack, coffee breaks, and pretty much anything else that wasn't heads down coding. The actual amount of code they produced were minimal (good quality but very minimal). We need to remove additional noise so they can use the 40 hrs to focus on their core job functionality. AI will hopefully provide assistance to the boring parts but come on, slow down? Yeah people do

  • Streaming Apps (Score:5, Interesting)

    by thecombatwombat ( 571826 ) on Tuesday October 14, 2025 @11:01AM (#65723926)

    OK. Old man rant. But I've been saying this for years about streaming apps. How is the overall quality so awful?

    It is *stunning* how bad they are and people just put up with them. If you have two or three streaming apps, you will be constantly playing whac-a-mole with the same basic set of player bugs all the time. And somehow, even trillion dollar companies can't handle syncing where I am watching a show across multiple devices, it's the software challenge of our time.

    When I press the "next show" button in a series, I really, really, really . . . don't mean the episode I just watched two days ago, I mean the next one. Apparently no one is able to even come close to doing this reliably.

    For years now, Apple has introduced a big feature everyone seemed to think they want: a unified queue across many apps. The thing is, it's terrible. In theory, it shows you where you are in a show/movie, in a list, across multiple apps, on multiple devices. Watch half a movie on my phone, then launch it on Apple TV, it should go to that same spot. This ridiculously simple shortcut feature, from this trillion dollar company, works, at best, 50% of the time.

    And the thing is . . . streaming is supposed to be the hot new business every tech company can't stay out of. Amazon and Apple, huge software companies, spend tons for shows, and produce terrible software. Disney is basically staking the company on it, same for WB. *How* is it so bad, while also being so basic?

    • i hate the planned obsolescence of apps myself. there was a new update to an app that my kids use for watching streaming stuff, all of a sudden it doesn't work, and it says to update to the new version, but i can't because the hardware doesn't support the new iOS...but nothing is different about the app. we just use it to watch shows, if it worked yesterday, why wouldn't it work today?
    • How is the overall quality so awful?

      Because Streaming apps no longer exist. They are a front end wrapper to the Streaming company's website written in Electron. Calling them an app implies a level of thought and development that simply wasn't applied.

  • I agree (Score:4, Insightful)

    by gweihir ( 88907 ) on Tuesday October 14, 2025 @11:04AM (#65723932)

    Software is typically being done incompetently and that seems to slowly be getting worse. This is mostly in the Windows-space, including the OS. For example Win 11 crashed on me 2 times in the first hour of using it, while Win10 was rock-solid on the same hardware. And the GUI is infantilized and decidedly has worse usability. What can you expect from application development, if this is the culture being set?

    At the same time, requirements increase. Software now runs everything, attackers have solid business models and are more and more of a real threat. AI makes that worse as it mostly helps the attackers and it helps them a lot.

    This is headed for a big crash. And yes, my DNS/Email/Webservers are all on Linux and I have a desktop on Linux as well. But what does that help me if most things do not run anymore because Azure got fully compromised (again) and this time the attackers take it hostage or just want to destroy it?

    I hope the crash will not be civilization-ending. And after that we hopefully get engineering practices in the IT space that match other engineering disciplines. I mean, you do not have to expect your car fall to pieces while driving it, do you? (Unless it is a Cybertruck or a Jeep, of course...) Amateur hour really has to stop in the software and general IT space or we will see things fall to pieces.

    • by mspohr ( 589790 )

      I think that "AI programming" vibe will only make this worse.

      • by gweihir ( 88907 )

        Oh, definitely. It makes the defenders weaker and the attackers stronger, because attack software does not have to be reliable or secure. And I think we may see a real software maintenance catastrophe from that stupidity in a few years as well.

      • I'm less sure about that.

        I'm finding Claude is really good at removing layers of abstraction - i just went through a stack of shipping code that created Fedex labels, there were multiple layers of abstraction on top of some code written years ago. I had the LLM go through and develop a spec for what each method needed to do, propose a clean interface and then rewrite the existing code into the new architecture.

        Then I can have it find a common interface between my UPS and FedEx code and wrap those up a
    • One problem I have with Windows has been the ongoing obfuscation of controls. Replacing Control Panel with Settings is something I suppose needed to happen eventually. I personally dislike the style of Settings over Control Panel. However it appears to me Settings is just a front end to actual controls and Windows can lie/hide things from the user. Before Control Panel showed the actual settings.

      For example, Add/remove programs appears to remove optional Windows components but in reality it does not. For ex

  • by Atmchicago ( 555403 ) on Tuesday October 14, 2025 @11:10AM (#65723954)

    Most of the "sentences," as defined by punctuation, are really just phrases. The article was published on Substack, so not only are there no editors, but apparently no standards. Please take time to edit your work, otherwise it's illegible. I'll take a stab at repeating what I just said, but in that same terse style:

    The author used sentence fragments. Hardly 5 words. Barely even phrases. What the hell? Twenty years ago, this would have triggered editorial review. Not now. Nobody cares! It's awful. Here's what bloggers don't want to acknowledge: writing takes time and effort. It's a skill. And guess what? If you write well, people will have an easier time understanding your point. If there ever was one.

    • Seriously. I agree with the author, but the writing was atrocious. Also, at least one of his examples of problems was with beta software. It's built to have problems like this, not really an issue for me if my known beta software doesn't work right. I'm glad someone else noticed this too.

    • A consequence of "the great writing quality collapse" is the great comprehension collapse. One can reasonably ask whether Stetskov's poor writing is by default or by design. I spent decades both working in higher education and reading tech news. The result in this case was that, despite all my training, I didn't even notice the poor writing until I read this post.

  • Not surprising (Score:5, Insightful)

    by TwistedGreen ( 80055 ) on Tuesday October 14, 2025 @11:12AM (#65723966)

    "Move Fast and Break Things" inspired a generation of incompetence. Note that actually fixing the things you broke gets swept under the rug.

    It seemed to be an almost orgasmic revelation to the business-types when they realized they can continually ship broken software and nobody cares.

    • Re:Not surprising (Score:5, Insightful)

      by Voyager529 ( 1363959 ) <voyager529 AT yahoo DOT com> on Tuesday October 14, 2025 @11:28AM (#65724038)

      "Move Fast and Break Things" inspired a generation of incompetence.

      I think there were three elements of this mindset that were assumed knowledge on the part of the person who said it:

      1. "Move fast and break things...in a development environment where possible".
      2. "Move fast and break things...in a way that is easily reversible." (see #1)
      3. "Move fast and break things...and assume they will break, so assume you'll be fixing what broke" (see #2 and #1).

      I can appreciate that Facebook can have this mindset, and in the case of a social network, there *is* an element of wisdom in not treating it like the IBM-of-old that overengineered EVERYTHING, making it super-reliable, but also making development very slow and very expensive. Facebook's focus on agility makes perfect sense for the nature of the work.

      This doesn't work in every field, though. From finance to medicine to engineering, the costs are much, much greater than the loss of cat videos. Just because something makes sense in one field, doesn't mean it makes sense in EVERY field...and unfortunately, there are very, very few MBAs who understand the one thing that is more valuable than money: wisdom. Wisdom can earn money, but money can't buy wisdom.

      • very few MBAs who understand the one thing that is more valuable than money: wisdom.

        No, it isn't more valuable. In fact, unused, it has no value at all. What IS more valuable than money is time. It is finite and eventually we all use up our share. The problem here is not intellectual. The problem is that money is the value by which success is measured. Most software is junk because its cheaper to produce than a quality product and the cost of user's time is irrelevant to the producer and capturing the cost of the wasted time is hard to factor into the choice of software.

    • by taustin ( 171655 )

      "Move Fast and Break Things" inspired a generation of incompetence. Note that actually fixing the things you broke gets swept under the rug.

      That's not incompetence, that's corruption. And it's not just tech bros, it's ubiquitous across our society, from top to bottom.

    • by DarkOx ( 621550 )

      I might be wrong but I don't think move fast and break things referred to that which you were building. It was more about don't worry about the consequences for the rest of the market place, perhaps don't worry about the consequences for cooperative shared infrastructure, like shoveling tons of data over DNS, or say abusing NTP to distribute a bunch of very large binaries..

      Another example would be Electron apps, the move fast and break things does not apply to your own app it applies to using the giant fra

  • And in 2005, if the Apple Calculator had leaked arbitrary amounts of memory... it just would have been a bug report in a queue. Because it's just a little utility, not an emergency.

    He inadvertently disproves his own thesis by linking to the Spotify bug. It's from 2020, when his chart was still in the green, though it's likely there are multiple different memory leak bugs discussed in the thread.

    It's true that in the much-further distant past there were fewer bugs of this nature. Or at least fewer that ma

  • I remember in the last 90's / early 2000's discussing with a colleague how it was possible that X allocated 2 megabytes for an empty window, just for sitting there on the screen.

  • by Rosco P. Coltrane ( 209368 ) on Tuesday October 14, 2025 @11:23AM (#65724010)

    When you have 32 kilobytes of RAM and a 1 MHz processor, you need all the programming talent you can get to squeeze the most performances out of them.

    When you have 32 gigabytes and dozens of cores, any incompetent code monkey can churn out the same application in Visual Basic or Python.

    Resources don't make your computer faster. They empower incompetent and sloppy developers, who crucially are paid less than good ones, so their boss can make more money.

  • by MpVpRb ( 1423381 ) on Tuesday October 14, 2025 @11:30AM (#65724046)

    I learned software engineering in the 70s. Back then, there weren't very many of us and even fewer who were really good at it. As a result, we were paid really well.
    Then the news spread, software is the key to riches.
    Pundits, advisors and politicians told kids that everybody needs to learn to code.
    This resulted in a flood of CS students of varying talent. The talented ones worked hard, the not-so-talented ones somehow ended up with diplomas and entered the market.
    At the same time, the beancounters wanted to reduce costs. This created a market for a wide variety of tools, frameworks and languages, optimized for mediocre, cheap programmers. The tools and frameworks were often buggy, opaque black boxes, but nobody cared.
    And also at the same time, processors were getting fast, really fast. Along with giant memory and storage, it encouraged companies to optimize solely for development speed and cost, while ignoring performance and code quality.
    Now, the fad of the day is "vibe coding" that promises to allow people with absolutely no engineering skill or training to create code with a text prompt.
    Expert software engineers still exist and use AI tools effectively where they work well, but the tsunami of crap continues to overwhelm us

  • Not evangelizing Ruby, but the start of the keynote made a lot of sense.

    https://www.youtube.com/watch?... [youtube.com]

  • by ak3ldama ( 554026 ) on Tuesday October 14, 2025 @11:34AM (#65724062) Journal

    It is likely true that software quality is dropping. But the important point I would like to make is that quality elsewhere is horrible too. Our relatively new house is on its third bathroom sink faucet in about 12 years total time. I cannot fathom how this could be so bad. Car quality, parts, engines, transmissions all of it is worse. Worse parts, worse designs, it all is bad and so much more expensive. A twenty year old car w/ only front wheel drive, a four speed transmission, a reasonable power v6 in comparison is SO solid. A little worse MPG but that is it, and sometimes that isn't so clear cut. I'm sure there are examples of things that have improved, and others that have gotten worse. But those were a couple I can think of off hand. A lot of this is driven by big government making decisions for us, even during republican administrations ironically. Fuel efficiency standards go back to Bush. Anyways, enjoy, the future is gonna suck. And be expensive.

    Farmers will bitch about a def burn/regen, but still buy the new huge combine because even with sitting for 45 minutes, they get so much more done so fast it is unreal. So they buy/rent huge equipment they cannot work on because they run so many acres they really have no other options. Our government bankrolls all their risk so land/rent values keep going up and everyone is too happy to question anything. Red America complains about market access while voting in Trade War Trump. We're all so stupid. Nevermind GMO everything. If only we could make tofu w/ all this cheap soy. Nope, we gotta feed it to cows/pigs/chickens. No one cares about the river and aquifer water quality, or how expensive it is to treat for high nitrate levels. Sorry for the slightly unrelated farming rant. But I really wish our local river was cleaner. It is never a focus and so sad. But I'd refer to it as a "water quality collapse." Ironically you have to pay farmers for buffer strips and CRP, I'm not sure they'd do it on their own. Left to their own devices they're even ripping out the shelter belts around here.

    • Car quality, parts, engines, transmissions all of it is worse.

      BS. A car that lasted 100,000 miles was unusual and getting 20 mpg was outstanding mileage. Gas stations checked the oil with a fill-up because most cars burned oil constantly. There is a lot of modern junk, just compare your jeans to a pair from the 60's. But in our society, you buy something new and throw the old things away even when they still serve their purpose well.

      Software developers have made that an institution. Nobody thinks a 10 year old computer should be able to run the newest software. And no one thinks a new computer wouldn't have new software with a modern look. Just listen to the discussions about the need to replace legacy software because it was written in cobol. Its still working as designed and rock solid, but it needs to be replaced with modern software with a multi-colored interface. Simple web sites are "upgraded" with new complex websites that have lots of new features that they require users to learn to use the site. The reality is that this drives the industry and bugs are part of that. As the cobol example shows, creating rock solid programs without bugs just eliminates a future customer for new software.

  • by ctilsie242 ( 4841247 ) on Tuesday October 14, 2025 @11:41AM (#65724088)

    If I had to pin on why software quality is so crappy, it is because of a few factors:

    * Companies don't have to do better. They can ship something that doesn't even run, and make money. At most, they can ship something barely functional so implied running is taken care of. However, even that can be stomped in the TOS/EULA agreement.

    * Devs are viewed as fungible. They are not getting what marketing wants, even if it is impossible? Offshore and outsource until management is happy.

    * The entire Agile/Scrum system had well intentions, but has become nothing except beating on devs until they quit. I worked at one software place that had daily 4-6 hour standups, where every deliverable was asked for, by every dev, and the Scrum master personally insulting the devs every day. Then people would whine that they are blocked, point fingers, the other person pointed at would start cursing at them. After that exhausting kangaroo court, not much got done after lunch. Even worse, I was in Ops, and was dragged into the meetings, and often considered the cause of deadlines slipping... because I refused to put things into production unless they went through some type of testing first. This was after a dev used a root PW on a production machine, threw his code in, and caused an insanely expensive outage. Code quality was absolute garbage. Security? Hah!

    Security was out the window, because the devs knew that if the company had software that caused a major breach, there were many layers between the legal team and them, while not getting a deliverable out the door meant the boss saying tata to their job. So, on install, SELinux was disabled, firewalld was masked out, and the program would quietly exit with an error code zero if not run by root. All DB access was done by the SYSTEM or sa user, and the program would just exit out, error code 0, if that wasn't present. The devs didn't care... to them and the Scrum master, security has no ROI.

    Want something done right, don't do permanent sprints, and allow for times where the entire code base goes through a complete refactor, perhaps every 6-12 months. Get that technical debt paid off.

    Overall, this makes me glad I work with embedded programming. Stuff that is acceptable for web pages isn't going to cut it when at most, you are programming on the resources of an Apple ][+. You are not going to vibe program stuff in that environment.

  • by TrentTheThief ( 118302 ) on Tuesday October 14, 2025 @11:44AM (#65724092)

    Gartner predicted this when they critiqued Microsoft's new philosophy of releasing when the software is "just good enough." Inherent flaws and known bugs were ignored.

    Now that that's what everyone does, expect it to get worse.

  • He makes a bit of a valid - if poorly articulated - point.
    We should be using assembly language! More efficient!

  • The author forgot the most important thing: You can't make iPhone apps on an iPhone (or Android apps on an Android phone, now that Google will lock sideloading). You can't mod iPhone or Android apps (on an un-jailbroken/un-rooted device). You can't even view the files that make up an app on an iPhone or an Android phone. Anything that could motivate the "smartphone generation" (Gen-Z) to try coding is not there. Gen-Z are "digital natives" but mostly clueless about what an app really is. In fact, some of th
  • Modern software is built on towers of abstractions, each one making development "easier" while adding overhead: Today's real chain: React > Electron > Chromium > Docker > Kubernetes > VM > managed DB > API gateways.

    It doesn't make it easier, it's to force DOM to act like a real GUI at gunpoint. DOM is the wrong tool for the GUI job, time for a new state-friendly GUI markup standard.

    Apps that devs used to make in 3 weeks in VB, Delphi, Paradox, or Oracle Forms now take 7 months and be 5x

  • by pauljlucas ( 529435 ) on Tuesday October 14, 2025 @12:08PM (#65724178) Homepage Journal
    .. but I don't think it means what the author thinks it means. FTFA:

    The Apple Calculator leaked 32GB of RAM. Not used. Not allocated. Leaked.

    First, AFAIK, leaking memory means you allocate it, but don't deallocate it. So how can he say "Not allocated?"

    Second, leaked how? If it's leaking 32GB of RAM on, say, every keystroke, that would be serious; but if it allocates 32GB RAM once on start-up and simply forgets to deallocate it upon termination, it doesn't matter since the OS will reclaim the RAM for the entire process.

    Today's real chain: React > Electron > Chromium > Docker > Kubernetes > VM > managed DB > API gateways.

    OK, those are lots of layers of abstraction and they each use memory, perhaps a lot, and he has a point that modern software tends to use too many layers, but that doesn't mean that any of that memory is leaked: just used.

    Based on that part of his rant, is he complaining more about the 32GB size of the (alleged) leak of the Calculator app, i.e., why should a calculator need 32GB? Sure, complaining that a calculator using 32GB is valid, but it's not a leak, just inefficient or lazy on the part of the programmer.

    • Based on the context clues, the author may have used the word leaked exactly as you expected, and then used the word allocated as a synonym for "written," and used as a synonym for "written and later read." If I were speaking to the author and uncertain about this, I'd probably ask for clarification directly. People rarely use the exact dictionary definition for every single word they write, and I usually find it more useful to understand what someone intended than what words they should have used.

      Your sign

      • What I asked relates to my comment because I'm NOT assuming he actually meant anything. I'm asking what he meant, or at least I'm pointing out the inconsistency if I were to assume a particular meaning; but at no time did I assume he actually meant a particular thing. That aside, I'd hope that a software engineer would be more precise when using computer-related terms.
  • It is ironic that the memory leak is reported in the current iteration of the Apple calculator.

    The website chronicling the history of Apple (Folklore.org) notes that Steve Jobs wanted a calculator, and wanted the developers to make it look a particular way
    https://folklore.org/Calculato... [folklore.org]

    He also wanted a few other odds and ends, but including a sliding block puzzle would be too big to fit onto a 400k floppy disk that already had the System and Finder. So the developers made it smaller by hand-coding

  • > A basic calculator app is hemorrhaging more memory than most computers had a decade ago

    32 Gigabytes is still more memory than most computers have nowadays (which would be around 16 GB). A decade ago it probably was more like 8 GB or even just 4 GB.

  • The article claims to measure the severity of a memory leak defect based on the amount of memory it leaked -- but most memory leaks (that are severe enough to be noticed) are small leaks that occur at regular intervals, meaning that the program's memory footprint will continually grow larger over repeated operations.

    Therefore, do you want a 1MB memory leak? Run the program for a while. Do you want a 1GB memory leak? Run the program for that much longer. Keep going, and you can eventually get to any numb

  • by whitroth ( 9367 )

    Thirty years ago, I had a course (before I got my B.Sc) in OOPs and GUI. OO design I found interesting. OOP, not so much - the closer you got to the code, the fuzzier the picture.

    Since then, hell, for about 20 years, I've been saying you want a clipping of Godzilla's toenail, and what they give you is Godzilla, with a tiny frame around part of his toenail. Invoke x, then you can invoke this method x.y, then you can invoke x.y.z... Rather than invoke z directly, because you don't know how to do that, and bec

  • by Somervillain ( 4719341 ) on Tuesday October 14, 2025 @12:49PM (#65724316)
    I thought AI would fix this...I thought WRONG! For my entire career, I have been passionately fighting bloat...and losing. Why? Because I show someone how to do the job in a nice compact way...then they move me to another project and some POS charismatic 24yo rewrites the fucking thing in Python with 40 frameworks...in slightly more time than it took me, but he's younger and better looking...and now management has a choice...listen to the passionate, but slightly autistic greybeard saying "no...we can do this with a minimal Java REST service and it'll nicely fill into your existing architecture"...or listen to handlebar moustached skinny jeans wearing hipster with his metrosexual turtleneck and oversized scarf...telling you about all the cool frameworks he's going to integrate...oh, and he's going to write part of it in node.js...because JavaScript is the language of the internet!!!!...that old guy?...he's trying to deliver something that runs in the language of the old internet...Applets!!!!...My version is tangibly faster...I delivered it faster...it costs less to run, but the hipster swears Python makes you more productive...so management has to figure out which one of us is wrong....and spoiler alert, they're not going to look at the evidence.

    Why?...because fucking management is dumb AF....at the end of the day, I need a paycheck. I don't take it personally. It's a mild frustration. This is not my religion. However, I deliver REST services that do the job in megabytes of RAM using 15yo frameworks in Java that are well known industry standards, faster than the hipsters and even with less lines of code....young people who think "Java is hard" try to force everything to Python and megabytes becomes 100s of megabytes if not gigabytes...and the fucking thing is slow as molasses and breaks constantly with every change because Python and Node rely on the developer to write PERFECT unit tests...

    And EVERYONE writes PERFECT unit tests, right?...no need for those annoying compilers getting in the way

    So why did I bring up AI? Well...OK, dumb fucks can't figure out Java...let alone C++/Rust...one of the first use cases I pictured was AI taking your SHITTY SHITTY node and Python apps and rewriting them to not be shitty...it can't be that hard. Take functioning Python and convert it to Rust...or if you insist, Java...or C++...fuck...do assembly if you really like pain...The AI should be able to handle it, right? I've managed to convert a few pieces from Java to Rust nicely...but nope..that doesn't seem to be a popular option. People are mostly using AI to generate new SLOP in sloppy languages. So...instead of taking existing code and making it faster and cheaper to run and lowering their cloud spend, companies just want garbage buggy new python apps...because python is more error tolerant than C/Java (Claude and OpenAI still can't generate code in correct syntax 75% of the time, in my experience....they can't put commas or semicolons in the correct place or match braces). I don't know if that's a limitation of the tools or just the people using them.
  • by laxr5rs ( 2658895 ) on Tuesday October 14, 2025 @12:55PM (#65724336)
    I didn't read the blog but many of assumptions here are not well thought out. "The Apple Calculator leaked 32GB of RAM." What does this mean exactly. If there was a story, I missed it. What's the actual story?? "We've normalized software catastrophes...." Our computers are a lot more powerful now and can take more problem with ease. I used to run 4MB of RAM on my windows 3.1box. So...? I'm not defending poor software quality, but we're not comparing apples and oranges as far as present and past computer resources. "AI just weaponized existing incompetence." This is a toss it in, make myself look clever BS statement. We don't know that. "Here's what engineering leaders don't want to acknowledge:" Which leaders? Who is he talking about? " but because nobody noticed the cumulative cost until users started complaining." Nobody. Really? Bullshit. There's all kinds of competency in computing... and all kinds of incompetency. No one noticed... Yeah, it hit us like a bolt from the blue, "hey! some of this software sucks!" Welcome to the world of the real. "We're living through the greatest software quality crisis in computing history. " This person, I don't know who he is, but he - sounds young and excitable. Like he doesn't know the past, maybe. "THIS crisis that I'M living through is the WORST!" Really. No it's not. When people are making bad software, it inspires others to attempt to improve it... Crisis? Shitty software? Microsoft used to routinely sell software ideas, the didn't even exist yet. This person would do well to calm down. He thinks this is a huge crisis. Really? A child starves to death in the world, about every 7 seconds. Do the math over time on that.... that's a crisis. Bad software? It's par for the course. Just look backwards a little.
  • Last month I started noticing how bad websites are getting. Tried to order a laptop from the manufacturer's site? Nope, website inexplicably removes next button somewhere in the process. Call to helpdesk: sorry, we know the issue, took them three days to fix it. Wanted to register our car to drive through the LEZ in London. Nope, I could not make an account as my phone number was already used by another account. We tried all our phone numbers, random numbers, helpdesk numbers, everything was taken. Tried ag
  • About 30 years ago Niklaus Wirth published an article titled "A Plea for Lean Software" which got summarized as Wirth's Law "software is getting slower more rapidly than hardware is becoming faster". Nobody seems to have paid attention.
  • 10-15 years ago there was such a split for web engineering. They wanted to make everything on the web look like an app, and a lot of backend guys hate anything looking like UI, so lets have an amicable divorce and do everything through these god awful endpoints, so the backend folks don't have to touch UI and the frontend folks can think they're "more real" engineers by making stuff that looks like it's a black box app vs enjoying the natural versatility and iterability of the old web.

    I'm sure I'll never get hired for it, but good ol PHP (hell for most things I skip the MySQL; poor mans no-SQL w/ JSON files on the file system works and scales well for so many things)... vanilla Javascript can even be beautifully declerative when you want it to, with string templates building up whatever new DOM you didn't get from the server. I have these sites that last for decades, and when it comes time to add something, they're easy to figure out and adapt and there's no library hell (browsers have gotten so GOOD yet still so backwards compatibile over the years)

    So I look for like minded souls using terms like "buildless" and "evergreen". But it's like an underground movement...

  • We dont have time to write programs the way they should be written. We just barely have enough time to write them at all.

    This is the same as it always was. Software today is probably better than it has ever been. People just dont remember what it used to be like. Windows 95 crashed all the time and that was the entire OS going down regularly. Every application was equally buggy. This is not new. Its only new to people who dont remember what it used to be like.
    • Its only new to people who dont remember what it used to be like.

      Absolutes like this ABSOLUTELY warrant a "citation needed" response.

  • Quality takes time: time to design, time to implement, and time to test before release.

    The market has favored time to market and continuous delivery over quality. If we start putting a premium on quality then quality will return, plain and simple. Perhaps the pendulum will soon turn around, perhaps not.

  • The first computer I programmed on had 4K RAM - the TRS-80.
    The first computer I used on a job had 256K RAM the PDP-11/44.
    The first computer I paid for myself has 128K, and a 400K floppy - the OG Macintosh.

    On the latter, you could fit the entire OS (less some fonts and desk accessories), MacWrite and MacPaint on one floppy.

    We got lots of useful $#!+ done on those computers. It's not just errors, it's lazy.

    • My first computer had 256 bytes of RAM. I had to hijack some of the video RAM to load BASIC programs. (TI-99/4A)

      It was barely more powerful than a programmable calculator until you upgraded it with expansion RAM. Even in the bare bones configuration you could do useful things on it. Maybe not a spreadsheet or word processing, but these were computers in an era where most people didn't think they needed a computer.

  • And this is what I've been saying all along. I started my computing journey with 4 KiloBytes of main memory. No kidding. 4K. And I've always had to be careful with my mallocs and garbage collection and disk space and everything else, including power usage. A lot of junior coders that I encounter today want to start with basically unlimited cores, RAM, and disk space. Of course they do, they're not paying the hosting bill, whether it's AWS or OVH or Hetzner. And until you threaten to take the AWS bill

  • Dave Cutler, of Microsoft Windows NT (the basis of 2000, XP, 7, 8, 10, and 11) fame, had signs posted around the Microsoft offices that said "Software quality is not a crime."

  • Let's take a look at software sizes, for a moment.

    UNIX started at around 8k, and the entire Linux kernel could happily sit in the lower 1 megabyte of RAM for a long time, even with capabilities that terrified Microsoft and Apple.

    The original game of Elite occuped maybe three quarters of a 100k floppy disk and used swapping and extensive use of data files to create a massive universe that could be loaded into 8k of RAM.

    On a 80386SX with 5 megabytes of RAM (Viglens were weird but fun) and a 20 megabyte hard d

  • And nary a joke to be seen? Sadness.

  • A completely ignorable easily replaceable for free app that most people don't keep open on their computer for more than 15 seconds at a time leaking memory is hardly a catastrophe. It barely qualifies for a "meh".

  • Today's real chain: React > Electron > Chromium > Docker > Kubernetes > VM > managed DB > API gateways.

    So, how else am I going to get all that stuff on my resume?

  • The title talks about a software quality collapse. That implies that software quality was better in the past. While we all bemoan the current state of software quality, how many of us actually think, "Remember the good old days when software quality was good!"? Or is this complaint-fest simply an acknowledge of the ever-present challenges of engineering software quality, challenges that are not necessarily different than before. As with all things in life and not just software, it's easy to complain abo

People will buy anything that's one to a customer.

Working...