
The Great Software Quality Collapse (substack.com) 182
Engineer Denis Stetskov, writing in a blog: The Apple Calculator leaked 32GB of RAM. Not used. Not allocated. Leaked. A basic calculator app is hemorrhaging more memory than most computers had a decade ago. Twenty years ago, this would have triggered emergency patches and post-mortems. Today, it's just another bug report in the queue. We've normalized software catastrophes to the point where a Calculator leaking 32GB of RAM barely makes the news. This isn't about AI. The quality crisis started years before ChatGPT existed. AI just weaponized existing incompetence.
[...] Here's what engineering leaders don't want to acknowledge: software has physical constraints, and we're hitting all of them simultaneously. Modern software is built on towers of abstractions, each one making development "easier" while adding overhead: Today's real chain: React > Electron > Chromium > Docker > Kubernetes > VM > managed DB > API gateways. Each layer adds "only 20-30%." Compound a handful and you're at 2-6x overhead for the same behavior. That's how a Calculator ends up leaking 32GB. Not because someone wanted it to -- but because nobody noticed the cumulative cost until users started complaining.
[...] We're living through the greatest software quality crisis in computing history. A Calculator leaks 32GB of RAM. AI assistants delete production databases. Companies spend $364 billion to avoid fixing fundamental problems. This isn't sustainable. Physics doesn't negotiate. Energy is finite. Hardware has limits. The companies that survive won't be those who can outspend the crisis. There'll be those who remember how to engineer.
[...] Here's what engineering leaders don't want to acknowledge: software has physical constraints, and we're hitting all of them simultaneously. Modern software is built on towers of abstractions, each one making development "easier" while adding overhead: Today's real chain: React > Electron > Chromium > Docker > Kubernetes > VM > managed DB > API gateways. Each layer adds "only 20-30%." Compound a handful and you're at 2-6x overhead for the same behavior. That's how a Calculator ends up leaking 32GB. Not because someone wanted it to -- but because nobody noticed the cumulative cost until users started complaining.
[...] We're living through the greatest software quality crisis in computing history. A Calculator leaks 32GB of RAM. AI assistants delete production databases. Companies spend $364 billion to avoid fixing fundamental problems. This isn't sustainable. Physics doesn't negotiate. Energy is finite. Hardware has limits. The companies that survive won't be those who can outspend the crisis. There'll be those who remember how to engineer.
Part of this decline is all MBA-driven (Score:5, Insightful)
Part of this decline is MBA-driven.
Beancounters ruin every shop they run, whether it be Boeing, or any ones that I've worked at where Mr. Money Bags controls IT.
In shops that Mr. Moneybags (CFO) do not control IT I've noticed things are much better for IT.
This. (Score:2)
IT, whether it be software, hardware or the people that tie it together, is seen by all-too-many CxOs as a commodity cost to be minimised, rather than an asset and competitive advantage to invest in and nurture.
Of course, those same CxOs don't take any responsibility for the failures they create either...
Re: (Score:2)
Re:Part of this decline is all MBA-driven (Score:5, Insightful)
Short-term planning and profit is everything to these cretins. Everyone of them is essentially a get-rich-quick preacher. That destroys things reliably. In some sense this is a variant of the "Tragedy of the Commons": Nobody makes things sustainable, nobody gets things to a stable and dependable state, nobody looks at the longer-term effects.
I mean Microsoft is making record profits while their cloud gets hacked again and again and their OS is crumbling. Same thing for Boeing with their planes falling out of the sky while they have forgotten how to design planes. Same thing for Intel (with the profits there more reflecting their actual performance but still being too high). Same thing for Crowdstrike that again has record profits after screwing up so badly that it could hardly have been any worse.
This cannot go on. We need regulation and liability. We need to recognize that IT is critical infrastructure. We need to send CEOs that endanger it to prison until they all have gotten the message.
Re: (Score:3)
Re: (Score:2)
Thanks. And don't worry about it, you just followed up on it, after all.
Re: (Score:3)
32GB of RAM means nothing
Just add more swap space. Thank goodness that's free.*
*An actual comment I found in some crappy code. Most likely put there by someone attempting to debug it.
Re: (Score:2)
All that does is collapse into crazy levels of swapping. The whole system bogs down. Which will likely be why the bug reports were happening in the first place.
Re: (Score:2)
Re: Part of this decline is all MBA-driven (Score:3)
I think the poster knew that, but not the "dev" that actually wrote the original comment.
Related story:
We once had a production server where java went from 2GB ram usage to 10GB n 2 minutes. I told the main Dev he had a bug in his code. He went over my head to the VP and forced us to upgrade the servers to 20GB (we had VMs so it was possible). A week later, same day, same time, the RAM went from 2GB to 20GB. We could not double the RAM again so they actually looked at the bug. The issue was a customer click
Re: (Score:2)
Right now, the OS's OOM killer will do it... but until then, it causes extreme memory pressure on a system, forcing swap, which will nosedive performance on the machine until that process is killed. It may cause other processes to see that the machine has a lot memory pressure and quietly freeze or exit.
Historically, failing to keep track of RAM was a bad thing, especially pre OS X on Macs, before the OS had "hard" memory protection, and a memory like like what is mentioned would cause a complete system cr
Re:Part of this decline is all MBA-driven (Score:5, Interesting)
The Linux oom_killer is really, really bad when running on Microsoft Hyper-V and Azure. The ballooning of memory causes it to think the VM has run out of memory, when it hasn't.
VMware and KVM do not have this problem.
Re: (Score:2)
The Linux oom_killer is really, really bad.
Fixed that for you. What you really want is earlyoom [github.com].
Re: (Score:2)
Makes me wonder if AI can take something from the top layer, go through the entire stack, and compile something in machine code directly, all statically linked, with bounds checking. Basically a next-gen transpiler/compiler.
Could a commercial AI do that? Yes. Will it? No. Follow the money. Firstly RAM is free, because the customer pays for their hardware, not the app creator. Secondly the driver behind AI coding is not to make things better, it is make thing cheaper by making them faster, quality is not a requirement.
There are two exceptions to this. Firstly the AI providers internal programs. Training an AI is resource heavy so you can be sure they have optimised that as much as possible, to save their own money. S
Re:Part of this decline is all MBA-driven (Score:5, Insightful)
> 32GB of RAM means nothing
Well, it's about double the amount of RAM most modern computers come with, so I'm not sure it's nothing. Are you sure you're not confusing Gb with Mb?
That said, I'm curious to know where the figure comes from. Modern operating systems are extremely unreliable at reporting memory usage figures. According to 'top' right now on my PC, Pluma, a very basic text editor shipped with MATE is using either 15.5g or 8g. There's a Java app that's either using 23.1g (Virt) or 2.1g (Res). Firefox's WebExtentions process is either using 18.9G (Virt) or 0.25G (Res) And... well, you get the idea. By rights, just running Firefox on that on my 64G machine should be putting me in swap hell. But oddly enough... it isn't.
It isn't because those memory measurers aren't actually all that reliable. Modern operating systems allocate pages when they're written to, and modern programming languages assume they run on modern operating systems that will do that. So the entire addressable memory a program can see is usually measured in gigabytes or even terabytes, while the amount it uses is more normal. To add to the confusion, shared memory is extremely common and normally involves huge addressing spaces. Which means the figure reported by top is, if not useless, very nearly useless.
In order to believe the Mac's Calculator app has allocated 32Gb we have to believe that:
1. It's leaked that amount of memory, or loaded so much crap - images, etc - that it somehow took up that many pages of memory. Neither are likely. Memory leaks take time to show up. And even uncompressed high resolution 12bpp versions of every single calculator button wouldn't take up 32Gb.
AND
2. That it nicely "leaked" a power-of-two amount of memory.
I don't buy it.
Programmers have, as you point out, always treated memory as something to waste, but I think the story's assertion is bunk.
Re:Part of this decline is all MBA-driven (Score:5, Interesting)
Re: (Score:2)
It means nothing because if any app gets stuck in a loop allocating memory, it will keep allocating more and more until it crashes or the OS kills it. The bug is the same if it does it on a DOS machine with 640k RAM and allocates all of that, or if it does it on a modern server and allocates a terabyte of RAM.
We have had looping memory allocation bugs since the invention of memory allocation and recursion.
What is the effect of the leak? (Score:3)
Re: (Score:2)
I think you'll find that leaking means written to. And the SSD is swapping like crazy. Hence the bug complaints.
Re: (Score:2)
But the OP says the 32GB was "Not used. Not allocated. Leaked." It's a little hard to parse, but if true, then maybe the actual effect is truly negligible.
Re: (Score:2)
I think:
Not used = not legitimately used, as in trying to calculate for example 9^(9^9) with all digits (which obviously will blow up)
Not allocated = not only saying it needs that much and in fact not using any real resource
Leaked = just some garbage pushed there and forgotten (as in unused, but still taking the space)
Re: (Score:2)
Dame on Linux. This is really bad software engineering, but it gets made worse by a really bad OS (Windows).
Re:What is the effect of the leak? (Score:4, Insightful)
I bet the problem is that parts of original calculator app is written in Objective-C from before there was automatic reference counting, and someone messed with the legacy code and didn't free memory properly - a good old fashioned dangling pointer leak.
It should not be possible for this sort of bug to occur in a modern language with garbage collection (or even ARC unless you're going crazy with your program structure).
Re: (Score:2)
On iOS, malloc (1ull 35) will succeed. It will reserve 32 GB in the address space but nothing else.
Well, at least some page table entries were created. Free memory block list in heap was extended as well (though this could be avoided if only brk or mmap were used). The operation has some overhead but it is very little if the allocated memory is never used.
Re: What is the effect of the leak? (Score:2)
SSDs will wear out, so swapping to them for no reason means your machine will break sooner. On Apple machines where you (usually) cannot replace the internal storage means that you will brick your machine. My wife has a 2018 Mac mini that still works fine, but we never used the internal drive and got an external one instead. Sure it is not as fast but she can't tell the difference and we can upgrade to larger capacity as needed for a lot less.
Answer is simple: (Score:3, Insightful)
We need to stop shifting the blamer on costs (it's a bullshit excuse IMO; not WANTING to spend the resources, time and money to allow quality to shine =/= being unable to).
We need to stop blaming complexity when we don't give developers the time and space needed to explore that complexity and work with it.
Companies need to learn that gaining X amount of money slower than before is not only still earning X amount, but more worthwhile because the issues (and thus costs of dealing with them after) will be better mitigated,
So much in this fucking industry needs to change if software quality is to improve.
Re:Answer is simple: (Score:5, Insightful)
Actually we need to bring down complexity. No amount of time and money will make the current complexity in the IT space sustainable.
And there is another thing: Disregard for KISS and high complexity is amateur-hour. Actually competent engineers avoid complexity like the plague it is.
Re: (Score:2)
Actually competent engineers avoid complexity like the plague it is.
Yeah but avoiding complexity takes a lot of time. It's like the saying 'I didn't have time to write a short letter...'. Especially with powerful high level languages, I can just bash out something that works extremely quickly. Things like JavaScript where what they call 'dynamic programming features' could also be construed as 'no respect for any sort of useful scoping rules' means you can pull assets around your project like globals on steroids. This is damn fast if you just want to mock something, but it'
Re: (Score:2)
It definitely is much more effort and takes more skill and experience to respect KISS than just heap complexity on complexity. That is why high complexity in systems is a almost sure sign of the designers being amateurs or being forced to do it on the cheap.
The problem with "management" is that management is dumb and often greedy and cannot do long-term planning. The only way to get that under control is regulation and liability. And a few "managers" that got in the way of engineers trying to do it right in
Re: (Score:3)
The problem with "management" is that management is dumb and often greedy and cannot do long-term planning.
They can do long term planning, but their incentives are about doing things that improve their own career. (If you want people to change, their incentives need to change).
Re: (Score:3)
Re: (Score:2)
Yes, very much so. Obviously, if you keep adding too much, at some point it all collapses. But if you start with a good solid as-simple-as-possible design and architecture, that point comes much later.
Re: (Score:3)
No. Just no.
Re:Answer is simple: (Score:5, Interesting)
"If you are not embarrassed by the first version of your product, you've launched too late"
Reid Hoffman originally coined this sentence when discussing startup culture and the launch experience of LinkedIn.
This is the reality of the market. People will jump on the first service available, be it bad or otherwise. The competitor that release its stuff 1 year later has already lost the race.
So the bad technical decision makes market sense. You can either make money with fast crap or go bankrupt with a well engineered project.
Re: (Score:3, Interesting)
Nope. Most software engineers I've known spend most of their days in planning meetings, scrums, side engineering meetings, off-sites, training, slack, coffee breaks, and pretty much anything else that wasn't heads down coding. The actual amount of code they produced were minimal (good quality but very minimal). We need to remove additional noise so they can use the 40 hrs to focus on their core job functionality. AI will hopefully provide assistance to the boring parts but come on, slow down? Yeah people do
Streaming Apps (Score:5, Interesting)
OK. Old man rant. But I've been saying this for years about streaming apps. How is the overall quality so awful?
It is *stunning* how bad they are and people just put up with them. If you have two or three streaming apps, you will be constantly playing whac-a-mole with the same basic set of player bugs all the time. And somehow, even trillion dollar companies can't handle syncing where I am watching a show across multiple devices, it's the software challenge of our time.
When I press the "next show" button in a series, I really, really, really . . . don't mean the episode I just watched two days ago, I mean the next one. Apparently no one is able to even come close to doing this reliably.
For years now, Apple has introduced a big feature everyone seemed to think they want: a unified queue across many apps. The thing is, it's terrible. In theory, it shows you where you are in a show/movie, in a list, across multiple apps, on multiple devices. Watch half a movie on my phone, then launch it on Apple TV, it should go to that same spot. This ridiculously simple shortcut feature, from this trillion dollar company, works, at best, 50% of the time.
And the thing is . . . streaming is supposed to be the hot new business every tech company can't stay out of. Amazon and Apple, huge software companies, spend tons for shows, and produce terrible software. Disney is basically staking the company on it, same for WB. *How* is it so bad, while also being so basic?
Re: (Score:2)
Re: (Score:2)
How is the overall quality so awful?
Because Streaming apps no longer exist. They are a front end wrapper to the Streaming company's website written in Electron. Calling them an app implies a level of thought and development that simply wasn't applied.
Re: (Score:2)
Re: (Score:2)
I agree (Score:4, Insightful)
Software is typically being done incompetently and that seems to slowly be getting worse. This is mostly in the Windows-space, including the OS. For example Win 11 crashed on me 2 times in the first hour of using it, while Win10 was rock-solid on the same hardware. And the GUI is infantilized and decidedly has worse usability. What can you expect from application development, if this is the culture being set?
At the same time, requirements increase. Software now runs everything, attackers have solid business models and are more and more of a real threat. AI makes that worse as it mostly helps the attackers and it helps them a lot.
This is headed for a big crash. And yes, my DNS/Email/Webservers are all on Linux and I have a desktop on Linux as well. But what does that help me if most things do not run anymore because Azure got fully compromised (again) and this time the attackers take it hostage or just want to destroy it?
I hope the crash will not be civilization-ending. And after that we hopefully get engineering practices in the IT space that match other engineering disciplines. I mean, you do not have to expect your car fall to pieces while driving it, do you? (Unless it is a Cybertruck or a Jeep, of course...) Amateur hour really has to stop in the software and general IT space or we will see things fall to pieces.
Re: (Score:2)
I think that "AI programming" vibe will only make this worse.
Re: (Score:3)
Oh, definitely. It makes the defenders weaker and the attackers stronger, because attack software does not have to be reliable or secure. And I think we may see a real software maintenance catastrophe from that stupidity in a few years as well.
Re: (Score:2)
I'm finding Claude is really good at removing layers of abstraction - i just went through a stack of shipping code that created Fedex labels, there were multiple layers of abstraction on top of some code written years ago. I had the LLM go through and develop a spec for what each method needed to do, propose a clean interface and then rewrite the existing code into the new architecture.
Then I can have it find a common interface between my UPS and FedEx code and wrap those up a
Re: (Score:2)
One problem I have with Windows has been the ongoing obfuscation of controls. Replacing Control Panel with Settings is something I suppose needed to happen eventually. I personally dislike the style of Settings over Control Panel. However it appears to me Settings is just a front end to actual controls and Windows can lie/hide things from the user. Before Control Panel showed the actual settings.
For example, Add/remove programs appears to remove optional Windows components but in reality it does not. For ex
The great writing quality collapse (Score:5, Insightful)
Most of the "sentences," as defined by punctuation, are really just phrases. The article was published on Substack, so not only are there no editors, but apparently no standards. Please take time to edit your work, otherwise it's illegible. I'll take a stab at repeating what I just said, but in that same terse style:
The author used sentence fragments. Hardly 5 words. Barely even phrases. What the hell? Twenty years ago, this would have triggered editorial review. Not now. Nobody cares! It's awful. Here's what bloggers don't want to acknowledge: writing takes time and effort. It's a skill. And guess what? If you write well, people will have an easier time understanding your point. If there ever was one.
Re: (Score:2)
Seriously. I agree with the author, but the writing was atrocious. Also, at least one of his examples of problems was with beta software. It's built to have problems like this, not really an issue for me if my known beta software doesn't work right. I'm glad someone else noticed this too.
Re: (Score:2)
A consequence of "the great writing quality collapse" is the great comprehension collapse. One can reasonably ask whether Stetskov's poor writing is by default or by design. I spent decades both working in higher education and reading tech news. The result in this case was that, despite all my training, I didn't even notice the poor writing until I read this post.
Not surprising (Score:5, Insightful)
"Move Fast and Break Things" inspired a generation of incompetence. Note that actually fixing the things you broke gets swept under the rug.
It seemed to be an almost orgasmic revelation to the business-types when they realized they can continually ship broken software and nobody cares.
Re:Not surprising (Score:5, Insightful)
"Move Fast and Break Things" inspired a generation of incompetence.
I think there were three elements of this mindset that were assumed knowledge on the part of the person who said it:
1. "Move fast and break things...in a development environment where possible".
2. "Move fast and break things...in a way that is easily reversible." (see #1)
3. "Move fast and break things...and assume they will break, so assume you'll be fixing what broke" (see #2 and #1).
I can appreciate that Facebook can have this mindset, and in the case of a social network, there *is* an element of wisdom in not treating it like the IBM-of-old that overengineered EVERYTHING, making it super-reliable, but also making development very slow and very expensive. Facebook's focus on agility makes perfect sense for the nature of the work.
This doesn't work in every field, though. From finance to medicine to engineering, the costs are much, much greater than the loss of cat videos. Just because something makes sense in one field, doesn't mean it makes sense in EVERY field...and unfortunately, there are very, very few MBAs who understand the one thing that is more valuable than money: wisdom. Wisdom can earn money, but money can't buy wisdom.
Re: (Score:2)
very few MBAs who understand the one thing that is more valuable than money: wisdom.
No, it isn't more valuable. In fact, unused, it has no value at all. What IS more valuable than money is time. It is finite and eventually we all use up our share. The problem here is not intellectual. The problem is that money is the value by which success is measured. Most software is junk because its cheaper to produce than a quality product and the cost of user's time is irrelevant to the producer and capturing the cost of the wasted time is hard to factor into the choice of software.
Re: (Score:2)
"Move Fast and Break Things" inspired a generation of incompetence. Note that actually fixing the things you broke gets swept under the rug.
That's not incompetence, that's corruption. And it's not just tech bros, it's ubiquitous across our society, from top to bottom.
Re: (Score:2)
I might be wrong but I don't think move fast and break things referred to that which you were building. It was more about don't worry about the consequences for the rest of the market place, perhaps don't worry about the consequences for cooperative shared infrastructure, like shoveling tons of data over DNS, or say abusing NTP to distribute a bunch of very large binaries..
Another example would be Electron apps, the move fast and break things does not apply to your own app it applies to using the giant fra
Nope, software's been crap for years (Score:2)
And in 2005, if the Apple Calculator had leaked arbitrary amounts of memory... it just would have been a bug report in a queue. Because it's just a little utility, not an emergency.
He inadvertently disproves his own thesis by linking to the Spotify bug. It's from 2020, when his chart was still in the green, though it's likely there are multiple different memory leak bugs discussed in the thread.
It's true that in the much-further distant past there were fewer bugs of this nature. Or at least fewer that ma
This really, REALLY isn't new (Score:2)
I remember in the last 90's / early 2000's discussing with a colleague how it was possible that X allocated 2 megabytes for an empty window, just for sitting there on the screen.
Resources allow the incompetent to make products (Score:5, Insightful)
When you have 32 kilobytes of RAM and a 1 MHz processor, you need all the programming talent you can get to squeeze the most performances out of them.
When you have 32 gigabytes and dozens of cores, any incompetent code monkey can churn out the same application in Visual Basic or Python.
Resources don't make your computer faster. They empower incompetent and sloppy developers, who crucially are paid less than good ones, so their boss can make more money.
Perspective from an old timer (Score:3)
I learned software engineering in the 70s. Back then, there weren't very many of us and even fewer who were really good at it. As a result, we were paid really well.
Then the news spread, software is the key to riches.
Pundits, advisors and politicians told kids that everybody needs to learn to code.
This resulted in a flood of CS students of varying talent. The talented ones worked hard, the not-so-talented ones somehow ended up with diplomas and entered the market.
At the same time, the beancounters wanted to reduce costs. This created a market for a wide variety of tools, frameworks and languages, optimized for mediocre, cheap programmers. The tools and frameworks were often buggy, opaque black boxes, but nobody cared.
And also at the same time, processors were getting fast, really fast. Along with giant memory and storage, it encouraged companies to optimize solely for development speed and cost, while ignoring performance and code quality.
Now, the fad of the day is "vibe coding" that promises to allow people with absolutely no engineering skill or training to create code with a text prompt.
Expert software engineers still exist and use AI tools effectively where they work well, but the tsunami of crap continues to overwhelm us
DHH spoke about the same thing recently (Score:2)
Not evangelizing Ruby, but the start of the keynote made a lot of sense.
https://www.youtube.com/watch?... [youtube.com]
Re: (Score:2)
It isn't just software (Score:3, Insightful)
It is likely true that software quality is dropping. But the important point I would like to make is that quality elsewhere is horrible too. Our relatively new house is on its third bathroom sink faucet in about 12 years total time. I cannot fathom how this could be so bad. Car quality, parts, engines, transmissions all of it is worse. Worse parts, worse designs, it all is bad and so much more expensive. A twenty year old car w/ only front wheel drive, a four speed transmission, a reasonable power v6 in comparison is SO solid. A little worse MPG but that is it, and sometimes that isn't so clear cut. I'm sure there are examples of things that have improved, and others that have gotten worse. But those were a couple I can think of off hand. A lot of this is driven by big government making decisions for us, even during republican administrations ironically. Fuel efficiency standards go back to Bush. Anyways, enjoy, the future is gonna suck. And be expensive.
Farmers will bitch about a def burn/regen, but still buy the new huge combine because even with sitting for 45 minutes, they get so much more done so fast it is unreal. So they buy/rent huge equipment they cannot work on because they run so many acres they really have no other options. Our government bankrolls all their risk so land/rent values keep going up and everyone is too happy to question anything. Red America complains about market access while voting in Trade War Trump. We're all so stupid. Nevermind GMO everything. If only we could make tofu w/ all this cheap soy. Nope, we gotta feed it to cows/pigs/chickens. No one cares about the river and aquifer water quality, or how expensive it is to treat for high nitrate levels. Sorry for the slightly unrelated farming rant. But I really wish our local river was cleaner. It is never a focus and so sad. But I'd refer to it as a "water quality collapse." Ironically you have to pay farmers for buffer strips and CRP, I'm not sure they'd do it on their own. Left to their own devices they're even ripping out the shelter belts around here.
Re: (Score:2)
Car quality, parts, engines, transmissions all of it is worse.
BS. A car that lasted 100,000 miles was unusual and getting 20 mpg was outstanding mileage. Gas stations checked the oil with a fill-up because most cars burned oil constantly. There is a lot of modern junk, just compare your jeans to a pair from the 60's. But in our society, you buy something new and throw the old things away even when they still serve their purpose well.
Software developers have made that an institution. Nobody thinks a 10 year old computer should be able to run the newest software. And no one thinks a new computer wouldn't have new software with a modern look. Just listen to the discussions about the need to replace legacy software because it was written in cobol. Its still working as designed and rock solid, but it needs to be replaced with modern software with a multi-colored interface. Simple web sites are "upgraded" with new complex websites that have lots of new features that they require users to learn to use the site. The reality is that this drives the industry and bugs are part of that. As the cobol example shows, creating rock solid programs without bugs just eliminates a future customer for new software.
Marketing/MBA driven slop... (Score:5, Insightful)
If I had to pin on why software quality is so crappy, it is because of a few factors:
* Companies don't have to do better. They can ship something that doesn't even run, and make money. At most, they can ship something barely functional so implied running is taken care of. However, even that can be stomped in the TOS/EULA agreement.
* Devs are viewed as fungible. They are not getting what marketing wants, even if it is impossible? Offshore and outsource until management is happy.
* The entire Agile/Scrum system had well intentions, but has become nothing except beating on devs until they quit. I worked at one software place that had daily 4-6 hour standups, where every deliverable was asked for, by every dev, and the Scrum master personally insulting the devs every day. Then people would whine that they are blocked, point fingers, the other person pointed at would start cursing at them. After that exhausting kangaroo court, not much got done after lunch. Even worse, I was in Ops, and was dragged into the meetings, and often considered the cause of deadlines slipping... because I refused to put things into production unless they went through some type of testing first. This was after a dev used a root PW on a production machine, threw his code in, and caused an insanely expensive outage. Code quality was absolute garbage. Security? Hah!
Security was out the window, because the devs knew that if the company had software that caused a major breach, there were many layers between the legal team and them, while not getting a deliverable out the door meant the boss saying tata to their job. So, on install, SELinux was disabled, firewalld was masked out, and the program would quietly exit with an error code zero if not run by root. All DB access was done by the SYSTEM or sa user, and the program would just exit out, error code 0, if that wasn't present. The devs didn't care... to them and the Scrum master, security has no ROI.
Want something done right, don't do permanent sprints, and allow for times where the entire code base goes through a complete refactor, perhaps every 6-12 months. Get that technical debt paid off.
Overall, this makes me glad I work with embedded programming. Stuff that is acceptable for web pages isn't going to cut it when at most, you are programming on the resources of an Apple ][+. You are not going to vibe program stuff in that environment.
That's unfortunate (Score:3)
Gartner predicted this when they critiqued Microsoft's new philosophy of releasing when the software is "just good enough." Inherent flaws and known bugs were ignored.
Now that that's what everyone does, expect it to get worse.
Let's all get nack to basics... (Score:2)
He makes a bit of a valid - if poorly articulated - point.
We should be using assembly language! More efficient!
Re: (Score:2)
Nuke the Evil DOM!* (Score:2)
It doesn't make it easier, it's to force DOM to act like a real GUI at gunpoint. DOM is the wrong tool for the GUI job, time for a new state-friendly GUI markup standard.
Apps that devs used to make in 3 weeks in VB, Delphi, Paradox, or Oracle Forms now take 7 months and be 5x
"Leak": he keeps using that word .. (Score:3)
First, AFAIK, leaking memory means you allocate it, but don't deallocate it. So how can he say "Not allocated?"
Second, leaked how? If it's leaking 32GB of RAM on, say, every keystroke, that would be serious; but if it allocates 32GB RAM once on start-up and simply forgets to deallocate it upon termination, it doesn't matter since the OS will reclaim the RAM for the entire process.
OK, those are lots of layers of abstraction and they each use memory, perhaps a lot, and he has a point that modern software tends to use too many layers, but that doesn't mean that any of that memory is leaked: just used.
Based on that part of his rant, is he complaining more about the 32GB size of the (alleged) leak of the Calculator app, i.e., why should a calculator need 32GB? Sure, complaining that a calculator using 32GB is valid, but it's not a leak, just inefficient or lazy on the part of the programmer.
Re: (Score:2)
Based on the context clues, the author may have used the word leaked exactly as you expected, and then used the word allocated as a synonym for "written," and used as a synonym for "written and later read." If I were speaking to the author and uncertain about this, I'd probably ask for clarification directly. People rarely use the exact dictionary definition for every single word they write, and I usually find it more useful to understand what someone intended than what words they should have used.
Your sign
Re: (Score:2)
Historical irony (Score:2)
It is ironic that the memory leak is reported in the current iteration of the Apple calculator.
The website chronicling the history of Apple (Folklore.org) notes that Steve Jobs wanted a calculator, and wanted the developers to make it look a particular way
https://folklore.org/Calculato... [folklore.org]
He also wanted a few other odds and ends, but including a sliding block puzzle would be too big to fit onto a 400k floppy disk that already had the System and Finder. So the developers made it smaller by hand-coding
32 Gigabytes of RAM (Score:2)
> A basic calculator app is hemorrhaging more memory than most computers had a decade ago
32 Gigabytes is still more memory than most computers have nowadays (which would be around 16 GB). A decade ago it probably was more like 8 GB or even just 4 GB.
The article's premise is flawed (Score:2)
The article claims to measure the severity of a memory leak defect based on the amount of memory it leaked -- but most memory leaks (that are severe enough to be noticed) are small leaks that occur at regular intervals, meaning that the program's memory footprint will continually grow larger over repeated operations.
Therefore, do you want a 1MB memory leak? Run the program for a while. Do you want a 1GB memory leak? Run the program for that much longer. Keep going, and you can eventually get to any numb
OOps (Score:2)
Thirty years ago, I had a course (before I got my B.Sc) in OOPs and GUI. OO design I found interesting. OOP, not so much - the closer you got to the code, the fuzzier the picture.
Since then, hell, for about 20 years, I've been saying you want a clipping of Godzilla's toenail, and what they give you is Godzilla, with a tiny frame around part of his toenail. Invoke x, then you can invoke this method x.y, then you can invoke x.y.z... Rather than invoke z directly, because you don't know how to do that, and bec
I'm on the frontlines fighting this daily (Score:3)
Why?...because fucking management is dumb AF....at the end of the day, I need a paycheck. I don't take it personally. It's a mild frustration. This is not my religion. However, I deliver REST services that do the job in megabytes of RAM using 15yo frameworks in Java that are well known industry standards, faster than the hipsters and even with less lines of code....young people who think "Java is hard" try to force everything to Python and megabytes becomes 100s of megabytes if not gigabytes...and the fucking thing is slow as molasses and breaks constantly with every change because Python and Node rely on the developer to write PERFECT unit tests...
And EVERYONE writes PERFECT unit tests, right?...no need for those annoying compilers getting in the way
So why did I bring up AI? Well...OK, dumb fucks can't figure out Java...let alone C++/Rust...one of the first use cases I pictured was AI taking your SHITTY SHITTY node and Python apps and rewriting them to not be shitty...it can't be that hard. Take functioning Python and convert it to Rust...or if you insist, Java...or C++...fuck...do assembly if you really like pain...The AI should be able to handle it, right? I've managed to convert a few pieces from Java to Rust nicely...but nope..that doesn't seem to be a popular option. People are mostly using AI to generate new SLOP in sloppy languages. So...instead of taking existing code and making it faster and cheaper to run and lowering their cloud spend, companies just want garbage buggy new python apps...because python is more error tolerant than C/Java (Claude and OpenAI still can't generate code in correct syntax 75% of the time, in my experience....they can't put commas or semicolons in the correct place or match braces). I don't know if that's a limitation of the tools or just the people using them.
These statements are very poorly written, at least (Score:3)
Slowly downwards (Score:2)
Wirth's Law (Score:2)
old beautiful LAMP stack - buildless, evergreen (Score:3)
10-15 years ago there was such a split for web engineering. They wanted to make everything on the web look like an app, and a lot of backend guys hate anything looking like UI, so lets have an amicable divorce and do everything through these god awful endpoints, so the backend folks don't have to touch UI and the frontend folks can think they're "more real" engineers by making stuff that looks like it's a black box app vs enjoying the natural versatility and iterability of the old web.
I'm sure I'll never get hired for it, but good ol PHP (hell for most things I skip the MySQL; poor mans no-SQL w/ JSON files on the file system works and scales well for so many things)... vanilla Javascript can even be beautifully declerative when you want it to, with string templates building up whatever new DOM you didn't get from the server. I have these sites that last for decades, and when it comes time to add something, they're easy to figure out and adapt and there's no library hell (browsers have gotten so GOOD yet still so backwards compatibile over the years)
So I look for like minded souls using terms like "buildless" and "evergreen". But it's like an underground movement...
No time, not much different (Score:2)
This is the same as it always was. Software today is probably better than it has ever been. People just dont remember what it used to be like. Windows 95 crashed all the time and that was the entire OS going down regularly. Every application was equally buggy. This is not new. Its only new to people who dont remember what it used to be like.
Re: (Score:2)
Its only new to people who dont remember what it used to be like.
Absolutes like this ABSOLUTELY warrant a "citation needed" response.
Incentives. (Score:2)
Quality takes time: time to design, time to implement, and time to test before release.
The market has favored time to market and continuous delivery over quality. If we start putting a premium on quality then quality will return, plain and simple. Perhaps the pendulum will soon turn around, perhaps not.
I'm old (Score:2)
The first computer I programmed on had 4K RAM - the TRS-80.
The first computer I used on a job had 256K RAM the PDP-11/44.
The first computer I paid for myself has 128K, and a 400K floppy - the OG Macintosh.
On the latter, you could fit the entire OS (less some fonts and desk accessories), MacWrite and MacPaint on one floppy.
We got lots of useful $#!+ done on those computers. It's not just errors, it's lazy.
Re: (Score:2)
My first computer had 256 bytes of RAM. I had to hijack some of the video RAM to load BASIC programs. (TI-99/4A)
It was barely more powerful than a programmable calculator until you upgraded it with expansion RAM. Even in the bare bones configuration you could do useful things on it. Maybe not a spreadsheet or word processing, but these were computers in an era where most people didn't think they needed a computer.
Well said. Just exactly this. (Score:2)
And this is what I've been saying all along. I started my computing journey with 4 KiloBytes of main memory. No kidding. 4K. And I've always had to be careful with my mallocs and garbage collection and disk space and everything else, including power usage. A lot of junior coders that I encounter today want to start with basically unlimited cores, RAM, and disk space. Of course they do, they're not paying the hosting bill, whether it's AWS or OVH or Hetzner. And until you threaten to take the AWS bill
Software quality is not a crime (Score:2)
Dave Cutler, of Microsoft Windows NT (the basis of 2000, XP, 7, 8, 10, and 11) fame, had signs posted around the Microsoft offices that said "Software quality is not a crime."
Well, that's one example. (Score:2)
Let's take a look at software sizes, for a moment.
UNIX started at around 8k, and the entire Linux kernel could happily sit in the lower 1 megabyte of RAM for a long time, even with capabilities that terrified Microsoft and Apple.
The original game of Elite occuped maybe three quarters of a 100k floppy disk and used swapping and extensive use of data files to create a massive universe that could be loaded into 8k of RAM.
On a 80386SX with 5 megabytes of RAM (Viglens were weird but fun) and a 20 megabyte hard d
90 comments (Score:2)
And nary a joke to be seen? Sadness.
"Software catastrophes" (Score:2)
A completely ignorable easily replaceable for free app that most people don't keep open on their computer for more than 15 seconds at a time leaking memory is hardly a catastrophe. It barely qualifies for a "meh".
Tool Chain (Score:2)
Today's real chain: React > Electron > Chromium > Docker > Kubernetes > VM > managed DB > API gateways.
So, how else am I going to get all that stuff on my resume?
Is software worse now than before? (Score:2)
The title talks about a software quality collapse. That implies that software quality was better in the past. While we all bemoan the current state of software quality, how many of us actually think, "Remember the good old days when software quality was good!"? Or is this complaint-fest simply an acknowledge of the ever-present challenges of engineering software quality, challenges that are not necessarily different than before. As with all things in life and not just software, it's easy to complain abo
Re:Percentages. (Score:5, Insightful)
You are the problem.
Re: (Score:2)
Re: (Score:2)
Re: (Score:3)
Why are you running a calculator app on the cloud instead of locally?
Because the local machine does not have enough resources to run such a complex application, of course.
Re:Percentages. (Score:5, Insightful)
As our memory size has increased, the acceptable losses have also gone up. Similarly, the developers have shifted their requirements. More important to get things done faster and add features than to be a memory miser.
This attitude, IMO, is a huge part of the fucking problem in the first place.
Re: (Score:2)
There's also two sides to this coin. The attitude that says a calculator app which is rarely used, and when used often only kept open for a time measured in second leaking memory is not a catastrophe in the slightest.
There's too many developers who are complacent, and too many other developers/reporters/users freaking the fuck out over nothing.
Re: (Score:2)
I built a top line gaming PC late last year. It has 64 GB of RAM. If a fucking CALCULATOR took up half of that I don't even want to know what a game with the complexity of Solitaire would do to it!
Re: (Score:2)
20 years ago a calculator leaking 32 GB would have been physically impossible. The hardware that it ran on didn't have 32 GB. So leaking even 1GB would have been a major issue.
Now we have Terrabytes, so a Gig is NOT THAT BIG A DEAL.
As our memory size has increased, the acceptable losses have also gone up. Similarly, the developers have shifted their requirements. More important to get things done faster and add features than to be a memory miser.
Is the leak good? No. But it is not a catastrophe nor a sign of bad times. It's just a shifting of priorities due to a shifting of resources.
Also, this is an AI issue because our AI are being trained on the existing software. That is more of a problem than the current memory surpluses.
You know the feeling of not knowing whether someone is being serious? I have a similar feeling now except I don't know whether everyone from TFA on down is posting AI slop.
Re: (Score:2)
Now we have Terrabytes, so a Gig is NOT THAT BIG A DEAL.
You're thinking about storage, not memory.
Re: (Score:2)
I'm not sure you know what RAM is, or a memory leak, or SI units.