Ask About Proprietary vs. Open Source Code Quality 196
Scott Trappe is CEO of Reasoning, a company that has gained a certain amount of noteriety (and a Slashdot mention) by running its Ilumna automated inspection service on several versions of TCP/IP -- and concluding that the Linux version has fewer bugs than most proprietary ones. Why is this? Let's ask Scott, and also ask him any other question you can think of about software quality and how to achieve it since, after all, that's his business. We'll send him 10 of the highest-moderated questions and post his answers when we get them back.
Well ... (Score:3, Interesting)
Re:Well ... (Score:2)
Re:Well ... (Score:3, Insightful)
Re:Well ... (Score:2)
Would that be the Microsoft Shared Source license then? ;)
Security through Obscurity (Score:5, Interesting)
Re:Uhh, gee (Score:2, Insightful)
One the one hand we have open source which is subject to large amount of peer review.
On the other hand we have closed source where no way near as many people can check the code, end users can't help much in finding them either.
A persistent h4x0r maybe helped a little by the O/S, but security through obscurity has been proven to fail, time and again
Where was the most interesting ... (Score:5, Interesting)
Does this trend extend to other areas? (Score:1, Interesting)
Re:Does this trend extend to other areas? (Score:2, Insightful)
Re:Does this trend extend to other areas? (Score:3, Interesting)
Grab.
Re:Does this trend extend to other areas? (Score:2)
sample size and conclusions (Score:5, Insightful)
Where in the product lifecycle is the problem? (Score:5, Interesting)
Re:Where in the product lifecycle is the problem? (Score:2, Insightful)
I don't know what Scott's opinion is on this, but I know I've found the specifications to be the biggest point of failure. I can't tell you how many times I, or someone I know, have written the perfect program that nobody wants because it didn't follow what the customer actually wanted. --Jason
Re:Where in the product lifecycle is the problem? (Score:2)
Re:Where in the product lifecycle is the problem? (Score:2)
Oh, and having the sales dolts sell a product that does not exist to the customer is not a good thing either. Not even a good project manager can help how the code ends up when the development team is asked to do stuff that is out of the scope of the current projects!
Re:Where in the product lifecycle is the problem? (Score:2, Interesting)
oh what is that old yarn? "You never plan to fail, you fail to plan"
What exactly is being compared. (Score:5, Interesting)
So did you go cherry picking to find OS's that had more bugs than linux, or was it random or what?
Too often the Open vs Closed argument turns into linux vs windows, and then criteria is arbitrarily picked. Since the two OS's are designed largely for very different purposes, the comparison is by definition never fair, no matter who conducts it.
Saying that one product is better doesnt necessarily mean that the way it was created is inherently superior.
Implementing properly documented standards is something the OS community is great at, since they're all on the same page. Creating from scratch is different.
Hence, TCP/IP is rock solid in linux, yet development on the desktop crawls along in 100 different directions at once, gaining little ground.
Re:What exactly is being compared. (Score:2, Interesting)
And as for implementing properly documented standards, I've seen a lot of programs which are RFC compliant, yet are incompatible between systems (Kerebos). I've also seen different implementations of an RFC compliant program (MS's DHCP server) take down a seemingly rock solid OS (SCO), rendering it's TCP/IP compatibilities null. Unless it is explicitly spelled out in an RFC as to how something is to be done, you're going to have failures on some things, unless everyone uses the same hardware, the same software, and the RFC's are actually code snippits for the only used prgramming language.
So Implementing properly documented standards is not as easy as you put it.
Re:What exactly is being compared. (Score:4, Insightful)
Hence, TCP/IP is rock solid in linux, yet development on the desktop crawls along in 100 different directions at once, gaining little ground.
Actually, the Linux desktop has gained a lot of ground, as have the distro installers. If anything, public perception lags far behind the reality. That's not too surprising given that the OS community isn't pumping millions of dollars into marketing.
Re:What exactly is being compared. (Score:2)
Perhaps it's the window manager you're using, perhaps it's that I really like a fairly simple and uncluttered desktop, but I haven't had any problems at all with windowmaker. The multiple workspaces are great and have always worked better than Windows.
The fanboys are always there, and will always exagerate. The only difference is that the MS fanboys are called the marketing department, and get paid big bucks to say what they say. In any case, public perception is not what the fanboys say, it what the public at large says.
On the usability front, on the rare occasions I do have to sit down with Windows, I feel like one hand is tied behind my back (at least). There are just too many very useful things missing.
Re:What exactly is being compared. (Score:2)
I like WindowMaker too, it is possibly the best (light + stable + aesthetics) window manager I've used on any Unix.
Problem is, again, where are the apps? Mozilla and Evolution are probably the best desktop apps (from a usability perspective) on Linux right now-- the others are in varying stages of alpha- and beta-ness.
Also, every app on Linux looks and feels like a different beast (different toolkit, different keyboard shortcuts, etc). This is great for feeding the app-developers' ego (gee, we wrote our own toolkit for our bitmap editor!) but it is *hell* for usability. Why was OO.o not written with Gnome? That way, with Gnome2, it too could have taken advantage of the accessibility + display features of Gnome2. Ditto Mozilla. (Yes, I know I can hack OO and Moz to use anti-aliased fonts now. Won't do it, no time.)Most OSS projects seem to thrive on re-inventing the wheel.
9 years after Win95, and and *19* years after the Mac, the most *usable* software on Unix (and Linux) are still the ones that use the command line and the console -- at least they present a consistent interface and are easy to learn.
Re:What exactly is being compared. (Score:2)
OK, now I see where you're going. I agree that it would be nice if the Free and open source world could agree on a toolkit (or at least agree on fully interoperable toolkits) and the distros could put together standards for UI design.
Part of the issue there is that the projects are truly independant of one another. At MS, and at Apple, designers can be told "design to these standards because that's what you're paid to do".
I do see hope here, the open and Free software world isn't totally blind to this. There are efforts to get KDE and Gnome to freely interoperate and skinnable apps provide hope for a vendor that wants to define UI standards for their distro. At that point, they would just need to change the default skins to a set that is consistant with their UI scheme. It's taking a bit longer, but the result will be far more versitile. Once completely implemented, it will not only allow for UI consistancy, but for each individual user to have their own preferred UI standard wherever they go. Most will probably stick with their distro's default or something close to it. Even then it would be useful for people who like distro A and use it at home while their employer had standardized on distro B. The user could just bring the UI configs from home and have a familiar interface.
Of course, it ain't here yet.
To be fair, you have to remember that MS rolled windows out in the mid '80s. It took until version 3.1 before it was even usable for anything but solitare and minesweeper. The UI didn't start to get inetersting at all until '95. Stability wasn't anywhere near there until 2000. Security still isn't there at all. If the virus and worm writers get really nasty, Windows usability will be deeply affected. Part of the problem for MS is that some of the worst security problems are architectural rather than just a matter of auditing code. If proper security were added by fiat, Windows would be unusable.
Linux is on a different track with different emphasis. Stability and security (or securability) have been given a much higher priority, and UI consistancy is a lesser consideration.
To fairly compare the development tracks, we have to judge how long it will take for Linux to have consistant toolkits and UI design vs. how long until Windows becomes securable.
Re:What exactly is being compared. (Score:2)
I thought we had properly documented standards for the UNIX desktop too. Whatever happened to CDE and Motif anyway?
They were too ugly to live.
Re:What exactly is being compared. (Score:3, Interesting)
and what would those be? linux was designed as a desktop unix originally. that is is a great server paltform is testament to its quality, etc. but it was designed to runon top of x86 hardware, same with windows.
oh wait, i know what you mean. one was deigned to enslave and control you...gee i wonder which one, (those pesky finns)
Re:What exactly is being compared. (Score:2)
Having to maintain backward compatibility with software written for pre-386 processors accounts for a lot of the stability problems of pre-NT/2000/XP Windows. This is an issue that Linux never had to contend with.
Proprietary v Open (Score:5, Interesting)
Re:Proprietary v Open (Score:2, Informative)
As is frequently pointed out, in some cases their software is just overall better than others.
it helps us (Score:2)
We just need a talking penguin that harrasses you while you're trying to work and the market will be ours!
Code quality (Score:3, Interesting)
The development enviroment (Score:5, Interesting)
Re:The development enviroment (Score:4, Interesting)
For example:
Fine and dandy. (Score:5, Interesting)
OK "Your TCP stack is cleaner than theirs" but what metrics are being used? How do we know bugs in their testing software doesn't skew the numbers?
Re:Fine and dandy. (Score:2)
The right tool (Score:5, Interesting)
How many bugs are emergent phenomena? (Score:5, Interesting)
What needs to happen? (Score:5, Interesting)
Re:What needs to happen? (Score:2)
What gave you the idea that companies care about better quality?
How do you measure quality (Score:3, Interesting)
50% Stability, efficiency
33% Form, structure
17% Ease of build
Stability and efficiency, of course, is the most important thing. Does the code work? How well does it cover all cases? Does it do it efficiently? Does it make 10 copies of a string just to return a substring?
Form and structure are important too. This is key for maintainability. Is the code broken down unto logical modules? Or is the entire 50000 line code base contained all in one monster if/else function? Does the code itself follow sensible, consistent conventions? Or did the developer purposely obfuscate it to prove how smart he is? Or did the developer hack the whole thing due to a failure to understand the actual problem to be solved? How well documented is the code?
Ease of build - how many #define's (or their analogues in other langs) do I need to get the thing to compile? Does it come with a makefile or build script? Do I need to install a 100MB SDK because the author decided to use 1 small function he could have written himself?
These are the factors by which I use to measure the quality of source.
Re:How do you measure quality (Score:1)
</humor> (hopefully)
Re:How do you measure quality (Score:2)
50% Stability, efficiency
33% Form, structure
17% Ease of build
You must be a clueless programmer:
Usability should be number one. If the code is ugly but the user is happy, the program is a success. If the code is beutifull but hard to use or functionless, then the user will look for alternative software. My list would be:
60% Usabilty
25% Stability
15% Form, Structure, Design, etc...
It is not that stability and the rest are not good and important but programs are meant to help make life easier for users. Computers/programs are tools (and entertainment)- If the user does not get what they want, you are not doing your job right no matter how well designed and debuged your programs are.
Is this applicable everywhere? (Score:5, Interesting)
If so, which areas do you think are benefitting and which need more community action / peer pressure to excel?
Are there any areas you think this phenomenon will never apply? (eg areas in which proprietary code will always be better)
What about BSD? (Score:5, Interesting)
So did you take a look at the BSD tcp/ip implementation, if so, how did it compare to the rest?
If you didn't, why not?
Re:What about BSD? (Score:2, Informative)
People like Terry Lambert pop up often with quasi-benchmarks taken from personal experience.
Check out http://news.gw.com/freebsd.arch/9169 [gw.com] for a detailed way to get 1.6 million simultaneous connections in FreeBSD, a number that Linux simply can't match.
Check out http://linuxpr.com/releases/5611.html [linuxpr.com] for IBM's simultaneous connection limit:
1.6 million compared to 6,900. To be fair, one is excessively tuned, but despite that, it's a huge difference.
Re:What about BSD? (Score:2)
Are you sure that those connections talked about are the same types of connections?
Re:What about BSD? (Score:2)
(off topic) Number of connections with SPECweb99: (Score:2)
If you read the posting he referenced, you can see the calculations, and how you can get useful work done. It all boils down to transmit buffer usage (mbufs).
Remember that for most HTTP traffic, you have very small requests, and it's the responses that are larger, so the mbuf usage is asymmetric between inbound and outbound data.
The product this was for was a reverse proxy cache, and so if you didn't care about a lot of content, just getting it out fast, you could compromise between connections and cache size, and operate with 500,000 simultaneous client connections.
This was back in the days when there was an mbuf required per connection for the tcp_template structure. The thing that let me push it to 1.6M was I shrunk the size of that from 256b to 64b. But as of FreeBSD 4.5, the structure went away; a FreeBSD 4.5 based port of the same changes could probably gain another 150,000 connections, which would move the number up to 1.75M. The number of useful connections would (based on cache size) moved up to 300,000 (or 600,000) as a result.
Practically, the cache was a special case, because it was possible to share mbuf chains containing cached content between connections.
-- Terry
bugs (Score:2)
Re:bugs (Score:2)
>> closed source software engineering
Please pass the crack pipe.
Influence of project size (Score:5, Interesting)
Quality Software vs Fewer Bugs? (Score:5, Interesting)
It sounds like your company focuses on analyzing the code bugs, and not necessarily the perceived bugs. What are your opinions on this? I know that locating and eliminating the bugs *is* a critical part of software QA, but do you feel that bug-free ensures true quality? A bug-free Open Source project may still be too difficult to use or confusing for the non-technically inclined.
Irony (Score:4, Interesting)
(-1, Flamebait) (Score:3, Informative)
Sure, because it's well known that commercial software vendors never fix serious vulnerabilities as fast as the open source community. Particularly ones like Apple, for example, who have fixed several vulnerabilities in MacOS X way before the equivalent Linux patches were released. Since you like sendmail so much, I suggest you check how fast the major commercial *nix vendors released their patches compared to the open source world, and get back to us.
Now please pick up your ill-informed pro-OS FUD and go away.
What about the bad guys? (Score:5, Interesting)
Re: What about the bad guys? (Score:3, Interesting)
But there's nothing stopping the cracker from using the same closed-source techniques on open source software. Of course it doesn't work the other way around.
So basically open source does provide the cracker a second avenue that they can optionally try. Whether it takes more time/patience is irrelevant, they still have the same options they have with closed-source.
Now, OTOH, I do think in most cases (any public project like Linux or Apache) the "many eyes" theory works wonders. Things that are very easily over-looked by specific programming teams in closed-source development may be caught in open-source development. Even if the number of actual, contributing developers is relativley low, many of the end users will be looking at the code.
I haven't contributed a line of code to Linux, Apache, MPlayer, or anything else. But I have looked at tons of code, and have reported bugs or potential exploits for all of them. Remember, many eyes can easily be users, not necessarily contributing developers.
So, while a cracker may be able to look at the code, you have to consider:
- The "non-patient" cracker can of course use whatever methods he wants on closed source code as he can on open source code. This of course doesn't work in the reverse.
- The "many eyes" approach is still great, because a) the developers KNOW other people will see the source, and are less likely to be lax in certain areas, or "I'll fix that potential security problem later" and b) even if they do, someone, somewhere (not necessarily an OSS contributor) will point out the potential vulnerability.
Many end-users, especially of things like Linux and Apache, are coders, but not necessarily OSS coders. They instead rely on these systems, and if ever there is a question we go to the source. And if a problem is found, we report it. And in most (99% or higher) cases, it is quickly addressed.
Re:What about the bad guys? (Score:2)
I think the number of viruses and worms has much more to do with the popularity of the system than anything else. It's the same reason more shareware is available for Windows than Linux: there are more Windows users.
Why release a virus that can only affect a small percentage of users, when you can target the one OS that the vast majority use?
I do feel that *nix is, in general, more secure and less prone to the types of attacks that viruses and worms target; however, I also feel that *nix has too small a user-base, and (in general) a more intelligent user-base, to where these attacks are simply not worth implementing on a *nix platform.
If you transferred all the idiots to Linux -- take away their XP boxes and just sit them in front of a RedHat 8.0 box -- we'd have many more problems. These people aren't going to log in as a non-privileged user, and take other precautions -- and will be targetted for attack.
Until then, these people use Windows, run effectively as "root", and do dangerous things like execute unknown binaries from unknown sources, and thus will be the primary target for such attacks.
Not that I don't think *nix security in general is better, rather, I think other factors contribute to the virus/worm issue. More importantly, the typical Windows user doesn't care nearly as much about security as the typical *nix user, by nature. Note that I consider "Windows IIS/SQL Admin" who couldn't be bothered to patch a known hold (Code Red/Nimda/Slammer) in the months before it was exploited as a "typical Windows user"...
A bug is a bug is a bug? (Score:5, Interesting)
Did they fix the bugs they found? (Score:5, Interesting)
General quality of programming (Score:5, Interesting)
How do you know you have found all the bugs? (Score:4, Interesting)
Why the TCP/IP subsystem? (Score:4, Interesting)
What Metrics Are Used to Determine Buginess? (Score:5, Interesting)
Are there checks for use of unsafe functions like gets and the str* family of functions in C? Are there more complex data flow analysis algorithms at play here like those in the used in Stanford's Meta-level compilation [stanford.edu] techniques?
Inquiring minds want to know. A pronouncement like OS foo is has more/less bugs than OS bar is meaningless without a definition of what having more/less bugs means.
Issues behind test cases for proprietary v.s. open (Score:5, Insightful)
What are your thoughts on this trade-off between test case management and confidentiality as it relates to proprietary v.s. open source code development?
Compare yourselves to Checker and Smatch (Score:4, Interesting)
company offers with those offered by Checker [stanford.edu]
or Smatch [sf.net].
They seem pretty similar. In fact, do you
use Checker or Smatch internally? It would
seem logical.
- Dan Kegel (dank@kegel.com)
Internally inconsistent argument (Score:4, Insightful)
Purify (Score:2)
How do you maintain your neutrality? (Score:5, Interesting)
A simpler question (Score:2)
In the same vein as the parent, but much more black and white: who paid for the study in question?
Peer review (Score:5, Interesting)
In proprietary source systems, there is generally formal peer review, as per CMMI [cmu.edu]. But I have seen this done rarely (almost exclusively for CMMI level 3+ projects). There seems to be a disincentive to do formal peer review. There seem to be various reasons for this, cost, workplace environment, and group dynamics. Which do you think are most significant?
Whereas in open source projects, there is not the formal peer review, but rather seems like a mass informal peer review. This seems to foster an enviroment of besting each other, trying to find the most and most obscure bugs.
What do you say?
Formal Peer Review Is Expensive (Score:2)
For formal peer review to work it must be scheduled in and implemented with the blessing of management. The surest way to fail at code reviews is to up and one day say that code reviews are mandated but never provide time or the framework to execute.
As mentioned in the parent, Open Source has more informal review structure. Before you implement new features you inspect the code and ask the author questions which can lead to improved and robust designs even without implementing new features. Either the author gets sick of answering questions or seeing comments about their weak design and implements a new one or a newcommer goes ahead and does it. Its a win-win.
What makes for better code (Score:3, Interesting)
What I mean is this: over the years there have been numerous methodologies that to some extent all claim to make programmers write better code in less time. eXtreme Programming is a recent and - imho - fairly impressive example. All of them boil down to a slightly different approach to the task of programming.
So if you find fewer programming defects in the Linux IP stack, would you think that this indicates that there is something that works well about the way open-source programmers approach programming? Or could it be simply that people willing to donate their time to a project tend to be talented?
So if open source is so good... (Score:5, Insightful)
Is this a matter of brute force vs. process? (Score:3, Interesting)
Too many cooks... (Score:2)
Do you think that Linux, being "benevolent dictator" is a better model than having "teams" make every development decision by committee?
Re:Too many cooks... (Score:2)
Automated code 'auditing' (Score:3, Interesting)
Strictly speaking, static analysis tools measure what is called kwalitee, a property which isn't the same as code quality but is usually closely correlated with it. In other words the tools do make mistakes, but most of the time they are on the right track.
It would also be possible to have a big online 'databank' of C source from many projects - the top thousand on Sourceforge plus the GNU programs, or something like that - and make this a standard 'corpus' for code analysis tools.
Hmm, I have to get a question out of this. Do you think that code analysis tools like Splint could improve free software quality further? What sort of infrastructure could be created for doing code kwalitee checks across a whole Linux or BSD distribution?
The future of automated code inspection (Score:5, Interesting)
What errors are currently hard to detect automatically but which you would really like to be able to find?
What is the next category of errors that you're trying to detect with automatic code inspection?
To give you some ideas, what about:
Language Choice (Score:4, Interesting)
Test first (Score:5, Interesting)
Developers' motivation (Score:3, Insightful)
Do you think part of the difference in resulting code quality is due to the developers' motivation for working on the project -- that perhaps closed-source programmers are more likely to be doing it just to earn a salary, while open-source programmers are more interested in the art of coding itself?
The terms open/proprietary don't help you tell ... (Score:3, Insightful)
It is possible to have an open source model and have the code reviewed by no one but the original coder. Or to have 15 reviewers of varying competence looking at ever line and debating it vigorously.
It is possible in the same OS to have source files or code fragments from various sources with various development and review methodologies. Some can be as extreme as using/requiring automated tools to find potential errors and requiring skilled reviewers. Some as lax as no review by anybody or anything.
Given this diversity, how can the terms open and proprietary be used to usefully describe software quality? Doesn't it depend not on the open/closed but on the amount of skill of the coder, automation of the review and experience of the reviewers. And isn't that independent of open/proprietary?
Open vs shared source quality (Score:2, Interesting)
Do you study the process of the Development team? (Score:4, Insightful)
I, Code (Score:3, Funny)
Improving the testability of code (Score:3, Interesting)
How would you extend C/C++ to include information about the intended behavior of programs, so that programmers can tell the tool directly what is supposed to happen?
Security, reliability, support (Score:4, Interesting)
Most primes and militaries are moving towards COTS products to reduce costs and improve reliability and support. If we were to port our product s/w to run on Linux, how on earth can we achieve similar value and benefits of COTS-like s/w, s/w like WinRiver's Tornado, that have great robustness, standard (purchasable) support, and carry the perception (remember: perception is reality here) of greater security?
For those of you who think support is not important, market data has shown that for larger organizations, the number one "care about" is support. And since Sept 11, security is moving to the top of the list of care abouts for the militaries and primes.
Does bug count==quality? (Score:4, Insightful)
Do you forsee any metrics in the (near) future to measure other aspects of code quality? Performance is obviously important, but what about things like code style, modularity, and 'cleanliness?'
How open source translate into fewer bugs (Score:2, Insightful)
This is because, although being open makes it possible to involve many more people, it is not necessarily true that many people will look at your code. Coding is not an easy task, it takes time. In general, many open source projects are maintained by few people, which is actually worse than the commercial applications, since these commercial companies can hire the top people in their area, and they can hire as much as possible if that's needed to compete with any other product, including open source applications. So being "open" does not translate into anything. It is the number of people, their quality, their time to dedicate for the project, not the license of the product.
I am partially open source advocate, and I really appreciate people working for open source. But there are big problems associated with it, and I think instead of trying to cheat people to use open source, we need to focus on the problems of the open source itself. Otherwise it will be a hobby for people, geeks, but nothing more.
In short, can you explain your logic behind this conclusion, because it just seems to me either you or Slashdot is making it up.
Stupidity and Lies (Broken Metric) (Score:4, Insightful)
Think about that.
If Stack A is 3 times as large (bloated code) but has only 2 times the bugs as stack B, then stack A (worse in all respects) gets a better grade!!!
You can halve your defect count by doubling the number of lines of code in your module. What a rip! How could so many people read and write about this and not see the problem.
What forms of bugs? (Score:2, Interesting)
I wonder if there were marked similarities in the bugs in Proprietary code compared?
Were these similarities found in FOSS code that was looked at, or did the dendritic peer review process handle that to some degree?
Were bugs found in the proprietary code that were already (verifiably) marked as things to be fixed, and if so, what was the average lag time (Bug turn over)? Do these companies keep track of their bug turn over periods, and what is the empirical comparison with that of FOSS?
Was there pro-active debugging done in the FOSS code that were results of known bugs in the proprietary code base, and if so, were these bugs addressed in the proprietary code?
Was there a verifiable process for maintenance in the proprietary companies that had changed in the 3-6 months prior to the testing?
I think that will do for now. Plenty more where that came from. :)
TaranWhat does it mean... (Score:2)
easier to understand? (and how do you evaluate this)
more compact code? (i.e., fewer LOC)
more evidence of encapsulation and data hiding?
more comments (to better explain what the code is doing)
fewer comments (the code stands on its own)
rigorously standardized naming conventions?
choice of language?
All too often, one man's notion of quality is another's nightmare.
Re:Licenses (Score:2)
FIX IT YOURSELF (Score:1)
Why should they? (Score:2)
Someone here has misunderstood open source.
If you think it means that any bug anyone ever sees will get fixed, it's you, though.
Re:Give due credit (Score:5, Informative)
No. The Linux TCP/IP stack was written from the spec mainly by Alan while he was at Swansea. Haven't you seen the credit to SUCS [sucs.org] in your Linux boot-up? That's the problem with graphical splash screens...
Re:Give due credit (Score:2, Informative)
Re:Open. Source. Fucking. Sucks. (Score:2)
-1 Flamebait? +5 Funny! -1 Offtopic...
Re:Open. Source. Fucking. Sucks. (Score:4, Insightful)
"open source" works because your owner needs something done and may realize that it makes more sense to spend labor on the problem rather than money. There also may be no compelling reason to maintain ownership over the results.
Software is a tool, not fools gold.
Software is valuable for what it can do for people who don't have any interest in selling it.
Re:So nice you chose GNU/Linux (Score:2)
Besides, if you compare using a flawed example of open source, you better validate the value of it.
Re:Why? (Score:2)
They're selling process automation, e.g. automatic installation of these patches, to save a busy administrator valuable time. The cost is worth it if you have more than say, five boxes. (The break-even point is up to the end-users' judgments.)
You can always download the updates for free yourself and apply them by hand.