Michi Henning on Computing Fallacies 587
Karma Sucks writes "Check out this summary of a keynote at Linux.conf.au by Michi Henning of CORBA fame. It really hits the nail on several points. I especially liked the point about people constantly rewriting letters in these modern times, as opposed to say 1945 where it just wasn't worth the pain of re-typing a letter. The only point that didn't made sense in this summary was the one about "source code being useless"."
Of course. (Score:3, Insightful)
Re:Of course. (Score:2, Interesting)
"Given enough eyeballs, all bugs are shallow."
The idea is that if everybody gets to see the code, the problem will be obvious to somebody. It certainly stands a better chance of being found than if only the original coders (who might not see anything wrong -- after all, they wrote it that way in the first place) get to look under the hood.
Re:Of course. (Score:2, Insightful)
If software has bugs that are easy to see, and come up frequently, it's perceived as "buggy", and few people will download it, and fewer eyes will look for the bugs.
If a bug is harder to reproduce, it probably doesn't come up very often, and not everyone will spend time looking for the bug. Hell, I'm in a research environment, I know how to code, and my KDE print daemon crashes everyday. I don't care enough to try to submit bug traq reports, or look at why it's crashing, I'm just going to hope that it magically gets fixed in the next release.
Exactly how many people here have actually hunted through source code to find the one little bug that annoys them, like the fact that Konq occasionally ignores links, or that it's javascript interpretter is far from perfect. Source code is not the magic bullet, but I'll admit it's better than nothing.
Re:Of course. (Score:5, Interesting)
An absurd fallacy. Perhaps for fetchmail or hello, world! or other,
similarly sized projects, but nowhere else. Debugging require not
merely a pair of eyeballs, nor even crackerjack programming skills,
but mostly an understanding of the problems and compromises that went
into the creation of the software system in the first place.
To produce better software, we need better programmers, and better
tools, not meaningless platitudes about the business justification of
Open Source licensing.
Peace,
(jfb)
Re:Of course. (Score:2)
distribution of source is an invaluable gift. But it's not going to
make problems disappear without a reevaluation of the culture of
software.
Peace,
(jfb)
Re:Of course. (Score:3, Interesting)
Care to back this up, say with some examples of projects where large numbers of people swarmed over the code and still couldn't fix the bugs?
Re:Of course. (Score:5, Insightful)
Well that sounds good, but it's been proven wrong in practice.
At this point, with all of the incredible software that has been produced by open source methods, I don't think it leaves people much room to attack the open source design philosophy. It clearly works and works well, it just works differently than people expect.
Re:Of course. (Score:5, Interesting)
Through my time as a sysadmin I've come accross bugs in both open and closed source software and have definatly come to appreciate being able to fix the bugs on my own.
Example: Last weeks helpdesk software installation. The software was incompatable with qmail. Fix: 5 minutes. Any guesses how long it would have taken to get the closed source equivelant fixed?
Re:Of course. (Score:2, Insightful)
Re:Of course. (Score:5, Insightful)
source code isn't necessarily about bugs
it's also about insulation from change or situations the author couldn't see, test, have predicted, have known about.
I may never READ the source code for 99.9% of my apps but they day something get's changed and Eric's OSS projects fails I can go find o.ut why and fix it. Without the source I'm screwed.
And as it happened I did exactly that yesterday when the plan9 imap file server didn't get along with Courier. By having the soruce code I was able to track down the problem to it being a wrong assumption in the code AND a config problem in Courier.
If I'd had no source code I would have been screwed.
So Mr Henning can't be that clever if he can't even see what the potential is
He's making the classic mistake of saying something is worthless to everybody when he means it's worthless to him
Re:Of course. (Score:3, Interesting)
Two latent bugs. With the source, it's almost as good as if the bugs didn't exist. The overall effect is getting 5-nines reliability at a cost of 3-nines reliability. Also if you are facing a scissors/rock/paper scenario, any assumption you make will be wrong is some cases.
For most people, most things, most of the time, source code is useless. For most people, 5-nines reliability is useless expense.
Insurance (Score:4, Interesting)
Never had to use it, never wanted to use it. But it was there, and allowed us to pick something other than IBM (way too expensive at the time. Not sure if they even offer a similar product anymore.)
Re:Of course. (Score:5, Insightful)
Our business is not writing this software.
I work for a law firm. Our business is to produce legal documents and legal arguments. Our business is not accounting, yet we have accountants on staff. Our business is not records management, yet we have records management specialists on staff. Our business is not facility maintenance, yet we have facility maintainers on staff. Our business is not programming, yet we have programmers on staff.
We want software that works, so we can do our business.
All commercial software is broken in some way (exceptions number in the single digits). Source code hinders your ability to have software that works. It follows that source code hiding hinders your ability to do your business.
what makes you think that throwing somebody at it in their spare cycles is going to help?
We have 400 attorneys. A bug (misfeature, non-optimized routine, poorly designed UI, etc.) that costs us three minutes per attorney per day costs us $3000 daily. (average billing rate is $150/hr)It may be worth our while to hire a programmer at $50/hr to fix the problem. Without source code availability, we have no choice but to burn money on a daily basis.
If the people who designed and wrote the software can't find the bugs
The bug may be specific to the way we use the software, or it may be preventing us from using the software the way we want. Perhaps we want a dialog box organized in the way that is most efficient for us. Maybe a program has its data path hardcoded and we want to store data someplace else. One program we have produces a hash that is used for the filename; I'd like to see a differenct algorithm used (for reasons to complex to go into now.) I'm hardly a programmer (I know a bit of C, a bit of VB), yet I'm confident that I could, by studying code, determine if these changes are feasible and locate where code needed to be changed. A pro could be hired to validate my opinions (or deny them!); another pro could be hired to do the work.
Here's another reason why source-code availability and the right to modify and recomile it is a good thing to have: companies go out of business. We use a program called Wealth Transfer Planning that is pretty cool; it automates the creation of wills, trusts, estate plans, etc. The company that makes it has disappeared. We are stuck with ALL our bugs and NO possibility of improvement to either the content or the engine.
Re:Of course. (Score:3, Insightful)
Guys, guys, you're all missing the point. So is Henning. In response to the question "Is Open Source the solution or isn't it?" I answer with a resounding "Yes". :)
Examples where the proprietary model has excelled: Highly optimized code (Intel compiler) User friendliness (MacOS, Windows) Timeliness (Sun's original Java implementation--was any OSS project working on xp GUIs before Java?).
Examples where Open Source model has excelled: Portability (are there any platforms that don't support the JPEG libraries?) Endurance (LISP stuff from the 80s will never die). Security (OpenBSD or NSA's Linux).
Examples where proprietary has failed: Ongoing access and support for legacy products (Where can I legally buy MS-DOS and get support for it?) Broken formats (WORD documents) Security (Outlook) Customer relations (product activation--no thank-you).
Examples where Open Source has failed: As a business model (Loki) Time to market (HURD, where are you?) Political entanglements (Say "GNU" before everything or you are not my friend).
When choosing, you have to look at the strengths and weaknesses and decide what is important to you. Sometimes that will lead to Open Source software as the correct choice. Other times it will lead to proprietary. If you are lucky you can mix-n-match That's why I love using MSVC (proprietary) to write Freeware (proprietary) that uses IJG code (Truly Free Open Source), and using the resulting app to generate frames that I pass through Gifsicle (GPL) to generate GIFs (proprietary format) to put on the Internet (Open Standards) that most people will view through IE (proprietary). And everybody is happy if they choose to be.
Where's this guy's asbestos suit? (Score:3, Funny)
- Economic model is doubtful
- Source code is useless
- Motivation for Open Source is inappropriate for most software
- Nerd culture is counter-productive
I'd like to see him come here and say that.
Re:Where's this guy's asbestos suit? (Score:3, Insightful)
The economic model is so far proven to work for only a few companies and not for the industry as a whole. I suspect that basing the industry entirely on service and support is going to drive the prices for those functions much higher and frustrate most users. By offering the program for free, most users are going to expect free support.
Another relevant thought is that without closed-source companies to support the programmers who are donating software, how are these programmers going to survive? A recent article in the Register (http://www.theregister.co.uk/content/4/23935.htm
Source code IS useless if you don't have time to look over it or modify it. It only benefits the 5% or so that are actively involved in maintaining or modifying the code. The remaining consumers get absolutely zero benefit from it.
I'm not sure I can argue either for or against the third point, except to say that once the money is removed from the equation, how do you force change without innovation? Ie., fixes instead of new features?
The nerd culture IS counterproductive, since it emphasizes an antagonism toward those who run businesses (suits) and those who sell products (marketroids). In order for Open Source to succeed, there is going to have to be a meeting of the minds on a massive level, not just a few companies here and there.
Re:Where's this guy's asbestos suit? (Score:2, Interesting)
Re:Where's this guy's asbestos suit? (Score:2)
Well, the economic model for open source is doubtful, under current conditions at least. I was a very early customer of Cygnus. We needed to pay them in part because g++ was so horrendously buggy in those days: it's easier to have a support business to support code that badly needs it.
Source code isn't useless, but it is useless to many people (those without the skill to change it or the funds to hire someone to do so). It is very useful to folks like me. But most computers are embedded systems programmed in assembly language. How useful would the source code to your microwave oven be to you?
The motivation for open source works very well for tools that the programmer himself/herself needs, for producing tools with rough edges that can be handled successfully by other programmers. It gets harder with applications; in this cases the only successful open source projects clone some proprietary design (the Gimp, Gnumeric, etc). The truly original open source creations, like Perl, Python, and Emacs, are environments built by nerds for nerds.
The nerd culture can be counterproductive. Nerds focus on minutia and often don't see the big picture. In many cases, nerds find themselves working for someone who has the opposite limitation. This should be no surprise. Also, many programmers are the wrong kinds of nerds. Civil and mechanical engineers obsess on getting everything correct, because they are well aware that if they don't, people may die and careers may end. Too many programmers lack rigor and think of themselves as artists, not engineers, even if they use the term "software engineer" in their title.
A key issue, that software is brittle and downright dangerous, is not addressed by either proprietary or open source software today. If we fix this by requiring proprietary software to have a warranty against severe defects, what happens with open source software, where the distributor cannot possibly provide a warranty?
I'm afraid that Microsoft may start to get it about security before the open source movement does. If you think that the open source movement gets it, then why did the Debian project need to issue 81 security updates in 2001? Both Microsoft and Linux are putting out software that is too buggy, and the BSD world isn't as much better as they claim, despite better practices (code auditing is great, but a lot of work: move most of what Linux distributions call the system into "ports" and then the bugs don't count against you).
I think that open source can work, but not in the current economic climate (native to the US, being forced on other countries through the GATT and the like), which elevates "intellectual property" to a universal value. A funding mechanism is needed. One possibility is that governments fund it. This would actually save taxpayers a lot of money, since governments are currenty paying Microsoft and the like hundreds of millions just for Office, and paying again every few years for upgrades. That would pay for a lot of full-time programmers.
List of things developed with pre-1946 technology (Score:5, Insightful)
Iowa/Yamoto class battleships - 1920's technology
Apollo moonrockets - 1940's with a dash of 50's
Polio vaccine - 1880's with a dash of 1940's
Transistor - 1930's
Bulk transport system, rail - 1860's
Bulk transport system, car/truck - 1920's
Airplane - 1910's
Fast airplane - 1950's
Yup, makin progress fast.
sPh
Re:Sauce for the goose and gander (Score:4, Funny)
He said advance, not regress
The only way computers have made my life better... (Score:4, Insightful)
Is that they enable me to make a large salary without having to turn to medicine, law or crime.
Recently, I was walking back to my car late at night in downtown San Francisco. A homeless person standing in front of an all night donut shop asked me for a dollar. I said no, but invited him into the shop and bought him a donut and a coffee.
I would lay money that that $1.25 spent on a human being had more impact to society than all the software I have written over the past 20 years of my career.
He is wrong about number of letters (Score:2, Insightful)
Where does he come up with that? We are about twice as productive as 1945(amount of goods produced per hour of work) We have taken this productivity gain in more goods and services in place of working less. You could have a 1945 standard of living and take 6 months of per year. Just start to think about what we have now that are necessities: Car that goes 100,000 miles before it is tuned up, large color TV, phone in every room, answering machine, air conditioning, jet travel, it goes on and on.
be careful: (Score:2)
It's a matter of how you count. If salaried employess are forced to work longer now than in 1945 (they are) -- then productivity goes up. If tax laws change (they have) -- then productivity goes up. Use cheaper labor, more efficient machines, and productivity goes up. The economy makes a difference, since idle workers aren't productive either. So while yes, we are more productive now than before, the actual delta is hard to measure and certainly much smaller than the official figures.
More importantly, how much of this is the result of using computers (as opposed to increased education, and market pressure)? Quite a bit in some specialized fields like air traffic control, telecommunications, or warehouse management. But for general office work?
And now factor in the amount of money business spends on computers. I think during the 1980's that IT ate away half of all capital spending in the US. Are the secr^H^H^H admins more productive now?
Re:He is wrong about number of letters (Score:5, Insightful)
One can not have a 1945 standard of living on 6 months of work per year.
In 1949, thanks to labor unions, an illiterate coal miner could afford to buy a house and his wife could afford to stay at home and take care of the kids full time, because he made the equivalent of $50k/year in today's money. Try that today. For people who are neither professionals nor managers, real income peaked in 1973 and has been dropping precipitously ever since.
One size != all (Score:5, Insightful)
Nerds are the computer equivilent of the Enos, the Yoko Onos, the Peter Gabriels
Whats wrong with different people born for different goals? Even if you don't directly contribute the masses, most changes in fundemental social systems (and technical systems) starts with someone rejecting the norm. As well it should be. Leave them alone and let them nerd!
Re:One size != all (Score:4, Insightful)
The point is when one is making software to be used by the masses, nerdyness is a bad thing. Nerds like lots of features, we like complexity, we like living in our little world and working on our little pet project without much care for what others want.
This in general is BAD for most people, most of the time. They want something that works, that makes sense, that's easy and simple and gets the job done. They could care less about command line options, flashing text, and alpha blending.
That's the point that was being made and it's a great one.
Impractical Thinking != Visionary Thinking (Score:4, Insightful)
It's not counterproductive to have people pushing the envelope, it's counterproductive to have people outside of the mainstream dictating to those in it what their needs are.
Despite advances in UIs, computers are still designed as general-purpose hobby devices, rather than for the specific functions for which the majority of their sales are used. When users complain that it doesn't make sense to have to log in to a system or to "start" a word processor, or to "double-click" to "open" a file through a graphical icon, they're simply told that they don't understand the technology. Same when they have to figure out [to avoid being scammed] what kind of RAM they need with their new P4 processors.
The point is that for products to be useful and effective, they need to be designed with more consideration for the needs of the user; and much of the time, that which is "neat" to enthusiasts has held sway over design at the expense of what would be useful [see featurebloat].
BTW: impractical thinking is not necessarily visionary. It might just be impractical.
Larsal
Re:Impractical Thinking != Visionary Thinking (Score:2)
Thats the world we live in. Those who get paid, do so at the expense of being so creative as to be creating things that are appropriate for a popular mass.
Good point (Score:2)
Re:One size != all (Score:4, Informative)
- How often do you need to
- perform a Fourier analysis?
Several times a day, usually. How often do I need to email a document to more than one person? Almost never. One tool is not adequate for all people. This is a fact all to often overlooked in arguing for software applications as standards.
I are an engineer. (Score:5, Funny)
I like my metaphores stirred not mixed.
Good points! (Score:5, Interesting)
I'll agree here, although I see it most in database design. With the advent of such super-fast DBs such as MySQL there has been a FLOOD of horribly written applications that utilizes them. For instance, you'll see every column defined as CHAR( 255 ), or every table prepended with AUTO_INCREMENT columns even when they are not necessary. Indexing is poor or non-existent, and tables are horribly in need of normalization.
Some finer points in design; I see some stuff like this a lot as well:
function bob( varlist ) { $var = $joe + 12345; return $var; }
You're wasting memory and such for the variable declaration and assignment, simply return $joe + 12345;.
Fallacy 12: We are Making Progress
- Progress in quality assurance has been remarkably slow
I used to work in QA for a software company and I wouldn't say that I was the worst programmer there, but I think the problem is that 90% of the QA staff WERE NOT PROGRAMMERS or didn't have access to the source. Basically, QA reports bugs, they go into the queue, and then a developer, if they have the time when compared to all their code development, meetings and such, may have a chance to get to the bug. It would be nice if the QA staff, who may have software programming skills, would be allowed to be developers as well (e.g. all the rights of a developer but QA is their main focus). They attend the same dev meetings and such which gives them the insight to the architecture to allow them to fix bugs which have been approved by management.
So in effect, have two programming teams.
Re:Good points! (Score:2, Informative)
I know that you are talking about (what appears to be) PHP here, but I thought I'd toss in my 2 cents. In compiled languages small differences like that don't matter. If your optimizer doesn't suck (and most don't, these days), it will reorder your code to be as efficient as it can get it to be, and that includes things like elminiating uneeded variables, etc. So maybe what you are seeing is developers used to working with compiled languages that include a good optimizer and like to go for good, clear code as a first rule of thumb. No, that doesn't make it right, but just something to be aware of.
Re:Good points! (Score:2)
From talking to Zeev (author of Zend Engine) he said that there would be a slight performance hit to doing something like that (I assume it is as you said the reorg and such).
My style of programming is to keep excessive things like that to a minimum because 10ms more might not seem like much until your program (or in this case, web page) is hit 100 times a second.
Re:Good points! (Score:4, Insightful)
function bob( varlist ) { $var = $joe + 12345; return $var; }
and
function bob( varlist ) { return $joe + 12345; }
might actually be the same number of operations. Not because of the compiler, but just because of the way that the machine works.
Regardless, as said before, this kind of micro-optimizing is pointless and dumb. It is not programming it is coding. Coding is a mechanical process. Programming is an art. You can optimize your code, but it is almost imposiable to optimize an API. Designing APIs is where I think all modern languages have totatly failed us. It is way too easy to write a bad api with todays languages. I've had to implement too many crazy interfaces written by people who didn't think them through. I've also created interfaces that later I went back and scrapped because they were dumb. This is the way programming is and it doesn't make any sence.
Complier Theory Lesson (Score:2, Informative)
function bob( varlist ) { $var = $joe + 12345; return $var; }
You're wasting memory and such for the variable declaration and assignment, simply return $joe + 12345;.
Well, when you simply return $joe + 12345, the complier creates a variable of the same type as $joe, gives it the new value, and then returnes it, negating any hoped-for savings on memory.
Re:Good points! (Score:3, Insightful)
As a developer, I inherently know what NOT to do. A computer moron doesn't know these things, and will use it like the end user will. An experienced programmer will use my programs like I will, and will usually get the tough errors back to me. A computer moron will get the obscure ones back, and it tends to be those errors which make it through to the end user.
Re:Good points! (Score:3, Interesting)
I agree with you as well -- if I came across as 'QA should only be programmers' then I apologize; that was not my intent.
QA is more than just 'poking' at the program and seeing if it breaks. It's authoring test procedures, finding new and interesting ways to break the program, interacting with other developers and management, and a whole lot more. As a programmer I know I hated to write test procedures -- it is very very boring and as the complexity of what you are testing increases linearly the complexity of your test procedure increases exponentially.
However, we'd write up bugs such as "Inserting 32 characters in field XYZ on form 123 causes program to crash" which, in the grand scheme of things, could be viewed as either a "Show Stopper" (highest priority) or a "Do We Care/When We Have Time" sort of a bug. Considering adding range checking to a form is trivial giving QA clearance to fix that would result in a much better program (again provided the QA developers are qualified) and give the regular developers more time (since we'd find 30 or so of these things on a single form) to fixing the hard-core bugs or developing new features.
Re:Good points! (Score:3, Informative)
Thank you!
I'm (sort of) in my company's QA department, and get a whole lot of guff about it from the other engineers ("You're QA? Ewwww"). Thing is, QA doesn't need to be a bad job -- I've spent my last few years largely working on (nifty, new) automated testing tools, and love it. There's nothing quite so interesting as coming up for a test for something that on the surface doesn't look practical to test programmaticly, or putting together a home-grown piece of software that does a task in a massively cross-platform manner that comparable (expensive) commercial solutions could only do on one or two platforms.
Now, writing loads of Expect scripts has never been my thing (that's what the
Anyhow, I'm just glad to see someone putting QA in a light that reflects that it doesn't have to be a boring and tedious job done by those who don't have what it takes to be
Re:Good points! (Score:2)
Re:Good points! (Score:4, Insightful)
That's only a "fine point of design" to a 15-year-old. No, scratch that; it's not design at all. Any decent or even semi-decent compiler or interpreter should be able to make that particular optimization all by itself. A real fine point of design is whether to use events or threads, update or invalidate, distance vector or shortest path, this class hierarchy or that class hierarchy, this module layering or that module layering...stuff that can't be automated or even delegated to an inexperienced programmer.
Dream much? Ever hear of specialization? You're right that QA tends to get the short end of the stick in a lot of ways. QA engineers should have some programming experience, should attend (some) development meetings, should have more authority wrt the disposition of bugs...but they should not be checking in production code. Good QA is hard work, requiring its own specialized set of knowledge and skills. Any QA engineer who's making (and, one would hope, unit testing) their own changes to the production code is not going QA, and QA needs to get done. Hire another developer or extend the schedule, but don't take good QA engineers away from the necessary task that they do best to have them do someone else's job.
Re:Good points! (Score:3, Insightful)
The problem with this is that you really do want two different loci of responsibility for development and QA. You don't even want the two teams to have the same manager (or generally, the same chain of command) because that creates a conflict of interest for the manager. While wearing the DevMgr hat, he wants to get stuff out the door quickly, so he's rewarded for cutting corners when he puts on the QAMgr hat.
It might work to do what you suggest, as long as the chains of command were kept distinct so only the people at the bottom of the hierarchy ever wore both hats. But do you really want to work for two bosses at the same time, and be answerable to both?
Another possible model would be a "clean room" approach, where you're given read-only access to the source database, and you can tinker on your own machine. You can propose a specific change to the developer working on your bug, but he checks it in. Things are still sped up that way, and you avoid blurring the responsibilities.
frm the artical (Score:5, Funny)
but now is ok for ppl 2 put 42 typos in inrnet msg & hit submitt
Windows for dummies. (Score:5, Funny)
Oh how right you are...except (Score:5, Insightful)
"The best UI people on the planet are those working in the car industry.
We need to make it a criminal law to change certain API's. There are potentially
huge impacts. When we produce a new drug, we can't just release it to millions of
people without some sort of testing."
yeah, but how long should the testing cycle be? For example, we hear all the time about drugs being recalled because of illnesses caused by its use. Beta testing is a great way to do this, however, even then you can't know until your program is running on a lot of machines in different environments, with different variables.
So, what can you do? Well, you release the software after doing as much testing as possible, and wait to see the results...then patch, patch, patch..which is the way it's being done now. That's why early adopters know (or should know)what they are getting into, and why most of the companies I have dealt with, (running win2k) waited for SP2 to come out before upgrading.
Or, you could establish some sort of body, like the FDA does, that tests the heck out of software for a while before shipping. Problem with that though, is that by the time it is approved, its obsolete.
Other than that, this was a most excellent read.
Re:Oh how right you are...except (Score:2, Informative)
i find the drug analogy a bit absurd...yes, UI's and API's should undergo rigerous testing, but when was the last time a person was killed by an underdevelopled and tested program??? there are some notable exceptions (like that xray machine a couple of decades or so ago that was giving radiation doses that were off by a factor of ten)....but buy and lagre people who grab the latest instant messenger beta dont have to worry about being physically hurt.
drugs on the other had can KILL people if they are not understood and tested fully.
maybe i'm missing the boat here but, i agreee with the idea...i just think the analogy is a bit much.
dude.
A similar reference (Score:3, Informative)
Landauer. I think its kind of controversial, but he points out that a lot of the promised and perceived productivity gains due to computers have never come about.
Source code *IS* useless ... (Score:3, Insightful)
Source code *is* useless to about 99% of the people that use the program. My aunt Benita isn't going to track down a Microsoft Word bug and fix it even if she HAD the source. She wouldn't care - she'd just wait for the update. So in that context, the source code is useless.
Where the source code does become useful is in the hands of developers, but for users it's just another disk of stuff they get in the package that they'll never use.
Re:Source code *IS* useless ... (Score:2)
My company recently ran into this same thing. We dedicated resources to looking into ways to make our java code more difficult to decompile. I brought up the fact that they were wasting their time. Why? Because our product is quite large. If someone were to decompile it they would spend months trying to document the overrall design and engineering behind it, to the point where they would be competent enough to modify it or use it. Even with complete documentation and source code, it takes a long time for someone to be able to grasp the whole system.
Re:Source code *IS* useless ... (Score:2)
True. But Aunt Benita might go to www.joescodefixingservice.com and pay Joe $50 to fix the bug for her, if she needed it fixed right away. Without the source code, she (and Joe) don't have that option.
Imagine there was something wrong with your car's engine, and the only place that could fix it was Honda Corporate headquarters in Japan. Wouldn't you like to have the option to go to the local mechanic instead?
Re:Source code *IS* useless ... (Score:2)
But it does not follow from this that source code is useless. If the value that you get from giving the code to the tiny fraction of people who will actually do something useful with it is larger than the cost of doing the distribution, then distributing it is worthwhile. Given the low cost of source distribution these days, that may make distributing worthwhile even if only one or two people will ever look at the code.
Besides, users get value from having the source even if they never modify it. I find that it's very useful to compile programs for my system. They wind up being optimized for my processor and take advantage of the other resources that are on my system. This may not be a big thing, but there are certainly more people out there who compile than who write, and source availability helps them.
Re:Source code *IS* useless ... (Score:2)
Well then it's not useless. Make up your mind.
I believe I did - my main point was that usefulness is all about context. Programs are made for end users and from their perspective, the source code is useless.
There are a few people that have said that source code is useless even to developers, which I can see. Unless the documentation fairy has visited the company that produced the source and made absolutely superb docs, chances are you won't be able to make heads or tails of the code without a serious time investment. What percentage of developers have this amount of time to fix a bug that might be fixed in a service pack in a month or two?
source code is useful to me (Score:2)
Forgetting all the myriad reasons source code is useful, the one best thing about getting source code for your product is: it's the ultimate documentation for the program.
I always look at the source code when trying to solve a problem. It's like a reference manual written in a terse language that doesn't slow me down.
It's like including schematics with a piece of test equipment. Why bother with the manual when you can just look and see EXACTLY what that button does and how.
Source code may not be useful to users of software, but to the coders and people doing the actual work, it is a tremendous productivity boost.
Re:source code is useful to me (Score:2)
Yes, exactly. Here's an example:
My workplace is in the market for a new firewall. However, we have some staff who periodically need to do weird things with the network, and want to make sure that the firewall can be set not to interfere with them. Many commercial firewalls do particular classes of filtering (such as flood filtering, rejection of invalid packets, etc.) in a way which is not completely documented. So we can't tell whether they will interfere or not, or which functions we need to enable or disable in order to get them to work for our purposes.
Enter OpenBSD. I am not the sort of person who usually reads kernel source -- whether on the job or for fun -- but I can pick up the kernel source for OpenBSD's pf packet filter and know (for instance) exactly which combinations of TCP flags it rejects as invalid. I can then look at a network dump and tell someone exactly what pf will do with the traffic represented there. I can, in short, prove that my firewall will or will not pass that traffic.
I can't do that with a product that comes with nothing but a guide to "Basic Firewalling for the Beginning Networks Staffer" and a command reference.
Nerd culture.... (Score:4, Insightful)
Hello?... 'nerds' are the whole impetus behind the electronics industry. Without nerds wanting to show off with faster processors, cooler video boards or better OS's a few billion dollar industries wouldn't exist today.
Hell, Star Wars would have earned $1.50 at the theater without nerds creating the cult that propells it. Nerds created pong on a friggen mainframe just to goof off and sparked the video game industry, quickly gaining as the most widespread form of mass entertainment on the earth.
I am nerd, hear me calculate!
Old Outlaw Quote (Score:2)
If we make innovation illegal, only Microsoft will innovate.
Interesting if debatable (Score:3, Interesting)
Fallacy 1 (Computing is Easy) I think is spot on. I shudder when I see some of the "For Dummies" titles out there now.
Fallacy 6 (Computers are Getting Faster), I would have to say I disagreed with him on. Sure, my desktop boots slower than my old 386 from 10 years ago. But my Handspring Visor has more memory and boots instantly. Web pages load faster with my DSL connection then they did over my modem (where could you get that 5 years ago?) Most of my compiles are shorter than they were 3 years ago. Sure, people tend to put bloat in, but Moore's law is still wining overall.
This ones really a quibble, but a subpoint of Fallacy 7 asks "How often do you need to do a Fourier transform?" I don't know if it's need per se, but I kind of like some of the music visualizations that use a whole bunch of frequency domain stuff.
One of the subpoints to Fallacy 13 (The Industry Knows where it's going) is
"There haven't been any new ideas in a decade"
My response
"There is no new thing under the sun"
--Ecclesiastes
That said, he certainly seemed to bring up a lot of food for thought. Do you think he'd be willing to do a Slashdot Interview?
Something to think about (Score:2)
Fallacy 10: Open Source is the Answer
- Economic model is doubtful
- Source code is useless
- Motivation for Open Source is inappropriate for most software
- Nerd culture is counter-productive
We write software for peer recognition. We write fancy structures because
'it's cool', but not particularly useful.
If this were a Microsoft developer conference, would you expect a keynote speaker to stand up in front of thousands of Microsoft employees and users and claim that Microsoft is a monopoly, produces insecure and unusable software and only cares about money, not its users? One would expect a security team (think 2-metres tall and muscular, not securityfocus) up on that podium to carry the infiltrant off stage pretty quickly. More likely, it just wouldn't happen. I'm certain Microsoft puts millions just into screening the opinions that are expressed during its conferences, written on its website or posted on Usenet by its employees.
I think the Linux community's willingness to listen to criticism before (perhaps sometimes vehemently) counterarguing is one of its greatest strengths.
I don't agree with what Michi says towards the end of his keynote, but I doubt the organisers of GUADEC will cause too much of a fuss about it (perhaps they will ask him once or twice if he _really_ thinks Open Source is no good for production software).
Source code is useless (Score:5, Funny)
Cheers,
Ian
Re:Source code is useless (Score:3, Insightful)
The source code is VERY useful to me, even though I haven't seen 90% of it. That's because I built my system optimized for the Pentium IV (Athlon at home). You just can't do that with a binary. In addition, I get to build certain packages the way *I* want them built. I love Dia but I don't use Gnome so I get to build Dia without Gnome instead of using the binary package which requires Gnome.
I wouldn't even be running an X server on my workstation if it weren't for the source code, since XFree86 doesn't fully support my video card. But with a simple patch it works great. Yes, this patch could have been posted binary only, but how the hell would the poster know how I compiled my server? How the hell does he even know which OS I am using. Is he going to have a binary patch available for every possible combination of CPU and OS?
I may never look at the source code for gcc, konqueror or XFree86, but I damn well want it available.
wrong on all (most) counts (Score:4, Informative)
well, actually it IS easy to learn syntax. This fallacy is just sniping at inexperience. No one teaches you how to write great code, even the greatest C hackers learned their loops one at a time. And, most of the rationale behind spaghetti code nowadays is due to extreme commercial pressure, not any lack of aesthetic sense.
- Teach Yourself C++ in 14 Easy Lessons
- Brain Surgery in 14 Easy Lessons
its completely arrogant to equate Brain surgery to C++. For one thing, lives are not at stake. This analogy is delusional with extreme grandeur.
Fallacy 2: Computers Allow People to Do things They Could Not Do Otherwise
As a matter of fact, they DO empower us. With Word I can do mass mailings in an hour, instead of all day. A great word processor will do a lot of the annoying things like spellcheck and thesaurus and automatic formatting of headings and footnotes and equations - which used to be a severe drain of time. A great spreadsheet lets you analyse numbers with impressive ease - ask any accountant how much the spreadsheet has transformed their parctice. This power of analysis has allowed professionals to actually expand their business instead of being mired down in drudgery.
Fallacy 3: Computers Increase Productivity
yes, they do, if used with discipline. See above. The idiots who waste all day adding sound effects are the same ones who in eth 40's used to while the day way lobbing sharp pencils into the ceiling. Procrastination has evolved with technology but is essentially the same.
the point about typos in letters written in 1945 illustrates the opposite point.
quote: "Nowadays, we rewrite the letter many, many times, changing fonts, format etc.
We are no better off in terms of letters produced."
really? you call a letter produced with no typos, "no better off" ? and all of the ways we can edit documents today, can be done effortlessly. The default templates that come with Word do all of this already. Its only the "power users" who seem to obsess like that, when people who actually use computers daily for their profession simply get the work done.
Fallacy 4: Programs Help Their Users
true, software companies try to ensnare their users. Also true that DVD makers try to snsnare their consumers, that groceries and airlines and car salesmen all use deceptive marketing, schemes, and even planned obsolescence to suck your wallet drier. You shoudl blame capitalism, not computers.
Fallacy 5: If It's Graphical, It's Easy
the vast majority of GUIs make simple tasks much easier. If you think that arcane text codes and comands are easier than just clicking the Underline button, then you're a
with a gui, you dont NEED to be a "sysadmin, programmer, typesetter, etc." to get work DONE. You just get work done. In a CLI you have to be all these things and more.
also, the paperclip has NEVER interrupted me to tell me a joke. Document the allegation!
Fallacy 6: Computers are Getting Faster
yes, they are. NO software I can buy today can really tax my 2 GHZ Pc, not even the most bloated WINXP install. My Pentium DOES boot faster than my old 386, Word loads in a few seconds, my web page is limited by my dial in connection (which i am forced to use because of monopolies and lack of regulation in telecom, not because of any computer issue). Its obvious that a Pentium 4 compiles faster than a 486, and the programs of today have more functionality anyway. EVERYTHING took FAR LONGER 5, 10 years ago.
Hardware is SO FAR AHEAD of software that only Id Siftware can really claim to have tested the metal. And can YOU tell the difference between 100 and 200 fps ? NO! stick your head out of the benchmark app!
Fallacy 7: Programs are Getting Better
Yes they are. True many obscure functionalities are barely used but they are there - and they barely slow things down in todays 2 GHz age.
I dont buy the anecdote about a single hyperlink inflating a 800K document to 2.2 MB. I just tried it myself, but taking 800 K of raw text and pasting it into Word. Then i added a link. The file size difference is negligible, but dont take my word for it, TRY IT YOURSELF! And then stop propagating foolish incendiary lies.
Fallacy 8: Programmers are Getting Better
well, if they all bitch and moan like this, maybe this really is a fallacy. But, I doubt it. Most of teh programmers I know are able to switch between languages and adapt to different environments. Most old time programmers are surgically attached to the Language of Choice for them and will never change. Look at the quality of coding being done on the Linux Kernel, in Oracle's 8i, in Windows'
BTW, ANY student who majors in CS will know what a core dump is, dont be alarmist. Any student who isnt CS, has no reason to know. So what?
the jab about knowing how to write excel memos being a mark of qualification is just arrogant snobbery. And the average retention time is from the dotcom boom, it surely isnt true anymore. YOu have a problem with people cashing in on their skills while they could?
Fallacy 9: Programming is About Date Structures and Algorithms
this is an extremely provincial accusation - probably better to just nod and agree with you rather than set off a religious war.
Agreed that programmers are not taught to design. Well, who taught you? If experience sufficed for you to become a self-declared expert, then it will suffice for others also.
Fallacy 10: Open Source is the Answer
The Answer? The Answer to what? with apologies to DOuglas Adams, first off you better figure out just what the Question is!
Re:wrong on all (most) counts (Score:3, Informative)
My theory on that story -
The email address hightlighting was set up to including changing them to a custom font. Word was also set up to embed custom fonts in documents. Thus when the only use of that font was deleted, the font wasn't included, explaining the 1.4M difference.
Re:wrong on all (most) counts (Score:5, Interesting)
Bruce
This document is a fallacy (Score:4, Insightful)
99% of all documents are written to be printed on paper.
Hell, no! 99% of documents (besides programs) I write are emails.
I'm not nitpicking, this is a major flaw in the argument.
What is obsolescence? (Score:2, Insightful)
It reminds me of the guy who had an old 68k Macintosh running Word 5.1. He knew how to use it and it did everything he wanted it to do.
One day the IT people at his company took his mac away and gave him a new PC because the mac was "too slow" Well, what happened?
First of all, he was not familiar with the PC or with the new features availible in Word. Second of all, many of these new features were more annoying than useful, especially when the newer version of Word autocorrected something that didn't need correction. Also. considering that this new, more complex software is both more demanding of hardware and more prone to bugs, He found that his new system was slower than his old one and more prone to crashing.
So, why again was that Mac obsolete?
Fallacy 2 (Score:5, Insightful)
How is this a fallacy??...He cites perhaps the handful of examples in which it may NOT be true, but leaves out the seemingly unending numbers of examples in which it is in fact very true.
- Telephone switching
- All the sophisticated computers running those F16s we see in Afganistan
- Power grid / sewage / water / gas control (in most areas)
- The entire Internet
- The level of visual effects in movies
- Computer and video games
- Thousands of different manufacturing processes that need to be computer controlled to get the level of accuracy needed
- Protein folding research
- and so on...
Inmates Are Running the Asylum (Score:3)
If you have anything to do with designing any sort of interface to any sort of product (be it a piece of hardware, a piece of software, a widget, whatever), you should read this book. It will open your eyes.
His arguments don't apply to a lot of people (Score:5, Insightful)
Let's consider Fallacy 2: Computers Allow People to Do things They Could Not Do Otherwise. This is not a fallacy, it's true. As an amateur composer, I can compose and print a piece of music in a tenth of the time it would take me to do by hand. I am not taking advantage of any automatic composition or any silly A.I. technology. I'm just taking about using Finale 2000 to enter in the notes using my MIDI keyboard, edit them quickly with the mouse, and listen to the result through my speakers to make sure I didn't make any musical "typos".
How about scientific research? Scientists now have amazingly powerful tools at their disposal. I know plenty of people who do need to perform a Fourier analysis on a daily basis (see Fallacy 7) and for people like this who are leading experts in Physics but know little about computers, a book like "Learn Matlab in 21 days" is all they need. I agree that you can't become a good DB programmer or QA person by reading a quick book or studing at DeVry, but most people who use computers aren't programmers and don't need to be.
While we were taking about scientific researchers, clearly "Computers are Getting Faster" is NOT a fallacy for them!
Finally, what about the Internet? Yes, the dot-com bubble bursted, but note that all major companies still have websites. It's silly to even consider a company not having one. E-mail definitely allows you to do things that weren't possible (or at least weren't realistic) before, like collaborate on a book or article with someone who lives halfway around the world.
Also, statements like "Programmers are Getting Better" are hard to really analyze. One problem is that there are hundreds of times more programmers now than there were twenty years ago. As a natural consequence, the average level of expertise has gone down a lot. But the best programmers today are a lot better than the best programmers twenty years ago - because they're building off of the best ideas of the last twenty years. And there's no question that even below-average programmers are far more productive today than below-average programmers twenty years ago, simply because there are more high level tools available to them. People who write Visual Basic scripts for internal company programs may be very poor programmers, but if they can get the job done, who cares?
< / RANT >
Sorry. Computers have definitely made my life better, and have enabled me to do many things I never could have done without them, so I get upset when people try to argue that computers suck and that things are basically the same as they were before computers.
History Repeats Itself (Score:3, Informative)
Slides and video from one of these (given on April 18th, 2000) are available here [onthenet.com.au].
He will probably continue to give his talk for many years to come, as it is unlikely things will change much in the short to medium term.
Programs are all the same? (Score:4, Insightful)
-Word
-Matlab
-Apache
-Linux/Embedded
-AutoCAD
...
While you (should) want to make Word as simple as possible, you want to let Apache users configure everything, you want to let people modify the source to Linux(Embedded) to exactly fit their needs. AutoCAD needs lots of features, but not necessarly source code ('cuz there are less programmers in mec. eng. than ee)
So I'd add fallacy #11: Programs are all the same
-Software management should be done the same way, regardless of the software being produced
-All programs should focus on simplicity, not features
Looking at it from the wrong decade (Score:5, Insightful)
I mean, the cars don't actually go any faster. The speed limits aren't much higher, and if anything, the increased traffic makes us drive slower. Environmental improvements from catalytic converters and the like are nullified by the increased number of cars producing pollution. We add rear-window wipers and CD players, and instead of buying (or building) a more efficient vehicle we demand (and get) SUVs from every last manufacturer on the planet.
So, are cars actually any better, when any technological improvements are effectively nullified by the people driving them?
Well, yes, they are. Cars are more popular every decade because they're easier to use, cheaper to own, and more comfortable for everyone inside. They may not be "better" from a numerical perspective, but anyone driving a 2002 model right after driving a 1962 model will immediately notice the difference.
Computers are the same way. The faster they get, the more we expect them to do. The more people that use them, the fewer things they are used for. Developers get sloppier about optimization and APIs get changed with every iteration of the OS. It takes longer to start up this year's computer as it did 1979's, and people still do the same basic things with them.
But look at how much they've changed: graphical UIs make it easier for anyone to use a computer, instead of having to know what to type in at a text prompt. WYSIWYG doesn't happen 100% of the time, but 98% is a fair sight better than 0%. I may not get anything more interesting using a cable modem than I could using a 14.4 and a BBS, but at least all the commands are on screen instead of hidden behind a hundred scrolling screens of
So people are using all this computing power for nothing more than playing video poker and typing papers. So what? 90% of the population never needed it to do anything more; at least in 2002, they can do it for a lot less money and with a lot less reading. Companies and users may throw away countless man-hours developing skins and pretty interfaces, but at least they're successful in making computers familiar, comfortable, and desirable to the common man.
And besides, look at all the things we can do with a PC today that we couldn't ten years ago: access millions of pages of esoteric information online. Take photos digitally and organize them on CD-R discs, taking up 1/100th of the space for about the same cost. Listen to a thousand songs from a single digital jukebox, no vinyl or tape required. IM your mom across the continent without spending a penny on long-distance. Order anything from the Sears catalog without having to own the catalog. Find a new job. Locate a special interest group. Print a map. Comparison shop.
Or, just write and print out your resume. But at least nowadays, as with Henry Ford's first cars, you're not stuck with "any color you want, as long as it's black."
I have to disagree (Score:3, Interesting)
Lets run down them quick:
I have no idea what the 'Progress' is at the end. Apparently it's quite different from Progress? I guess I had to be there.
I think the designers should focus on design and let everybody else do their job.
Very very true that we need realistic growth expectations. Especially for startups. I remember an anecdote were AOL had figured a certain growth rate not factoring any sort of slowdown as they reach critical mass. They intened to account for something like 15% of the nations GNP by 2010.
Value of source code (Score:5, Insightful)
Let me explain.
Imagine that you're a young student who wants to become a writer. You ask your teacher, "What do I need to do in order to become a great writer."
Your teacher, if she has a bit of sense about her, will tell you to read all the works that you can, by the best writers, and learn from them. By recreationally reading and studying the works of great writers, as a young person, you will learn to recognize and understand, from experience, what differentiates good writing from bad writing. This is the educational process that, if you are both diligent and lucky, will turn you into a talented writer.
Contrast this advice with the world of computer programming. In the world of software, programs are distributed as object code -- meaning that you can't learn from them by reading them. Plus, they contain "licenses" that proport to deny you the right to study them to learn from them. Any programmer who obtains surreptitious access to some major program's source code is running a serious risk of being unemployable -- as a legal liability.
It is as if our original hypothetical budding author were told:
"If you want to be come a author, you must be sure to never, ever read anyone else's books, especially popular books by great authors. The way to become an author is to wait until you are of college age, then enroll in a two year "writing school", where you will learn grammar and spelling, sentence structure, and then write a series of short essays. For your final project, you will write a single chapter of a book co-authored by the entire class. Once you are completed with this two year course, you will have your degree, and, having only studied textbooks, will be fully qualified and ready to join the workforce as a writer, uncontaminated by exposure to real-world writing experience.
If this were the way we taught writing, then our novels would show the same lack of quality -- and lack of progress as our software does right now!
So how do we fix this problem?
I believe that the answer is to reform copyright law. The current system of closed source, proprietary programming technology -- and the lack of any noticable progress in the craft of programming -- reflects the complete failure of copyright law brought on by the extension of copyright protection to proprietary software.
Patent and Copyright law are supposed to promote progress by placing the best examples of science and technology into the public domain, where they can be studied and learned from. If I want to learn about any physical science or engineering discipline -- if I want to catch up to the current state of the art, all I need do is go to the patent databases and -- right there, are thousands of examples of the latest, real-world scientific technology, written by actual scientists working in actual companies on actual products -- all there for me to study and learn from, and, 17-20 years after disclosure, to freely draw upon and use.
This is the public benefit of the patent system -- the dissemenation of practical engineering and scientific knowledge. This is supposed to be the public benefit of the copyright system. Copyright is supposed to be a tradeoff. Copyright is supposed to provide monopoly benefits in exchange for publication -- public disclosure. This works just fine in the case of natural language writings, because the source code is the product, but not for object code, where the product can only, for all practical purposes, be used -- not studied and learned from.
Copyright law could and should be used to leverage a similar public benefit, however, in the case of software, our legislators have completely missed the point of having copyright in the first place. The purpose of copyright is not to protect authors. The purpose of copyright is to create the next generation of authors -- to "promote progress" -- by encouraging the publication of works.
Imagine an alternate universe in which copyright protection were only afforded to software that was distributed in conjunction with full, buildable source code. Companies would have to choose between copyright protection, and DRM protection, instead of the current dysfunctional system, where they are able to effectively obtain copyrights on works that are at the same time, in effect, trade secrets.
In such an alternate universe, young programmers would start out as computer users. However, if they became curious about how their software worked, they would find the source code to their programs waiting for them.
Like the young, would-be writer with a library full of books, they would have the entire world of software to read, study, analyze, and learn from.
One objection to the source code requirement for copyright protection that I have heard is that it would encourage code theft. If companies distributed the source code to their products, I have heard it said, other companies will steal their work and incorporate it into their own code.
The answer to this objection is that, under such a system, they would not be able to do that because it would be trivially easy to detect such theft. If I were to steal a portion of the Windows source code and add it to my program, then went to market my program, in order to obtain copyright protection, I would be forced to distribute my source code -- with the stolen Windows source code imbedded. Microsoft would discover it and shut me down.
In this way, mandatory disclosure of source code would severely limit, or effectively end the practice of code theft. Who's to say who is stealing code today? It's nearly impossible to tell, when only object code is published.
Fortunately, free software, and to a lesser extent open source software is bridging this gap. Yesterday's young budding software writers had little to work from. The new generation of young software writers -- and I am talking about high-school age students -- have the entire GNU/Linux/Gnome/KDE system to study and learn from. Free software is the only software that earns its copyright. It's the only software that "promotes progress", because it's the only software that can be freely studied by the general public. It's a functional replacement for the public domain that has been lost/destroyed by misguided, failed copyright law.
In other words, just as having access to a library of great books is everything to a young, budding writer, having access to quality, real-world source code is everything to a young, budding programmer.
In a certain sense, it's probably the only thing that really matters.
Programming is NOT about DS & A (Score:3, Interesting)
When is the last time you thought it necessary to analyze (algorithmically) code that you are writing?
Its far more important to be very good in the programming language you have chosen and its libraries. Knowing how to write quicksort in your latest language is a dead skill - its already been done better by someone else, and added into the SDK.
Computer Science is not Engineering (Score:3, Interesting)
Computer Science now gets to join Chemistry, Physics, and Biology as science disciplines that can no longer handle their own engineering. Physicists don't design boilers any more, Chemists don't design refineries, and biologists don't build waste treatment plants. And computer scientists don't build operating systems well.
Enter Software Engineers and Computer Engineers, who get to learn their stuff from the CS boys, but who focus on production, on tradeoff, on integration, on management. Its the engineers that push for legislation, that make sure that you have the education and experience to practice, and build systems that we are willing to call 'infrastructure'.
What people need to clue into is that we have an industry that has hit the point where it needs to split and to recognize those that advance the theory and those that pave the roads.
where did this guy come from? (Score:3, Insightful)
- Economic model is doubtful
Getting tired of hearing this. a bunch of people start companies using "open sourse" products, have no real business plan, then surprise surprise, they fail and some how open source is the fault. There are companies making money in the open sourse arena. Most companies fail, in any arena.
- Source code is useless
I'd like to see him say that after a vendor goes out of business and he has software that must be fixed, or he goes out of business.
Or the vendor changeed its focus, and since you are tied to them, your company must change the way it does business.
Say a lot of companies get surprised when MS finally discovered the internet, and they change there focus.
- Motivation for Open Source is inappropriate for most software
not sure what he means here. My motivation is 2 fold, improve my programming knowledge, make better code. I fail to see how that inappropriate.
- Nerd culture is counter-productive
yes, we nerds never ever produce anything, or start big companies *coughapplecough*.
Pretty much every large computer company was started because by a nerd.
As a matter of fact a can't think of any.
Xerox was founded by a nerd, Apple, Microsft, IBM, Sun, etc. etc.
Re:I love Fallacy 10 (Score:5, Insightful)
Um, no, doing stuff you don't like to do just "for the money" is what leads to burnouts. Nobody ever says "Boy am I burned out playing around carefree...I gotta take a vacation and do some drudgery!". That said, the concepts of *sound design*, *quality*, *maintainability*, *lifespan*, etc., have to be built into the "programming curriculum". I mean, they don't just hand soldiers a bazooka and say "Ok, if you can pull the trigger, you're ready! Off you go!".
Re:I love Fallacy 10 (Score:5, Funny)
Re:I love Fallacy 10 (Score:3, Insightful)
I am somewhat indignant at this remark. I write software because it is useful to me, and because I feel it might be useful to others. I release it as Free Software because I feel the this is the way people will get the most use out of it, and possibly improve it for everyone else, as well.
As for Fallacy 10.2 and 10.4, it's easily shown to be invalid via counterexample. Linux, gcc, XFree86, etc., are all case in point here.
Fallacy 10.3 is my own personal business. Who are you to tell me my motivation is inappropriate? I think the sole desire to make money is an inappropriate motivation. Should I tell you to stop writing software for money? Of course not.
As for F10.1, I consider this highly irrelevant. I don't give two hoots about Open Source or Free Software as an economic model. (In fact, if my Free Software ruins your market, I'd be more than apathetic, I'd be somewhat gleeful. ;-))
Re:I love Fallacy 10 (Score:3, Insightful)
Sounds ok, but s/cool/fun and I disagree completely.
Don't do anything just because it's "cool". What kind of person does that? Some mindless MTV wannabee?
OTOH, if it's fun, well, why not do it?
Fun doesn't lead to burnout, it leads to well, children. erm, no, that's something else...
Re:I love Fallacy 10 (Score:3, Insightful)
Apparently you and I are very different people. I get burned out writing software for money. Writing software that is 'cool', on the other hand, is fun.
At least you get something back and don't ruin the market for the rest of us.
I get something back from my 'cool' software -- reputation and job opportunities. When I want a job with XYZ-Corp, I don't have to make do with just a (mostly unverifiable) resume; I can point them to places on the internet where my code is in use every day, let them download source code of my work and look at it for themselves, and send them the email addresses of my code's happy users. That way they know just what sort of programmer they will get for their money, and I get the job I want.
As for "ruining it for the rest of you", tough shit. I bet you complained about the people in your college classes who set the curve on exams, too.
Re:I love Fallacy 10 (Score:5, Interesting)
Fallacy 10: Open Source is the Answer
- Economic model is doubtful
- Source code is useless
- Motivation for Open Source is inappropriate for most software
- Nerd culture is counter-productive
It seems like he's trying to make the point that many open source developer's motivation is in the wrong place (making technically interesting, but not useful software), but he does a pretty horrible job conveying that with these bullets.
While there are *some* (I'm not going to make up statistics) who do a pretty horrible job at making useful softwarebecause of poor motivation, there are also plenty of Open Source developers who's contributions to core technologies are VERY underappreciated because they were able to make the technology transparent.
Unfortunately, he begins to make some good points about these issues.
1. He right insofar as source code isn't everything and won't solve everything, but that hardly makes it useless.
2. Yes the economic model is pretty doubtful at this point. Some have made it worked, others haven't. Some do it for profit, others as philanthropists, and others do it to set standards that will benefit a consortium.
Personally, I think he's just beginning to hit the iceberg by pointing out these fallicies that many of us need to address, but he doesn't follow through with supporting arguments. Instead, it's as if he expects us to just "get it" because he "gets it".
Maybe we can expand on his work and fill in some of the holes.
Re:I love Fallacy 10 (Score:2, Insightful)
In my particular branch of financial services (business appraisal), the equivalent to open source would be all of the articles, books and speeches about newly discovered techniques, insight into court cases and mathematical formulas...in short, industrial infrastructure. Everyone in the industry utilizes and benefits from the sharing of information. And, the people that share this information, such as Shannon Pratt, gain tons of respect and are held in the highest regard.
The equivalent to closed-source work in my industry would be actual valuation assignments. They have to be "closed" and not open for anyone to see due to the sensitive nature of the information we're working with.
The funny thing is, we use all of our openly discussed ideas and techniques to create confidential work. Sort of like a BSD licensing system
Re:I love Fallacy 10 (Score:4, Interesting)
The Harvard model(turning away qualified applicants because you have more applicants than slots to fill) ain't gonna cut it in the world of software. If the software industry expects to sell its wares, it damn well better hire all the qualified applicants.
Elitism will not work. Because if people have the ability and the time, but no job, they will sit around making high quality software and giving it away for free. And that poses quite a little problem if you have a similar product and want to charge for it, now doesn't it?
The current downturn in the computer industry is by far the worst I have ever seen. Ever since I can remember(back to the early 80's when PC's first arrived) the computer industry had always expanded and provided more jobs. Now its experiencing its first real downturn and you have a lot of skilled people without jobs. If those skilled people continue to produce software, but they give it away for free, that spells disaster for software companies who expect to sell their product for money.
Open source software will indeed "catch up" to its commercial equivalents. I give KDE less than five years before it is equivalent or superior in every way to Windows. Same thing with the Open Offices, the databases, the programming languages, etc. The software industry has one choice - start paying open-source programmers or die.
I'm not sure if our current economic model can deal with the situation of high quality products being given away en masse for free. I certainly don't see how the software industry can grow like it did in years past. Since the computer industry has led the economy for the last 20 years prior to the current recession, we may never see a recovery. Unless we revamp our current economic system to deal with the fact that what had previously been leading the economy into prosperity(software) is now being given away for free. Also, on a global scale we have to compete with entire economies of scale(China) that don't pay for software.
Re:I love Fallacy 10 (Score:2, Insightful)
Though at first appealing, it's economically impossible. There is a cost to everything, whether financial, labor or opportunity cost.
There is no such thing as a free lunch.
Re:I love Fallacy 10 (Score:5, Insightful)
Money is not the only motivator for people to do work -- even skilled labor. Doctors who work in well-paid jobs in cooshy suburban hospitals routinely donate their efforts to free clinics and programs like Doctors Without Borders. Lawyers take on pro-bono cases for causes they believe in. Programmers write Free software. They all do this not for money, but for personal satisfaction, out of a sense of duty, or to gain experience they wouldn't get in their day jobs. Why do you feel it's appropriate to praise the doctors and lawyers who donate their hard-won professional knowledge to the world, but to deride and ridicule the programmers who do so.
Consider this - a volunteer doctor or lawyer can only help one person at a time, whereas an infinite number of people can benefit from the efforts of a volunteer programmer.
Re:I love Fallacy 10 (Score:2)
Who knows? Interesting thought to entertain, though.
Re:I love Fallacy 10 (Score:5, Funny)
Which makes your 680
I salute you.
Re:A Bit more than that (Score:3)
As far as source code being useless... Let me ask you - how many times do you actually go through the code itself and change things? I would be willing to bet that most people here simply download, unzip, untar, make, make install and go on with their lives. If this is even 70% true, then actually *having* the source code IS useless. The only reason you have it is so that you can 'make install' to the path of your choosing. That, in and of itself, is not the reason the source code exists.
Having said that, I disagree with his blanket argument. He should have quantified it somewhat, because some people *do* look at the source code. Some people *do* make additions, and some people *do* feel more secure having it available in case something goes awry. I certainly feel more secure knowing that there is a body of peers overlooking every code change that goes into Our Favorite Operating System (tm). Do I use it? Not usually - most of the time I apt-get install [the binary]. But I like having the *option*.
Re:A Bit more than that (Score:2)
This is why on even x86 Linux systems source tarballs are still used. They can be remarkably less trouble than binary packages for smaller applications.
Re:A Bit more then that (Score:4, Insightful)
It would have been better, perhaps, to say "for most users, source code is useless."
I remember when I was first getting started, and I head about Open Source. "Hey, cool, I can teach myself to write a word processor!" The truth, though, was that I couldn't. The code to any non-trivial program is going to be very hard to follow if you don't have someone walking you through it, or loads of time to work it through. And that's if you're a programmer. If you aren't, all the source code is good for is taking up space on your hard drive.
Re:A Bit more then that (Score:5, Insightful)
Or if you ask me. This guy has a few interesting points and a bunch of useless or fundimentally flawed points.
The one "decent" point is that people spend too much time dicking around with fonts and colors. But that's a problem that doesn't sit on a hard drive - it sits in the chair in front of the computer.
Fallacy 1: Computing is Easy, he shows by pointing out that there are "Teach Yourself" and "for Dummies" books. These are merely titles. He then points out things like "Air Traffic Control for Morons" is silly. Yes, it is, if you're going to be a professional ATC. But then, if you're seeing if you're interested in the field, or possibly getting caught up on a new system, it makes sense. These are industry specific publications - grabbing a copy of New Riders "Essiental Python" won't teach you Python if you've never programmed, but it will get you going quickly if you've done C++, Perl, etc before.
Fallacy 2: Computers Allow People to Do things They Could Not Do Otherwise This is not a fallacy. I don't have a typewriter, lightboard, razors, specialized photography equipment and a printing press. But I've used a multitude of layout programs and a printer, which is far cheaper and quicker to use. The fallacy he *is* showing is the concept of "Software automatically makes me an expert", which is not a common credo at all. In fact, most office workers will activly try to avoid learning new software because they don't understand that that field.
Fallacy 3: Computers Increase Productivity
Walk into an engineering firm or archetect firm and ask them if they want to go back to sketching blueprints. See if most small businesses have a CPA on hire anymore. He says: "It only took five hours to format this memo". If that's the case, the problem sits in the chair, not in the software. Your HR department should take care of that, not IT.
Fallacy 4: Programs Help Their Users Which he then says is a false because they are only focused on upgrades, money and crushing the competition. And yet, later he'll say why open source is useless. Um.
Fallacy 5: If It's Graphical, It's Easy CLI might be more "powerful" in the hands of a skilled user (I won't give mine up), but well done GUIs can be self explanatory (assuming you know the conventions of that UI). Again, he's phrasing this so it is self-fufilling. Of *course* there are lousy graphical interfaces out there. There are also quite a few easy ones. And I think the metaphor of windows is a very very good one in a computing environment when you are moving from task to task constantly (Letter, Email, Check the intranet for some numbers, back to the Email, task list, look up an phone number, make a call, pull up client records and make note, check email, write a letter). For very deticated tasks, it's less useful, but many real world users need to jump around in tasks. Even if it's a spreadsheet and solitare, as a receptionist listed as her "necessary applications".
Fallacy 6: Computers are Getting Faster Computers are getting faster, the experience is not. On my Apple ][, I could turn it on and in a matter of several seconds, be typing in a word processor. But then, it didn't have spell checking, fonts, and I couldn't make a spreadsheet larger than something like 32x64. Humans can only work so quickly, so as long as the computer can keep up, the coders will add new features. There is the issue of "snappiness", but that's a feel issue less than a task speed issue.
"We have come along and destroyed all the gains we have made in hardware" - no, we have leveraged them into more flexability.
Fallacy 7: Programs are Getting Better
He asks how often I make pie charts. Me? Very seldom. But for the guy down the hall who does financial pitches to clients, he really really needs that ability. How often do I embedd live info into a document? Not often - but the guy who manages the intranet does it all the time. How often do I perform a Fourier analysis? Pretty damn often - when I was in college. Had I gone into a different field, I would be stone dead if I couldn't.
His wife was trying to save a 2.2MB for a 2 page Word document on a floppy disk. Plain text, default font, left aligned. There was one email address, underlined. After 17 minutes of searching, he found a way to turn off this email address highlight off. The document was then saved at 800KB.
My comment: Then it wasn't plain text! We all *know* that Word's doc format sucks... so use something else (and yet open source has no advantages in creating sane standards).
Fallacy 9: Programming is About Date Structures and Algorithms
No... Program = Data Structure + Algorithms. Knowing how they work makes you a better programmer. Period. And Michi? I wrote a program using linked lists last week. Some of us do low level code for specific reasons.
Fallacy 10: Open Source is the Answer I agree with this one - again, because he phrased it carefully. Open Source is an answer, not "the" answer.
Fallacy 11: Standards all the Solution Right then, what is Corba?
Fallacy 12: We are Making Progress If you bought the shite at the beginning, I assume this makes sense.
Fallacy 13: The Industry Knows Where it is Goling Name an industry that does know what is in the future. Hell, name a *person* who is certain of the future and is not delusional.
This guy is a twonk, and almost dangerous: The best UI people on the planet are those working in the car industry. And yet people die all the time because the controls for AC and radio are off to the side and different in each car. I get into a new car, and I have to play for a minute to figure out the lights, shifting, etc... and I just forget about things like cruise control unless I'm on a road trip - and then I ask.
And finally, the biggest thing that shows what an idiot this guys is: We have to stop doing things just because they are fun. If you don't enjoy your activities and you aren't pursuing alternate activites, that's a pathalogical condition. You are mentally ill.
Me? I enjoy my profession, *and* do the best damn job I can do to make powerful, easy to use and useful solutions for my users.
--
Evan
Re:A Bit more then that (Score:3, Funny)
"Teach Yourself C++ in 10 Minutes"
Chapter 1: Minute 1:
First, realize that if you really thought you could learn a programming language in 10 minutes, you're too stupid to be a programmer.
Spend the remaining 9 minutes 40 seconds letting that sink in.
Sucker.
Re:A Bit more then that (Score:2)
sPh
Re:A Bit more then that (Score:2)
No way it should have been over 2 megs though.
Re:A Bit more then that (Score:2, Informative)
Re:MS Paperclip (Score:2)
No it wouldn't. Because you still have a choice to not use the software. However it would be ethically wrong if you did not have the right to choose a different software packages. Even then, the ethics are in the person who revokes your right.
Software has no ethics, right or wrong. People do.
--Jeff
how much oil could a gargoyle gargle if a gargoyle could gargle oil?
Re:You Fscking Morons (Score:3, Insightful)
Well... (Score:2)
Virg
Re:Car Industry? (Score:4, Insightful)
But, the people that design and position the steering wheels, pedals, shifters, turn signals, gauges and door handles in such a way that *anyone* can go from one car to another without *any* difficulty or re-education
The example I give is this: I once drove a friends car in the dark. I drove it on city streets for close to an hour before I realised the turn signal lever wasn't a lever, but a little switch on the dashborad. However that switch was right where my fingers expeced it to be, and worked just the like the turn signal lever of any other car I ever drove so I didn't know that it was a completely different implimentation!
Unfortunatly not all things are like that. When I drive my dad's truck I often trun the headlights off on a rainy day, his headlights switch is where the whindshield whipers on most cars are.