Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Michi Henning on Computing Fallacies 587

Karma Sucks writes "Check out this summary of a keynote at Linux.conf.au by Michi Henning of CORBA fame. It really hits the nail on several points. I especially liked the point about people constantly rewriting letters in these modern times, as opposed to say 1945 where it just wasn't worth the pain of re-typing a letter. The only point that didn't made sense in this summary was the one about "source code being useless"."
This discussion has been archived. No new comments can be posted.

Michi Henning on Computing Fallacies

Comments Filter:
  • Of course. (Score:3, Insightful)

    by SuiteSisterMary ( 123932 ) <slebrunNO@SPAMgmail.com> on Friday February 08, 2002 @02:44PM (#2975468) Journal
    source code is useless.
    "But wait!" you hear the OSS people cry. "If you have the source code, you can fix bugs!" "Well," I have to ask, with a rather pensive look on my face. "If the people who designed and wrote the software can't find the bugs, what makes you think that throwing somebody at it in their spare cycles is going to help? We want software that works, so we can do our business. Our business is not writing this software." From a business perspective, at least.
    • Re:Of course. (Score:2, Interesting)

      by coyul ( 119455 )

      "Well," I have to ask, with a rather pensive look on my face. "If the people who designed and wrote the software can't find the bugs, what makes you think that throwing somebody at it in their spare cycles is going to help? We want software that works, so we can do our business. Our business is not writing this software."

      "Given enough eyeballs, all bugs are shallow."

      The idea is that if everybody gets to see the code, the problem will be obvious to somebody. It certainly stands a better chance of being found than if only the original coders (who might not see anything wrong -- after all, they wrote it that way in the first place) get to look under the hood.

      • Re:Of course. (Score:2, Insightful)

        by taliver ( 174409 )
        Sure, the very obvious bugs might be caught, but there's a catch:

        If software has bugs that are easy to see, and come up frequently, it's perceived as "buggy", and few people will download it, and fewer eyes will look for the bugs.

        If a bug is harder to reproduce, it probably doesn't come up very often, and not everyone will spend time looking for the bug. Hell, I'm in a research environment, I know how to code, and my KDE print daemon crashes everyday. I don't care enough to try to submit bug traq reports, or look at why it's crashing, I'm just going to hope that it magically gets fixed in the next release.

        Exactly how many people here have actually hunted through source code to find the one little bug that annoys them, like the fact that Konq occasionally ignores links, or that it's javascript interpretter is far from perfect. Source code is not the magic bullet, but I'll admit it's better than nothing.
      • Re:Of course. (Score:5, Interesting)

        by nosferatu-man ( 13652 ) <spamdot@homonculus.net> on Friday February 08, 2002 @03:00PM (#2975584) Homepage
        "Given enough eyeballs, all bugs are shallow."

        An absurd fallacy. Perhaps for fetchmail or hello, world! or other,
        similarly sized projects, but nowhere else. Debugging require not
        merely a pair of eyeballs, nor even crackerjack programming skills,
        but mostly an understanding of the problems and compromises that went
        into the creation of the software system in the first place.

        To produce better software, we need better programmers, and better
        tools, not meaningless platitudes about the business justification of
        Open Source licensing.

        Peace,
        (jfb)
        • I should point out that I do in fact believe in Free Software, and the
          distribution of source is an invaluable gift. But it's not going to
          make problems disappear without a reevaluation of the culture of
          software.

          Peace,
          (jfb)
        • Re:Of course. (Score:3, Interesting)

          by Jeremi ( 14640 )
          "Given enough eyeballs, all bugs are shallow." An absurd fallacy.


          Care to back this up, say with some examples of projects where large numbers of people swarmed over the code and still couldn't fix the bugs?

    • Re:Of course. (Score:5, Insightful)

      by jidar ( 83795 ) on Friday February 08, 2002 @02:57PM (#2975560)
      "If the people who designed and wrote the software can't find the bugs, what makes you think that throwing somebody at it in their spare cycles is going to help? We want software that works, so we can do our business. Our business is not writing this software."

      Well that sounds good, but it's been proven wrong in practice.

      At this point, with all of the incredible software that has been produced by open source methods, I don't think it leaves people much room to attack the open source design philosophy. It clearly works and works well, it just works differently than people expect.
    • Re:Of course. (Score:5, Interesting)

      by gmack ( 197796 ) <gmack@@@innerfire...net> on Friday February 08, 2002 @03:00PM (#2975585) Homepage Journal
      Complete BS.. I've yet to see any testing that manages to find 100% of the bugs.

      Through my time as a sysadmin I've come accross bugs in both open and closed source software and have definatly come to appreciate being able to fix the bugs on my own.

      Example: Last weeks helpdesk software installation. The software was incompatable with qmail. Fix: 5 minutes. Any guesses how long it would have taken to get the closed source equivelant fixed?
    • Re:Of course. (Score:2, Insightful)

      by Daemonik ( 171801 )
      The greatest benefit of open source code is that when a bug enevitably does show up, it can quickly be fixed without waiting for some uber corporation to:
      • admit there's a bug in the first place
      • see if they can wait and force you to upgrade the whole product to get this bug fixed
      • decide if you rank as a large enough financial interest, or if this bug will effect a significant number of users, to actually put an engineer on this bug for a few hours to create a patch
    • Re:Of course. (Score:5, Insightful)

      by DrSkwid ( 118965 ) on Friday February 08, 2002 @03:10PM (#2975666) Journal
      *sigh*

      source code isn't necessarily about bugs

      it's also about insulation from change or situations the author couldn't see, test, have predicted, have known about.

      I may never READ the source code for 99.9% of my apps but they day something get's changed and Eric's OSS projects fails I can go find o.ut why and fix it. Without the source I'm screwed.

      And as it happened I did exactly that yesterday when the plan9 imap file server didn't get along with Courier. By having the soruce code I was able to track down the problem to it being a wrong assumption in the code AND a config problem in Courier.

      If I'd had no source code I would have been screwed.

      So Mr Henning can't be that clever if he can't even see what the potential is

      He's making the classic mistake of saying something is worthless to everybody when he means it's worthless to him
      • Re:Of course. (Score:3, Interesting)

        by Tony-A ( 29931 )
        Building plans are worthless to most people. Most of the time. Still better if they exist.
        Two latent bugs. With the source, it's almost as good as if the bugs didn't exist. The overall effect is getting 5-nines reliability at a cost of 3-nines reliability. Also if you are facing a scissors/rock/paper scenario, any assumption you make will be wrong is some cases.
        For most people, most things, most of the time, source code is useless. For most people, 5-nines reliability is useless expense.
    • Insurance (Score:4, Interesting)

      by gmhowell ( 26755 ) <gmhowell@gmail.com> on Friday February 08, 2002 @04:00PM (#2976044) Homepage Journal
      Company I work for purchased a system in early 80's. Company was new. Who knew if it would last? This was pretty damned important software. So, the source went into an escrowed safety deposit box. They disappear or file bankruptcy, we get to open the box.

      Never had to use it, never wanted to use it. But it was there, and allowed us to pick something other than IBM (way too expensive at the time. Not sure if they even offer a similar product anymore.)

    • Re:Of course. (Score:5, Insightful)

      by dillon_rinker ( 17944 ) on Friday February 08, 2002 @04:03PM (#2976062) Homepage
      I'll take this in reverse...

      Our business is not writing this software.

      I work for a law firm. Our business is to produce legal documents and legal arguments. Our business is not accounting, yet we have accountants on staff. Our business is not records management, yet we have records management specialists on staff. Our business is not facility maintenance, yet we have facility maintainers on staff. Our business is not programming, yet we have programmers on staff.

      We want software that works, so we can do our business.
      All commercial software is broken in some way (exceptions number in the single digits). Source code hinders your ability to have software that works. It follows that source code hiding hinders your ability to do your business.

      what makes you think that throwing somebody at it in their spare cycles is going to help?
      We have 400 attorneys. A bug (misfeature, non-optimized routine, poorly designed UI, etc.) that costs us three minutes per attorney per day costs us $3000 daily. (average billing rate is $150/hr)It may be worth our while to hire a programmer at $50/hr to fix the problem. Without source code availability, we have no choice but to burn money on a daily basis.

      If the people who designed and wrote the software can't find the bugs
      The bug may be specific to the way we use the software, or it may be preventing us from using the software the way we want. Perhaps we want a dialog box organized in the way that is most efficient for us. Maybe a program has its data path hardcoded and we want to store data someplace else. One program we have produces a hash that is used for the filename; I'd like to see a differenct algorithm used (for reasons to complex to go into now.) I'm hardly a programmer (I know a bit of C, a bit of VB), yet I'm confident that I could, by studying code, determine if these changes are feasible and locate where code needed to be changed. A pro could be hired to validate my opinions (or deny them!); another pro could be hired to do the work.

      Here's another reason why source-code availability and the right to modify and recomile it is a good thing to have: companies go out of business. We use a program called Wealth Transfer Planning that is pretty cool; it automates the creation of wills, trusts, estate plans, etc. The company that makes it has disappeared. We are stuck with ALL our bugs and NO possibility of improvement to either the content or the engine.

      • Re:Of course. (Score:3, Insightful)

        by istartedi ( 132515 )

        Guys, guys, you're all missing the point. So is Henning. In response to the question "Is Open Source the solution or isn't it?" I answer with a resounding "Yes". :)

        Examples where the proprietary model has excelled: Highly optimized code (Intel compiler) User friendliness (MacOS, Windows) Timeliness (Sun's original Java implementation--was any OSS project working on xp GUIs before Java?).

        Examples where Open Source model has excelled: Portability (are there any platforms that don't support the JPEG libraries?) Endurance (LISP stuff from the 80s will never die). Security (OpenBSD or NSA's Linux).

        Examples where proprietary has failed: Ongoing access and support for legacy products (Where can I legally buy MS-DOS and get support for it?) Broken formats (WORD documents) Security (Outlook) Customer relations (product activation--no thank-you).

        Examples where Open Source has failed: As a business model (Loki) Time to market (HURD, where are you?) Political entanglements (Say "GNU" before everything or you are not my friend).

        When choosing, you have to look at the strengths and weaknesses and decide what is important to you. Sometimes that will lead to Open Source software as the correct choice. Other times it will lead to proprietary. If you are lucky you can mix-n-match That's why I love using MSVC (proprietary) to write Freeware (proprietary) that uses IJG code (Truly Free Open Source), and using the resulting app to generate frames that I pass through Gifsicle (GPL) to generate GIFs (proprietary format) to put on the Internet (Open Standards) that most people will view through IE (proprietary). And everybody is happy if they choose to be.

  • by wrinkledshirt ( 228541 ) on Friday February 08, 2002 @02:44PM (#2975469) Homepage
    Fallacy 10: Open Source is the Answer

    - Economic model is doubtful

    - Source code is useless

    - Motivation for Open Source is inappropriate for most software

    - Nerd culture is counter-productive


    I'd like to see him come here and say that. ;)
    • I agree with those points for the most part (they may be overstated a bit much for my taste).

      The economic model is so far proven to work for only a few companies and not for the industry as a whole. I suspect that basing the industry entirely on service and support is going to drive the prices for those functions much higher and frustrate most users. By offering the program for free, most users are going to expect free support.

      Another relevant thought is that without closed-source companies to support the programmers who are donating software, how are these programmers going to survive? A recent article in the Register (http://www.theregister.co.uk/content/4/23935.html ) noted that most Open Source programmers are employed by closed source companies. If they damage their employer's ability to deploy software, what good does that do anybody?

      Source code IS useless if you don't have time to look over it or modify it. It only benefits the 5% or so that are actively involved in maintaining or modifying the code. The remaining consumers get absolutely zero benefit from it.

      I'm not sure I can argue either for or against the third point, except to say that once the money is removed from the equation, how do you force change without innovation? Ie., fixes instead of new features?

      The nerd culture IS counterproductive, since it emphasizes an antagonism toward those who run businesses (suits) and those who sell products (marketroids). In order for Open Source to succeed, there is going to have to be a meeting of the minds on a massive level, not just a few companies here and there.
    • I'll come here and say that, at least about one of the points: "Motivation for Open Source is inappropriate for most software." That is my main beef with open source advocacy: it will only produce good software that does things that *programmers* really want and need. Hence such things as Apache, whereas many Linux advocates see no need for a text processor more sophisticated than emacs. My main obstacle to moving entirely to Linux (other than games) is business applications, like Timeslips, Peachtree Accounting, Kleinrock's Tax Expert, and much other tax and legal software. Sure, I could find open-source alternatives *almost* as good, but that would entail reconfiguring my entire way of running my practice. Perhaps I should; but it is simply much more practical for a lawyer to use Windows and available Windows programs -- and I doubt that the open source community will produce a viable alternative soon. The bazaar does have its advantages; but it is not a replacement for the cathedral.
    • Well, the economic model for open source is doubtful, under current conditions at least. I was a very early customer of Cygnus. We needed to pay them in part because g++ was so horrendously buggy in those days: it's easier to have a support business to support code that badly needs it.

      Source code isn't useless, but it is useless to many people (those without the skill to change it or the funds to hire someone to do so). It is very useful to folks like me. But most computers are embedded systems programmed in assembly language. How useful would the source code to your microwave oven be to you?

      The motivation for open source works very well for tools that the programmer himself/herself needs, for producing tools with rough edges that can be handled successfully by other programmers. It gets harder with applications; in this cases the only successful open source projects clone some proprietary design (the Gimp, Gnumeric, etc). The truly original open source creations, like Perl, Python, and Emacs, are environments built by nerds for nerds.

      The nerd culture can be counterproductive. Nerds focus on minutia and often don't see the big picture. In many cases, nerds find themselves working for someone who has the opposite limitation. This should be no surprise. Also, many programmers are the wrong kinds of nerds. Civil and mechanical engineers obsess on getting everything correct, because they are well aware that if they don't, people may die and careers may end. Too many programmers lack rigor and think of themselves as artists, not engineers, even if they use the term "software engineer" in their title.

      A key issue, that software is brittle and downright dangerous, is not addressed by either proprietary or open source software today. If we fix this by requiring proprietary software to have a warranty against severe defects, what happens with open source software, where the distributor cannot possibly provide a warranty?

      I'm afraid that Microsoft may start to get it about security before the open source movement does. If you think that the open source movement gets it, then why did the Debian project need to issue 81 security updates in 2001? Both Microsoft and Linux are putting out software that is too buggy, and the BSD world isn't as much better as they claim, despite better practices (code auditing is great, but a lot of work: move most of what Linux distributions call the system into "ports" and then the bugs don't count against you).

      I think that open source can work, but not in the current economic climate (native to the US, being forced on other countries through the GATT and the like), which elevates "intellectual property" to a universal value. A funding mechanism is needed. One possibility is that governments fund it. This would actually save taxpayers a lot of money, since governments are currenty paying Microsoft and the like hundreds of millions just for Office, and paying again every few years for upgrades. That would pay for a lot of full-time programmers.

  • by sphealey ( 2855 ) on Friday February 08, 2002 @02:47PM (#2975489)
    Penicillin - 1920's technology
    Iowa/Yamoto class battleships - 1920's technology
    Apollo moonrockets - 1940's with a dash of 50's
    Polio vaccine - 1880's with a dash of 1940's
    Transistor - 1930's
    Bulk transport system, rail - 1860's
    Bulk transport system, car/truck - 1920's
    Airplane - 1910's
    Fast airplane - 1950's

    Yup, makin progress fast.

    sPh
  • by dghcasp ( 459766 ) on Friday February 08, 2002 @02:48PM (#2975497)

    Is that they enable me to make a large salary without having to turn to medicine, law or crime.

    Recently, I was walking back to my car late at night in downtown San Francisco. A homeless person standing in front of an all night donut shop asked me for a dollar. I said no, but invited him into the shop and bought him a donut and a coffee.

    I would lay money that that $1.25 spent on a human being had more impact to society than all the software I have written over the past 20 years of my career.

  • "We still produce exactly the same amount of letters as in 1945. "
    Where does he come up with that? We are about twice as productive as 1945(amount of goods produced per hour of work) We have taken this productivity gain in more goods and services in place of working less. You could have a 1945 standard of living and take 6 months of per year. Just start to think about what we have now that are necessities: Car that goes 100,000 miles before it is tuned up, large color TV, phone in every room, answering machine, air conditioning, jet travel, it goes on and on.
    • We are about twice as productive as 1945(amount of goods produced per hour of work)

      It's a matter of how you count. If salaried employess are forced to work longer now than in 1945 (they are) -- then productivity goes up. If tax laws change (they have) -- then productivity goes up. Use cheaper labor, more efficient machines, and productivity goes up. The economy makes a difference, since idle workers aren't productive either. So while yes, we are more productive now than before, the actual delta is hard to measure and certainly much smaller than the official figures.

      More importantly, how much of this is the result of using computers (as opposed to increased education, and market pressure)? Quite a bit in some specialized fields like air traffic control, telecommunications, or warehouse management. But for general office work?

      And now factor in the amount of money business spends on computers. I think during the 1980's that IT ate away half of all capital spending in the US. Are the secr^H^H^H admins more productive now?

    • by JoeBuck ( 7947 ) on Friday February 08, 2002 @03:25PM (#2975775) Homepage

      One can not have a 1945 standard of living on 6 months of work per year.

      In 1949, thanks to labor unions, an illiterate coal miner could afford to buy a house and his wife could afford to stay at home and take care of the kids full time, because he made the equivalent of $50k/year in today's money. Try that today. For people who are neither professionals nor managers, real income peaked in 1973 and has been dropping precipitously ever since.

  • One size != all (Score:5, Insightful)

    by SirSlud ( 67381 ) on Friday February 08, 2002 @02:49PM (#2975508) Homepage
    > Nerd culture is counter-productive

    Nerds are the computer equivilent of the Enos, the Yoko Onos, the Peter Gabriels .. very creative, with a propensity to desire to push boundries. Their influences may not be approrpiate for the masses, but they lay the frame work for those who compute and program (or write pop and rock) to achieve practical purposes. Practical people see no value in thinking outside the boundries of current methods, but are more than happy to stand on the shoulders of those that do (as well it should be.)

    Whats wrong with different people born for different goals? Even if you don't directly contribute the masses, most changes in fundemental social systems (and technical systems) starts with someone rejecting the norm. As well it should be. Leave them alone and let them nerd!
    • Re:One size != all (Score:4, Insightful)

      by km790816 ( 78280 ) <wqhq3gx02 AT sneakemail DOT com> on Friday February 08, 2002 @03:13PM (#2975693)
      Nerding is fine and good. I consider myself a nerd (some of the time). I like mountains of weird and crazy features and talk of obscure technical jargon.

      The point is when one is making software to be used by the masses, nerdyness is a bad thing. Nerds like lots of features, we like complexity, we like living in our little world and working on our little pet project without much care for what others want.

      This in general is BAD for most people, most of the time. They want something that works, that makes sense, that's easy and simple and gets the job done. They could care less about command line options, flashing text, and alpha blending.

      That's the point that was being made and it's a great one.
    • by larsal ( 128351 ) on Friday February 08, 2002 @03:18PM (#2975724)

      It's not counterproductive to have people pushing the envelope, it's counterproductive to have people outside of the mainstream dictating to those in it what their needs are.


      Despite advances in UIs, computers are still designed as general-purpose hobby devices, rather than for the specific functions for which the majority of their sales are used. When users complain that it doesn't make sense to have to log in to a system or to "start" a word processor, or to "double-click" to "open" a file through a graphical icon, they're simply told that they don't understand the technology. Same when they have to figure out [to avoid being scammed] what kind of RAM they need with their new P4 processors.


      The point is that for products to be useful and effective, they need to be designed with more consideration for the needs of the user; and much of the time, that which is "neat" to enthusiasts has held sway over design at the expense of what would be useful [see featurebloat].


      BTW: impractical thinking is not necessarily visionary. It might just be impractical.


      Larsal

      • I agree 100%. Thats what I said. I was only saying that just because nerdiness is bad for the masses, doesn't mean that nerdiness should be done away with. My point was that nerdiness is being utilized too close to the practical for-the-masses end of the business, but that it's still essential for the development of the industry; as it has been in all industries. Someone does something because they love it (which immediately makes it unsuitable for most people, since love is very personal, and can/should only be able to serve adjacent communities/ideologies), and then someone does something with it to make money from it.

        Thats the world we live in. Those who get paid, do so at the expense of being so creative as to be creating things that are appropriate for a popular mass.
    • It's sort of like the fashion industry, what the models wear on the runways isn't what you're going to buy in Sears, but it does set the "trend".

    • Re:One size != all (Score:4, Informative)

      by Witchblade ( 9771 ) on Friday February 08, 2002 @04:41PM (#2976297) Homepage
      Fallacy 7: Programs are Getting Better
      - How often do you need to
      - perform a Fourier analysis?

      Several times a day, usually. How often do I need to email a document to more than one person? Almost never. One tool is not adequate for all people. This is a fact all to often overlooked in arguing for software applications as standards.



  • by testpoint ( 176998 ) on Friday February 08, 2002 @02:50PM (#2975512)
    It really hits the nail on several points.
    I like my metaphores stirred not mixed.
  • Good points! (Score:5, Interesting)

    by MattRog ( 527508 ) on Friday February 08, 2002 @02:50PM (#2975519)
    Fallacy 9: Programming is About Date Structures and Algorithms

    I'll agree here, although I see it most in database design. With the advent of such super-fast DBs such as MySQL there has been a FLOOD of horribly written applications that utilizes them. For instance, you'll see every column defined as CHAR( 255 ), or every table prepended with AUTO_INCREMENT columns even when they are not necessary. Indexing is poor or non-existent, and tables are horribly in need of normalization.

    Some finer points in design; I see some stuff like this a lot as well:
    function bob( varlist ) { $var = $joe + 12345; return $var; }
    You're wasting memory and such for the variable declaration and assignment, simply return $joe + 12345;.

    Fallacy 12: We are Making Progress
    - Progress in quality assurance has been remarkably slow

    I used to work in QA for a software company and I wouldn't say that I was the worst programmer there, but I think the problem is that 90% of the QA staff WERE NOT PROGRAMMERS or didn't have access to the source. Basically, QA reports bugs, they go into the queue, and then a developer, if they have the time when compared to all their code development, meetings and such, may have a chance to get to the bug. It would be nice if the QA staff, who may have software programming skills, would be allowed to be developers as well (e.g. all the rights of a developer but QA is their main focus). They attend the same dev meetings and such which gives them the insight to the architecture to allow them to fix bugs which have been approved by management.
    So in effect, have two programming teams.
    • Re:Good points! (Score:2, Informative)

      by BigZaphod ( 12942 )
      Some finer points in design; I see some stuff like this a lot as well: function bob( varlist ) { $var = $joe + 12345; return $var; } You're wasting memory and such for the variable declaration and assignment, simply return $joe + 12345;.

      I know that you are talking about (what appears to be) PHP here, but I thought I'd toss in my 2 cents. In compiled languages small differences like that don't matter. If your optimizer doesn't suck (and most don't, these days), it will reorder your code to be as efficient as it can get it to be, and that includes things like elminiating uneeded variables, etc. So maybe what you are seeing is developers used to working with compiled languages that include a good optimizer and like to go for good, clear code as a first rule of thumb. No, that doesn't make it right, but just something to be aware of.
      • Correct, I didn't want to type a whole lot of C++ or whatnot, so I stuck to type-independant PHP. :D

        From talking to Zeev (author of Zend Engine) he said that there would be a slight performance hit to doing something like that (I assume it is as you said the reorg and such).

        My style of programming is to keep excessive things like that to a minimum because 10ms more might not seem like much until your program (or in this case, web page) is hit 100 times a second. :D
      • Re:Good points! (Score:4, Insightful)

        by Steveftoth ( 78419 ) on Friday February 08, 2002 @04:43PM (#2976327) Homepage
        Another point is that
        function bob( varlist ) { $var = $joe + 12345; return $var; }
        and
        function bob( varlist ) { return $joe + 12345; }
        might actually be the same number of operations. Not because of the compiler, but just because of the way that the machine works.

        Regardless, as said before, this kind of micro-optimizing is pointless and dumb. It is not programming it is coding. Coding is a mechanical process. Programming is an art. You can optimize your code, but it is almost imposiable to optimize an API. Designing APIs is where I think all modern languages have totatly failed us. It is way too easy to write a bad api with todays languages. I've had to implement too many crazy interfaces written by people who didn't think them through. I've also created interfaces that later I went back and scrapped because they were dumb. This is the way programming is and it doesn't make any sence.
    • Some finer points in design; I see some stuff like this a lot as well:
      function bob( varlist ) { $var = $joe + 12345; return $var; }
      You're wasting memory and such for the variable declaration and assignment, simply return $joe + 12345;.


      Well, when you simply return $joe + 12345, the complier creates a variable of the same type as $joe, gives it the new value, and then returnes it, negating any hoped-for savings on memory.
    • Re:Good points! (Score:3, Insightful)

      I disagree, somewhat. It is important to have programmers in a QA environment, but I find that it's just as important to have complete computer morons in there too.


      As a developer, I inherently know what NOT to do. A computer moron doesn't know these things, and will use it like the end user will. An experienced programmer will use my programs like I will, and will usually get the tough errors back to me. A computer moron will get the obscure ones back, and it tends to be those errors which make it through to the end user.

      • Re:Good points! (Score:3, Interesting)

        by MattRog ( 527508 )
        Kris,

        I agree with you as well -- if I came across as 'QA should only be programmers' then I apologize; that was not my intent.

        QA is more than just 'poking' at the program and seeing if it breaks. It's authoring test procedures, finding new and interesting ways to break the program, interacting with other developers and management, and a whole lot more. As a programmer I know I hated to write test procedures -- it is very very boring and as the complexity of what you are testing increases linearly the complexity of your test procedure increases exponentially. :D However, there were guys there who, although they didn't know much about programming, wrote EXCELLENT and in-depth test procedures and saved my butt many a time. :)

        However, we'd write up bugs such as "Inserting 32 characters in field XYZ on form 123 causes program to crash" which, in the grand scheme of things, could be viewed as either a "Show Stopper" (highest priority) or a "Do We Care/When We Have Time" sort of a bug. Considering adding range checking to a form is trivial giving QA clearance to fix that would result in a much better program (again provided the QA developers are qualified) and give the regular developers more time (since we'd find 30 or so of these things on a single form) to fixing the hard-core bugs or developing new features.
        • Re:Good points! (Score:3, Informative)

          by cduffy ( 652 )
          QA is more than just 'poking' at the program and seeing if it breaks. It's authoring test procedures, finding new and interesting ways to break the program, interacting with other developers and management, and a whole lot more.

          Thank you!

          I'm (sort of) in my company's QA department, and get a whole lot of guff about it from the other engineers ("You're QA? Ewwww"). Thing is, QA doesn't need to be a bad job -- I've spent my last few years largely working on (nifty, new) automated testing tools, and love it. There's nothing quite so interesting as coming up for a test for something that on the surface doesn't look practical to test programmaticly, or putting together a home-grown piece of software that does a task in a massively cross-platform manner that comparable (expensive) commercial solutions could only do on one or two platforms.

          Now, writing loads of Expect scripts has never been my thing (that's what the /other/ QA guys do), and I'll probably find my way back into product development if I find that the other tools I build to no longer be in need of heavy development.

          Anyhow, I'm just glad to see someone putting QA in a light that reflects that it doesn't have to be a boring and tedious job done by those who don't have what it takes to be /real/ engineers. Thanks. :)
    • For instance, you
    • Re:Good points! (Score:4, Insightful)

      by Salamander ( 33735 ) <jeff@ p l . a t y p.us> on Friday February 08, 2002 @03:40PM (#2975883) Homepage Journal
      Some finer points in design; I see some stuff like this a lot as well: function bob( varlist ) { $var = $joe + 12345; return $var; }

      That's only a "fine point of design" to a 15-year-old. No, scratch that; it's not design at all. Any decent or even semi-decent compiler or interpreter should be able to make that particular optimization all by itself. A real fine point of design is whether to use events or threads, update or invalidate, distance vector or shortest path, this class hierarchy or that class hierarchy, this module layering or that module layering...stuff that can't be automated or even delegated to an inexperienced programmer.

      It would be nice if the QA staff, who may have software programming skills, would be allowed to be developers as well (e.g. all the rights of a developer but QA is their main focus).
      ...
      in effect, have two programming teams.

      Dream much? Ever hear of specialization? You're right that QA tends to get the short end of the stick in a lot of ways. QA engineers should have some programming experience, should attend (some) development meetings, should have more authority wrt the disposition of bugs...but they should not be checking in production code. Good QA is hard work, requiring its own specialized set of knowledge and skills. Any QA engineer who's making (and, one would hope, unit testing) their own changes to the production code is not going QA, and QA needs to get done. Hire another developer or extend the schedule, but don't take good QA engineers away from the necessary task that they do best to have them do someone else's job.

    • Re:Good points! (Score:3, Insightful)

      by WillWare ( 11935 )
      It would be nice if the QA staff, who may have software programming skills, would be allowed to be developers as well (e.g. all the rights of a developer but QA is their main focus). They attend the same dev meetings and such which gives them the insight to the architecture to allow them to fix bugs which have been approved by management. So in effect, have two programming teams.

      The problem with this is that you really do want two different loci of responsibility for development and QA. You don't even want the two teams to have the same manager (or generally, the same chain of command) because that creates a conflict of interest for the manager. While wearing the DevMgr hat, he wants to get stuff out the door quickly, so he's rewarded for cutting corners when he puts on the QAMgr hat.

      It might work to do what you suggest, as long as the chains of command were kept distinct so only the people at the bottom of the hierarchy ever wore both hats. But do you really want to work for two bosses at the same time, and be answerable to both?

      Another possible model would be a "clean room" approach, where you're given read-only access to the source database, and you can tinker on your own machine. You can propose a specific change to the developer working on your bug, but he checks it in. Things are still sped up that way, and you avoid blurring the responsibilities.

  • by Waffle Iron ( 339739 ) on Friday February 08, 2002 @02:50PM (#2975520)
    Back then it was okay to have 3 or so typos per page without re-typing the entire letter.

    but now is ok for ppl 2 put 42 typos in inrnet msg & hit submitt

  • by www.sorehands.com ( 142825 ) on Friday February 08, 2002 @02:51PM (#2975521) Homepage
    Well, that is redundant. But even so, it make me think of the book, "Sex for dummies" which gave me a whole new perspective on RTFM.
  • by Em Emalb ( 452530 ) <ememalb.gmail@com> on Friday February 08, 2002 @02:52PM (#2975532) Homepage Journal
    From the article:

    "The best UI people on the planet are those working in the car industry.
    We need to make it a criminal law to change certain API's. There are potentially
    huge impacts. When we produce a new drug, we can't just release it to millions of
    people without some sort of testing."

    yeah, but how long should the testing cycle be? For example, we hear all the time about drugs being recalled because of illnesses caused by its use. Beta testing is a great way to do this, however, even then you can't know until your program is running on a lot of machines in different environments, with different variables.

    So, what can you do? Well, you release the software after doing as much testing as possible, and wait to see the results...then patch, patch, patch..which is the way it's being done now. That's why early adopters know (or should know)what they are getting into, and why most of the companies I have dealt with, (running win2k) waited for SP2 to come out before upgrading.

    Or, you could establish some sort of body, like the FDA does, that tests the heck out of software for a while before shipping. Problem with that though, is that by the time it is approved, its obsolete.

    Other than that, this was a most excellent read.
    • "The best UI people on the planet are those working in the car industry. We need to make it a criminal law to change certain API's. There are potentially huge impacts. When we produce a new drug, we can't just release it to millions of people without some sort of testing."

      i find the drug analogy a bit absurd...yes, UI's and API's should undergo rigerous testing, but when was the last time a person was killed by an underdevelopled and tested program??? there are some notable exceptions (like that xray machine a couple of decades or so ago that was giving radiation doses that were off by a factor of ten)....but buy and lagre people who grab the latest instant messenger beta dont have to worry about being physically hurt.

      drugs on the other had can KILL people if they are not understood and tested fully.

      maybe i'm missing the boat here but, i agreee with the idea...i just think the analogy is a bit much.

      dude.
  • A similar reference (Score:3, Informative)

    by big.ears ( 136789 ) on Friday February 08, 2002 @02:52PM (#2975535) Homepage
    If you found some of his earlier points interesting, you may want to read the 1995 book "The trouble with computers" by Tom [colorado.edu]
    Landauer. I think its kind of controversial, but he points out that a lot of the promised and perceived productivity gains due to computers have never come about.
  • by rlowe69 ( 74867 ) <ryanlowe_AThotmailDOTcom> on Friday February 08, 2002 @02:53PM (#2975545) Homepage
    "The only point that didn't made sense in this summary was the one about "source code being useless"."

    Source code *is* useless to about 99% of the people that use the program. My aunt Benita isn't going to track down a Microsoft Word bug and fix it even if she HAD the source. She wouldn't care - she'd just wait for the update. So in that context, the source code is useless.

    Where the source code does become useful is in the hands of developers, but for users it's just another disk of stuff they get in the package that they'll never use.
    • And even in the case of 80% (or so) of developers it's useless. For any given piece software that is sufficiently complex, it takes a lot of effort just to learn the code to be able to modify it. If you were given the code for ms word, how many developers have the time to go through it and attempt to modify it? Not many.

      My company recently ran into this same thing. We dedicated resources to looking into ways to make our java code more difficult to decompile. I brought up the fact that they were wasting their time. Why? Because our product is quite large. If someone were to decompile it they would spend months trying to document the overrall design and engineering behind it, to the point where they would be competent enough to modify it or use it. Even with complete documentation and source code, it takes a long time for someone to be able to grasp the whole system.
    • My aunt Benita isn't going to track down a Microsoft Word bug and fix it even if she HAD the source.


      True. But Aunt Benita might go to www.joescodefixingservice.com and pay Joe $50 to fix the bug for her, if she needed it fixed right away. Without the source code, she (and Joe) don't have that option.


      Imagine there was something wrong with your car's engine, and the only place that could fix it was Honda Corporate headquarters in Japan. Wouldn't you like to have the option to go to the local mechanic instead?

    • Source code *is* useless to about 99% of the people that use the program.

      But it does not follow from this that source code is useless. If the value that you get from giving the code to the tiny fraction of people who will actually do something useful with it is larger than the cost of doing the distribution, then distributing it is worthwhile. Given the low cost of source distribution these days, that may make distributing worthwhile even if only one or two people will ever look at the code.

      Besides, users get value from having the source even if they never modify it. I find that it's very useful to compile programs for my system. They wind up being optimized for my processor and take advantage of the other resources that are on my system. This may not be a big thing, but there are certainly more people out there who compile than who write, and source availability helps them.

  • Forgetting all the myriad reasons source code is useful, the one best thing about getting source code for your product is: it's the ultimate documentation for the program.

    I always look at the source code when trying to solve a problem. It's like a reference manual written in a terse language that doesn't slow me down.

    It's like including schematics with a piece of test equipment. Why bother with the manual when you can just look and see EXACTLY what that button does and how.

    Source code may not be useful to users of software, but to the coders and people doing the actual work, it is a tremendous productivity boost.

    • Forgetting all the myriad reasons source code is useful, the one best thing about getting source code for your product is: it's the ultimate documentation for the program.

      Yes, exactly. Here's an example:

      My workplace is in the market for a new firewall. However, we have some staff who periodically need to do weird things with the network, and want to make sure that the firewall can be set not to interfere with them. Many commercial firewalls do particular classes of filtering (such as flood filtering, rejection of invalid packets, etc.) in a way which is not completely documented. So we can't tell whether they will interfere or not, or which functions we need to enable or disable in order to get them to work for our purposes.

      Enter OpenBSD. I am not the sort of person who usually reads kernel source -- whether on the job or for fun -- but I can pick up the kernel source for OpenBSD's pf packet filter and know (for instance) exactly which combinations of TCP flags it rejects as invalid. I can then look at a network dump and tell someone exactly what pf will do with the traffic represented there. I can, in short, prove that my firewall will or will not pass that traffic.

      I can't do that with a product that comes with nothing but a guide to "Basic Firewalling for the Beginning Networks Staffer" and a command reference.

  • Nerd culture.... (Score:4, Insightful)

    by Daemonik ( 171801 ) on Friday February 08, 2002 @02:58PM (#2975568) Homepage
    Nerd culture is counter productive??

    Hello?... 'nerds' are the whole impetus behind the electronics industry. Without nerds wanting to show off with faster processors, cooler video boards or better OS's a few billion dollar industries wouldn't exist today.

    Hell, Star Wars would have earned $1.50 at the theater without nerds creating the cult that propells it. Nerds created pong on a friggen mainframe just to goof off and sparked the video game industry, quickly gaining as the most widespread form of mass entertainment on the earth.

    I am nerd, hear me calculate!
  • We need to make it a criminal law to change certain API's.

    If we make innovation illegal, only Microsoft will innovate.

  • by Cato the Elder ( 520133 ) on Friday February 08, 2002 @03:02PM (#2975597) Homepage
    That's a really interesting summary of what looks like a talk I would have liked to have attended. Of course, a lot of the points were matters of opinion, and I disagree on some of them.

    Fallacy 1 (Computing is Easy) I think is spot on. I shudder when I see some of the "For Dummies" titles out there now.

    Fallacy 6 (Computers are Getting Faster), I would have to say I disagreed with him on. Sure, my desktop boots slower than my old 386 from 10 years ago. But my Handspring Visor has more memory and boots instantly. Web pages load faster with my DSL connection then they did over my modem (where could you get that 5 years ago?) Most of my compiles are shorter than they were 3 years ago. Sure, people tend to put bloat in, but Moore's law is still wining overall.

    This ones really a quibble, but a subpoint of Fallacy 7 asks "How often do you need to do a Fourier transform?" I don't know if it's need per se, but I kind of like some of the music visualizations that use a whole bunch of frequency domain stuff.

    One of the subpoints to Fallacy 13 (The Industry Knows where it's going) is
    "There haven't been any new ideas in a decade"
    My response
    "There is no new thing under the sun"
    --Ecclesiastes

    That said, he certainly seemed to bring up a lot of food for thought. Do you think he'd be willing to do a Slashdot Interview?


  • Fallacy 10: Open Source is the Answer
    - Economic model is doubtful
    - Source code is useless
    - Motivation for Open Source is inappropriate for most software
    - Nerd culture is counter-productive

    We write software for peer recognition. We write fancy structures because
    'it's cool', but not particularly useful.


    If this were a Microsoft developer conference, would you expect a keynote speaker to stand up in front of thousands of Microsoft employees and users and claim that Microsoft is a monopoly, produces insecure and unusable software and only cares about money, not its users? One would expect a security team (think 2-metres tall and muscular, not securityfocus) up on that podium to carry the infiltrant off stage pretty quickly. More likely, it just wouldn't happen. I'm certain Microsoft puts millions just into screening the opinions that are expressed during its conferences, written on its website or posted on Usenet by its employees.

    I think the Linux community's willingness to listen to criticism before (perhaps sometimes vehemently) counterarguing is one of its greatest strengths.

    I don't agree with what Michi says towards the end of his keynote, but I doubt the organisers of GUADEC will cause too much of a fuss about it (perhaps they will ask him once or twice if he _really_ thinks Open Source is no good for production software).
  • by mccalli ( 323026 ) on Friday February 08, 2002 @03:09PM (#2975655) Homepage
    ...unless, of course, you either compile or interpret it into executable form and then use the resulting software tool to create lecture notes containing the text 'source code is useless'...

    Cheers,
    Ian

    • I run two FreeBSD systems, one at home and one at work. Everything from the kernel to this browser was built from source code. Even the documentation was built from the original DocBook sources. The only things installed as binaries are Acroread and Realplayer.

      The source code is VERY useful to me, even though I haven't seen 90% of it. That's because I built my system optimized for the Pentium IV (Athlon at home). You just can't do that with a binary. In addition, I get to build certain packages the way *I* want them built. I love Dia but I don't use Gnome so I get to build Dia without Gnome instead of using the binary package which requires Gnome.

      I wouldn't even be running an X server on my workstation if it weren't for the source code, since XFree86 doesn't fully support my video card. But with a simple patch it works great. Yes, this patch could have been posted binary only, but how the hell would the poster know how I compiled my server? How the hell does he even know which OS I am using. Is he going to have a binary patch available for every possible combination of CPU and OS?

      I may never look at the source code for gcc, konqueror or XFree86, but I damn well want it available.
  • by abde ( 136025 ) <apoonawa-blog&yahoo,com> on Friday February 08, 2002 @03:10PM (#2975664) Homepage
    Fallacy 1: Computing is Easy

    well, actually it IS easy to learn syntax. This fallacy is just sniping at inexperience. No one teaches you how to write great code, even the greatest C hackers learned their loops one at a time. And, most of the rationale behind spaghetti code nowadays is due to extreme commercial pressure, not any lack of aesthetic sense.

    - Teach Yourself C++ in 14 Easy Lessons
    - Brain Surgery in 14 Easy Lessons

    its completely arrogant to equate Brain surgery to C++. For one thing, lives are not at stake. This analogy is delusional with extreme grandeur.

    Fallacy 2: Computers Allow People to Do things They Could Not Do Otherwise

    As a matter of fact, they DO empower us. With Word I can do mass mailings in an hour, instead of all day. A great word processor will do a lot of the annoying things like spellcheck and thesaurus and automatic formatting of headings and footnotes and equations - which used to be a severe drain of time. A great spreadsheet lets you analyse numbers with impressive ease - ask any accountant how much the spreadsheet has transformed their parctice. This power of analysis has allowed professionals to actually expand their business instead of being mired down in drudgery.

    Fallacy 3: Computers Increase Productivity

    yes, they do, if used with discipline. See above. The idiots who waste all day adding sound effects are the same ones who in eth 40's used to while the day way lobbing sharp pencils into the ceiling. Procrastination has evolved with technology but is essentially the same.

    the point about typos in letters written in 1945 illustrates the opposite point.

    quote: "Nowadays, we rewrite the letter many, many times, changing fonts, format etc.
    We are no better off in terms of letters produced."

    really? you call a letter produced with no typos, "no better off" ? and all of the ways we can edit documents today, can be done effortlessly. The default templates that come with Word do all of this already. Its only the "power users" who seem to obsess like that, when people who actually use computers daily for their profession simply get the work done.

    Fallacy 4: Programs Help Their Users

    true, software companies try to ensnare their users. Also true that DVD makers try to snsnare their consumers, that groceries and airlines and car salesmen all use deceptive marketing, schemes, and even planned obsolescence to suck your wallet drier. You shoudl blame capitalism, not computers.

    Fallacy 5: If It's Graphical, It's Easy

    the vast majority of GUIs make simple tasks much easier. If you think that arcane text codes and comands are easier than just clicking the Underline button, then you're a /etc/conf hacker, not someone working in an office relying on Word to get the memo done.

    with a gui, you dont NEED to be a "sysadmin, programmer, typesetter, etc." to get work DONE. You just get work done. In a CLI you have to be all these things and more.

    also, the paperclip has NEVER interrupted me to tell me a joke. Document the allegation!

    Fallacy 6: Computers are Getting Faster

    yes, they are. NO software I can buy today can really tax my 2 GHZ Pc, not even the most bloated WINXP install. My Pentium DOES boot faster than my old 386, Word loads in a few seconds, my web page is limited by my dial in connection (which i am forced to use because of monopolies and lack of regulation in telecom, not because of any computer issue). Its obvious that a Pentium 4 compiles faster than a 486, and the programs of today have more functionality anyway. EVERYTHING took FAR LONGER 5, 10 years ago.

    Hardware is SO FAR AHEAD of software that only Id Siftware can really claim to have tested the metal. And can YOU tell the difference between 100 and 200 fps ? NO! stick your head out of the benchmark app!

    Fallacy 7: Programs are Getting Better

    Yes they are. True many obscure functionalities are barely used but they are there - and they barely slow things down in todays 2 GHz age.

    I dont buy the anecdote about a single hyperlink inflating a 800K document to 2.2 MB. I just tried it myself, but taking 800 K of raw text and pasting it into Word. Then i added a link. The file size difference is negligible, but dont take my word for it, TRY IT YOURSELF! And then stop propagating foolish incendiary lies.

    Fallacy 8: Programmers are Getting Better

    well, if they all bitch and moan like this, maybe this really is a fallacy. But, I doubt it. Most of teh programmers I know are able to switch between languages and adapt to different environments. Most old time programmers are surgically attached to the Language of Choice for them and will never change. Look at the quality of coding being done on the Linux Kernel, in Oracle's 8i, in Windows' .NET. These are true advances in computing complexity and it is a continuing process.

    BTW, ANY student who majors in CS will know what a core dump is, dont be alarmist. Any student who isnt CS, has no reason to know. So what?

    the jab about knowing how to write excel memos being a mark of qualification is just arrogant snobbery. And the average retention time is from the dotcom boom, it surely isnt true anymore. YOu have a problem with people cashing in on their skills while they could?

    Fallacy 9: Programming is About Date Structures and Algorithms

    this is an extremely provincial accusation - probably better to just nod and agree with you rather than set off a religious war.

    Agreed that programmers are not taught to design. Well, who taught you? If experience sufficed for you to become a self-declared expert, then it will suffice for others also.

    Fallacy 10: Open Source is the Answer

    The Answer? The Answer to what? with apologies to DOuglas Adams, first off you better figure out just what the Question is!

    • I dont buy the anecdote about a single hyperlink inflating a 800K document to 2.2 MB. I just tried it myself, but taking 800 K of raw text and pasting it into Word. Then i added a link. The file size difference is negligible, but dont take my word for it, TRY IT YOURSELF! And then stop propagating foolish incendiary lies.

      My theory on that story -
      The email address hightlighting was set up to including changing them to a custom font. Word was also set up to embed custom fonts in documents. Thus when the only use of that font was deleted, the font wasn't included, explaining the 1.4M difference.

    • by Bruce Perens ( 3872 ) <bruce@perens.com> on Friday February 08, 2002 @04:29PM (#2976214) Homepage Journal
      Remember, this is one of the people behind CORBA. He would say source is useless. He wants a software world of black boxes connected together. Most people have accepted that this particular promise of OO programming was hype. He hasn't.

      Bruce

  • by Nicolas MONNET ( 4727 ) <nicoaltiva@gmai l . c om> on Friday February 08, 2002 @03:11PM (#2975669) Journal
    He claims:

    99% of all documents are written to be printed on paper.

    Hell, no! 99% of documents (besides programs) I write are emails.

    I'm not nitpicking, this is a major flaw in the argument.

  • It amazes me how quickly people that have anything to do with computers like to declare them obsolete.

    It reminds me of the guy who had an old 68k Macintosh running Word 5.1. He knew how to use it and it did everything he wanted it to do.

    One day the IT people at his company took his mac away and gave him a new PC because the mac was "too slow" Well, what happened?

    First of all, he was not familiar with the PC or with the new features availible in Word. Second of all, many of these new features were more annoying than useful, especially when the newer version of Word autocorrected something that didn't need correction. Also. considering that this new, more complex software is both more demanding of hardware and more prone to bugs, He found that his new system was slower than his old one and more prone to crashing.

    So, why again was that Mac obsolete?
  • Fallacy 2 (Score:5, Insightful)

    by ChristianBaekkelund ( 99069 ) <draco AT mit DOT edu> on Friday February 08, 2002 @03:14PM (#2975698) Homepage
    "Computers Allow People to Do things They Could Not Do Otherwise"

    How is this a fallacy??...He cites perhaps the handful of examples in which it may NOT be true, but leaves out the seemingly unending numbers of examples in which it is in fact very true.

    - Telephone switching
    - All the sophisticated computers running those F16s we see in Afganistan
    - Power grid / sewage / water / gas control (in most areas)
    - The entire Internet
    - The level of visual effects in movies
    - Computer and video games
    - Thousands of different manufacturing processes that need to be computer controlled to get the level of accuracy needed
    - Protein folding research
    - and so on...
  • I would like to call attention to the Useful Reading list at the bottom of the linked article. One of the books listed, "The Inmates Are Running The Asylum [amazon.com]" is a fabulous book by Alan Cooper [cooper.com].

    If you have anything to do with designing any sort of interface to any sort of product (be it a piece of hardware, a piece of software, a widget, whatever), you should read this book. It will open your eyes.

  • by Dominic_Mazzoni ( 125164 ) on Friday February 08, 2002 @03:17PM (#2975719) Homepage
    While I think many of his fallacies have some truth to them (and I find them amusing), I think that his arguments only apply to business, or more specifically, they DON'T apply to a lot of areas where computers have revolutionized the way people do things, for example, in music composition, graphic design, scientific research, etc. - not to mention communication.

    Let's consider Fallacy 2: Computers Allow People to Do things They Could Not Do Otherwise. This is not a fallacy, it's true. As an amateur composer, I can compose and print a piece of music in a tenth of the time it would take me to do by hand. I am not taking advantage of any automatic composition or any silly A.I. technology. I'm just taking about using Finale 2000 to enter in the notes using my MIDI keyboard, edit them quickly with the mouse, and listen to the result through my speakers to make sure I didn't make any musical "typos".

    How about scientific research? Scientists now have amazingly powerful tools at their disposal. I know plenty of people who do need to perform a Fourier analysis on a daily basis (see Fallacy 7) and for people like this who are leading experts in Physics but know little about computers, a book like "Learn Matlab in 21 days" is all they need. I agree that you can't become a good DB programmer or QA person by reading a quick book or studing at DeVry, but most people who use computers aren't programmers and don't need to be.

    While we were taking about scientific researchers, clearly "Computers are Getting Faster" is NOT a fallacy for them!

    Finally, what about the Internet? Yes, the dot-com bubble bursted, but note that all major companies still have websites. It's silly to even consider a company not having one. E-mail definitely allows you to do things that weren't possible (or at least weren't realistic) before, like collaborate on a book or article with someone who lives halfway around the world.

    Also, statements like "Programmers are Getting Better" are hard to really analyze. One problem is that there are hundreds of times more programmers now than there were twenty years ago. As a natural consequence, the average level of expertise has gone down a lot. But the best programmers today are a lot better than the best programmers twenty years ago - because they're building off of the best ideas of the last twenty years. And there's no question that even below-average programmers are far more productive today than below-average programmers twenty years ago, simply because there are more high level tools available to them. People who write Visual Basic scripts for internal company programs may be very poor programmers, but if they can get the job done, who cares?

    < / RANT >

    Sorry. Computers have definitely made my life better, and have enabled me to do many things I never could have done without them, so I get upset when people try to argue that computers suck and that things are basically the same as they were before computers.
  • by nanobug ( 446693 ) on Friday February 08, 2002 @03:20PM (#2975740)
    Michi Henning has given his Computing Fallacies talk several times in various venues in the last few years.

    Slides and video from one of these (given on April 18th, 2000) are available here [onthenet.com.au].

    He will probably continue to give his talk for many years to come, as it is unlikely things will change much in the short to medium term.
  • by jmv ( 93421 ) on Friday February 08, 2002 @03:22PM (#2975758) Homepage
    There are good points but I've got a problem with what looks like the unwritten assumptions that programs are all the same and are targetted at the same people. Let's see who uses:
    -Word
    -Matlab
    -Apache
    -Linux/Embedded
    -AutoCAD
    ...
    While you (should) want to make Word as simple as possible, you want to let Apache users configure everything, you want to let people modify the source to Linux(Embedded) to exactly fit their needs. AutoCAD needs lots of features, but not necessarly source code ('cuz there are less programmers in mec. eng. than ee)

    So I'd add fallacy #11: Programs are all the same
    -Software management should be done the same way, regardless of the software being produced
    -All programs should focus on simplicity, not features
  • by mblase ( 200735 ) on Friday February 08, 2002 @03:35PM (#2975839)
    I wonder just how far we've come in automotive technology since the 1950s?

    I mean, the cars don't actually go any faster. The speed limits aren't much higher, and if anything, the increased traffic makes us drive slower. Environmental improvements from catalytic converters and the like are nullified by the increased number of cars producing pollution. We add rear-window wipers and CD players, and instead of buying (or building) a more efficient vehicle we demand (and get) SUVs from every last manufacturer on the planet.

    So, are cars actually any better, when any technological improvements are effectively nullified by the people driving them?

    Well, yes, they are. Cars are more popular every decade because they're easier to use, cheaper to own, and more comfortable for everyone inside. They may not be "better" from a numerical perspective, but anyone driving a 2002 model right after driving a 1962 model will immediately notice the difference.

    Computers are the same way. The faster they get, the more we expect them to do. The more people that use them, the fewer things they are used for. Developers get sloppier about optimization and APIs get changed with every iteration of the OS. It takes longer to start up this year's computer as it did 1979's, and people still do the same basic things with them.

    But look at how much they've changed: graphical UIs make it easier for anyone to use a computer, instead of having to know what to type in at a text prompt. WYSIWYG doesn't happen 100% of the time, but 98% is a fair sight better than 0%. I may not get anything more interesting using a cable modem than I could using a 14.4 and a BBS, but at least all the commands are on screen instead of hidden behind a hundred scrolling screens of /help documentation.

    So people are using all this computing power for nothing more than playing video poker and typing papers. So what? 90% of the population never needed it to do anything more; at least in 2002, they can do it for a lot less money and with a lot less reading. Companies and users may throw away countless man-hours developing skins and pretty interfaces, but at least they're successful in making computers familiar, comfortable, and desirable to the common man.

    And besides, look at all the things we can do with a PC today that we couldn't ten years ago: access millions of pages of esoteric information online. Take photos digitally and organize them on CD-R discs, taking up 1/100th of the space for about the same cost. Listen to a thousand songs from a single digital jukebox, no vinyl or tape required. IM your mom across the continent without spending a penny on long-distance. Order anything from the Sears catalog without having to own the catalog. Find a new job. Locate a special interest group. Print a map. Comparison shop.

    Or, just write and print out your resume. But at least nowadays, as with Henry Ford's first cars, you're not stuck with "any color you want, as long as it's black."
  • I have to disagree (Score:3, Interesting)

    by jandrese ( 485 ) <kensama@vt.edu> on Friday February 08, 2002 @03:56PM (#2976009) Homepage Journal
    A lot of these "fallacies" are the types of things I see in joke emails. The problem is they're simply not true.

    Lets run down them quick:
    1. True, computing is not easy, especially if you are a programmer. However, for user applications it's frequently easy enough. How long did it take your mother to lean how to type a letter in Word? I bet it was less than a few minutes. More complex things may be beyond her, but for what she wants to do it's easy enough.
    2. False, but he's in the wrong context. Ask a meterologist if he'd like to run those weather simulations without a computer. As another poster noted, we are about twice as productive now as we were back in the 1940s.
    3. Vague. I'm not sure exactly what he's getting at here. I think he's talking about how software companies are unfriendly to their consumers by requiring them to buy upgrade products by not making their software forward compatable (IE you can't open a Word2000 document in Word 3). The software industry is somewhat unique in this field , so the comparision is not completely fair.
    4. True. GUI's do not make everything easier automatically, however a well designed gui will tend to be more intuitive than a well designed text interface, because we can pack more precise contextual information (make the widget buttons look like real life buttons for instance) into the graphical representation of the concepts we are trying to convay. They make ergonomic pointing devices.
    5. False. Even when you upgrade your software, it's generally faster than it was 10 years ago. People look through the past with rose colored glasses and forget that you had to wait half a second for the stupid menu to draw. Booting time is largely a function of how long you need to probe all that new hardware you didn't have 10 years ago, and to load a real operating system instead of DOS (which is no doubt what the speaker is referring to). The webpage example is particularly bad, as 10 years ago there was no such thing, and 5 years ago pretty much the entire web was slow (and your slow computer took forever to render even the simplist page). Compiling is definatly faster than it used to be too, but I havn't changed my compiler much over the years (still gcc).
    6. False, you may not need to animate fonts (what dose that even mean?), but my productivity is much better when I'm using vim instead of ed. Sure we don't have to create a pie chart, but it sure helps make the meeting go faster when you don't have to run through the major numbers and have something to point at. Where does that 99% statistic come from anyway? I havn't printed a document for work in ages. Nobody wants to get a paper copy of anything short anymore, they want it emailed to them the instant it's ready. Nobody reads long documents unless they really really have to, so there is little need to print out your documents. Caveat: I'm an engineer and write fairly techincal documents ment mostly for other engineers, I don't spend a lot of time "prettying up" my documents because it's useless.
    7. Kinda True, but the programmers don't have to be as "good" anymore. There are a lot of tasks out there that are execllent for mediocre programmers and their elite VB skills. Because our development environments (and laguages to a certain extent) have gotten so much better, we don't have to worry so much about hiring the rocket scientest types to design the "save as" dialog or the disk IO routines. This isn't to say there aren't a lot of really talented programmers out there, but there are more "fillers" as well. I'd say the average programmer talent is higher than it was a few years ago, simply because more people ARE taking formal education on programming. A few years ago it seemed like every other developer I met graduated with some weird degree like animal husbandry and then got a job programming. Also, experiance is the best teacher, and many of the beforementioned people are the master programmers of today.
    8. True. Data structures havn't changed much, because they do their job. People aren't really interested in fixing the array because it isn't broken. Also, some algorithms are about as good as they are going to get (and have been proven so), such as sorting, searching, etc... Was the speaker expecting someone to come up with a better Traveling Salesman by now? I think most great programmers have written assembly at some point because the great programmers are the old ones with lots of experiance. The old ones wrote assembly because that's all they had back then (or they come from a time where structured languages were still in their infantcy).
    9. False, I'd say most modern programmers can say Yes to the first one because they did it in school, but once they graduated they immediatly started using the toolkit like any normal person. I'd say yes to the second only if they're programming in C++ (not a safe assumption speaker!). What does HCI have to do with data structures and algorithms? Wouldn't the interface programmers be more interested in that? We were tought when to return bools and ints in school, thank you very much. Granted, C programmers have it easy (or hard depending on how you look at it) since they have no native bool type.
    10. False, Without open source practically none of my projects would have gotten anywhere (since I tend to work on nonstandard routing protocols and testing them in embedded enviornments). The Economic Model is doubtful in many ways (if you are going to try to make money off of open source at least), but you don't write open source software to make money. Having the source code has saved my butt a couple of times when tracking down very obscure bugs only brought forth by running nonstandard protocols (although they SHOULD work, sometimes they don't). The Nerd culture comment is too vague for me to say anything about.
    11. False, without standards we are left with the connector conspiracy everywhere. For many things, (networking, communication, HCI!) standards are they key to making the whole thing work. The last part is just a random insult.
    12. False, Progress in many areas is fast, other areas slow. You can't single out a few examples and say that everything is slow. PC OSes have become much much better in the past few years (especially on the Windows side). PC hardware is much easier to work with than it used to be (remember when you had to configure IO and IRQs manually and when you could accidentally fry the Motherboard by plugging in the power connectors backwards?). Remember when MacOS had no memory management to speak of? Remember when it was hard to network computers with TCP/IP? Remember when everyone was using their own standard for everything and nothing ever worked right if it wasn't plugged into the same brand of machine? Remeber when programmers had to write in assembly or even toggle the bootloader in on the front of the machine? Do you remember when the weatherman wasn't able to accuratly predict more than a few hours into the future? Remember carbourators? Don't you like your Tivo? Just because we don't write newer and better sort routines every year doesn't mean there isn't progress.
    13. True. The computing industry is very hard to predict. A lot of people were broadsided by the Web for instance.

    14. I have no idea what the 'Progress' is at the end. Apparently it's quite different from Progress? I guess I had to be there.
      I think the designers should focus on design and let everybody else do their job.
      Very very true that we need realistic growth expectations. Especially for startups. I remember an anecdote were AOL had figured a certain growth rate not factoring any sort of slowdown as they reach critical mass. They intened to account for something like 15% of the nations GNP by 2010.
  • by jms ( 11418 ) on Friday February 08, 2002 @04:06PM (#2976079)
    The public benefit of open source and free software has nothing to do with economics. It has everything to do with learning and progress, and free access to source code is the key to advancing past the current stagnant backwater that computer science has become.

    Let me explain.

    Imagine that you're a young student who wants to become a writer. You ask your teacher, "What do I need to do in order to become a great writer."

    Your teacher, if she has a bit of sense about her, will tell you to read all the works that you can, by the best writers, and learn from them. By recreationally reading and studying the works of great writers, as a young person, you will learn to recognize and understand, from experience, what differentiates good writing from bad writing. This is the educational process that, if you are both diligent and lucky, will turn you into a talented writer.

    Contrast this advice with the world of computer programming. In the world of software, programs are distributed as object code -- meaning that you can't learn from them by reading them. Plus, they contain "licenses" that proport to deny you the right to study them to learn from them. Any programmer who obtains surreptitious access to some major program's source code is running a serious risk of being unemployable -- as a legal liability.

    It is as if our original hypothetical budding author were told:

    "If you want to be come a author, you must be sure to never, ever read anyone else's books, especially popular books by great authors. The way to become an author is to wait until you are of college age, then enroll in a two year "writing school", where you will learn grammar and spelling, sentence structure, and then write a series of short essays. For your final project, you will write a single chapter of a book co-authored by the entire class. Once you are completed with this two year course, you will have your degree, and, having only studied textbooks, will be fully qualified and ready to join the workforce as a writer, uncontaminated by exposure to real-world writing experience.

    If this were the way we taught writing, then our novels would show the same lack of quality -- and lack of progress as our software does right now!

    So how do we fix this problem?

    I believe that the answer is to reform copyright law. The current system of closed source, proprietary programming technology -- and the lack of any noticable progress in the craft of programming -- reflects the complete failure of copyright law brought on by the extension of copyright protection to proprietary software.

    Patent and Copyright law are supposed to promote progress by placing the best examples of science and technology into the public domain, where they can be studied and learned from. If I want to learn about any physical science or engineering discipline -- if I want to catch up to the current state of the art, all I need do is go to the patent databases and -- right there, are thousands of examples of the latest, real-world scientific technology, written by actual scientists working in actual companies on actual products -- all there for me to study and learn from, and, 17-20 years after disclosure, to freely draw upon and use.

    This is the public benefit of the patent system -- the dissemenation of practical engineering and scientific knowledge. This is supposed to be the public benefit of the copyright system. Copyright is supposed to be a tradeoff. Copyright is supposed to provide monopoly benefits in exchange for publication -- public disclosure. This works just fine in the case of natural language writings, because the source code is the product, but not for object code, where the product can only, for all practical purposes, be used -- not studied and learned from.

    Copyright law could and should be used to leverage a similar public benefit, however, in the case of software, our legislators have completely missed the point of having copyright in the first place. The purpose of copyright is not to protect authors. The purpose of copyright is to create the next generation of authors -- to "promote progress" -- by encouraging the publication of works.

    Imagine an alternate universe in which copyright protection were only afforded to software that was distributed in conjunction with full, buildable source code. Companies would have to choose between copyright protection, and DRM protection, instead of the current dysfunctional system, where they are able to effectively obtain copyrights on works that are at the same time, in effect, trade secrets.

    In such an alternate universe, young programmers would start out as computer users. However, if they became curious about how their software worked, they would find the source code to their programs waiting for them.

    Like the young, would-be writer with a library full of books, they would have the entire world of software to read, study, analyze, and learn from.

    One objection to the source code requirement for copyright protection that I have heard is that it would encourage code theft. If companies distributed the source code to their products, I have heard it said, other companies will steal their work and incorporate it into their own code.

    The answer to this objection is that, under such a system, they would not be able to do that because it would be trivially easy to detect such theft. If I were to steal a portion of the Windows source code and add it to my program, then went to market my program, in order to obtain copyright protection, I would be forced to distribute my source code -- with the stolen Windows source code imbedded. Microsoft would discover it and shut me down.

    In this way, mandatory disclosure of source code would severely limit, or effectively end the practice of code theft. Who's to say who is stealing code today? It's nearly impossible to tell, when only object code is published.

    Fortunately, free software, and to a lesser extent open source software is bridging this gap. Yesterday's young budding software writers had little to work from. The new generation of young software writers -- and I am talking about high-school age students -- have the entire GNU/Linux/Gnome/KDE system to study and learn from. Free software is the only software that earns its copyright. It's the only software that "promotes progress", because it's the only software that can be freely studied by the general public. It's a functional replacement for the public domain that has been lost/destroyed by misguided, failed copyright law.

    In other words, just as having access to a library of great books is everything to a young, budding writer, having access to quality, real-world source code is everything to a young, budding programmer.

    In a certain sense, it's probably the only thing that really matters.
  • by Ars-Fartsica ( 166957 ) on Friday February 08, 2002 @04:29PM (#2976215)
    Come on now, when is the last time you wrote a data structure to store the primitive types of your language in a way that hasn't been done before?

    When is the last time you thought it necessary to analyze (algorithmically) code that you are writing?

    Its far more important to be very good in the programming language you have chosen and its libraries. Knowing how to write quicksort in your latest language is a dead skill - its already been done better by someone else, and added into the SDK.

  • by JohnsonWax ( 195390 ) on Friday February 08, 2002 @05:54PM (#2976841)
    So, basically what Michi is saying is that Computer Science isn't having the day-to-day impact that it once did. Advances in data structures and algorithms aren't impacting the development of products like it once did.

    Computer Science now gets to join Chemistry, Physics, and Biology as science disciplines that can no longer handle their own engineering. Physicists don't design boilers any more, Chemists don't design refineries, and biologists don't build waste treatment plants. And computer scientists don't build operating systems well.

    Enter Software Engineers and Computer Engineers, who get to learn their stuff from the CS boys, but who focus on production, on tradeoff, on integration, on management. Its the engineers that push for legislation, that make sure that you have the education and experience to practice, and build systems that we are willing to call 'infrastructure'.

    What people need to clue into is that we have an industry that has hit the point where it needs to split and to recognize those that advance the theory and those that pave the roads.
  • by geekoid ( 135745 ) <dadinportland&yahoo,com> on Friday February 08, 2002 @07:10PM (#2977237) Homepage Journal
    Fallacy 10: Open Source is the Answer

    - Economic model is doubtful
    Getting tired of hearing this. a bunch of people start companies using "open sourse" products, have no real business plan, then surprise surprise, they fail and some how open source is the fault. There are companies making money in the open sourse arena. Most companies fail, in any arena.


    - Source code is useless
    I'd like to see him say that after a vendor goes out of business and he has software that must be fixed, or he goes out of business.
    Or the vendor changeed its focus, and since you are tied to them, your company must change the way it does business.
    Say a lot of companies get surprised when MS finally discovered the internet, and they change there focus.
    - Motivation for Open Source is inappropriate for most software
    not sure what he means here. My motivation is 2 fold, improve my programming knowledge, make better code. I fail to see how that inappropriate.
    - Nerd culture is counter-productive


    yes, we nerds never ever produce anything, or start big companies *coughapplecough*.
    Pretty much every large computer company was started because by a nerd.
    As a matter of fact a can't think of any.
    Xerox was founded by a nerd, Apple, Microsft, IBM, Sun, etc. etc.

Those who can, do; those who can't, write. Those who can't write work for the Bell Labs Record.

Working...