Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Good Software Takes 10 Years? 250

webword writes: "Joel Spolsky is running an editorial where he claims that writing really good software takes ten years. (Yes, TEN years.) He explains several mistakes that are made by companies that don't understand his ten year principle. The mistakes are backed up with some interesting industry examples, so it isn't just vapor. Unfortunately, he doesn't mention any OSS/FS examples, but I'm sure we can handle that. I am mostly skeptical of his idea because I'm not sure how he thinks a young software company could possibly survive without building really good software in much less time. Spolsky mentions that new companies need to strictly control their cost structures, and that will save them, but that doesn't seem like enough to me."
This discussion has been archived. No new comments can be posted.

Good Software Takes 10 Years?

Comments Filter:
  • by Anonymous Coward
    What the hell happend to Windows?
  • by Anonymous Coward
    ... he did mean something you can drink. You would have known it if you had clicked on the link.
  • by Anonymous Coward on Thursday July 19, 2001 @03:09AM (#75512)
    It's all about job security folks, why build an app in 3 months when you could spend some quality time reading slashdot and drag the project on for ten years.

    Those old hippies were smart.
  • by Gleef ( 86 )
    Netscape used key NCSA Mosaic developers, and started their code from scratch.

    Internet Explorer 1.0 was based on Spyglass Mosaic code, which was the NCSA's commercial distribution of Mosaic. Later versions of IE were, of course, based on IE1.

    There's an excellent site devoted to the history of key browsers (though it's missing Konqueror and Lynx) at http://www.blooberry.com/indexdot/history/browsers .htm [blooberry.com]
    .


    ----
  • You make a very good point. The real standards as produced by w3c are OK, but the de-facto standards created initially by netscape, and then embrace/extended by MS are horrible.

    But, did you ever examine the early mozilla code (ie what they started with from netscape) ? That was a poisoned chalice if ever I saw one. It didn't support any of the trickier elements, but it still took a ton of code to do not very much badly.

    BTW, how come nobody ever mentions Opera in all this. It is *by far* the best browser around, but nobody ever mentions it. If writing a decent browser is so impossible (unless one is inventing the standards yourself) how did they do it ?
  • by joss ( 1346 ) on Thursday July 19, 2001 @06:25AM (#75516) Homepage
    nearly...

    Marc Andresson was an intern at NCSA. His maths wasn't good enough for him to do what he was hired for, but he did know motif pretty good, so his superior Ping Foo asked him to write a GUI for www. This proved to be a good idea. Then Jim Clarke (founder of SGI) thought it would be a good idea to commercialise the idea, and hired the more commerically minded people from NCSA (not the same as the smartest, for instance - Ping whose idea the project was, and who made some of the smarter design decisions (eg single click to follow link), never joined).

    Netscape was the browser tail wagging the internet dog for a while, but they declared MS history too early. The VC money they got from being declared the next MS enabled them to hire 1000s of developers very quickly. This was profoundly stupid, because having 1000 mediocre developers is far worse than having 10 decent ones. These people wrote 100s of 1000s of lines of shitty code. This code was then released to the mozilla effort, and left it with no fucking chance. Writing a browser isn't *that* tough, but if you start with a massive shitty code base, you can lose sight of that.
  • by sql*kitten ( 1359 ) on Thursday July 19, 2001 @05:32AM (#75517)
    The catch is you have to specify the expected properties of your program in terms of logical language (yes, and this is very hard sometimes). If you stated the properties correctly, then our tool is able to detect violations against the properties.

    Uh, if you have a logical language that describes what your program is supposed to do, why not just compile it? It sounds like what you're doing is writing the same program twice in two different languages and comparing the output - you would be better off with a code generation tool.

  • You can easily survive 10 years without releasing "really good software". How many people bought Mechwarrior Mercenaries? Win95? How many installations of sendmail are out there? bind 8? I think we would all agree that the likelyhood of encountering a bug in these is pretty darned high, and yet people bought and installed them anyway.
  • Angband (and its finest variant) is continually improving, and is far in advance of any commercial game in terms of long-term playability.

    Hmmm. I'm not entirely convinced. I felt Angband peaked at around v2.5.8, and since then, has suffered from the same problems as MS Office -- new features for the sake of new features, not because they're needed. Yes, some of them are quite nice, but they're just candy, and had game development stopped back then, I'd still be playing it today. By that I mean, I am still playing it today, but the The *gameplay* hasn't got significantly better for some time. But then, it's hard to improve when you're that close to perfection in the first place :-) Zangband is intriguing, in that it does give altered gameplay (I'm still torn on whether it's an improvement or not, but there is definitely innovation there).

  • Nitpick: GNU tar has an option to pipe files through bunzip2. It doesn't have internal support for either gzip or bzip2.
  • The point is not how long it takes to get something useful - that can be done rather quickly - but instead something that is mature. A mature piece of software interacts with people and other software in the best ways possible. Figuring out how the best way for software to interact with other software and people takes a long time. Compare Notepad and BBEdit. Notepad is what you get when you throw something together in a month or two. BBEdit is what you get when you take an editor and run with it. Notepad is useful, but BBEdit is useful and mature. I think this is also one of the things that us free software people (myself included) miss when trying to convince others to use free software. Most free software has not had the time to mature. For example, let's take PostgreSQL (which I love) and Oracle (which I can't stand). I have to hand it to the Oracle people - they have added bits and pieces to their platform that make it great for enterprise use. Things like _extremely_ verbose logging, session tracing, the ability to move tables all over the disk and all through files, and all sorts of little administrative things that help the administrator keep things going. All-in-all, PostgreSQL has a much better design, but it will take many years of production use to get all of the little features that Oracle has. Also, if you look at the --help option of most of the UNIX commands, you see a plethora of options. Most likely, the original author had a really, really limited use for the command. However, users said "wouldn't it be cool if it did X", and so that was implemented. Two things are adding to this time frame - the feedback cycle (it takes a long time for users to learn something, and then figure out how they _wish_ it would run) and the development cycle (each feature takes a lot of time). These cycle times are slow. That's why software takes a long time. More programmers won't make the feedback cycle go any faster. Neither will better programmers. Good software takes time.
  • Actually, Q3A is much older than that. You could say that it is more like Wolfenstein 5.0, in which case the software is about 10 years old. The fact that they changed the name and some graphics doesn't hide the fact that they've been working on this for many, many years.
  • Even if its completely from scratch, they are still building on what they did earlier.
  • Help/about in MSIE 5:
    Based on NCSA Mosaic

    --
  • by larien ( 5608 ) on Thursday July 19, 2001 @03:24AM (#75527) Homepage Journal
    I will give the Internet Explorer team credit. With IE versions 3.0 and 4.0 they probably created software about ten times faster than the industry norm. This had nothing to do with the Internet and everything to do with the fact that they had a fantastic, war-hardened team that benefited from 15 years of collective experience creating commercial software at Microsoft.
    Personally, I'd put a lot of it down to them nicking the Mosaic source and using it. That effectively makes the time of development something more like 6 or 7 years rather than the 2 or 3 MS worked on it.

    Also, I'd agree with others who have posted that this counts for things like OS's, DBMS's and the like. To create a simple application (eg, a text editor) should take less time.
    --

  • While I generally agree with you, there are games that have been in continuous development for over 10 years, and are all the better for it. Angband [phial.com] (and its finest variant [zangband.org]) is continually improving, and is far in advance of any commercial game in terms of long-term playability. Nethack [nethack.org] is another one that has been around for a very, very long time now, and continues to improve. It is amazing how much gameplay can get squeezed into a game over a decade or so, especially when no one cares about the graphics :)
  • Lotus Notes is also a bad case because there's several huge factors that affected it's adoption.

    1) It was a PC-based client-server program in the 1980s. Way ahead of it's time, but that meant it pretty much had to run on the unpopular OS/2 operating system, and even then 10 concurrent users was a miricle on the hardware people had.

    2) It was exclusively marketed to large corporations (read: it was ridiculously expensive) until the early 90s. It still is a lousy product for places without dedicated system admins and developers.

    3) Lotus also sold cc:Mail which was the #1 corporate e-mail system from about 1987 to 1997. Cancelling cc:Mail and transitioning users to Notes obviously had a huge effect on the Notes user base.

    4) Oh, it's true that the email component wasn't close to being "done" until v5 shipped a couple years ago. (insert link to UI Hall of Shame)

    And because it's so old, it does carry tons of legacy baggage, including lots of back- and forward-compatibility features.
    --
  • Perl [perldoc.com] looks like it followed more or less the same pattern. Perl 1.000 was released in December '87, Perl 5.000 in October '94, and by December '97 it was up to 5.004_56.

    If you look at the "selected release sizes" table, there was a big jump in "core + lib + ext" from 4.036 (798K) to 5.000 (1504K), and a big jump in the documentation size from 5.003_07 (976K) to 5.004 (1587K).

    Does anyone have the Camel Book sales figures from 1991 (when it was first published) to the present?
    --

  • he chastises anyone who has ever re-written anything from scratch, saying that it is a waste of a huge investment. I don't know. I certainly think refactoring code is often necessary. When does it stop being refactoring and start being a re-write?
    When your goal is no longer "make build x+1 pass the same unit tests as build x", either because you're now trying to add new features to the code, or because your code is now in little pieces strewn all over the workshop (so to speak) and some crucial function won't work until you've figured out how to fit them back together.
    --
  • You're assuming that the reason software takes so long is because of the time needed to get the bugs out. Joel's point is that it takes this long to put the features in -- to find out, based on experience and customer feedback, what features you need and where you need to optimize.
    --
  • by maroberts ( 15852 ) on Thursday July 19, 2001 @05:08AM (#75545) Homepage Journal
    The word "good" in the authors article is a very imprecise word. Does good software have lots of features, is it free of bugs, or is it an innovative 'I never knew I needed it till I saw it' type of package.

    Software need only fall into one or possibly two of the above categories in order to reach the accolade of 'good'. There are even packages in use today that are in mainstream use that are not necessarily good but have just become ubiquitous by default. [Windows Notepad springs to mind -its minimally featured, you can't edit large files with it, yet everyone uses it at some point or other].

    There are other packages (e.g. games) which take nowhere near 10 years to develop (Minesweeper is incredibly simple but is a great game and is virtually unchanged since Windows 3.0/1).

  • Two points:

    1. All CompSci degrees are not created equal. What some lower-rung schools call CS makes me cry. No theory, no math, no algorithms, just programming languages and trendy topics.

    2. Software Engineering != CompSci. They are very different disciplines, and there are very few schools teaching Software Engineering right now. CMU and RIT are the two I know of (based on previous /. posters). You don't hire a physicist to design a car, you hire a mechanical engineer. The same should be true in software.

    -jon

  • I could be wrong about the comp-sci bit. Maybe it's the lack of comp-sci that does it. When I think of the 10 worst programmers I've known, 7 of them had comp sci degrees, though. On the other hand, to be fair, of the 10 best I personally know, 6 have comp sci degrees (and the top three all have them). So it's probably just laziness in general, and has nothing to do with comp sci.

    I also overspoke. Good programs require more than good code (as someone else pointed out). They also require correct design and the correct feature set. I guess I subscribe to the preferences of "as few features as possible". But still, I think bugs are worse, and more prevalent, than feature sets that are too small, and that blame lies squarely with the programmers.

  • You're right. I'm never specific enough when I mouth off. I think bugs are more common (and thus worse) than malformed feature sets, and bugs are clearly the fault of the programmers. That's what I meant, and tried to keep it so short I wound up not actually saying so.
  • Yeah, I probably shouldn't have generically insulted comp-sci like that; it's probably more because of personal laziness in programmers. It's actual work to test all parameters, provide traps for all exception points, and then remove everything unneccesary and test like hell. It doesn't feel like part of solving the problem, and isn't fun. That's where the bugs come from, imho.

    However, I still agree with you on the curricula probably being still experimental by neccesity of the comparitive newness of the subject. I'd love to see a class on quality coding practices in general though.
  • The book you linked is *fantastic*. I've read it twice.
  • No. Every time you haven't been bitten, you haven't notice.

    You're 100% right of course.

    You may also be right about the environments (I work for a shop that creates and sells products, not customer-specific). I have worked in an IT-type shop, but even there speed was a factor, but not to the minute. I also suspect we're talking about differing degrees of bad code. I've worked with web designers cum javascripters, vb guys writing transactional queue systems, java people pretending they're not just building web applets, x86 ASM people reinventing the wheel (and the road), and c++ people coding to Win32. You can bet your ass their levels of checks and errors were vastly different. I've worked with lots and lots of new programmers, too. I'll say this though: If you can write an app in 1 day and fix its problems a month later in 20 minutes, the code must either be trivial or use very high-quality tested existing chunks (which reinforces my point).

    ...How do you check for a nearly infinite number of failures in a distributed complex system ... We use transactions and high level exception handlers; if there is a failure anywhere we roll back the transaction and report the reason to the user...

    You're doing precisely what I'm talking about. I don't mean checking every damn variable on every use. I'm talking about checking the parameters as they come in, making sure you haven't overflowed your buffers, and making sure you haven't wrapped your integers. You're at least providing the high-level trap. I'm talking about programmers that don't do that :) We're not in as much disagreement as you think.

  • by Katravax ( 21568 ) on Thursday July 19, 2001 @05:24AM (#75553)

    It isn't always worth it.

    It is to me. That one thing or other I've skipped in the past always comes back to bite me. I've never had a boss that pressured me on time once they saw that my code didn't break the thing 6 months later when we were all struggling to fix a few pages that the coder didn't take the time to do it right. Also, that occassional wundercode I've pulled off pretty much buys me whatever time I ask for. I guess it's the old saying : Fast, Good, Cheap; pick any two. My way certainly isn't the only way, but I can never pick the check to leave out, because I can't always conceive what the person that uses my code will do with it.

    Once in a while I get burned by that long lived program or scope creap that I underestimated and end up with a big program that should of had the checks in place.

    And that's why i always build the checks in first. The time saved up front has never, ever, been worth it.

    In some situations where I've gone into an existing program, the checks were wrong simply because there was so much code that the developers got lost and ended up with crap.

    And thus one of my points. They weren't doing quality work. If they were, the burn would have been much smaller or non-existent.

    I understand what you're saying, though. Throw-away code can be written with throw-away techniques. I'm just particularly bad at knowing in advance what's going to be throw-away code.

  • by Katravax ( 21568 ) on Thursday July 19, 2001 @03:44AM (#75554)

    If programmers wrote solid code and tested it thoroughly, it would not take ten years to produce truly good software. I see more unchecked parameters, non-tested failure conditions, and badly designed function interfaces now in the work of the average programmer than ever before. I'm not sure who or what to blame for the problem (dare I say comp-sci curricula?), but I suspect part of it lies with the fact that "safe" languages are the first most programmers gain mastery on.

  • robbyjo says:

    I am working for a research group that develops a tool for checking software *automatically*.

    [...much deleted...]

    Our tool is still very very buggy and limited.

    Doesn't that basically sum up what happens when new 'silver bullet' software technologies hit the real world? I don't mean to knock valuable research, because I know there's much room for improvement in software development tools and techniques, but when you have to apply them to the fuzzy requirements and deadlines of the real world, the results aren't as revolutionary as one would hope.

  • Ender Ryan. Dude, you're in time-out.

    Read your posts. You have a bit of growing up to do as far as your social skills go. You are replying to real people that put serious thought and effort into their messages and articles. Respect the fact that they are posting their ideas for you to learn from.

    Mozilla as a product based on Netscape 5.0 which was based on previous versions of Netscape which were based on Mosaic which was first released in 1993. Give it 10 years. Judge the quality of Mozilla in 2003.

    Q3A is definitely based on older versions of Quake, Doom & Wolfenstein.

    The fact is that if you are a programmer for a commercial product, you should not expect any of your code to last more than 3-4 years in that product. The product will continue to improve. Those improvements entail replacing older code, eventually replacing all older code.

    Think about it.
  • by Priestess ( 30745 )
    You want to find the number 10 in the world you will be able to find it everywhere. 10 steps from your street corner to your front door, 10 seconds you spend in the elevator? When your mind becomes obsessed with anything, you will filter everything else out and find that thing everywhere. 350, 420, 22, whatever. You have chosen 10, and you will find it everywhere in nature. But as soon as you discard scientific rigor you are no longer a mathematician. You are a numerologist.

    Pre......
  • it is expensive, but it's a solid product. On win2k it's the only way to go. About 4 times faster than PCAnywhere (must be nice to have source code) and is pretty stable. For GUI Win adminin' it's the best I've found. Not much different than remote X sessions, just more expensive.
    --
  • For those of you who think this is interesting, you can also check out meta-level compilation [stanford.edu], implemented as an extension to gcc.

    And for those of you wondering how useful stuff like this is, it's already caught bugs in the Linux kernel [stanford.edu], among other things. So that low-level, tricky race condition that was fixed in the newest version? It might have been pointed out by this tool.

    -sugarescent

  • What about TeX? Started work in 1977, but the final version was written in 1982, and was mostly-bug-free by 1985. Last known bug found in 1995.

    Err... I have no idea how you count that, but remember, it was (almost) all done by one man, who was also writing METAFONT at the same time.

    True, Donald Knuth isn't your average developer, but -- see? It can be done.

    -grendel drago
  • Exactly right. Does this guy actually think that grip or cdparanoia won't be stable for another five to eight years?

    Maybe KDE or GNOME in general will take ten years to mature into a 'final' state. Maybe Apache too... but when I think 'software', I think 'XMMS' or 'konsole'. Little stuff.

    -grendel drago
  • You mean the

    <tag attribute="value">data</tag>

    idea? Something like this is so basic you don't really think about where it came from...

    -grendel drago
  • "Good stuff takes 10 years"
    This guy must be talking about red wine, not software.

    -- Pure FTP server [pureftpd.org] - Upgrade your FTP server to something simple and secure.
  • Quake 3 is basically version 4 of the DOOM engine.
    ------
  • Not neccessarily true. There are one-way property checks such as:

    Input: request for door to open
    Output: door opens or error reported IF (this or that or that)

    While this logic may be complex (lets say the API to the door is a nightmare), verifying that the output matches a set of desired conditions can be far simpler in certain cases than the actual logic to implement the desired behaviour? Especially depending on the language you are using to implement .. think of doing something with assembler, when correct operation is verified simply by walking through some memory and doing a few checks like 'not everything is null', and other trivial possible outputs you know you won't EVER want. His program almost sounds to me like high-level asserts, where the app goes through all the possible inputs you specify to check against the asserts.
  • by bachelor3 ( 68410 ) on Thursday July 19, 2001 @04:18AM (#75583)
    Hang on to that HP3300C scanner...you're gonna get a really good linux driver for it in 2011.
  • You can't put it down to nicking code. I mean, if it were that, Mozilla would be ready, wouldn't it?

    It's probably more because MS diverted their best and brightest from every other product it had to work on IE. During those years they must have spent more $ on making IE than working on Windows, Office, and all their other stuff put together...


    ---
  • If its a project with government involvement, he's being far too optimistic...
  • by alispguru ( 72689 ) <bob.bane@ m e . c om> on Thursday July 19, 2001 @05:04AM (#75588) Journal
    The semantic content of XML is equivalent to S-expressions, the core data structure of Lisp, which was invented by John McCarthy in 1959.

    To a Lisp hacker, XML is S-expressions in drag.
  • I can't stand it when people make idiotic statements like this fool. Good software takes 10 years to develop... What a load...

    Simply put, software takes... as long as it takes. Some software is incredibly great after only 2 or 3 years, other software takes 10 or even 15 years before it is great. And then there is tons and tons of software that is NEVER great.

    Just take a look at Netscape. It was a very widely used piece of software that has NEVER been fully stabilized, and there are countless bugs in that POS. It is a bad piece of software.

    Then there is Mozilla (WTF is he laughing about?). It's only 3 years old, 100% new code, and while it's not even to 1.0 yet, I'd say it's good. It is well on it's way to becoming great software well before 10 years.

    Another example of a great piece of software, which happens to be one of my favorites, Q3A! ; ) I'm not kidding though, it is a fantastic piece of software. It is almost perfectly stable, I can't even remember the last time I saw Q3A crash. I've played Q3A for countless hours over the last year, and it hasn't crashed in that whole time. And, there is only 1 bug I can think of off the top of my head. Q3A is only (3? 4?...) years old and it is a complicated piece of software.


  • Eg old warhorse like tar now has support for .bz2 files (tar -zI).


    I assume you meant tar -j instead of -zI (apparantly there were a couple of different patched versions using -y or -I and so they picked -j in the newest versions intead).

    OT? Yes. Many apologies. I am bored.
  • Some would argue that linux was in a usable and stable state in the 1.2.x series. 2.0.x was really a milestone, although that series lagged on for quite a while (due to lackluster 2.1.x development).

    You also have to realize that Linux actually means GNU/linux, and the gnu project really started around 1984. So you could argue that in 1994 linux 1.2.x + GNU operating system was the final product, the linux kernel being the last piece.

    • Personally, I like the ability to launch a text editor for 'small' documents without waiting for 300 DLLs to load, or a splash screen, or 50 million freakin options that I'll never have a use for.
    I think what you're looking for is vim under linux. Unless you are addicted to point-n-click, in which case I will sipmly sympathize and move on...
  • Cool. Does most gtk software port over to windows easily? I know gimp is ported. Maybe we can subvert MS on their own operating system. Interesting.
  • But as soon as you discard scientific rigor you are no longer a mathematician. You are a numerologist.

    Or an idiot opinion writer on the web :)
  • I'm not a Java programmer (IANAJP), but how is Bandera different than Java Pathfinder [nasa.gov]?

    From the Java Pathfinder page: "In keeping with this philosophy we have developed a verification and testing environment for Java which integrates model checking, program analysis and testing. Part of this work has consisted of building a new Java Virtual Machine that interprets Java bytecode. JPF is the second Java Model Checker developed by the Automated Software Engineering group at NASA Ames - JPF1 used a translation from Java to PROMELA in order to do model checking with the SPIN model checker."

    While your enthusiam is great, I do not think it will revolutionize software development. The tool would have to be amazing. And yes, I do mean amazing. If it was amazing right now, which it is not, you'd have a chance to change the software development world. It also seems very academic and not truly practical. I could be totally wrong about this -- remember, IANAJP!

    Bottom line: It is a good idea and sounds interesting so keep your chin up. But, try to be a bit more realistic.
  • by mhelie ( 83207 ) on Thursday July 19, 2001 @03:33AM (#75604)
    A lot of games have been in development for a very long time and are still being developped. Quake, for example, is 5 years old and at least three upcoming games I'm aware of (SoF 2, Wolfenstein, Medal of Honor) are using it. Unreal has also been in development for years.

    Just because it changes on the surface doesn't mean the whole program gets scrapped and everyone starts over. Unless we encounter a serious technological bottleneck in our current engines, it is quite likely they will still be in use in another five years.

    -------------------------

  • No kidding...

    Actually, we may make a pitch to them.

  • by Pedrito ( 94783 ) on Thursday July 19, 2001 @05:23AM (#75611)
    I've been apart of software projects that produced "Good" software. Really good software, in my opinion. Really good, large systems that were reliable and the users loved them. One took roughly 3 years to be ready for market and had another 2 years of additional development.

    Another is one that we're nearing completion on. It has taken 1 year for four developers, and it's a large distributed system. It's very stable and it's good software. We've also taken on the idea of "plugins" and exposed a great deal of our system's internal data and functionality so that we can add almost all of our new features via the "plugins" and not have to worry about mucking with the base system and messing it up.

    Now, as my old boss used to say: "We're not sending rockets to Pluto," but these are fairly large complex system.

    The first was a multi-user engineering system for developping cell phone networks (base station locations, traffic analysis, propagation prediction, interference prediction, etc...) The second is an enterprise wide tracking system, used to track everything from bugs in the software itself, to evidence in police stations, to prisoners in prisons, to assets for a company.

    So, I don't really buy into the 10 year thing. Not to mention the speed of technology changes, hell, you can't design for what's going to be there 10 years from now. Who knows what's going to be on your desktop?

  • The final line of the article is...

    Good software, like wine [winehq.org], takes time.

    (Okay, so I changed the link).

  • He may have a point, but the "number of lotus users" graph might have some correlation to "the number of PCs in use" graph. The size of the market has increased so much it doesn't really prove much. I know Joel tends to think older software is better, hence this article entitled "Things you should never do, part 1" [joelonsoftware.com] where he chastises anyone who has ever re-written anything from scratch, saying that it is a waste of a huge investment. I don't know. I certainly think refactoring code is often necessary. When does it stop being refactoring and start being a re-write?
  • by MrBlack ( 104657 )
    I forgot to mention...Has anyone else noticed how much of a Philip Greenspun devotee Joel is? All those photos were flashbacks to "Travels With Samantha", and he praises Philip in a few places on his site. He's also about as oppinionated as Phil G. too....
  • Has taken 10 years!

    Notice that some of the best projects, although usable, are not "mature" software until at least 3-4 years.

    Linux has been around about that long, and in the last 2-3 years has really picked up speed (and dollars) from corporations.

    Microsofts OS picked up speed around Windows 3.1 & 3.11 (early 90's) and had matured for aproximately 10 years prior to that (DOS 1.0 - DOS 6.22) You get the idea. Now that Windows is aproaching 10 years (Windows 2000) is finally an almost dare I say it "Good" OS.

    Although 10 years might not hold true for all applications, it will hold for a large number.

  • See my paper (with Scott Johnson), Practical Program Verification [nec.com], in POPL '83.

    It's quite possible to use formal methods, and tools that check them, to produce bug-free programs. But the formalism is too much for most programmers. That's the real problem. Using a verifier means writing code that you can explain in a formal notation. This is a huge pain. It is, though, quite straightforward to check for low-level problems like possible numeric overflow and subscript errors by formal methods. More recent work has extended this to race conditions and deadlocks.

    Back when we were working on this, a major problem was that it took 45 minutes on a 1 MIPS VAX 11/780 to verify about 1000 lines of code. Today, that would take 2 seconds. You can be too early with a technology.

    Incidentally, undecidability isn't a problem. It's possible to construct programs whose halting is formally undecidable, but they're not very useful.

    Although program verification hasn't done much for software, formal techniques are widely used in IC design, where people are serious about engineering and stuff is expected to work.

  • by jallen02 ( 124384 ) on Thursday July 19, 2001 @04:18AM (#75629) Homepage Journal
    What about the whole Rapid Application Development philosophy?

    I think it still applies that in a month I can write a very useful application using something such as Visual Basic or Delphi (Kylix)

    The time in development does not have to fall soley on MY shoulders. The Delphi people have spent many years coding the base libraries and IDE for Delphi. How does this entire development philosophy fit into this guys plan?

    What about 4Gl's where many years have been spent refining these languages. Many years have been spent thinking of how to make programming flexible (to a degree) and easier. Does this not /cheat/ the example given since you have so much time invested into your platform that you can get a good boost having such a strong underlying development platform?

    I am not implying any such a platform exists I am just stating that it seems to me this is quite feasible with the right tools. Maybe they are not out there yet but they will be some day.

    I just think that in software development you will find many rules and philosiphies spring up only to be later invalidated as the nature of the beast changes.

    Anyhow, laters.

    Jeremy

  • Agreed. If I want to edit a webpage, I use Notepad. A webpage that is too big to load in Notepad is too big. Period.

    If I want to view or edit a really large plain-text, I open it in MSVC. MSVC whips the pants off Wordpad for large plaintexts. Wordpad is only useful for viewing documents from people who insist on sending you .doc files. As long as there isn't any crap embedded in a word file, you can view it. If people send me word files with crap in them, it usually doesn't matter. 99% of the time the graphics are just that--crap with no information that can't be deduced from the accompanying text.

  • Ok, not the best example of a game that took many years to "perfect". I guess Daikatana isn't either... ;-) Games are probably not a good example of "more dev time == better app" because gamers want the latest, bleeding-edge thing more than a "workhorse" (Half-Life is a workhorse, but its persistence is probably dependant on a steady supply of new mods/Counterstrike updates).
  • The catch is you have to specify the expected properties of your program in terms of
    logical language


    Oh, so all we have to do is write a bug-free program to check our first program for bugs? Brilliant! (Seriously, I'm all for correctness checking, but if you think it is a solution and not a tool, you're smoking too much of that there crack cocaine.)

    "Beware of bugs in the above code - I have only proved it correct, not tested it."
    - Knuth
  • This was previously intractable and infeasible task as the scientist proves that nobody ever breaks the Turing machine. But, we've got a way to get around with it.

    You might want to take a few more math classes, and then then get back to us. I'd tell you to suggest to your professor to do the same, but incompetent professors are often extremely vindictive.

    There's often nothing useful you can do with a computer science professor who has refused to learn any computer science, except nod and smile and stay the hell out of their way.
  • Look at PC Anywhere. This program has been around since the DOS days, and is now at version 10. It's still buggy as hell. You would think that by the time you get into double digit version numbers that you would pretty much have most of the bugs worked out.

    It's kind of amusing that you can do the same thing in UNIX with X without all the hassle. Who says UNIX is harder to use than Windows??

    ---

  • I havn't tried Windows Terminal Server, I hear it is quite expensive. We are slowly starting to use MS Netmeeting which is a little more stable than PC Anywhere. I should look into Terminal Server, though, since we have to support Windows based PC's at 4 remote warehouses.

    ---

  • But the calculator *still* can't do squares.

    View Menu > Scientific
    Enter your number, and click the "x^2" button

    It's been this way since at least Windows 3.1
  • by Atom Tan ( 147797 ) on Thursday July 19, 2001 @04:03AM (#75643) Homepage
    Obviously, 10 years as a blanket maturation time for all software is over-the-top. But Spolsky does make some good points about software development. His main points can be summarized in the statement that creation of new, non-trivial software is a "wicked problem"--it requires constructing an incomplete or unsatisfactory solution to fully characterize the problem in the first place. A deep understanding of software design as a wicked problem will make you a better (and less frustrated) developer. For example:
    • Requirements will always fluctuate throughout the development cycle, because users cannot entirely formulate what they want until they have seen something close. For the same reason, requirements can never be fully formalized--the more explicitly complex behavior is described, the more likely it is not exactly what the user wants.
    • It is nearly impossible to get a complex design right the first time, even if you are among the very best. Design and build functionality slowly and incrementally, and expect to revise aspects of the design and code you didn't anticipate revisiting.
    • All non-trivial software has defects--you should accept this and spend time and effort developing an effective QA process, rather than treating QA as an afterthought or feeling guilty that your software has bugs. The development process lasts months, but the QA process lasts for the lifetime of the software. Concentrate on making the software better, not just fixing the bug at hand.
    These are just a few points, acceptance of which will lead to more effective development. A big barrier to acceptance of the real difficulty of software is that we spent our formative years developing trivial, batch-oriented programs that our professors could run easily and grade objectively. With these programs, the requirements were known upfront (any ambiguities having been flushed out by repetition of the assignment), the program was known to be doable by most developers in the alloted time, and it could be completed bug-free. None of those things is certain in the real world. Yet we treat complex software as if it can be done this way. It doesn't work--"Zaro Boogs", anyone?
  • To develop a single game for 10 years would be madness.

    Madness? You bet. http://www.nethack.org [nethack.org]

  • I think the reason for the poor code lies more in the shortened development cycle than anything else. The eXtreme camp would say refactor, but usually the conditions that caused the shortened cycle in the first place mean you can't go back. If the conditions are caused by poor project management, that is one thing. Often though, they are brought about as the only means for survival [fuckedcompany.com].

    Maybe we should just stick with safe [schemers.org] languages?

    I also take exception with the comp-sci dig =). I think a good comp-sci curricula leads to LESS slipshod code. I've seen too many people reinvent the wheel (poorly) because they didn't understand basic computer science concepts and design. The teaching of which is the goal of the very first class in the curriculum. [ou.edu]
  • I don't know about you, but it doesn't take most programmers ten years to write a REALLY good HELLO, WORLD script. Of course, there's always the chance it can be used as a buffer-overflow denial-of-service Trojan!
  • Sounds like he's fairly anti-OSS/FS to me. He advises against "release-early, release-often" and "it'll ship when it's ready"; he takes a poke at Mozilla because they're not stamping "1.0" on a buggy release; and seems to believe that software rental is the only way to go.

  • New features are still being added, but they are few and far in between. Eg old warhorse like tar now has support for .bz2 files (tar -zI).

    Since the support is external (stuff is just piped through the bzip2 command), one could argue that tar could've been properly designed from the start to handle an arbitrary compression program (just as 'tin' works with arbitrary text editors and 'rsync' works with an arbitrary rsh-like program [though in the rsync case, I'm not sure when that was added -- but should ssh ever get supplanted by a new rsh-like program, rsync's already ready to support it]).

    Furthermore, one could argue that the existence of both 'compress' and 'gzip' should've clued the tar developers in on the idea of supporting arbitrary compression programs.

    However, I don't think the people who make tar particularly dropped the ball or anything. It's just that the software could've been made flexible enough before-hand.

  • What do you think I'm doing right now? :)
  • Well, note that I said 'single game'. The Quake engine is getting pretty old, but thats not really the game per se. As I said previously, to develop a game for 10 years would be nuts, but to develop a system to be used for the basis of perhaps 30 games 10 years is pretty sound.

    As for the 3 games you mention, Medal of Honor is a port of a Playstation game. And Playstations haven't been around for even 5 years yet, so that could hardly be 10 years work.

  • by onion2k ( 203094 ) on Thursday July 19, 2001 @03:17AM (#75664) Homepage
    I'd agree in prinicle. To write Notes, or an OS, or a DBMS, certainly, 10 years is probably a fair amount of time to get something of reasonable quality. But..

    Not all software has even close to thelifespan of the big applications this guy is talking about. Most user applications, games, web technologies, they are all projects that get used for a few years and then get replaced. To develop a single game for 10 years would be madness. The amount of time a project has is usually linearly related to the lifespan of the outcome. If you're writing soemthing thats going to be used in 20 years time, then its probably not an afternoons work.

  • Actually I believe that he's really refering to what software revision codes actually mean [ornot.com].

  • by evanbd ( 210358 ) on Thursday July 19, 2001 @04:10AM (#75671)
    Linux anyone? started in 1991, really taking off in 2000/2001. Sounds like ten years to me...
  • Personally, I'd put a lot of it down to them nicking the Mosaic source and using it. That effectively makes the time of development something more like 6 or 7 years rather than the 2 or 3 MS worked on it.

    Well. first of all, they licensed the Mosaic source from Spyglass (completely screwing them over). But the issue here is IE 3 and 4 development. Using Mosaic code got them out the door a lot faster but their blowing by Netscape was their own doing, and Netscape's.

    I think where Joel is off is that he says Mozilla made a crucial error by restarting from scratch. But didn't Microsoft completely rewrite the rendering engine for IE 5?

    Unsettling MOTD at my ISP.

  • by update() ( 217397 ) on Thursday July 19, 2001 @05:31AM (#75675) Homepage
    Not all software has even close to thelifespan of the big applications this guy is talking about. Most user applications, games, web technologies, they are all projects that get used for a few years and then get replaced.

    I agree about games, with the exception of low-tech stuff like Angband and other Rogue-likes. I also agree about simple utilities. (That's why I don't care whether Miguel and Ximian decide to throw their energies into chasing .NET -- the stuff I care about in Gnome, like grip and xchat, is only going to get slower and buggier if they ever really get Bonobo in place.)

    But user applications like office suite components, financial applications, things like that definitely need time to mature. It took Word and Excel years to go from their original Mac-only incarnations to their usability peaks (Mac Word 5.1 and Excel 6.0, IMHO). And web browsers are far more usable than they were five years ago.

    While I'm here, responding to some things in the article:

    Mistake number 2. the Overhype syndrome. When you release 1.0, you might want to actually keep it kind of quiet.

    A corollary of this -- don't crank up the hype before you've written anything! That way lies Eazel, Mozilla, Marimba and, I'm guessing, Mono.

    Mistake number 5. The "We'll Ship It When It's Ready" syndrome. Which reminds me. What the hell is going on with Mozilla? I made fun of them more than a year ago because three years had passed and the damn thing was still not out the door.

    I made fun of them, too. Until it turned out that its function at AOL was to serve as a bargaining chip with Microsoft. It did its job. And I don't think you can fault them for not releasing those horrific 0.9 versions as 1.0. You can fault them for taking so long to make it usable.

    Unsettling MOTD at my ISP.

  • > Unfortunately, he doesn't mention any OSS/FS examples

    He did mention Wine [slashdot.org] at the end. But i guess it'll take longer than 10 years for this wine to come of age ;)

  • I suspect part of the reason is that comp. sci. curricula are, in many universities, dictated by the students who are convinced that their investment in time and money is best spent on learning great chunks of Java, VB, C++ practice, in terms of graduating with the best job prospects.

    Unfortunately the teach-em-only-what-industry-is-using approach does not provide students with any real perspective and you end up with second rate programmers.

    For my own part, the "safe" languages aren't safe enough. Java etc. don't give Joe programmer access to dangerous pointers and so forth, but you still get runtime type errors, unchecked error cases, suckful performance, and an inability to stick to a spec. [Plug: it's nigh on impossible to avoid checking error codes in Mercury...]

  • by hillct ( 230132 ) on Thursday July 19, 2001 @05:04AM (#75681) Homepage Journal
    The submitter of this article states:
    I'm sure we can handle that. I am mostly skeptical of his idea because I'm not sure how he thinks a young software company could possibly survive without building really good software in much less time
    Perhaps it could be said that the customer expectations in the software industry have effectively closed the market to new companies in some areas.

    Will there ever be a new commercially viable operating system (not saing Linux is bad, but just that it's market share is far too low to consider it at this point)? Can suchan effort exist if the company producing it needs seed capital for 10 years of operation before a quality product can be produced?

    This seems actually to be a great opening for Open Source. OSS has the advantage of not requiring large capital outlays to continue development. Look at the number of OSS projects started in the past 5 years, and at the number of corporate software startups from the past 5 years. How many of the corporate software startups are still around? How many OSS projects are still around?

    I recall a statistic someware that only 2% of companies ever really succeed, beyond 3 years. I wonder whatr that percentage is for OSS projects...

    Based on the above comment, it seems to me that the only major competition in large scale software such as Operating Ststems, Enterprise quality databases and Perhaps Wordprocessor software will come from OSS, which is the only development model that could survive for the decvade nessecery for the products to reach maturity.

    The alternative is to say that the only major software companies that will ever exist have already been established and that the bariers to entry into these markets are too high for any new startup. This is not something I would ever want to have to admit.

    --CTH

    --
  • by Codeala ( 235477 ) on Thursday July 19, 2001 @03:26AM (#75688)

    I think a software is only good iff it takes forever. As software *approach* perfection, changes are made less often. New features are still being added, but they are few and far in between. Eg old warhorse like tar now has support for .bz2 files (tar -zI). It is impossible to say something is perfect and cannot be improved anymore. Anyone remember that quote about "all the inventions had already been invented"?

    Of course this probably don't apply to the commercial environment (which is what the article is aimed at I think). Imagine starting a business selling tar: buy tarXP because... okay lets go to a subscription model ;-)

    ====

  • Agree: No Way. I was a software engineer on several key projects which worked "out of the box" and performed very well. Both took 9 months to a year to complete:

    1. Thomson's 1st DirecTv satellite receiver.
    2. Thomson's DOCSIS Cable Modem.

    Most impressive about these programs is that they incorporated "first silicon" for their technologies. So not only the software worked, but the hardware worked on the first "tape out".

    Ten years? BS. That sounds like a poor excuse for crappy engineering.
  • There is also VNC, just like PC-Anywhere, 'cept its free, and works on many platforms.

  • Well, that's an interesting misinterpretation, but the fact is that Java in a browser cannot be expected to be up to date, anyway. Java is a portable virtual machine that used to pack an incredible punch and have a lot of power, even in its most compact version. I've written programs for Java 1.0 that look gangbusters and run fast enough that older machines can keep up with their HotSpot JIT brethern. It took a lot of effort, and required a lot of care, but you COULD make Java run everywhere (no matter what the naysayers, like our braindead IT department, will tell you). But now that MS isn't including Java with their browser, Java's just another plug-in that consumers visiting your site might not have. And the java plugin is currently only available as part of a 5 meg J2RE download that is decidededly not for the average joe. Hell, our QA department has fscked up installs on three boxes! At the same time, I can't get rid of this damn Comet Cursor, the most idiotic, useless and proprietary software around. In fact, unless something is done to ease the Java gap that will begin to widen when XP hits the CompUSAs and Warez Servers of the world, we, the development community, will have to consign ourselves to a world where the most powerful cross browser, cross platform control is Flash 4.

    *Shudder*.
  • This sounds like an attempt to automate formal verification. And formal verification can work, but at best it proves that the program meets the spec. Usually the problem is that the customer decides he wants something else when he sees the program running according to spec. So you do better by implementing part of the spec, getting the customer to play with it, then changing the spec accordingly.

    On the other hand, will this detect whether or not the programmer put anything in place to catch a buffer overflow? Considering that Microsoft has issued six patches since December, each because an unchecked buffer allowed a security exploit, ...

  • Sorry, but who is going to be interested in coding a boring, routine, business application like that. Certainly not me. I do shit like that all day at work. I imagine it's exactly the same for most open source guys.

    Um, maybe, just maybe: The guys who are doing it right now? There is nothing that says open source has to be made in somebody's parents garage. I'm positive that this is not the only clinic with a programer onhand working on keeping their crusty old custom software working.

    Now if two or more of these guys working on these custom apps for clinics get together and work on something together then they all can enjoy the work done by the others.(After auditing it themselves of course). You don't have at least a periferal grasp of how OSS work, right?

    - RustyTaco
  • Yes, she (female, not male.) wrote a very good article about it.
    "Things you should never do"

    --
    Two witches watched two watches.
  • Not at all.
    What she (it's a she, not he) says is that:
    A> Don't advertise during the super bowl your *1.0 product*. It's not going to be good enough, and you'll lose people's trust that way.
    B> Don't create over-ambitious goals. Don't say, "I want to create a word processor", and then try to copy every feature of Word on your 1.0 product.
    C> *Keep* a scheduale. Without one, programmers code for fun, not for the best of the application.
    D> Don't expect *immediate* commercial success.

    --
    Two witches watched two watches.
  • Actually, I think that Win9x is a master-piece of engineering.
    The one thing that MS needed with 9x is compatability with DOS & Win16 applications.
    They damn well got *that* one.
    The problem is that this compatability *cost*. And that cost is in the stability of the system. The OS can't guard itself against rouge applications.

    --
    Two witches watched two watches.
  • Depend on what you think of as Linux.
    If you are talking about the kernel, it was usable quite some time ago.
    It still need to be worked out around some problems that it has, but the kernel is mature.

    But the kernel isn't the problem with Linux, it's the rest of the system that raise many objections.
    Especially in the desktop market.


    --
    Two witches watched two watches.
  • "I didn't have a boyfriend"

    http://joel.editthispage.com/stories/storyReader $4

    Gay?

    --
    Two witches watched two watches.
  • You haven't seen some of the female soldiers that *I*'ve seen.
    The food the IDF supply *does* make hair grow on your chest.

    "I didn't have a boyfriend"

    http://joel.editthispage.com/stories/storyReader $4

    Gay?

    --
    Two witches watched two watches.
  • NT's kernel is actually quite tiny.
    Most of the code is *not* on the kernel, you know.
    Put Linux, X, KDE/GNOME, Bonobo, J2EE implementation, TCP/IP, Apache, sendmail, together, how many LOC does this come out?

    That is (very roughly) what NT has.


    --
    Two witches watched two watches.
  • But the calculator *still* can't do squares.

    --
    Two witches watched two watches.
  • by tb3 ( 313150 ) on Thursday July 19, 2001 @04:32AM (#75718) Homepage
    Beause Windows is the razor and Office is the razor blades. Notepad and Wordpad are in there to annoy you into buying Office. Check the retail price of Office($580) vs the retail price of Windows ($310). Micrsoft doesn't announce its profits by division or product, but I bet Applications makes more money than Operating Systems.
  • by n76lima ( 455808 ) on Thursday July 19, 2001 @03:23AM (#75741)
    I work in the medical practice management business. A group of Docs that I consult with have their own PM app suite that they had written in house. Its 14 years old, and has gone through 2 major re-writes. It ddn't take 10 years to produce good workable software, but it continues to take development to refine it and add features that the staff need to transition to an office model that uses less paper. It started as a DOS program and evolved into several DOS sessions that could be task switched under Windows 3.1, then 95. Now the current version is all Windows (on the desktop, we run extensive Linux support on all the servers). They have had a full time programmer on staff for 14 years with no end in sight. The current thrust is to make the medical records available to the Docs on the web through a browser interface. The cost to develop it has all been paid by 1 clinic, and I estimate that they have ~$1 million tied up in the coding over the life of the project.
  • by Smedrick ( 466973 ) on Thursday July 19, 2001 @04:36AM (#75752) Homepage
    I suppose you could blame comp-sci curricula (along with simple laziness)...I've got mixed feelings in that area. Some courses I have taken were incredibly helpful, while some were complete wastes of time. Yet, I'm lucky enough to be going to a pretty damn good school for CIS. I've seen graduates of some other schools that put shame to our field. I don't know what someof these colleges are teaching, but it sure ain't comp sci.

    The best way to teach comp sci is not to give the students "safe lanuages", but to teach them how to learn languages and how to develop software (planning, modularity, commenting, etc.). The problem is that I think the field of computer science is still going through beta testing in most universities. Topics like CoE and EE have a pretty firm base. The classes don't vary much from school to school. But with CIS, schools are still learning how to teach the concepts. Sure, a class in C++ or assembly lanuages won't vary much...but knowing a language can only get you so far. There's really not a solid platform yet for universities to start their curriculum.

    As high schools start offering more technology classes (which they're starting to) and colleges work out the bugs in their curricula, you'll probably start seeing the software industry become more stream-lined and standardized...like engineering.

    --

A physicist is an atom's way of knowing about atoms. -- George Wald

Working...