Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Rewrites Considered Harmful? 670

ngunton writes "When is "good enough" enough? I wrote this article to take a philosophical look at the tendency for software developers to rewrite new versions of popular tools and standards from scratch rather than work on the existing codebase. This introduces new bugs and abandons all the small fixes and tweaks that made the original version work so well. It also often introduces incompatibilities that break a sometimes huge existing userbase. Examples include IPv4 vs IPv6, Apache, Perl, Embperl, Netscape/Mozilla, HTML and Windows. "
This discussion has been archived. No new comments can be posted.

Rewrites Considered Harmful?

Comments Filter:
  • by halo1982 ( 679554 ) * on Thursday January 15, 2004 @02:28PM (#7988847) Homepage Journal
    I wasn't aware of this....ehh...I thought XP was just modified Win2k code (and I remember my early XP alphas/betas looked exactly like Win2k...same with Server 2003...)

    Was it a "good idea" for Microsoft to rewrite Windows as XP and Server 2003? I don't know, it's their code, they can do whatever they like with it. But I do know that they had a fairly solid, reasonable system with Windows 2000 - quite reliable, combining the better aspects of Windows NT with the multimedia capabilities of Windows 98. Maybe it wasn't perfect, and there were a lot of bugs and vulnerabilities - but was it really a good idea to start from scratch? They billed this as if it was a good thing. It wasn't. It simply introduced a whole slew of new bugs and vulnerabilities, not to mention the instability. It's just another example of where a total rewrite didn't really do anyone any good. I don't think anyone is using Windows for anything so different now than they were when Windows 2000 was around, and yet we're looking at a 100% different codebase. Windows Server 2003 won't even run some older software, which must be fun for those users...

    • by nakhla ( 68363 ) on Thursday January 15, 2004 @02:31PM (#7988902) Homepage
      Windows XP/Server 2003 were NOT complete rewrites of the OS. Many of the individual components within the OS may have received extensive retooling, but the OS as a whole was not a complete rewrite. New features were added. Existing features were modified. The code simply evolved from one version to another, just as with most products.
      • by rushfan ( 209449 ) on Thursday January 15, 2004 @02:45PM (#7989143) Homepage Journal
        Very true... One of the few "rewrites" that Microsoft has rever done is the NT codebase (which was actually more of OS/2 morphing into NT), and that wasn't a true rewrite since the "original" DOS/Win31 codebase keps livingo on with Win96/98/ME.

        MS has tried some rewrites (I think they tried in Excel rewrite, I think Code Complete references that) but scraped them (also never giving up on the previous generation codebase).

        That's one thing they do well (for better or worse) is not waste any money on rewrites (look at Win9x)

        Rushfan
    • XP and 2003 are fairly minor tweaks of Windows NT, but they are missing some of the back-compatibility that was in Windows 2000 if I remember right.

      Windows NT was as close to a complete rewrite (of Windows 3.1) as Microsoft has attempted for a long time. Since then there were 2 main branches that derivated ... Windows 3.1 -> Windows 95 -> 98 -> ME and Windows NT 3.5 -> NT4 -> 2000 -> XP -> 2003.

      XP was in no way "from scratch".

      Longhorn sounds like it will use a NT-derivative kernel, b
      • by Gherald ( 682277 ) on Thursday January 15, 2004 @02:55PM (#7989291) Journal
        > XP and 2003 are fairly minor tweaks of Windows NT, but they are missing some of the back-compatibility that was in Windows 2000 if I remember right.

        No, you have got it backwards. XP and 2003 are both MUCH more back-compatible than Win2k.

        Asside from NT, Win2k was the most incompatible windows ever. Stable, but with many compatibility problems with both hardware and software. Especially before the various service packs came out.

        > XP was in no way "from scratch"

        You are correct. XP is the Win2k codebase with many features added and much better hard/soft compatibility. It was designed to be both a home/office OS, whereas Win2k was designed specifically to be a robust server/workstation.

        Incidentally, after all this time there is still an ongoing debate about whether XP or 2000 are more stable as a workstation client. As a network admin for 46 stations, my vote goes for XP.
    • by Fnkmaster ( 89084 ) * on Thursday January 15, 2004 @02:32PM (#7988915)
      It was not a rewrite, and thus is a terrible example. In fact, if you poke around it's referred to in several places internal to the OS as "Windows NT 5.1" to Win 2k's "Windows NT 5.0". That should give you a pretty good clue that it's not a rewrite.


      And the fact that somebody thought it was should give you a good clue that Microsoft's marketing machine is quite a powerhouse indeed - they want the average consumer to THINK that XP was some totally new thing. It wasn't. In fact, if you install all the latest DirectX runtimes, patches and so forth into Win 2k, you will basically conclude that the difference between a fully patched up-to-date Win 2k and Win XP is themeability and some graphics geegaws. And that product activation stuff if you are running a non-corporate version of XP.

      • New network stack, kernel, driver model, GDI, interface, hardware layer. If thats what you think of as a small change I'd hate to see what you think is a major one.
        • When you say "new", you mean changed. I don't think the kernel was rewritten from scratch, was it? Driver model? I'm under the impression that most Windows 2000 drivers are more-or-less ABI compatible with Windows XP without modification. Apparently the DDKs aren't that different between the two OS's, though there were of course changes (http://www.osronline.com/article.cfm?id=249). There were a substantial number of additions to the network stack (http://www.microsoft.com/technet/treeview/defaul t .as
  • by [TWD]insomnia ( 125505 ) on Thursday January 15, 2004 @02:28PM (#7988854)
    .. as they are rewriting the security layer!
  • Design desitions (Score:5, Interesting)

    by FedeTXF ( 456407 ) on Thursday January 15, 2004 @02:28PM (#7988856)
    As a coder I can assure you that working on somebody else's code is frustrating because you allways say: "I would have done this differently". Most rewrites I think come from there, having the idea of a better implementation.
    • by polished look 2 ( 662705 ) on Thursday January 15, 2004 @02:35PM (#7988968) Journal
      As I recall, Torvalds made mention that some of his original code in the Linux base was not very good and he would have written it much differently today. Indeed, most anyone that habitually programs naturally becomes more skilled and if the underlying premisis/framework/model of an application or tool is not as good as could be - or is lacking a certain methodology that time has proven to be beneficial and only rewriting it will solve this - what is wrong with rewriting the code from the ground-up?
      • That is an excelent point.
        I was once working on a time card program for a buisness a few years ago. I was relativly new at programming in the real world (I was still in High School, actually) and after the intital program was 'finished' and we introduced it into the current setup, we had to fix bugs. After fixing bugs, and fixing the bugs that the previous fixes introduced, I realized that if I would have written the program using a different layout, the entire system would just be better.
        I didn't ever comp
      • by jadavis ( 473492 ) on Thursday January 15, 2004 @02:52PM (#7989239)
        what is wrong with rewriting the code from the ground-up?

        Nothing is wrong with that, as long as your time is worth nothing.

        The obvious answer is that if you get some enjoyment out of a rewrite, and you actually do it, then sure, its great. But if you have to trade something more important, than it's bad. What else could you do with your time, and how much enjoyment or productivity would you get out of the alternatives?

        When I first started programming, I would always get a vision about how a piece of software should work, and think about rewriting it. But usually the current software is, to an extent, cluttered for a reason. I think it's only worthwhile if you can actually work out the details, and you're still confident. The details are what always cause the problems.
      • what is wrong with rewriting the code from the ground-up?

        I usually find Jamie Zawinski to be an arrogant rude asshole, but occasionally our opinions overlap. In this brief rant [jwz.org] he describes the Cascade of Attention-Deficit Teenagers software development model, which often leads to rewriting code from the ground up. Over and over and over.

        Stay out of that trap, and actually fix stuff during your rewrite, and there's nothing at all wrong with doing it over from scratch. Rewrite it just because you d

      • by Webmonger ( 24302 ) on Thursday January 15, 2004 @04:55PM (#7991147) Homepage
        What is wrong with that is that most of the code is correct and solid. It's just organised in the wrong way.

        Instead of rewriting, restructure! When you rewrite, there's a period where the new code doesn't work. If you restructure in suitably-sized steps, the code always works between steps.
    • by Old Wolf ( 56093 )
      A lot of time wasting comes from that too. Even if you can think of a better implementation, if it isn't better by it's not worth the development time + debugging time to do it that way.

      Re. the original post, I think a lot of the problem is caused by bad code commenting. When you make a "little tweak", or fix some minor bug, or fix a subtle logic bug, you should clearly comment in the code what you have done, so that it can serve as a warning when somebody else looks at the code and does not realise the s
    • by selderrr ( 523988 ) on Thursday January 15, 2004 @02:35PM (#7988976) Journal
      I diesagree. Most rewrites come from the experience learned during long periods of adaptations. The roots of this rewriting problem go back to the source of all coding evil : specs.

      In 15 years of coding, i have NEVER worked on a project that had specs which could foresee future futher away than say 4-6 years. After that, either the managers start pushing up new features that simply do not fit the original concepts, or you bump into uses of your software you did not foresee simply because the scale of applications has grown beyond the site of your own usage.

      The last 4 years I've been writing an app for authoring psychology priming experiments (somewhat like e-prime, but with far more randomisation capabilities). In the original concept, no-one in our team expected someone to make randomisations wit a tree wider than 6 stages. So I went for 15 in my code. By now, 4 years later, I have seen projects with twice that depth. I could expand the code by changing some #defines to provide for larger arrays, but that ignores the fact that such complex randomisations demand a whole other interface. So after a few weeks of puzzling, we decided.. you guessed it : a rewrite.
      • Re:Design desitions (Score:5, Interesting)

        by blinder ( 153117 ) <blinder@dave.gmail@com> on Thursday January 15, 2004 @02:45PM (#7989141) Homepage Journal
        specs which could foresee future futher away than say 4-6 years

        LOL! Good grief man... the client I'm working with, their specs can't see past 4-6 weeks!

        Over the last year and a half I've been working on building a "policy engine" that manages this company's various business policies... everything ranging from ordering, or communications to whatever.

        Well, the ding-dang business users and their minions the "business analysists" can't see past a month or so... then oops... more functionality... change existing functionality... because "oops... we really need it to do this" to the point where I have to make this a unified system of "one off's"

        Yeah, ugh... and the idea of "rewrite" has come up because right now... the code base is huge... its a mess and looks like, well, like patch work. We are trying to get management buy-in... and calling it "upgrading and refactoring" because we know full well that "rewrite" is a dirty word in these parts :)

    • by BinxBolling ( 121740 ) on Thursday January 15, 2004 @02:54PM (#7989266)

      And often, you're mistaken when you think you have a better implementation.

      Here's an experience I used to have somewhat often: I'd be revisiting a piece of code I'd written a few months earlier. I'd think "Wait, this makes no sense. It shouldn't work at all. New approach X is much better." So I'd start refactoring it, and when I'm about 3 hours into the implementation of 'X', I begin to understand why I chose the original solution, and realize it remains the best approach. And so I nuke my changes.

      I don't tend to let that happen so much, any more. Partly I try to better document why I make the design decisions I do, and partly I try to have a little more faith in myself, and partly I stick to the attitude of "Don't fix what you don't empirically know to be broken."

      The point of my story is this: If someone can misunderstand their own design decisions after the fact (and talking to fellow programmers, I'm not the only one with this kind of experience), think how much easier it is to misunderstand someone else's.

      • Re:Design desitions (Score:3, Informative)

        by Bas_Wijnen ( 523957 )

        I know the feeling. But it hardly ever happens anymore, and that is only because I now document every "smart" move I make. If I do something which may look weird, I write a comment about why I don't do it the other way, or that it should have been the other way, but I was too lazy to do it.

        If I see something which looks like it shouldn't work, then I study it, and find out why it does, and document it. Or I study it, find out that indeed it doesn't work in some cases, document and fix it, or document t

    • Give 10 software developers the same problem and you are guaranteed to get at least 11 different solutions
    • Maybe times have changed, but when I started my career as a _maintenance_ coder, there were two ways to do it:

      1) The usual way: fix small sections of code in the same style and technique that it was originally written,

      2) rewrite large sections of code that were _truly_ hard to maintain, taking great care to leave something much more maintainable behind. This route requires much more thorough testing than (1).

      I remember another of us "programmers" who said he didn't do maintenance, he was a "development
    • by Salamander ( 33735 ) <`jeff' `at' `pl.atyp.us'> on Thursday January 15, 2004 @03:28PM (#7989826) Homepage Journal

      There are necessary and beneficial rewrites, but the vast majority of rewrites occur because it's easier to write a new piece of code than to understand an old one. Yes, easier. The "rewrite bug" afflicts brash beginners the most, and top-notch experienced programmers the least. The best programmers tend to get that necessary rewrite out of the way during initial development, by writing a serious first-cut version, throwing it away, and then writing it a second time for real all before anyone else even sees it. Such code will often pass unit tests earlier than the "never refactor" code written by second-raters, and rarely requires a rewrite after that.

  • by devphaeton ( 695736 ) on Thursday January 15, 2004 @02:29PM (#7988867)
    This introduces new bugs and abandons all the small fixes and tweaks that made the original version work so well. It also often introduces incompatibilities that break a sometimes huge existing userbase.

    Microsoft has created an entire, successful, multibillion-dollar-a-year-profiting business model off of this!!

    Sheesh.
  • Slashdot (Score:2, Funny)

    In light of the preceding article, I propose that we completely rewrite slashdot! In BASIC! This will provide unsurpassed slowness and crashing, making the world better for all!
  • by Metallic Matty ( 579124 ) on Thursday January 15, 2004 @02:29PM (#7988877)
    The trick is to include all the tweaks and fixes that were implemented in the old code. Obviously, if you rewrite and then leave open all the gaps and problems from the earlier version, there's no point in rewriting. However, you could rewrite with those fixes in mind, and come out with a completely new (and problem-free) edition.
  • Ego? (Score:5, Informative)

    by Undaar ( 210056 ) on Thursday January 15, 2004 @02:30PM (#7988883) Homepage
    I think it may have something to do with programmer ego and something to do with the challenge. I'm guilty of it myself. You find something you're interested in and you want to build it. It doesn't matter if someone else has done it or even done it well before you. The challenge is to do it yourself.
    • Re:Ego? (Score:3, Interesting)

      by CommieLib ( 468883 )
      There may be some of that, I suppose, but speaking as someone who is strongly advocating a rewrite for a new version, it's also a matter of just seeing vastly better ways of doing things now that you've had the benefit of the experience of v1.0.

      Furthermore, all of the features that creep into v1.1, .2, .3, etc. that were not envisioned at all in v1.0 become code barnacles, stuck on to the coherent codebase and clinging for dear life. Better to create a new codebase that incorporates them as part of it.
      • Re:Ego? (Score:3, Interesting)

        by StormReaver ( 59959 )
        "There may be some of that, I suppose, but speaking as someone who is strongly advocating a rewrite for a new version, it's also a matter of just seeing vastly better ways of doing things now that you've had the benefit of the experience of v1.0."

        I've done (and still do) regular rewrites. In my case, it frequently ends up being rewritten because Management didn't accept my initial recommendations on development and/or back-end technology. Their chosen technology, which I am compelled to use, fails to ada
    • Re:Ego? (Score:4, Insightful)

      by globalar ( 669767 ) on Thursday January 15, 2004 @03:39PM (#7989979) Homepage
      How do you learn how something works?

      1) You take it apart (literally the backwards approach, though if you have the time it works).

      2) Read the documentation, learn how to use it, and work with it. (Still will not show you everything, especially with well encapsulated components. And when was the last time documentation, even Google, answered all your questions?).

      3) Build something similar (a variant, clone, emulator, etc.)

      The experience of programming your own components cannot be substituted. Bad, but passable analogy: Building a house vs. repairing a house. In the former, you experience the though process; in the latter, you adapt your thought process (to some degree).

      Also, I think once you see all the work and brilliance that has gone into software you take for granted, you are motivated to build something once with the intention of reuse. To be a forward thinker you have to understand what has gotten us this far and what has to change to get us farther. Experience with what the wheel is made of and why, not necessarily rebuilding it, can provide you with these perspectives.
  • by Kenja ( 541830 ) on Thursday January 15, 2004 @02:31PM (#7988896)
    Slashdoter: Why wont Microsoft just drop the Windows code base and start over? There are too many problems to fix.

    Microsoft: Ok, Windows XP and 2003 have a full rewrite of the TCP/IP stack and security system.

    Slashdoter: Why did Microsoft rewrite the core OS? They just introduced more bugs and lost the stability and security fixes from older versions of the OS?

  • by Anonymous Coward on Thursday January 15, 2004 @02:31PM (#7988906)
    Ok, this dude uses netscape 4.x and thinks its fast. next article please.
    • by Eil ( 82413 )

      4.x is much faster than Mozilla. By a long shot. Its downfall, aside from having unmaintainable source code, was that it was unstable, did not follow any kind of standards, and had a tendency to screw up whatever *should* have worked right. I think Internet Explorer 1.0 is the only browser in existance to beat the general crappiness of Netscape 4.x.

      Give me a slow and bloated, yet stable and standards-compliant web browser over the opposite (Netscape 4.x) any day.
      • If all you're browsing are pages served from a single domain, consisting primarily of flowed elements (headers, lists, images, and that's about it) with pages that are fairly short.

        Start adding tables and forms, trying to reflow the page when resizing (especially if it's a long one), and prepare for the wait of your lifetime.

        At least mozilla can display part of a page while the rest renders, and resolve more than one domain name at a time when connecting to resources in parallel.
  • by Viral Fly-by ( 662186 ) <ross@truman.edu> on Thursday January 15, 2004 @02:31PM (#7988909) Homepage
    The minor tweaks, fixes, and changes that made the old version work so well can only go so far. Such is often the nature of code. Tiny fixes and patches are (sometimes haphazardly) hacked on to the code.

    Perhaps if true extensive software engineering and documentation techniques were followed, a full rewrite may not be necessary. However, as long as quick fixes continue to pollute the code and make it more and more difficult to work with, an eventual total rewrite will always be necessary.
    • I used to think so. But I was forced by economics at my current employer [zappos.com] to just keep on hacking. For years I predicted the imminent collapse of our system under the weight of a thousand hacks.

      But it never collapsed. And the functionality and performance has been greatly increased. And we've added five more developers. And we're profitable.

      And because the original design was decent, there have been no catastrophic failures, or impenetrable bugs.

      Sure, we've rewritten many small parts of the system, b
  • by JohnGrahamCumming ( 684871 ) * <slashdotNO@SPAMjgc.org> on Thursday January 15, 2004 @02:32PM (#7988925) Homepage Journal
    I'm sympathetic to the idea behind this article, but does it deserve a place on /.? There's absolutely no empirical data, or even a reasonable example given in the document. The author is talking about IPv6 and Perl6 both of which are unknown quantities at this point.

    He's right that just throwing away old code means yo u lose a lot of valuable bug fixes, on the other hand if you look at some code and realize there is a better way then the solution is to rewrite it.

    Of course you can have it both ways. What you do is write an automated test case for every bug that you fix in your code. When you write the new version it has to pass the old test suite, then you've got new code and all the experience from the old code.

    John.
  • I can see one good need to rewrite -- when old techniques no longer do a sufficent job. Take Perl 4 for instance. It's simple, yes, but not extendable. You had to write your own libraries of Perl code to do more complex tasks (sirc, anyone?). Then Perl 5 comes out... and introduces package spaces and Modules, plus alot of cleaner code!

    A few of my perl scripts were just hacks. Patchy hacks that were dirty and buggy. I rewrote one, Anything, to be cleaner. Oh so much better.
  • by kps ( 43692 )
    Ouch! There goes my karma!
  • Untrue (Score:2, Insightful)

    by Shazow ( 263582 )
    Although I have an unhealthy habit of wanting to start things from scratch, I believe it can be a good thing more often than not.

    When you've developed a piece of software, fixed its bugs, and tweaked it, more times than not, those fixes and tweaks are nothing more than workarounds for your currently flawed structure. Usually, you don't realize these flaws until AFTER you've created it.

    By starting it from scratch, you can keep your mistakes in mind, and make better and more efficient software.

    Sure, there
  • Wow (Score:5, Funny)

    by Boing ( 111813 ) on Thursday January 15, 2004 @02:34PM (#7988947)
    That article had about the highest flamebait-to-content ratio I've ever seen on Slashdot (and that's SAYING something).

    This oughtta be good. (puts on asbestos-lined pants)

  • by shaka999 ( 335100 ) on Thursday January 15, 2004 @02:34PM (#7988949)
    Your point is well taken about ego often driving rewrites but in my experience the driving force for rewrites is often maintainability.

    As a program ages and drifts from the original intent ugly hacks are often placed on top of the original code to add unforseen functionality. There is also the opposite effect where old code is sitting around that no longer has any function. I remember one drastic case of this when rewriting a program where only about 1/2 the code was even beeing utilized.

    By rewriting the code you clean things up and make it easier for future programers to understand what the code is doing.
  • by mikewren420 ( 264173 ) on Thursday January 15, 2004 @02:34PM (#7988951) Homepage
    For Windows users, Winamp is probably the best example I can think of. Take a stable, usable, simple and elegant audio player (Winamp2) and fuck it up by writing it from scratch (Winamp3), then ultimately abandon that clusterfuck rewrite in favor of yet another rewrite (Winamp5) that fixes what they fucked up with Winamp3.

    I'm mighty happy sticking with Winamp2, thank you very much.
  • ReFactor! (Score:3, Insightful)

    by gbr ( 31010 ) on Thursday January 15, 2004 @02:34PM (#7988952) Homepage
    Don't rewrite. Refactoring code is the way to go. Refactoring in small pieces allows the app to maintain compatibility as the process progresses.
  • Maintainability (Score:5, Interesting)

    by PhxBlue ( 562201 ) on Thursday January 15, 2004 @02:34PM (#7988962) Homepage Journal

    The other side of the rewrite issue is, how long can you continue to maintain code from a legacy system? I worked on a project a couple years ago that had been migrated from assembler to COBOL and is now being rewritten (as opposed to being redesigned) for Oracle. Nevermind for a moment the fact that the customers wanted to turn the Oracle RDBMS into just another flat-file system--which included designing a database that had no enabled foreign key constraints and that was completely emptied each day so that the next day's data could be loaded. . .

    Some of the fields that are now in the Oracle database are bitmapped fields. This is done because there's no documentation for what those fields originally represented in the assembler code and because the designers are afraid of what they might break if they try to drop the fields or attempt to map the fields out into what they might represent. I had the good fortune to get out of the project last August. . . last I checked, they had settled for implementing a Java UI over the COBOL mainframe UI.

    Anyway, my point is this: at some point, you have to decide whether the system you're updating is worth further updates. Can you fix everything that's wrong with the code, or are there some things you'll have to jerry-rig or just shrug your shoulders and give up on? Under circumstances like what I mentioned above, I truly think you're better off taking your licks and designing from scratch, because at least that way you can take advantage of the new features that more recent database software and programming languages have to offer.

  • Full of shit. (Score:4, Insightful)

    by iantri ( 687643 ) <iantri@gPOLLOCKmx.net minus painter> on Thursday January 15, 2004 @02:35PM (#7988977) Homepage
    This guy is full of shit and has no idea of what he is talking about.

    Some of the better parts:

    - He claims that The mozilla project and everything Netscape >4 is pointless and that Netscape 4 "just works". We all know that Netscape 4 is an awful, crashy, buggy, standards-breaking piece of crap that set the Internet back years.

    - He claims that Windows XP was a complete rewrite. Windows XP is NT 5.1 -- (check with ver if you want) Windows 2000 with the PlaySkool OS look.

  • When is "good enough" enough? ... This introduces new bugs and abandons all the small fixes and tweaks that made the original version work so well. It also often introduces incompatibilities that break a sometimes huge existing userbase. Examples include IPv4 vs IPv6, Apache, Perl, Embperl, Netscape/Mozilla, HTML and Windows.

    I don't get it. Windows 95 was a piece of crap. Are you saying they should have extended that codebase, instead of developing Windows NT onward, into WinXP? Basically, you're *compl
  • Fluff Article (Score:5, Insightful)

    by SandSpider ( 60727 ) on Thursday January 15, 2004 @02:36PM (#7988983) Homepage Journal
    Okay, so most of the article consists of, "Here's software X. They re-wrote it, and now it's not as good or as accepted. Why'd they do that? They suck."

    Software is re-written for many reasons. Sometimes it's ego, sometimes it's for fun, but usually it's because you take a look at the existing codebase and what you want to do with it in the future, and you decide that it's going to cost a lot less to implement the future features by re-writing and fixing the new bugs than to work around the existing architecture.

    I've had to make the re-write or extend decision more than once, and it's rarely a simple decision.

    What I would have preferred from this article is some interviews with the people responsible for the decision to re-write, and what their thinking was, as well as whether they still agree with that decision or would have done something differently now.

    =Brian
  • Structural bugs (Score:3, Interesting)

    by GrouchoMarx ( 153170 ) on Thursday January 15, 2004 @02:36PM (#7988988) Homepage
    Sometimes, rewriting from scratch is necessary to remove bugs. Not all bugs are just failure to check a buffer overflow, which can be fixed without a complete rewrite. Sometimes your basic communications architecture in the program is fundamentally flawed and insecure. At that point, by the time you've fixed that bug in the existing codebase it would have been easier to start from scratch and make everything else work "natively" on the new internal standard.

    Take for example, Windows. ;-) There are fundamental security issues with all GUI windows operating in the same user space. If one is compromised, they're all 0wnz3d. That's a reasonably major flaw, but to fix it would require essentially rewriting the entire GUI portion of Windows, because it's so integral to the system. To try and fix that without a rewrite would be harder and more complicated than chucking it and starting from scratch, and probably introduce a dozen other bugs in the process.

    Sometimes you really do need to throw out the baby with the bath water, if the baby is that dirty. Besides, making a new one can be fun! :-)
  • by mitchner ( 524884 ) * on Thursday January 15, 2004 @02:36PM (#7988992) Homepage
    Joel on software has covered this point in a good article: http://www.joelonsoftware.com/articles/fog00000000 69.html [joelonsoftware.com].
  • Who is this guy? Rewrites done properly and for the correct reasons are great. One would think after seeing everything Microsoft spews out that people would realize heaping code on top of code isn't good. For starters it is unlikely that you have the same programmers working on a codebase after more then a year or two and most code isn't exactly well documented. So you see dupplications. As for getting read of tweaks? Huh? Where does he get this stuff?!?!? Then again if you read the artcile he still uses Ne
  • There is only "good enough for now". Rewriting software is good from several standpoints:

    1. It may be possible optimize slower areas of the code
    2. It may be possible to take umpteen patches and merge them into the regular code flow
    3. It may be possible to move modules outside the main code base
    4. etc., ad infinitum
    X. it may be billable as a new product (for commercial software only)

    Anyone who sits back and says a given codebase is "good enough for now" needs to be consigned to the scrapheap of history and
  • by billnapier ( 33763 ) <napier&pobox,com> on Thursday January 15, 2004 @02:37PM (#7988997) Homepage
    It was too messy and unmaintainable. I'll wait until the rewrite comes out to fix all the grammer and spelling bugs.
  • As a video game developer, I've been involved in many "code upgrades", as well as rewrites. As long as the rewrite is being done by people who wrote the original code, and they invest time in some preproduction carefully thinking through what they did right and wrong, the rewritten product will always be faster, more stable, easier to maintain, etc. etc. In the end it's always been a clear winner.
  • Joel Spolsky (Score:5, Informative)

    by Boing ( 111813 ) on Thursday January 15, 2004 @02:37PM (#7989010)
    Joel of Joel on Software [joelonsoftware.com] has written a much more insightful and useful (IMO) analysis of the motivations and fallacies behind code rewrites.

    Things You Should Never Do, Part I [joelonsoftware.com]

  • by SerialHistorian ( 565638 ) on Thursday January 15, 2004 @02:39PM (#7989041)

    Rewrites are 'bad' from a management point of view (at least, a manager that isn't familiar with software development), which looks at return on investment (ROI).

    However, from a developer's point of view, a partial or complete rewrite is sometimes the only way to FIX certain bugs. While it may introduce new, small ones, usually developers are smart enough to read the old code and learn from it's mistakes before the do a rewrite.

    A partial or complete rewrite is ALSO sometimes the only way to fix 'spaghetti code' -- code that's become so tangled from patch upon patch being applied to it that it's now impossible to trace and fix a bug. If spaghetti code isn't pursued and rewritten on a regular basis (this is 'constant improvement' -- a management buzzword from the past few years that actually works), new bugs can be inadvertantly introduced -- and it can sometimes take weeks to hunt down an intermittant bug by tracing spaghetti code. Ladies and gents, WEEKS of programmer time is expensive compared to one programmer spending 8-10 hours per week tracking down bad code in the codebase and rewriting it.

    Really, there's a case for doing rewrites on a constant basis. The author should have instead addressed adequate testing in software development environments...

  • Rewrites are Good (Score:3, Insightful)

    by RichiP ( 18379 ) on Thursday January 15, 2004 @02:40PM (#7989058) Homepage
    As a software designer, developer, programmer and user, I have to saw that rewrites done right are A Good Thing(TM). When I do a rewrite, it is with the intention that it is to be better than the old one. I only do rewrites when a limitation of the old code base has been reached or can be foreseen to be reached.

    When a rewrite is to be made, it goes without saying that anything learned from previous development should also be applied to the newer project. If you can't learn from the mistakes of the past, don't do a rewrite.

    It is not rewriting, per se, that is the problem. It is choosing WHEN to do a rewrite. Unless there is sufficient reason to do one (ie. old code hard to maintain, scalability problems, old code reaching its maximum potential, etc.), of course one should stick to improving on existing one. If, however, the reason is that so "we could have something new", or so that "we could say we did a rewrite" or "I'm the new architect around here. Scrap the old code and write my design", then of course rewrites might be more trouble than they're worth.

    All common sense.
  • by hcg50a ( 690062 ) on Thursday January 15, 2004 @02:41PM (#7989069) Journal
    From the Perl 6 development [perl.org] webpage:

    "The internals of the version 5 interpreter are so tangled that they hinder maintenance, thwart some new feature efforts, and scare off potential internals hackers. The language as of version 5 has some misfeatures that are a hassle to ongoing maintenance of the interpreter and of programs written in Perl."

    For me, this is a necessary and sufficient condition for rewriting something.

    Another one is: When changing the original will take longer than rewriting from scratch.
  • by Enahs ( 1606 ) on Thursday January 15, 2004 @02:42PM (#7989081) Journal
    Enlightenment DR17.
  • by melted ( 227442 ) on Thursday January 15, 2004 @02:42PM (#7989094) Homepage
    Every successful piece of software I've ever worked on was rewritten at least once, by the same team (or by myself on private projects) in the process of development, fully or at least partially.

    The fact of the matter is, even if you hire an expensive architect and have him do a good job, he's not a God. When you develop software some parts of it tend to become ugly as heck and you can't help but think on how to do the same thing better and/or with less effort, so that it won't become a PITA to run, maintain, improve and extend. When you reach critical mass, you become "enlightened", throw some shit away and rewrite it to save time later on. In all cases where I've seen it done I think it was worth the extra effort. I also think re-engineering code as you go saves money long-term if it's done reasonably.

    All of this, of course, doesn't apply to those who start their separate standalone projects even though there are dozens of other reasonably good projects to contribute to (and maybe rewrite some parts of). Freshmeat.net is full of examples.
  • bizzare (Score:3, Insightful)

    by cultobill ( 72845 ) on Thursday January 15, 2004 @02:44PM (#7989126)
    While the article is a good rant, it's just wrong sometimes. For instance:

    * He says that IPv6 uses 64 bit addresses. It uses 128 bit in reality. You would think that, if you were saying why something was bad, you'd do some basic research?

    * Also in the IPv6 stuff, "TCP/IP works pretty well". So? TCP/IPv4 and TCP/IPv6 are the same damn thing. That's not an argument against IPv6, it's an argument for knowing what you're talking about.

    * Perl. Sorry, the reasons for moving to the model in Perl 6 is well documented and sane. There's some problems with Perl 5 that we can't get around without losing backwards compatibility (syntax braindamage, for instance).

    * Mozilla. Ok, it's slow. The Mozilla team even admits it at this point. MozFirebird is better. The reason for starting fresh wasn't speed, it was because the old codebase sucked.

    * HTML. Having a language for both layout and data sucks. Splitting it into 2 parts is much better. There are developer perks, too (no rewriting the website to make it look different, no playing with layout to add data).

    The basic point he seems to be missing is: a major version change (1 to 2) is supposed to be a radical update. The version system used by the kernel (and a lot of OSS projects) is based on that. Major.minor.revision. Bump revision when making bug fixes, bump minor when adding features (without breaking too much API), bump major when it's something new altogether.
  • BULLSHIT! (Score:3, Insightful)

    by xutopia ( 469129 ) on Thursday January 15, 2004 @02:45PM (#7989145) Homepage
    Right now the people that still use Netscape 4 should be hung upside down by their little toes, wipped with a chainsaw and burned with acid dripping on their genitals.

    Netscape 4 is horrible. It's usage is actually slowing down adoption of Mozilla and other far superior browsers. Once we start creating web sites with standards rather than with code that looks like HTML we'll have smaller browsers that can do things much faster than what Mozilla can do today. Indeed Mozilla isn't just one browsers but multiple browsers for all the F'ing crappy implementations of HTML there have been. Just look at the page this article is on. It's ladden with mistakes, isn't even standard HTML 4.0!

    This guy would prefer to see the net stop growing than see some change so he doesn't have to rewrite some stuff. Lazy ass.

    • Sorry bud (Score:3, Informative)

      by Kombat ( 93720 )
      I have a 400 MHz PC at home. Netscape 4.7 runs acceptable fast. Mozilla is a hog. So I'm sticking with Netscape 4.7.

      It's also useful to have that browser around when doing web development, to ensure that my sites look OK in the older browsers. There are still a lot of Netscape 4.7 browsers floating around out there.

      That said, I use Mozilla on both my 1.8 GHz laptop and my 2.0 GHz work PC.
  • by Lumpy ( 12016 ) on Thursday January 15, 2004 @02:46PM (#7989160) Homepage
    I wrote 3 years ago a web-app based on perl that is currently the heart of one of the tasks my company does. I am in the process of completely rewriting it in php and using no code or concepts from the first iteration in the new release.

    Why? I have better way's of doing things now, I need to be scalable to handle a worldwide company instead of simply a regional tool, and to increase speed, useability and stability.

    a rewrite is the only way to achieve these things. anyone who has been with a project for an extended period of time and had to expand/modify it beyond it's origional capabilities knows this.
  • Firebird (Score:3, Insightful)

    by sofakingl ( 690140 ) on Thursday January 15, 2004 @02:46PM (#7989165)
    Don't like Mozilla? Use Mozilla Firebird. Honestly, I can't think of any browser I've used that is better than Firebird (especially with the addition of extensions). Firebird should be enough proof to this guy that Mozilla was a step in the right direction.
  • by seanmeister ( 156224 ) on Thursday January 15, 2004 @02:53PM (#7989251)
    The Problem: Rewrite Mania
    Waaaaaaa!!

    Case 1: IPv4 vs IPv6
    Waaaaaaa!

    Case 2: Apache 1.x vs Apache 2.x
    Waaaaaaaaaa!

    Case 3: Perl 5.x vs Perl 6
    Waaaaaaaaa! Waaaaaaaaaaa!

    Case 4: Embperl 1.x vs Embperl 2
    Waaaaa!

    Case 5: Netscape 4.x vs Mozilla
    Waaaaaaaaa!

    Case 6: HTML 4 vs XHTML + CSS + XML + XSL + XQuery + XPath + XLink + ...
    XML is hard! My HTML for Dummies book weighs too much! Waaaaaaa!

    Case 7: Windows 2000 vs Windows XP vs Server 2003
    Waaaaaaaa!

    Conclusion: In Defense of "good enough" and simplicity
    Waaaaa waaaaaaaaa!
  • by crush ( 19364 ) on Thursday January 15, 2004 @02:56PM (#7989310)
    • Case 1: IPv4 vs IPv6 - IPv4 with NAT presents all sorts of complications and problems when trying to implement IPsec. Not least are the problems of NAT traversal which have only partial solutions available in the form of the NAT-T patches for linux-2.4 and are integrated into linux-2.6 and which can't do AH. Even if this is sorted out then there are still problems with MTU sizes leading to fragmentation as the result of protocol headers being layered within each other. IPv6 avoids these problems.
    • Case 3: Perl 5.x vs Perl 6 - Perl6 development is tied to the fact that the interpreter was hard to maintain [perl.org] due to the structure of Perl5. So, in order to maintain the interpreter and continue to make incremental developments (as the author wishes) it's necessary to clean house. There is also the desire to keep developers (people that can make improvements) involved by allowing interesting new ideas to be incorporated (such as Parrot: a multi-language interpreter)
    • Case 5: Netscape 4.x vs Mozilla - Well, I run mozilla on a 466MHz Celeron with 196Mb RAM and I don't see the speed difference that the author talks about.
    • Case 6: HTML 4 vs XHTML + CSS + XML - There's no question that CSS and XHTML make webpage scripting easier and cleaner. Thank god for these developments. The author makes no substantive argument here
  • Give and take (Score:3, Interesting)

    by j-turkey ( 187775 ) on Thursday January 15, 2004 @03:00PM (#7989373) Homepage

    I'm not sure that the author of the story really discusses the give and take of patching an old codebase, vs a complete rewrite. Instead, he focuses on a negative that isn't really there.

    As soon as I read the headline, the first apps that sprang to mind were Sendmail, and WuFTPD. Both have been historically full of holes, and a complete mess. I haven't really looked at Sendmail code, but having to configure each option with regular expressions, while powerful, is just lame (IMO). The WuFTPD code is a mess. It's been passed on and passed on, and patched and patched. It eventually became a total whore that nobody really wanted to touch on any level.

    Now, both of these (AFAIK) were not rewritten from scratch, and suitable replacements have been produced all over the place. However, would it have been so bad to rewrite those from scratch, while still maintaining the older versions? How would it be any different from, say, the Linux kernel. I run 2.4.x on my production machines. 2.6 is out, but I'm not going to run it until it's proven itself elsewhere (and is integrated into a mainstream distribution). 2.4 will be maintained for a long, long time -- and it's not eve na complete rewrite (AFAIK). Usually code rewrites are adopted by the public...not right away, but eventually.

    Finally, his gripe about Mozilla/Netscape are interresting, but not really warranted (and he does acknowledge this). The applications became more bloated as system resources became more plentiful. Software tends to do this -- it has to do with greater layers of abstraction as hardware gets better. But furthermore, it's because Mozilla had to be able to "compete" with the latest greatest from Microsoft...which MSFT will always be updating as new standards are added.

    The point is, it doesn't really matter. It doesn't do a disservice one way or the other, and since much of the software we're talking about is Free Software, it matters even less, since the code it out there -- if there are enough people using the older versions, there will always be someone to maintain it.

  • by squarooticus ( 5092 ) on Thursday January 15, 2004 @03:02PM (#7989418) Homepage
    Exactly when did Netscape ever work well on Linux?

    All I remember is consistent crashing from Netscape Gold through the finally-put-down Netscape 4.x. It was the biggest piece of shit browser ever written precisely because its codebase was old (forked from NCSA Mosaic in 1994, which itself was much older) and non-extensible, yet more and more shit was thrust into it. It had to be rewritten, and all the Gecko-based browsers have been much more feature-complete and reliable for the past 2-3 years than Netscape ever was.

    I use Galeon, and the thing basically never crashes. Back in 1999, I considered myself lucky if a particular version of Netscape 4.x only crashed once every half-hour.
  • What about Gnome? (Score:3, Insightful)

    by Anonymous Coward on Thursday January 15, 2004 @03:06PM (#7989485)
    The Gnome desktop environment is a prime example of disasters through re-writes.

    As we all know, Gnome's oringal purpose was to provide a free rival to KDE, which was the first easy to use Desktop Environment for Linux, this was back before Qt was GPL

    Unfortunaltey for Gnome, its problems started as it kept replacing and rewriting core components. For example, it started out with the Enlightenment window manger, then it switched to sawfish, then it switched to the buggy and slow metacity. Metacity has had many problems, and most people want the old sawfish back, but havoc pennington refused to do it and insists that people use it.

    The file manager keeps changing too. First it was GMC, then it was the Slow and buggy Nautilus from the now defunct Eazel corporation, now they are writing a new Windows 95 like file manager for gnome called Spiral Nautilus.

    It also rewrote the graphics layer GTK and broke compatibillity with GTK 1.x. There are many legacy GTK apps still in wide use and they look ugly on newer desktops.
    There is also the many problems with the file dialog, which is now only emerging in GTK 2.4. This is also incompatible with older GTK versions. This means that if you want to use a new program, YOU HAVE to upgrade to Gnome 2.6, and can't keep your leagcy Gnome 2.0,2,4 desktops.

    They keep switching default apps, for example, Galeon was dropped in favour of the buggy and far less featureful Epiphany in 2.0. They also dumped several other applications that were useful.

    To make matters worse, it is going away from the old philosphy of simple text files and are using an XML based registry clone to configure stuff. KDE keeps the text file format underneeth and has had a standardized API for it.

    It also has a lack of true intergration, Micheal de Incanta has PUBLICLY ADMITTED that Bonobo was a faliure. KDE has had this BUILT in from day one using kpart technology, which is now being used in Apples Mac OS X Panther Edition.

    Gnome developers, realising they kan't kompete with KDE technology, has spread various FUD about kde, but the message is getting through. Red Hat has abondaned their Gnome desktops, Fedora developers are working hard to make KDE 3.2 the default desktop for Core 2. Debian, who has traditionally been pro-gnome have announced their full support for KDE and they are working hard to make KDE the defualt desktop for

    KDE on the other hand has kept consistent technology and has internally has changed very little since 2.0. Distros like Lycoris are still using 2.x because it is very stable and mature. KDE 3.2 will be a good example of why maturity, and not wheel inventing is a better idea overall. They have took their technology and have optimized it for usabillity

    Gnome 2.6 will need more than just propoganda about the HIG if it is going to get the attention it needs, but instead it looks like they are reinventing wheels again.
  • A similar article... (Score:3, Interesting)

    by slamb ( 119285 ) on Thursday January 15, 2004 @03:09PM (#7989553) Homepage

    Here's a much better article with a similar thesis: Joel on Software - Things You Should Never Do, Part I [joelonsoftware.com]

    There are parts of it that I've never agreed with:

    "Well," they say, "look at this function. It is two pages long! None of this stuff belongs in there! I don't know what half of these API calls are for."

    [...]

    Back to that two page function. Yes, I know, it's just a simple function to display a window, but it has grown little hairs and stuff on it and nobody knows why. Well, I'll tell you why: those are bug fixes. One of them fixes that bug that Nancy had when she tried to install the thing on a computer that didn't have Internet Explorer. Another one fixes that bug that occurs in low memory conditions. Another one fixes that bug that occurred when the file is on a floppy disk and the user yanks out the disk in the middle. That LoadLibrary call is ugly but it makes the code work on old versions of Windows 95.

    This should never happen! If you have all these bugfixes in your code and no way to know why they were put in, you've screwed up badly. You should have each one documented in:

    • a bug number in the database
    • a log message in your commit history (cross-referenced to the bug database) (which you should be able to pull up easily with "cvs annotate" or similar)
    • if it's particularly weird-looking, a comment in the code

    So the idea that you'd have all these important bugfixes without any way of knowing what they are should be laughable! Given a codebase like that, you probably would be better off throwing it out, because it was clearly developed without any kind of discipline.

    Also, he's embelleshing a lot. If it's just a "a simple routine to display a window", it doesn't need to load a library, require Internet Explorer, etc., and thus can't possibly have bugs related to those things. He makes the situation sound a lot more extreme than it really is.

    But in general, I think he's right. Refactor, not rewrite. That's the same thing the XP people say to do. They also have extension unit tests to make it easier to refactor with the confidence that you haven't screwed anything up. Which can help in situations like this [joelonsoftware.com]:

    I laughed heartily as I got questions from one of my former employees about FTP code the he was rewriting. It had taken 3 years of tuning to get code that could read the 60 different types of FTP servers, those 5000 lines of code may have looked ugly, but at least they worked.

    Ugh. I bet it would have been a lot less tuning if there were a decent way to test that the change to support #60 hasn't broken any of the previous 59 server types. Or that just a refactoring hasn't broken any.

    I don't think this advice always applies, though. I rewrite one major project from scratch at work: our personnel system. Our database schema was hopelessly denormalized and broken. That's not something you can refactor easily - with a widely-used database schema, it's easy to make one big change than many smaller ones, because a lot of the work is just hunting down all the places that use it. That's easier to do once. So I believe there are situations this advice does not apply, but I also believe they are rare.

  • by astrashe ( 7452 ) on Thursday January 15, 2004 @03:10PM (#7989566) Journal
    It's hard to write code that is robust enough to not need rewrites. The ability to do that is what separates the really good programmers from amateurs like myself. It's the difference between being a piker (like myself) and an engineer.

    I'm not a great programmer, and don't do it regularly, but when I have written fairly big projects, I find that the need for rewrites came out of poor design choices that I had made.

    I typically start out with something small, that can handle the core functionality expected from the project. Then I try to add features and fix bugs.

    Eventually, the code becomes very difficult to maintain, and ultimately, you get to the point where the ad-hoc architecture simply won't support a new feature.

    To the user, everything looks fine, everything runs reliably, but under the hood, there are real problems.

    My worst experience was with a web app. I started out with script based pages in ASP (not my call), and kept writing new pages to do different things. It got to the point where I had a about three hundred script pages and lots of redundant code.

    When it would become necessary to change the db table structures for another app hitting the same data, I'd have a lot of trouble keeping up, fixing my code quickly in a reliable way.

    The problem was that it just wasn't possible to stand still. I couldn't go to my boss and say, "I need a three month feature freeze, to rewrite this stuff."

    Writing a new version in parallel was hard because maintaining the crummy but functional code was taking more and more time. It was a real problem, and caused me a fair amount of pain, and suffering.

    After digging myself into that hole, I stepped back and tried to figure out how other people did it. I would have been a lot better off building on top of something like struts.

    The lesson I took from this is that it's important to study design patterns, and to use tested frameworks whenever possible. You have to think like an engineer, and not someone who codes by the seat of his pants. I'm not an engineer, so it's not easy for me to do that.

    I'm not saying that the people who run the projects mentioned are in the same boat that I was. As programmers, they're in a different league.

    But they're often working on problems that aren't well understood. Patterns and frameworks are ways to leverage other people's experiences. But if that experience doesn't exist, you have to guess on certain design decisions, and see how it comes out.

    Top notch programmers are obviously going to guess a lot better than someone like me will. But they're still going to make mistakes. When enough of those mistakes pile up, you're going to need to do a rewrite.

    You could make a point that's opposite of the one that the article makes by looking at the java libraries.

    They made choices with their original AWT gui tools that were just wrong. They weren't dumb people -- they just didn't know, the experience necessary to make the right choice simply didn't exist. Once they tried it, they realized it wasn't working, and they came back with Swing.

    Rewrites are always going to be necessary for new sorts of projects, because you can't just sit in your armchair and predict how complex systems will work in the real world. You have to build them and see what happens.

  • by Waffle Iron ( 339739 ) on Thursday January 15, 2004 @03:12PM (#7989605)
    When I feel that a program is in need of a structural overhaul, I don't just throw it away. I usually comment out all of the source code, back down to a blank 'main()' function. Then I build it back up piece by piece using a cleaner design. However, instead of writing a lot of new code, I uncomment and tweak the bits of the commented out old program that worked fine. I usually find that huge swaths of the original program needed little or any work to adapt to the new architecture.

    Once all of the old code has been either pasted back in, revised or deleted, I've usually got a program that does everything the old one does and more, but it is smaller, simpler and cleaner.

    Most of the subtle features and knowledge embedded in the old code is not lost by using this approach; it gets pulled back in.

  • by ReadParse ( 38517 ) <john@nOSPAM.funnycow.com> on Thursday January 15, 2004 @03:19PM (#7989704) Homepage
    The story says, in part...

    "Examples include IPv4 vs IPv6, Apache, Perl, Embperl, Netscape/Mozilla, HTML and Windows"

    All props to IPv4 and all, but I don't think it stands a chance against all of those put together (even with Windows on their team).

    RP
  • Lousy Examples (Score:4, Insightful)

    by avdi ( 66548 ) on Thursday January 15, 2004 @03:25PM (#7989787) Homepage
    Most of the examples given needed rewrites to remain viable. It's easy to look at a package from afar and declare it "perfectly sufficient". Things look different when you have to work with a system daily. In particular, rewrites often address shortcomings in a system's capacity for extension. Just compare the number of third-party extensions available for Netscape 4.* vs. the number now available at mozdev.org for Mozilla and Firebird.

    A bigger problem, to my mind, is when a half-dozen projects with the noble intention of replacing an aging kludged-up tool are started, all of which suck in different ways, and none of which learn from each other. And then they lose momentum and stagnate.

    Examples? Most programmers agree that "make" is overdue for replacement, but despite many attampts (cmake, jam, cons, ant) no one has managed to come up with one that is compelling enough to catch on. CVS is a crufty mess, but none of it's potential replacements are mature enough or have the kind of widespread tool support to make much of a dent in CVS installations. And there are dozens of written-from-scratch applications which differ primarily on the GUI toolkit they are based on, which would be better apps if they incorporated the best features from all into a joint effort. My idea of the perfect browser combines features of Konqueror, Galeon, Epiphany, Firebird, and Safari.
  • Throw one away (Score:3, Insightful)

    by cdunworth ( 166621 ) on Thursday January 15, 2004 @03:29PM (#7989840)
    It's really common to build something, step back, examine its warts, and start over again with a new perspective and understanding. It's called prototyping. Some people actually build the first one with the intent of throwing it away. Others release it as v1.0, and introduce issues of the kind this author is referring to.

    There are many reasons you might prefer a rewrite. The main one, to me, is that complicated applications contain layers and dependencies, not all of which are obvious to a new programmer. If, after some analysis, your assumptions about these dependencies are wrong, you'll break the original code faster than you can say "global variable". In the end, you could easily spend more time and effort patching and praying than you would rebuilding from the ground up.

    Of course, if some of the original architects are still involved in the project, arhictectural knowledge and assumptions can be transferred to new programmers in a fairly fluid way, and I suspect it is in these cases where you can confidently add on to an existing code base.

    And it's always helpful if the previous programmers were actually good programmers, and who wrote code and comments that were mindful of those who might follow them later. But that's not within your control.
  • by stevew ( 4845 ) on Thursday January 15, 2004 @03:29PM (#7989845) Journal
    I've had the fortune to be affiliated with the Icarus Verilog compiler/simulator effort over the last 3-4 years. The first version of the code had some specific design decisions that made scaling simulations beyond a thousand or so gates impossible.

    The author chose to throw out his simulation engine and much of his code generation and adopt a completely new model. It took him the better part of a year to get roughtly where he was with the original code base as far as functionality is concerned. He also has a regression environment with several hundred tests he uses regularly to let him know how he is doing with respect to functionality. About 2 1/2 years into the rewrite period, Icarus is now handling behavioral code of 1 Million gates at about 80% of the performance of commercial tools!

    Was the rewrite needed. YES! Did it take awhile. YES! Was it worth the wait. YES!
  • Perl 6 v Perl 5 (Score:4, Informative)

    by Yrd ( 253300 ) on Thursday January 15, 2004 @03:52PM (#7990220) Homepage
    How is it possible to so completely miss the point of Perl 6? The intent is not necessarily to replace Perl 5 - Perl 5 is fantastic and the Perl 6 developers above all people know this. Perl 6 is perhaps best thought of as a DIFFERENT LANGUAGE which will 'just happen' to be, in many places, very similar/identical to Perl 6.

    Once you start thinking of Perl 6 in that manner, you realise what it's for. It's not to replace all of the Perl already out there. It's to provide a new tool, a new language for doing new things in, drawing on the experience gained in years of working with Perl 5 and other languages.

    Ponie, of course, is part of the effort to make sure that at least some of the vast amounts of Perl 5 code is usable with Perl 6, should programmers wish it. And even that's not a total rewrite of the existing Perl codebase.

    So ultimately, that article has nothing of use in it. Yes, programmers should be careful what they rewrite and when they rewrite it, but many times such things are actually worth it. GTK+ 2, anybody?
    • Perl 6 is perhaps best thought of as a DIFFERENT LANGUAGE which will 'just happen' to be, in many places, very similar/identical to Perl 6.

      Sort of how tea is a substance almost, but not completely, not unlike tea.
  • by bigpat ( 158134 ) on Thursday January 15, 2004 @03:58PM (#7990313)
    seems common in other areas of engineering also, bridges could just be retrofitted, buildings added on to, but sometimes there are too many unknowns in engineering old structures... Are the building materials made from Asbestos, How has the structure held up after so many years? Have other modifications extended or complicated further modifications beyond that which the original plans called for? Sometimes the unknowns themselves justify building from scratch. Sure we could just keep tacking on new technologies to old, but the result will seldom be better. More often the real advancement comes from taking the knowledge gained from past experiences and applying them to new, rather than actually taking old work and trying to make it work in a new situation.

    Would you really want horses running on a treadmill attached to the front of your car, just because humanity wouldn't want to throw away its previous investment in transportation technology?

  • by mypalmike ( 454265 ) on Thursday January 15, 2004 @03:59PM (#7990333) Homepage
    Well, it was professionally written, except for the use of phrases and words like "pain in the ass", "jackass", and "asshole".
  • by pclminion ( 145572 ) on Thursday January 15, 2004 @04:05PM (#7990418)
    I find myself constantly rewriting any code that I have complete control over. Code I write for my employer evolves continuously, but personal code for my own enjoyment is constantly getting axed and redone.

    Having done this for years, I think I'm starting to figure out why I do it, and perhaps someday I'll be able to stop myself from doing it, so that I can actually release something :-)

    I think the need to rewrite is more emotional than intellectual. As I work on an existing codebase, I notice the little bumps and warts on it, the little "tweaks and fixes" which make it work, and I find them ugly. For some reason, I place the highest aesthetic value on code that was written in one big, flowing session, where the entire structure was understood from the beginning, and the entire thing looks like it was born fully-formed from some supernatural source.

    In an ever futile attempt to realize this goal, I constantly chuck out perfectly good code and redo it from scratch. I do this because I seek the emotional experience of those few times when I really do sit down and blast out something that's beautiful, elegant, and functional. Even if, practically, it's no better than before.

    Open source programming is often described as scratching an itch. It should be immediately apparent why this correlates to extensive rewriting of code. Some problems are simply enjoyable to solve. The necessary thinking feels good. Just as we watch a good movie again and again even though we've got the plot memorized, some programmers want to rewrite the same functionality repeatedly because it just feels good.

    To hell with practical considerations, like whether or not that's "bad" for the codebase. I program for pleasure.

  • Opera 3,5,6,7 (Score:3, Informative)

    by danila ( 69889 ) on Thursday January 15, 2004 @04:50PM (#7991093) Homepage
    Another great example not mentioned by anyone yet is the excellent Opera Internet browser. It isn't always rewritten from scratch, but overall there are enough changes in each new major version to make it almost unusable, at least to me. Every time a new version (3.0, 5.0, 6.0, 7.0) is rolled out, many little things no longer work as they did, and sometimes they are clearly and unequivocally broken.

    Before I knew better, I used to download the release versions (not betas or RCs), but each and every time I ended up uninstalling the new version and switching back. It usually took more than a month and about 10 updates for a new version to reach relative maturity. Witness 3.21, 6.05, 7.20, only these versions could be considered better than their predecessors in all respects. With version 7 I succumbed at about 7.1, but next time I will really know better and not even consider Opera 8, until there have been a month without updates. :)

    On a more serious note, I think there is moment of maturity in many every product's lifetime, a moment when new features could no longer justify an upgrade (other things, such as compatibility, being equal).
  • by Trejkaz ( 615352 ) on Thursday January 15, 2004 @05:56PM (#7991908) Homepage

    I sent this reply to the author through the site, but it would probably get some use here too.

    "The Web was based on the idea that a simple markup language could allow us to divorce document presentation from document structure"

    Which HTML 1.0 through 3.2 didn't really achieve, admittedly...

    "Some of the changes to HTML were done in a way that shouldn't break old browsers, but as I said before, I am increasingly seeing websites that don't render properly in Netscape 4.x"

    There's a shock. I thought it was 2004, and you're still testing on a browser which is at least three major revisions old, never mind that Mozilla itself seems to be more useful than Netscape's rebadged browser.

    "So apparently the FONT tag is deprecated - now we have to use style sheets and whatnot to do something that was originally very simple"

    This is because "the web was based on the idea that a simple markup language could allow us to divorce document presentation from document structure", and the FONT tag is presentation appearing in the document structure. That's sort of like a divorce where the couple still sleep with each other.

    "but at the expense of being able to do simple things quickly."

    I beg to differ. Even if I really want to break style guidelines and make a chunk of text red for no particular purpose, it still takes the same amount of time to type <span class="red"> than it was to type <font color="red">. Never mind that this really is a bad thing to do. Why is it red? Is there a meaning to the red? Perhaps it should be <span class="important">, in which case why not just use <strong>?

    "As a Web developer I have long wondered why they didn't add more types to the INPUT form tags to express different types - for example, a DATE attribute, or INTEGER, DOUBLE, or whatever."

    Of course XHTML 2.0 will be partnered with XForms, which will attain this functionality in so as far as any field which can store a value can be of an XML Schema type. This includes -- wait for it -- dates, integers, doubles, and arbitrary regular expressions.

    "These "rich" (but simple! not XML!) attributes could then be seen by the browser and presented to the user in whatever way is supported by the system"

    Hopefully they do this. I would love to see browsers implement a calendar popup. I can't count the number of times we had to use a JavaScript for this.

    "But the direction we're going in, the HTML books have just become thicker and thicker over the last few years."

    This I don't get. There are less tags now, right? It's the CSS and XSL books which should be getting thicker. By the way, never buy a book on XSL:FO. I accidentally dropped that on my foot, and christ, they hurt.

    I think the progression from HTML 4.0 through XHTML 1.0 to XHTML 1.1 was smooth. They're encouraging people to go back to the roots of the web: to mark up content depending on what it means, not depending on how it's supposed to look. Sites like www.csszengarden.com are living proof of how the separation of HTML and CSS can achieve excellent separation of concerns between the graphic designer and the web developer, and I'd personally love to see more sites such as this (only with real content!) pop up all over the place. If for no other reason than the pages loading faster due to many, many less tags in the HTML! :-)

"If value corrupts then absolute value corrupts absolutely."

Working...