Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Rewrites Considered Harmful? 670

ngunton writes "When is "good enough" enough? I wrote this article to take a philosophical look at the tendency for software developers to rewrite new versions of popular tools and standards from scratch rather than work on the existing codebase. This introduces new bugs and abandons all the small fixes and tweaks that made the original version work so well. It also often introduces incompatibilities that break a sometimes huge existing userbase. Examples include IPv4 vs IPv6, Apache, Perl, Embperl, Netscape/Mozilla, HTML and Windows. "
This discussion has been archived. No new comments can be posted.

Rewrites Considered Harmful?

Comments Filter:
  • by [TWD]insomnia ( 125505 ) on Thursday January 15, 2004 @03:28PM (#7988854)
    .. as they are rewriting the security layer!
  • by Viral Fly-by ( 662186 ) <ross@truman.edu> on Thursday January 15, 2004 @03:31PM (#7988909) Homepage
    The minor tweaks, fixes, and changes that made the old version work so well can only go so far. Such is often the nature of code. Tiny fixes and patches are (sometimes haphazardly) hacked on to the code.

    Perhaps if true extensive software engineering and documentation techniques were followed, a full rewrite may not be necessary. However, as long as quick fixes continue to pollute the code and make it more and more difficult to work with, an eventual total rewrite will always be necessary.
  • by grendel_x86 ( 659437 ) on Thursday January 15, 2004 @03:32PM (#7988918) Homepage
    XP is not a total rewrite, there is still some code in there from win 3.1 and nt 3.51

    Longhorn is a rewrite.

    You must consider the source when reading articles. This is not a credible source, just some person's site. Im sure they have pondered this, but they supply no sources for their information.
  • I'm sympathetic to the idea behind this article, but does it deserve a place on /.? There's absolutely no empirical data, or even a reasonable example given in the document. The author is talking about IPv6 and Perl6 both of which are unknown quantities at this point.

    He's right that just throwing away old code means yo u lose a lot of valuable bug fixes, on the other hand if you look at some code and realize there is a better way then the solution is to rewrite it.

    Of course you can have it both ways. What you do is write an automated test case for every bug that you fix in your code. When you write the new version it has to pass the old test suite, then you've got new code and all the experience from the old code.

    John.
  • Untrue (Score:2, Insightful)

    by Shazow ( 263582 ) <{andrey.petrov} {at} {shazow.net}> on Thursday January 15, 2004 @03:34PM (#7988946) Homepage
    Although I have an unhealthy habit of wanting to start things from scratch, I believe it can be a good thing more often than not.

    When you've developed a piece of software, fixed its bugs, and tweaked it, more times than not, those fixes and tweaks are nothing more than workarounds for your currently flawed structure. Usually, you don't realize these flaws until AFTER you've created it.

    By starting it from scratch, you can keep your mistakes in mind, and make better and more efficient software.

    Sure, there are chances of running into new bugs, but isn't that what the whole learning process is about? The more you learn, the better the software will keep making. You can only go so far, if you need to turn a paper bag full of feces into an operating system. But if you start from scratch, you can create your own digitized significant other. You know, relatively speaking.

    - shazow
  • by shaka999 ( 335100 ) on Thursday January 15, 2004 @03:34PM (#7988949)
    Your point is well taken about ego often driving rewrites but in my experience the driving force for rewrites is often maintainability.

    As a program ages and drifts from the original intent ugly hacks are often placed on top of the original code to add unforseen functionality. There is also the opposite effect where old code is sitting around that no longer has any function. I remember one drastic case of this when rewriting a program where only about 1/2 the code was even beeing utilized.

    By rewriting the code you clean things up and make it easier for future programers to understand what the code is doing.
  • ReFactor! (Score:3, Insightful)

    by gbr ( 31010 ) on Thursday January 15, 2004 @03:34PM (#7988952) Homepage
    Don't rewrite. Refactoring code is the way to go. Refactoring in small pieces allows the app to maintain compatibility as the process progresses.
  • by Old Wolf ( 56093 ) on Thursday January 15, 2004 @03:35PM (#7988972)
    A lot of time wasting comes from that too. Even if you can think of a better implementation, if it isn't better by it's not worth the development time + debugging time to do it that way.

    Re. the original post, I think a lot of the problem is caused by bad code commenting. When you make a "little tweak", or fix some minor bug, or fix a subtle logic bug, you should clearly comment in the code what you have done, so that it can serve as a warning when somebody else looks at the code and does not realise the subtlety involved.
  • by selderrr ( 523988 ) on Thursday January 15, 2004 @03:35PM (#7988976) Journal
    I diesagree. Most rewrites come from the experience learned during long periods of adaptations. The roots of this rewriting problem go back to the source of all coding evil : specs.

    In 15 years of coding, i have NEVER worked on a project that had specs which could foresee future futher away than say 4-6 years. After that, either the managers start pushing up new features that simply do not fit the original concepts, or you bump into uses of your software you did not foresee simply because the scale of applications has grown beyond the site of your own usage.

    The last 4 years I've been writing an app for authoring psychology priming experiments (somewhat like e-prime, but with far more randomisation capabilities). In the original concept, no-one in our team expected someone to make randomisations wit a tree wider than 6 stages. So I went for 15 in my code. By now, 4 years later, I have seen projects with twice that depth. I could expand the code by changing some #defines to provide for larger arrays, but that ignores the fact that such complex randomisations demand a whole other interface. So after a few weeks of puzzling, we decided.. you guessed it : a rewrite.
  • Full of shit. (Score:4, Insightful)

    by iantri ( 687643 ) <iantri&gmx,net> on Thursday January 15, 2004 @03:35PM (#7988977) Homepage
    This guy is full of shit and has no idea of what he is talking about.

    Some of the better parts:

    - He claims that The mozilla project and everything Netscape >4 is pointless and that Netscape 4 "just works". We all know that Netscape 4 is an awful, crashy, buggy, standards-breaking piece of crap that set the Internet back years.

    - He claims that Windows XP was a complete rewrite. Windows XP is NT 5.1 -- (check with ver if you want) Windows 2000 with the PlaySkool OS look.

  • Fluff Article (Score:5, Insightful)

    by SandSpider ( 60727 ) on Thursday January 15, 2004 @03:36PM (#7988983) Homepage Journal
    Okay, so most of the article consists of, "Here's software X. They re-wrote it, and now it's not as good or as accepted. Why'd they do that? They suck."

    Software is re-written for many reasons. Sometimes it's ego, sometimes it's for fun, but usually it's because you take a look at the existing codebase and what you want to do with it in the future, and you decide that it's going to cost a lot less to implement the future features by re-writing and fixing the new bugs than to work around the existing architecture.

    I've had to make the re-write or extend decision more than once, and it's rarely a simple decision.

    What I would have preferred from this article is some interviews with the people responsible for the decision to re-write, and what their thinking was, as well as whether they still agree with that decision or would have done something differently now.

    =Brian
  • by SerialHistorian ( 565638 ) on Thursday January 15, 2004 @03:39PM (#7989041)

    Rewrites are 'bad' from a management point of view (at least, a manager that isn't familiar with software development), which looks at return on investment (ROI).

    However, from a developer's point of view, a partial or complete rewrite is sometimes the only way to FIX certain bugs. While it may introduce new, small ones, usually developers are smart enough to read the old code and learn from it's mistakes before the do a rewrite.

    A partial or complete rewrite is ALSO sometimes the only way to fix 'spaghetti code' -- code that's become so tangled from patch upon patch being applied to it that it's now impossible to trace and fix a bug. If spaghetti code isn't pursued and rewritten on a regular basis (this is 'constant improvement' -- a management buzzword from the past few years that actually works), new bugs can be inadvertantly introduced -- and it can sometimes take weeks to hunt down an intermittant bug by tracing spaghetti code. Ladies and gents, WEEKS of programmer time is expensive compared to one programmer spending 8-10 hours per week tracking down bad code in the codebase and rewriting it.

    Really, there's a case for doing rewrites on a constant basis. The author should have instead addressed adequate testing in software development environments...

  • Rewrites are Good (Score:3, Insightful)

    by RichiP ( 18379 ) on Thursday January 15, 2004 @03:40PM (#7989058) Homepage
    As a software designer, developer, programmer and user, I have to saw that rewrites done right are A Good Thing(TM). When I do a rewrite, it is with the intention that it is to be better than the old one. I only do rewrites when a limitation of the old code base has been reached or can be foreseen to be reached.

    When a rewrite is to be made, it goes without saying that anything learned from previous development should also be applied to the newer project. If you can't learn from the mistakes of the past, don't do a rewrite.

    It is not rewriting, per se, that is the problem. It is choosing WHEN to do a rewrite. Unless there is sufficient reason to do one (ie. old code hard to maintain, scalability problems, old code reaching its maximum potential, etc.), of course one should stick to improving on existing one. If, however, the reason is that so "we could have something new", or so that "we could say we did a rewrite" or "I'm the new architect around here. Scrap the old code and write my design", then of course rewrites might be more trouble than they're worth.

    All common sense.
  • by hcg50a ( 690062 ) on Thursday January 15, 2004 @03:41PM (#7989069) Journal
    From the Perl 6 development [perl.org] webpage:

    "The internals of the version 5 interpreter are so tangled that they hinder maintenance, thwart some new feature efforts, and scare off potential internals hackers. The language as of version 5 has some misfeatures that are a hassle to ongoing maintenance of the interpreter and of programs written in Perl."

    For me, this is a necessary and sufficient condition for rewriting something.

    Another one is: When changing the original will take longer than rewriting from scratch.
  • by melted ( 227442 ) on Thursday January 15, 2004 @03:42PM (#7989094) Homepage
    Every successful piece of software I've ever worked on was rewritten at least once, by the same team (or by myself on private projects) in the process of development, fully or at least partially.

    The fact of the matter is, even if you hire an expensive architect and have him do a good job, he's not a God. When you develop software some parts of it tend to become ugly as heck and you can't help but think on how to do the same thing better and/or with less effort, so that it won't become a PITA to run, maintain, improve and extend. When you reach critical mass, you become "enlightened", throw some shit away and rewrite it to save time later on. In all cases where I've seen it done I think it was worth the extra effort. I also think re-engineering code as you go saves money long-term if it's done reasonably.

    All of this, of course, doesn't apply to those who start their separate standalone projects even though there are dozens of other reasonably good projects to contribute to (and maybe rewrite some parts of). Freshmeat.net is full of examples.
  • bizzare (Score:3, Insightful)

    by cultobill ( 72845 ) on Thursday January 15, 2004 @03:44PM (#7989126)
    While the article is a good rant, it's just wrong sometimes. For instance:

    * He says that IPv6 uses 64 bit addresses. It uses 128 bit in reality. You would think that, if you were saying why something was bad, you'd do some basic research?

    * Also in the IPv6 stuff, "TCP/IP works pretty well". So? TCP/IPv4 and TCP/IPv6 are the same damn thing. That's not an argument against IPv6, it's an argument for knowing what you're talking about.

    * Perl. Sorry, the reasons for moving to the model in Perl 6 is well documented and sane. There's some problems with Perl 5 that we can't get around without losing backwards compatibility (syntax braindamage, for instance).

    * Mozilla. Ok, it's slow. The Mozilla team even admits it at this point. MozFirebird is better. The reason for starting fresh wasn't speed, it was because the old codebase sucked.

    * HTML. Having a language for both layout and data sucks. Splitting it into 2 parts is much better. There are developer perks, too (no rewriting the website to make it look different, no playing with layout to add data).

    The basic point he seems to be missing is: a major version change (1 to 2) is supposed to be a radical update. The version system used by the kernel (and a lot of OSS projects) is based on that. Major.minor.revision. Bump revision when making bug fixes, bump minor when adding features (without breaking too much API), bump major when it's something new altogether.
  • BULLSHIT! (Score:3, Insightful)

    by xutopia ( 469129 ) on Thursday January 15, 2004 @03:45PM (#7989145) Homepage
    Right now the people that still use Netscape 4 should be hung upside down by their little toes, wipped with a chainsaw and burned with acid dripping on their genitals.

    Netscape 4 is horrible. It's usage is actually slowing down adoption of Mozilla and other far superior browsers. Once we start creating web sites with standards rather than with code that looks like HTML we'll have smaller browsers that can do things much faster than what Mozilla can do today. Indeed Mozilla isn't just one browsers but multiple browsers for all the F'ing crappy implementations of HTML there have been. Just look at the page this article is on. It's ladden with mistakes, isn't even standard HTML 4.0!

    This guy would prefer to see the net stop growing than see some change so he doesn't have to rewrite some stuff. Lazy ass.

  • Firebird (Score:3, Insightful)

    by sofakingl ( 690140 ) on Thursday January 15, 2004 @03:46PM (#7989165)
    Don't like Mozilla? Use Mozilla Firebird. Honestly, I can't think of any browser I've used that is better than Firebird (especially with the addition of extensions). Firebird should be enough proof to this guy that Mozilla was a step in the right direction.
  • by jadavis ( 473492 ) on Thursday January 15, 2004 @03:52PM (#7989239)
    what is wrong with rewriting the code from the ground-up?

    Nothing is wrong with that, as long as your time is worth nothing.

    The obvious answer is that if you get some enjoyment out of a rewrite, and you actually do it, then sure, its great. But if you have to trade something more important, than it's bad. What else could you do with your time, and how much enjoyment or productivity would you get out of the alternatives?

    When I first started programming, I would always get a vision about how a piece of software should work, and think about rewriting it. But usually the current software is, to an extent, cluttered for a reason. I think it's only worthwhile if you can actually work out the details, and you're still confident. The details are what always cause the problems.
  • by BinxBolling ( 121740 ) on Thursday January 15, 2004 @03:54PM (#7989266)

    And often, you're mistaken when you think you have a better implementation.

    Here's an experience I used to have somewhat often: I'd be revisiting a piece of code I'd written a few months earlier. I'd think "Wait, this makes no sense. It shouldn't work at all. New approach X is much better." So I'd start refactoring it, and when I'm about 3 hours into the implementation of 'X', I begin to understand why I chose the original solution, and realize it remains the best approach. And so I nuke my changes.

    I don't tend to let that happen so much, any more. Partly I try to better document why I make the design decisions I do, and partly I try to have a little more faith in myself, and partly I stick to the attitude of "Don't fix what you don't empirically know to be broken."

    The point of my story is this: If someone can misunderstand their own design decisions after the fact (and talking to fellow programmers, I'm not the only one with this kind of experience), think how much easier it is to misunderstand someone else's.

  • by rgmoore ( 133276 ) * <glandauer@charter.net> on Thursday January 15, 2004 @03:54PM (#7989268) Homepage

    One point that the author seems to miss is that there are better and worse ways of doing a rewrite. Several of the examples he mentions (notably Apache 1 vs 2 and Perl 5 vs 6) are being handled very well. Development on the old versions is continuing while the new versions are being improved essentially in the background. That means that nobody is forced to upgrade until the new version actually provides them with enough tangible benefits that the switch is justified.

    Perl is an especially good example because the new version is actually separating the language specification (Perl6) from the Virtual Machine (Parrot). Parrot will be flexible enough to run both the old and new language specifications, so even people who don't want to rewrite their scripts will benefit from the performance enhancements. Combined with continued development of the existing codebase, this makes Perl very future safe, all while offering the potential benefits of a complete code rewrite.

  • by Gherald ( 682277 ) on Thursday January 15, 2004 @03:55PM (#7989291) Journal
    > XP and 2003 are fairly minor tweaks of Windows NT, but they are missing some of the back-compatibility that was in Windows 2000 if I remember right.

    No, you have got it backwards. XP and 2003 are both MUCH more back-compatible than Win2k.

    Asside from NT, Win2k was the most incompatible windows ever. Stable, but with many compatibility problems with both hardware and software. Especially before the various service packs came out.

    > XP was in no way "from scratch"

    You are correct. XP is the Win2k codebase with many features added and much better hard/soft compatibility. It was designed to be both a home/office OS, whereas Win2k was designed specifically to be a robust server/workstation.

    Incidentally, after all this time there is still an ongoing debate about whether XP or 2000 are more stable as a workstation client. As a network admin for 46 stations, my vote goes for XP.
  • Re:Full of shit. (Score:5, Insightful)

    by mcmonkey ( 96054 ) on Thursday January 15, 2004 @03:56PM (#7989306) Homepage
    Netscape 4 basically ended the browser wars. That was the point many users switched to IE, and they never switched back.

    Yes, it was that bad.
  • by FerretFrottage ( 714136 ) on Thursday January 15, 2004 @03:58PM (#7989325)
    Give 10 software developers the same problem and you are guaranteed to get at least 11 different solutions
  • by squarooticus ( 5092 ) on Thursday January 15, 2004 @04:02PM (#7989418) Homepage
    Exactly when did Netscape ever work well on Linux?

    All I remember is consistent crashing from Netscape Gold through the finally-put-down Netscape 4.x. It was the biggest piece of shit browser ever written precisely because its codebase was old (forked from NCSA Mosaic in 1994, which itself was much older) and non-extensible, yet more and more shit was thrust into it. It had to be rewritten, and all the Gecko-based browsers have been much more feature-complete and reliable for the past 2-3 years than Netscape ever was.

    I use Galeon, and the thing basically never crashes. Back in 1999, I considered myself lucky if a particular version of Netscape 4.x only crashed once every half-hour.
  • by wwvuillemot ( 676894 ) on Thursday January 15, 2004 @04:05PM (#7989473) Homepage
    I do not concur with the author's seemingly blanket assumption that a complete re-write of codebase is wasteful. There are times when it is necessary for both practical and philosophical reasons.

    From the practical standpoint, and suggested by other astute readers, often times the initial specs did not sufficiently anticipate future growth. Needless, it is a poor programmer who does not from a programmatic perspective anticipate this and do his/her/its best to provide a sufficiently robust framework that has at least one order of magnitude growth in a primary spec. On top of this, standards change, new ones emerge, "paradigms" shift, needs change and so on -- at times it just makes sense to start from scratch. You are not going to build a business building on top of your house's foundation...it just is not scalable to the new needs.

    Philosophically, I think it is worth tearing down the structures and building anew at times. Too much incremental growth can lead to long term stagnation as the original skills to build the foundation are lost through inactivity. As an aerospace engineer I can see it now where too much information and processes have become institutionalized -- I fear if ever we needed to do it from scratch.
  • What about Gnome? (Score:3, Insightful)

    by Anonymous Coward on Thursday January 15, 2004 @04:06PM (#7989485)
    The Gnome desktop environment is a prime example of disasters through re-writes.

    As we all know, Gnome's oringal purpose was to provide a free rival to KDE, which was the first easy to use Desktop Environment for Linux, this was back before Qt was GPL

    Unfortunaltey for Gnome, its problems started as it kept replacing and rewriting core components. For example, it started out with the Enlightenment window manger, then it switched to sawfish, then it switched to the buggy and slow metacity. Metacity has had many problems, and most people want the old sawfish back, but havoc pennington refused to do it and insists that people use it.

    The file manager keeps changing too. First it was GMC, then it was the Slow and buggy Nautilus from the now defunct Eazel corporation, now they are writing a new Windows 95 like file manager for gnome called Spiral Nautilus.

    It also rewrote the graphics layer GTK and broke compatibillity with GTK 1.x. There are many legacy GTK apps still in wide use and they look ugly on newer desktops.
    There is also the many problems with the file dialog, which is now only emerging in GTK 2.4. This is also incompatible with older GTK versions. This means that if you want to use a new program, YOU HAVE to upgrade to Gnome 2.6, and can't keep your leagcy Gnome 2.0,2,4 desktops.

    They keep switching default apps, for example, Galeon was dropped in favour of the buggy and far less featureful Epiphany in 2.0. They also dumped several other applications that were useful.

    To make matters worse, it is going away from the old philosphy of simple text files and are using an XML based registry clone to configure stuff. KDE keeps the text file format underneeth and has had a standardized API for it.

    It also has a lack of true intergration, Micheal de Incanta has PUBLICLY ADMITTED that Bonobo was a faliure. KDE has had this BUILT in from day one using kpart technology, which is now being used in Apples Mac OS X Panther Edition.

    Gnome developers, realising they kan't kompete with KDE technology, has spread various FUD about kde, but the message is getting through. Red Hat has abondaned their Gnome desktops, Fedora developers are working hard to make KDE 3.2 the default desktop for Core 2. Debian, who has traditionally been pro-gnome have announced their full support for KDE and they are working hard to make KDE the defualt desktop for

    KDE on the other hand has kept consistent technology and has internally has changed very little since 2.0. Distros like Lycoris are still using 2.x because it is very stable and mature. KDE 3.2 will be a good example of why maturity, and not wheel inventing is a better idea overall. They have took their technology and have optimized it for usabillity

    Gnome 2.6 will need more than just propoganda about the HIG if it is going to get the attention it needs, but instead it looks like they are reinventing wheels again.
  • by chromatic ( 9471 ) on Thursday January 15, 2004 @04:11PM (#7989583) Homepage
    You are not going to build a business building on top of your house's foundation...it just is not scalable to the new needs.

    That may not be a good analogy. You can't gradually update your house's foundation, while you can gradually improve the foundation of a piece of software.

  • by JohnnyComeLately ( 725958 ) on Thursday January 15, 2004 @04:12PM (#7989596) Homepage Journal
    I have to agree in concept with the author of the piece. I have installed Solaris 8 on quite a few systems and just ran with what it included, and maybe added a few packages (SSH, PHP, etc) for different things. Life was simple and everything worked, everytime.

    Then I changed jobs where there's not a single Unix machine to be found. Needing to set up a server (and knowing apache did what I needed), I took a PC and tried installing RedHat 8 from a disk I burned from a friend a long time ago. This is where my life got frustrating. I spent three weeks banging on this thing until someone mentioned Apache and Mod_Perl having different versions....oh yeah, and 1.99 is actually 2...WTF?? I came so close many times to just going on E-bay, buying an old Sun Ultra10, and donating it to work.

    Yes, I eventually got it to work (it's what I'm using now to write this, on a Mozilla browser), but it just seems like things were more complicated than they needed to be. Apache 1 worked....mod_perl 1 worked....

    So, since I agree with the author does that make me flamebait too? I guess the upside is now I have more books, as I went out and bought some Linux books to sit on the shelf next to my Sun Solaris System Admin 1 and 2 boxes.

    The reason I knew right away that I'd sympathize with this article is that all my co-workers (at my former job) were upgrading to Sol9. They'd ask, why aren't you upgrading? I'd always ask, why should I? Silence... Occasionally someone would mention OpenSSH, but then I'd remind them that took me less than 2 minutes to download, untar and pkgadd. A lot less time than a full OS load.

    John

  • Rewrites (Score:2, Insightful)

    by JASegler ( 2913 ) <jasegler AT gmail DOT com> on Thursday January 15, 2004 @04:16PM (#7989660)
    There comes a time in any softwares life that a rewrite IS the correct decision.

    To put it in real world terms...

    If you take a single floor home and start adding floors to it, you won't ever turn it into a skyscraper. At least not one I'd ever want to be near.

    If you want a skyscraper, you bull doze the house, design the skyscraper and build it.

    A lot of early design decisions can really haunt you later. Like the Apache threading example in the article.

    -Jerry
  • by Anonymous Coward on Thursday January 15, 2004 @04:24PM (#7989773)
    Rewrites can be good or bad depending on the goal and the understanding of the people doing them.
    For a good rewrite to occur the following need to be true:
    - emphasis on smaller/simpler code, NOT adding new features (new features may come about "for free" as part of the new design, but must not be added in as extras at this stage)
    - the person/team doing the work should have a full understanding of the whole architecture of what is being rewritten
    - full access to all the previous bugs and bugfixes to make sure the new versions addresses all these problems
    - fairly complete regression testing should be available to compare the old and new versions
    - reduce/simplify/refactor as much as possible; identify common patterns and eliminate the redundancy, and in so doing hopefully eliminate bugs as well

    It's a pretty major chore that requires a lot of understanding and very comptenent people to head up the effort. But if done right, it's well worth the effort, because if it can maintain compatibility (or at least mostly) and greatly simplifies the source base, it will dramatically decrease the maintenance time needed in the future, and can naturally wipe out many potential bugs at the same time, reducing future debugging time.
  • Lousy Examples (Score:4, Insightful)

    by avdi ( 66548 ) on Thursday January 15, 2004 @04:25PM (#7989787) Homepage
    Most of the examples given needed rewrites to remain viable. It's easy to look at a package from afar and declare it "perfectly sufficient". Things look different when you have to work with a system daily. In particular, rewrites often address shortcomings in a system's capacity for extension. Just compare the number of third-party extensions available for Netscape 4.* vs. the number now available at mozdev.org for Mozilla and Firebird.

    A bigger problem, to my mind, is when a half-dozen projects with the noble intention of replacing an aging kludged-up tool are started, all of which suck in different ways, and none of which learn from each other. And then they lose momentum and stagnate.

    Examples? Most programmers agree that "make" is overdue for replacement, but despite many attampts (cmake, jam, cons, ant) no one has managed to come up with one that is compelling enough to catch on. CVS is a crufty mess, but none of it's potential replacements are mature enough or have the kind of widespread tool support to make much of a dent in CVS installations. And there are dozens of written-from-scratch applications which differ primarily on the GUI toolkit they are based on, which would be better apps if they incorporated the best features from all into a joint effort. My idea of the perfect browser combines features of Konqueror, Galeon, Epiphany, Firebird, and Safari.
  • This is bogus... (Score:2, Insightful)

    by clifgriffin ( 676199 ) on Thursday January 15, 2004 @04:25PM (#7989790) Homepage
    There are always ways to improve code. Much of the time you'll end up with a much smaller, much more efficient, much more extensible application.

    Rewriting is almost always a good thing. The rules of writing english papers apply here.

    At a certain point, certain portions will mature to the point that they can't or needn't be improved with each successive version. If you're content with the architecture, you'll reach this stage. But not many applications can evolve to new heights with the same diagram/layout it started with 5 years ago.

    This is just a poor attempt to get noticed.

  • by Salamander ( 33735 ) <jeff AT pl DOT atyp DOT us> on Thursday January 15, 2004 @04:28PM (#7989826) Homepage Journal

    There are necessary and beneficial rewrites, but the vast majority of rewrites occur because it's easier to write a new piece of code than to understand an old one. Yes, easier. The "rewrite bug" afflicts brash beginners the most, and top-notch experienced programmers the least. The best programmers tend to get that necessary rewrite out of the way during initial development, by writing a serious first-cut version, throwing it away, and then writing it a second time for real all before anyone else even sees it. Such code will often pass unit tests earlier than the "never refactor" code written by second-raters, and rarely requires a rewrite after that.

  • Throw one away (Score:3, Insightful)

    by cdunworth ( 166621 ) on Thursday January 15, 2004 @04:29PM (#7989840)
    It's really common to build something, step back, examine its warts, and start over again with a new perspective and understanding. It's called prototyping. Some people actually build the first one with the intent of throwing it away. Others release it as v1.0, and introduce issues of the kind this author is referring to.

    There are many reasons you might prefer a rewrite. The main one, to me, is that complicated applications contain layers and dependencies, not all of which are obvious to a new programmer. If, after some analysis, your assumptions about these dependencies are wrong, you'll break the original code faster than you can say "global variable". In the end, you could easily spend more time and effort patching and praying than you would rebuilding from the ground up.

    Of course, if some of the original architects are still involved in the project, arhictectural knowledge and assumptions can be transferred to new programmers in a fairly fluid way, and I suspect it is in these cases where you can confidently add on to an existing code base.

    And it's always helpful if the previous programmers were actually good programmers, and who wrote code and comments that were mindful of those who might follow them later. But that's not within your control.
  • Re:Ego? (Score:4, Insightful)

    by globalar ( 669767 ) on Thursday January 15, 2004 @04:39PM (#7989979) Homepage
    How do you learn how something works?

    1) You take it apart (literally the backwards approach, though if you have the time it works).

    2) Read the documentation, learn how to use it, and work with it. (Still will not show you everything, especially with well encapsulated components. And when was the last time documentation, even Google, answered all your questions?).

    3) Build something similar (a variant, clone, emulator, etc.)

    The experience of programming your own components cannot be substituted. Bad, but passable analogy: Building a house vs. repairing a house. In the former, you experience the though process; in the latter, you adapt your thought process (to some degree).

    Also, I think once you see all the work and brilliance that has gone into software you take for granted, you are motivated to build something once with the intention of reuse. To be a forward thinker you have to understand what has gotten us this far and what has to change to get us farther. Experience with what the wheel is made of and why, not necessarily rebuilding it, can provide you with these perspectives.
  • by Gherald ( 682277 ) on Thursday January 15, 2004 @04:54PM (#7990264) Journal
    The new UI really is a love/hate thing. I started using 95 the week it was released, so I am very attached to the old theme.

    There are plenty of people who like the new theme... but there is too much color there for my tastes.

    "Crayola interface", indeed!
  • by bigpat ( 158134 ) on Thursday January 15, 2004 @04:58PM (#7990313)
    seems common in other areas of engineering also, bridges could just be retrofitted, buildings added on to, but sometimes there are too many unknowns in engineering old structures... Are the building materials made from Asbestos, How has the structure held up after so many years? Have other modifications extended or complicated further modifications beyond that which the original plans called for? Sometimes the unknowns themselves justify building from scratch. Sure we could just keep tacking on new technologies to old, but the result will seldom be better. More often the real advancement comes from taking the knowledge gained from past experiences and applying them to new, rather than actually taking old work and trying to make it work in a new situation.

    Would you really want horses running on a treadmill attached to the front of your car, just because humanity wouldn't want to throw away its previous investment in transportation technology?

  • by pclminion ( 145572 ) on Thursday January 15, 2004 @05:05PM (#7990418)
    I find myself constantly rewriting any code that I have complete control over. Code I write for my employer evolves continuously, but personal code for my own enjoyment is constantly getting axed and redone.

    Having done this for years, I think I'm starting to figure out why I do it, and perhaps someday I'll be able to stop myself from doing it, so that I can actually release something :-)

    I think the need to rewrite is more emotional than intellectual. As I work on an existing codebase, I notice the little bumps and warts on it, the little "tweaks and fixes" which make it work, and I find them ugly. For some reason, I place the highest aesthetic value on code that was written in one big, flowing session, where the entire structure was understood from the beginning, and the entire thing looks like it was born fully-formed from some supernatural source.

    In an ever futile attempt to realize this goal, I constantly chuck out perfectly good code and redo it from scratch. I do this because I seek the emotional experience of those few times when I really do sit down and blast out something that's beautiful, elegant, and functional. Even if, practically, it's no better than before.

    Open source programming is often described as scratching an itch. It should be immediately apparent why this correlates to extensive rewriting of code. Some problems are simply enjoyable to solve. The necessary thinking feels good. Just as we watch a good movie again and again even though we've got the plot memorized, some programmers want to rewrite the same functionality repeatedly because it just feels good.

    To hell with practical considerations, like whether or not that's "bad" for the codebase. I program for pleasure.

  • by Anonymous Coward on Thursday January 15, 2004 @05:10PM (#7990503)
    > But you have to consider what Netscape would be like if it had had the amount of work put into it that mozilla has now.

    If you knew the history of the Mozilla project, you would know that modifiying the old Netscape code would have gone nowhere.

    Before the Mozilla developers decided that a rewrite was necessary, they spent the better part of a year trying to improve the original Netscape code.

    But the code was so bad that they couldn't attract any developers -- no one was willing to work on it.

    Trying to work on the old code was boring, and difficult, and required huge amounts of effort for very little gain. Also, the code was not modularized properly, which meant that very few developers could work on it at the same time without constantly tripping over each other.

    If they had simply tried to upgrade the old code, you would have a slightly better Netscape browser today (assuming they didn't give up entirely).

    But the rewrite attracted large numbers of developers, and produced some real innovations.

    Today, Mozilla is a much better browser -- head and shoulders above IE -- with better stability, better standards support, and features such as tabbed browsing, and pop-up blocking.

    But more than that, the Mozilla project has given us a powerful cross-platform development toolkit, with the XUL user-interface facility. This has not only created a new field in developing Mozilla plug-ins, but is being used for the construction of many other products.

    The original poster was right. The author of the article is talking nonsense.
  • Netscape 4 (Score:1, Insightful)

    by Anonymous Coward on Thursday January 15, 2004 @05:11PM (#7990515)
    Netscape, and the web, with each version since at least 2 or 3, became less usable. In the days of Netscape 2/3, it was stable. The HTML standard was a lot simpler, and it was standards-compliant. I remember those days. The web worked great. It was faster than it is today, more machine-parsable, more disabled-friendly, easier to make web pages for, and just generally better.

    The problem came in when web design weenies decided they want pixel-by-pixel control of where everything went. Netscape and Microsoft, in competition to embrace-and-extend each other out of business, added crappy extensions to allow this. We saw layers, JavaScript, Java, Flash, etc. come out around this period.

    In order to keep up, the W3C came our with hairy standards to allow web pages to describe this sort of stuff. Writing a web client went from several months, to many, many years, with the only added benefit of web pages looking a little bit prettier with a good graphic designer, and much uglier and less useable the other 95% of the time. On the other hand, modern web pages now consistently cause the masses of complexity called web browsers to crash (I use Netscape, Galeon and Mozilla -- all of them consistently crash). Due to the complexity of the standards, different web browsers support different subsets of the standards, and so render differently. Web designers usually only sit with one or at most two browsers, and so never see how the pages break on everything else. Heck, with a lot of web pages, having a different set of fonts or different font-sizes from the web designer's box causes it to render uselessly.

    Normal people no longer learn HTML -- it is now much more complex than languages like TeX. They use tools like FrontPage, but increasingly the web becomes asymetric, with content providers (big companies with web design teams), and content receivers (typical end-users, overwealmed by the web, and only capable of putting up very crude web pages).

    I really don't see any win here. We introduced shitty standards, and coded shittier software to run on those standards, and then whine about how the old software isn't compatible with the new standards.
  • by CmdrTHAC0 ( 229186 ) on Thursday January 15, 2004 @05:24PM (#7990733)
    Those instances of Slashdotter are not necessarily the same person. Furthermore, Microsoft would be totally braindead to make a business plan from Slashdot comments. BillG probably isn't that dumb, despite what we wish for ;-)
  • by Webmonger ( 24302 ) on Thursday January 15, 2004 @05:55PM (#7991147) Homepage
    What is wrong with that is that most of the code is correct and solid. It's just organised in the wrong way.

    Instead of rewriting, restructure! When you rewrite, there's a period where the new code doesn't work. If you restructure in suitably-sized steps, the code always works between steps.
  • by Trejkaz ( 615352 ) on Thursday January 15, 2004 @06:56PM (#7991908) Homepage

    I sent this reply to the author through the site, but it would probably get some use here too.

    "The Web was based on the idea that a simple markup language could allow us to divorce document presentation from document structure"

    Which HTML 1.0 through 3.2 didn't really achieve, admittedly...

    "Some of the changes to HTML were done in a way that shouldn't break old browsers, but as I said before, I am increasingly seeing websites that don't render properly in Netscape 4.x"

    There's a shock. I thought it was 2004, and you're still testing on a browser which is at least three major revisions old, never mind that Mozilla itself seems to be more useful than Netscape's rebadged browser.

    "So apparently the FONT tag is deprecated - now we have to use style sheets and whatnot to do something that was originally very simple"

    This is because "the web was based on the idea that a simple markup language could allow us to divorce document presentation from document structure", and the FONT tag is presentation appearing in the document structure. That's sort of like a divorce where the couple still sleep with each other.

    "but at the expense of being able to do simple things quickly."

    I beg to differ. Even if I really want to break style guidelines and make a chunk of text red for no particular purpose, it still takes the same amount of time to type <span class="red"> than it was to type <font color="red">. Never mind that this really is a bad thing to do. Why is it red? Is there a meaning to the red? Perhaps it should be <span class="important">, in which case why not just use <strong>?

    "As a Web developer I have long wondered why they didn't add more types to the INPUT form tags to express different types - for example, a DATE attribute, or INTEGER, DOUBLE, or whatever."

    Of course XHTML 2.0 will be partnered with XForms, which will attain this functionality in so as far as any field which can store a value can be of an XML Schema type. This includes -- wait for it -- dates, integers, doubles, and arbitrary regular expressions.

    "These "rich" (but simple! not XML!) attributes could then be seen by the browser and presented to the user in whatever way is supported by the system"

    Hopefully they do this. I would love to see browsers implement a calendar popup. I can't count the number of times we had to use a JavaScript for this.

    "But the direction we're going in, the HTML books have just become thicker and thicker over the last few years."

    This I don't get. There are less tags now, right? It's the CSS and XSL books which should be getting thicker. By the way, never buy a book on XSL:FO. I accidentally dropped that on my foot, and christ, they hurt.

    I think the progression from HTML 4.0 through XHTML 1.0 to XHTML 1.1 was smooth. They're encouraging people to go back to the roots of the web: to mark up content depending on what it means, not depending on how it's supposed to look. Sites like www.csszengarden.com are living proof of how the separation of HTML and CSS can achieve excellent separation of concerns between the graphic designer and the web developer, and I'd personally love to see more sites such as this (only with real content!) pop up all over the place. If for no other reason than the pages loading faster due to many, many less tags in the HTML! :-)

  • by Anonymous Coward on Thursday January 15, 2004 @09:37PM (#7993561)
    Actually it's possible to get that level of compatabilty with W2k. Just install the Windows Application Compatability Toolkit (it's available for free off of M$'s site). I use it at work (a high school). IMO, W2K + WACT is far more stable (and faster) than XP Pro.
  • by CreateWindowEx ( 630955 ) on Friday January 16, 2004 @12:26AM (#7994907)
    I think this discussion is being clouded because people have experience with different-sized projects. If you are working on some fairly small tool which has a well-defined usage, it may be reasonable to rewrite the whole thing from scratch, partly because it is possible to test it fairly thoroughly yourself to verify your new version, and because it is possible to understand the whole thing in your head at one time, and thus make the decision to rewrite rationally.

    However, if you're talking about a larger project such as a commercial software application with 50 man-years of development, a complete rewrite will usually be undertaken with a large degree of ignorance of the true problem domain. Also, you can rewrite your codebase to fit more nicely with the *current* state of your specs and requirements, but the new "elegent" design may be even less suitable to tomorrow's new feature request. Even rewriting a major subsystem from scratch can be a costly mistake.

    The real trick is how to maintain the code in such a way that it continuously improves instead of just getting more and more riddled with spaghetti, dead code paths, and other clutter. It's especially hard, because it's easy to forget to treat a mature code base with respect, and just hack in "one more" thing because you are hoping to rewrite it at some point.

    There is the "broken window" idea--one broken window will lead to an increasing spiral of vandalism, and one line of crud in a source file will give future programmers the feeling that they can add ten more lines of crud because the code is already "dirty". While the usual adage is to clean up the window immediately, the reality is that most source files have one hundred broken windows already and fixing them all right now is not an option. What takes discipline is to make sure to leave each file in a better condition than you left it--remove some dead code, do a little refactoring to clean it up, rename identifiers or reformat code to conform with project-wide standards (NOT your personal pet style!!! This is very annoying when people check out a file, reformat it to their own preference, add one small feature, break some other part of the code, and check it back in to source control...). Another common problem is when people come up with some "new religion", convert about half the code over to the new way, but leaving it in a "worst of both worlds" state because it turns out the "new way" was just "different" and added as many problems as it solved. It is easy to add code that goes against the grain of the existing code because you didn't bother to really understand the system and structure of the code, or because you don't "like" a certain design decision.

    In many ways, the real achievement is that higher mental state where you stare at the huge, messy codebase that you've been working with for ages, have the "aha" moment, come up with a simple refactoring, make changes to fifty files, replace hundreds of lines of code with tens of lines, and it just works the first time and only takes a few hours, because for one fleeting instant, you had the whole thing in your head at once. I wish I could have more of those moments.

    I'm as guilty as any for violating these ideas--I often keep my nice clean "new" subsystems tidy because they are already tidy and conform to my current design philosophies/religion, but let my big, mature--and often more imporant--subsystems grow increasingly crudified because I have the continuing fantasy that I will be rewriting them from scratch "next project"...

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...