Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Windows Operating Systems Software Microsoft Programming IT Technology

Why Vista Had To Be Rebuilt From Scratch 711

iliketrash writes "The Wall Street Journal has a long front-page article describing how Jim Allchin approached Bill Gates in July, 2004, with the news that then-Longhorn, now-Vista, was 'so complex that its writers would never be able to make it run properly.' Also, the article says, 'Throughout its history, Microsoft had let thousands of programmers each produce their own piece of computer code, then stitched it together into one sprawling program. Now, Mr. Allchin argued, the jig was up. Microsoft needed to start over.' And start over they did. The article is astonishing for its frank comments from the principles, including Allchin and Gates, as well as for its description of Microsoft's cowboy spaghetti code culture."
This discussion has been archived. No new comments can be posted.

Why Vista Had To Be Rebuilt From Scratch

Comments Filter:
  • And Microsoft rule (Score:5, Insightful)

    by Timesprout ( 579035 ) on Saturday September 24, 2005 @07:39AM (#13637270)
    Because much as /. knocks them this is the sort of thing they can manage, astonishing turn arounds.
    • by davmoo ( 63521 ) on Saturday September 24, 2005 @07:49AM (#13637309)
      Right on, dude. I wish I had mod points to give you this week.

      The only computer company that has reinvented itself more times than Microsoft is IBM. And both companies are, contrary to popular belief around here, very far from dead. They aren't even sick or gasping.
      • You forget Apple, they reinvented themselves more than once AND always have managed to be the frontrunner of computer innovation...

        The lego block analogy apply to how Apple wrote code for a while, I would go as far as saying since system6 but more realistically system7 with its core os and extensions attaching to it, they invented plug-ins before browsers were even invented...

        That ultimately gave us osX, the ultimate in plug-in philosophy, from the kernel to the GUI.
        • by leandrod ( 17766 ) <l.dutras@org> on Saturday September 24, 2005 @09:44AM (#13637787) Homepage Journal
          osX, the ultimate in plug-in philosophy,

          Mac OS X is not that modular. GNU Hurd is far more, and even GNU/Linux.

          from the kernel

          Mac OS X’s kernel’ not modular at all. It has conflated the Mach microkernel, which has already been abandoned by the Hurd for its bad performance, with the monolithic BSD kernel. The result is something just as monolithic as BSD, but much larger, more complex and slow. Linux is not as fast or simple as BSD, but still much faster than Mac OS X — and both are just as modular.

          In contrast, the Hurd on the Mach is a little bit slower but much more modular, and the new L4 version has the potential to be much faster and still much more modular, because it is a true microkernel with multiple servers.

          to the GUI

          The Mac OS X GUI’s not modular at all X is.

          • I've tried Apple's Remote Desktop product. It isn't much of an improvement over VNC. Apple (I'm sure misinformed X-haters will love this.) COMPLETELY ripped sane remoting capabilitity out their desktop. There own remote admin products send bitmaps just like VNC. It is maybe slightly more efficient because they can hook in a bit lower but they can't do an X11 much less an rdesktop.

            Low bandwidth responsive remote desktops is a bullet point that modern OSes should be able to meet. The capability most cer
          • by pohl ( 872 ) *
            Could you cite a specific example of where there are two specific regions of code within those systems that are not linked through a well defined interface, and make a convincing argument that they should be?

            Did you know, by the way, that a system can be modular on the source code level and then (based upon a compilation flag) it can either be built such that (A) both regions are in kernel space, or (B) one region is in kernel space and the other is in user space. The former would use a very efficient in

        • by cahiha ( 873942 ) on Saturday September 24, 2005 @12:08PM (#13638399)
          You forget Apple, they reinvented themselves more than once AND always have managed to be the frontrunner of computer innovation...

          According to their marketing and PR departments, anyway.

          That ultimately gave us osX, the ultimate in plug-in philosophy, from the kernel to the GUI.

          Apple didn't give us OS X. The kernel came from CMU (an open source project), and NeXT and Apple spent the last 20 years making it less modular. The GUI software architecture came from NeXT, borrowed heavily from Smalltalk, and is client-server, like X11, only not as well architected or as efficient.

          In fact, Apple's own systems programming staff screwed up so badly that Apple had to go out and buy a new operating system; all their attempts to develop a next generation Macintosh OS in-house failed.
          • by SewersOfRivendell ( 646620 ) on Sunday September 25, 2005 @12:01AM (#13642612)
            Apple's own systems programming staff

            You misspelled executive management. Apple had plenty of fine programming talent who would have been happy to execute on a strategy. Any strategy.

            It may surprise you to learn that many programmers at Apple -- including key members of the Cocoa team, the Carbon team, and the IOKit team -- worked on Copland. Difference between Copland and Mac OS X? Executive management. Define a goal and stick to it. Q.E.D.

            In fact, a non-trivial amount of code and concepts from Copland is recycled in Mac OS X (excluding Classic for the purpose of this discussion ;) ). A lot of the Carbon toolbox implementation comes from Copland (most of it via Mac OS 8.5). Much of the Darwin IOKit design (but _not_ the implementation) is derived from Copland's NuKernel driver architecture, and some small parts of IOKit are derived from Pink/Taligent designs (but not the implementation).
        • by NutscrapeSucks ( 446616 ) on Saturday September 24, 2005 @12:13PM (#13638417)
          system7 with its core os and extensions attaching to it, they invented plug-ins before browsers were even invented...

          The MacOS extention mechanism was nothing like "plug-ins" -- There was not a defined "extention API" as with a browser, they were they were system call traps and often relied totally on undocumented behavior.

          Someone could write such extensions for any OS, but it's generally considered to be a bad practice. As the unstable, conflicting mess of MacOS extentions proved.
    • by idlake ( 850372 ) on Saturday September 24, 2005 @08:07AM (#13637386)
      Of course, you are right: Microsoft is indeed one of the most competently managed companies around. And that is exactly their problem.

      Why is that a problem? Because their management, sales, and marketing are so good that their technology doesn't have to be. They can ship software with security holes, bugs, poor usability, and bad design, but the non-technical part of the company will somehow manage to still sell it and make a bundle on it.
    • by hayden ( 9724 ) on Saturday September 24, 2005 @08:08AM (#13637393)
      If you honestly believe they have re-written all of Windows in 18 months then I have a bridge to sell you.

      This is probably one of two things. He's telling the truth and they have re-written the core parts. This wont fix the vast mass of code sitting on the core code which relies on the way things used to work.

      The other option is this is the latest round of "we've fixed it this time, honest". The result of this is left as an exercise to the reader.

      • by TMLink ( 177732 )
        From the article:

        The day before in Microsoft's auditorium, Mr. Allchin had announced to hundreds of Windows engineers that they would "reset" Longhorn using a clean base of code that had been developed for a version of Windows on corporate server computers.

        So they took the clean code they had from their server core (maybe from the next server edition, maybe from 2003?) and then used that to create Longhorn. So no, they didn't start from scratch, but they aren't building upon the XP quicksand foundation e

      • Indeed, Microsoft never re-write anything. And arguably any company that did so would be very foolish and probably heading for self-destruction. The article is couched in so much lay-man's talk and analogies it's hard to know what the real engineering changes have been. But at a guess they are just writing the new parts of Windows in a modular plugin way. And that's why they can delay them till after the launch of Longhorn. So WinFS for example was probably originally implemented as spaghetti (co-ming
      • by timeOday ( 582209 ) on Saturday September 24, 2005 @09:03AM (#13637622)
        The other option is this is the latest round of "we've fixed it this time, honest".
        Most software development houses struggle with this.

        Every piece of software starts with a clean, elegant structure - in the mind of whoever created it. Over time some of their assumptions prove false, and more importantly, many of the "true believers" who originally engineered the system move on. The inevitable result is the next wave of developers have a burning urge to throw it out and start from scratch. Virtually all developers want to throw out the code they maintain and start from scratch. As this faction gains momentum, what do you think they say about the software? It sucks, it's not engineered, it's not maintainable, and so on. There's probably some truth to it, but a lot of it is people making an argument to justifiy doing what they want.

      • How the story tracks (Score:5, Interesting)

        by DannyO152 ( 544940 ) on Saturday September 24, 2005 @10:35AM (#13638025)

        Put my two cents in as to how the article's storyline doesn't quite track. If Mr. Allchin, despite massive institutional inertia, gave the pig winglets and put it back on a track to actually being releasable then we're missing the motive for why he'll leave on Vista D-Day and why the company wouldn't fight to keep him. In some sense, the article is about the story Microsoft wishes to tell, which is we were writing bad code, but we've fixed that now (and look at the bruises: no pain, no gain, right?), which is what the parent posts suspects.

        Now I suspect that the interviews took place before the Microsofta est omnis divisa in partes tres announcement, and there was no desire from Microsoft to have Mr. Allchin candidly describe his reasons for retirement (and maybe Mr. Allchin has a book up his sleeve), so off to press with this peek into the hallowed halls of Redmond.

        One quibble I would have with article is in its suggestion that Mr. Gates, as Chief Software Architect has two paradoxical duties to reconcile: coming up with innovations and putting down unrealistic projects. A lot of the candid reporting I've seen is that there's a third element that he practices with zeal, which is to grind into a fine powder any idea he believes shakes a stick at the cash cows.

        One implication of the story is that in Summer 2004 Bill Gates didn't know that one of the cash cows was flatlining. There's a thought to ponder.

    • by labratuk ( 204918 ) on Saturday September 24, 2005 @02:02PM (#13639056)
      Because much as /. knocks them this is the sort of thing they can manage, astonishing turn arounds.

      Hah. I'm still trying to count the number of times I've heard "Yeah, we admit that everything so far has been kinda crap, but we've sorted it out this time..." from them.
  • by Gopal.V ( 532678 ) on Saturday September 24, 2005 @07:43AM (#13637291) Homepage Journal
    So Microsoft screwed up... and they're trying very hard to do it again. Dropping WinFS, porting Avalon back to XP etc..

    To quote

    And so at last the beast fell and the unbelievers rejoiced. But all was not lost, for from the ash rose a great bird. The bird gazed down upon the unbelievers and cast fire and thunder upon them. For the beast had been reborn with its strength renewed, and the followers of Mammon cowered in horror.
    Microsoft's greatest enemies now are still two for-profit companies - Google and Apple. I'll rest easier when FOSS replaces them (as was promised in 1999). Instead it's just a new master instead of the old one.
    • by cowscows ( 103644 ) on Saturday September 24, 2005 @08:03AM (#13637363) Journal
      The goal isn't for MS to disappear. We don't want them to get replaced by any single organization. We just want them to lose enough monopoly power and influence so that the rest of the computer world can get around without MS stomping on whatever they don't like. It already looks like they've lost some control. Google is doing their own thing, Apple openly taunts MS now, but neither of them are going to suddenly be ubiquitous on 90%+ of the world's computers. If Apple could get their marketshare up around 10%, maybe this "web as a platform" dealie sort of replaces windows 10% of the time, and maybe FOSS gets a 20% marketshare. Things would be way different, and about a zillion times better for consumers. I don't want FOSS to replace Google, Apple, or MS. I just want them all to be competitive, and to keep each other honest.
      • by nine-times ( 778537 ) <nine.times@gmail.com> on Saturday September 24, 2005 @11:41AM (#13638287) Homepage
        Dead on. We don't need a monoculture. We don't need a single technology or a single kernel or a single philosophy behind all of software development, and so it simply doesn't make sense to demand that all software be FOSS.

        In the midst of FOSS activism (which I have no problem with being a FOSS advocate, and often consider myself one) people tend to take their eyes off the ball. The important goal is not to have all software be GPL'ed, but to have real open standards. In fact, I don't think we should even mind Microsoft maintaining a large market share so long as they start using open standards. As customers and potential customers, we should all demand (in whatever way we're capable) that Microsoft provide freely available documentation to their file formats, protocols, and APIs. Insofar as they fail to do so, we should consider that a problem with their product, and look for alternatives.

        The tremendous value and power of FOSS is not in having everyone use it all the time, but in anyone and everyone having the ability to use it whenever is appropriate for them. If a Linux server can be used as an easy drop-in replacement for a Windows server and OpenOffice can open/save MS Office documents, then Microsoft will not be capable of abusing their own customers. Microsoft will be forced to compete with FOSS by offering better quality and features rather than vendor lock in, and frankly, if they would do that, I would have no problem with Microsoft whatsoever.

        Also, as much of a fan of FOSS as I am, I am also a fan of Apple and Google because I do believe they're competing by offering quality and features that people want.

  • by Anonymous Coward on Saturday September 24, 2005 @07:43AM (#13637292)

    "Microsoft's cowboy spaghetti code culture"

    If its any thing like "Guns n Roses - Spaghetti Incident" then this should effectively be the last we hear of Microsoft.

  • by NeuralAbyss ( 12335 ) on Saturday September 24, 2005 @07:44AM (#13637296) Homepage
    It's interesting to hear how their software development survived in such an anarchistic environment - everyone producing their own code, with ad-hoc integration. It's a good example of how software development methodology can work though, even though the specifics of the specification design weren't discussed in the article - if everyone codes to a documented interface, software development can work on such a grand scale.

    I personally would like to hear more about the software development procedures and methodologies used in other large projects - how successful different types of development are.

    I work for an automotive parts manufacturer, and to see the lack of consistency within the organisation's software development is disturbing. Safety-critical parts are being produced, and the level of testing between said parts varies quite considerably. Additionally, the level of oversight and adherence to software development procedures is rather bad to say the least. I just hope it's not characteristic of the industry as a whole.
    • It's interesting to hear how their software development survived in such an anarchistic environment - everyone producing their own code, with ad-hoc integration. It's a good example of how software development methodology can work though, even though the specifics of the specification design weren't discussed in the article - if everyone codes to a documented interface, software development can work on such a grand scale.

      The impression I get from the article is that even the documented interfaces (ie, fu

      • by NeuralAbyss ( 12335 ) on Saturday September 24, 2005 @08:01AM (#13637356) Homepage
        I have a friend at university who was recently hired by Microsoft, partially for a quality control role. While this's a single case, and in no way can be extrapolated to the whole company, from what he's said, it's apparent that they're reusing a large amount of their codebase, with the dodgy bits either rewritten or modified and thoroughly tested.

        As you said, there's no way in hell you can have a 12 month rewrite. But, with any luck (for the end-users), this will hopefully turn out to be more than PR fluff.
    • by Jugalator ( 259273 ) on Saturday September 24, 2005 @08:15AM (#13637414) Journal
      I personally would like to hear more about the software development procedures and methodologies used in other large projects - how successful different types of development are.

      Not sure if this is what you were interested in, but I think Paul Thurott has some great lengthy and detailed articles, along with some interviews with Microsoft engineers for some insight in the stress, problems, and achievements with various large Windows projects, and also with pictures of their build labs and test machines. :-)

      For example:

      Windows 2000

      Windows XP SP2

      Windows Server 2003


      A disclaimer bias-wise is that Paul Thurott is a guy who wants Microsoft to do well, but he's not afraid of criticizing them harshly when he doesn't agree with their decisions, so I think it's still not a case with "inside stories" being too biased to be useful. He was for example the guy behind the quote that Windows Vista had the markings of a shipwreck after seeing Beta 1. Although he has had some missteps IMO such as saying Windows Me should be far more reliable than Windows 98. ;-) I guess he had to eat his own words there...
  • by imipak ( 254310 ) on Saturday September 24, 2005 @07:46AM (#13637297) Journal
    From TFA:
    In 2001 Microsoft made a documentary film celebrating the creation of Windows XP, which remains the latest full update of Windows. When Mr. Allchin previewed the film, it confirmed some of his misgivings about the Windows culture. He saw the eleventh-hour heroics needed to finish the product and get it to customers. Mr. Allchin ordered the film to be burned.

    Man, that's a shame. I'd love to have seen film. Shame on Allchin if he didn't demand an archive copy that be retained, at least, even if it's only released in 20 years' time.

  • why ''astonishing''? (Score:5, Interesting)

    by dankelley ( 573611 ) on Saturday September 24, 2005 @07:49AM (#13637308)
    Why is it "astonishing" that the article does a decent job of providing hard-hitting information without spin? That's what we are supposed to expect of journalists. The Wall Street Journal is supposed to be (and often is) an example of real journalism. That makes it distinct from computer magazines that rely on advertising revenue from the computer industry, and from discussion forums whose course is steered by peeves and submission sequencing.
    • by Anonymous Coward on Saturday September 24, 2005 @08:25AM (#13637448)
      It's astonishing because nobody does it anymore, it's few and far between. It's gotten to this level and hardly anyone has noticed. Nowadays if you ask hard hitting questions they will just find someone else to be interviewed by, an interview that will have better PR results. With companies buying or owning media companies, they can just choose some of their own and build themselves and their empire up. A better question to ask is what incentive is there to do a hard hitting interview, for both the interviewer and the interviewee? Both want to perpetuate their jobs and positive PR but it requires criticism.
  • "Generally" (Score:4, Insightful)

    by X.25 ( 255792 ) on Saturday September 24, 2005 @07:49AM (#13637310)
    Microsoft's holy grail is a system that cranks out a new, generally bug-free version of basic Windows every few years, with frequent updates in between to add enhancements or match a competitor's offering.

    I really wish they explain me the difference between "generally bug-free" and "bug-free". Is the difference around 65,000 (as Win2000 has ~65,000 known bugs when launched)?
    • by justforaday ( 560408 ) on Saturday September 24, 2005 @07:54AM (#13637326)
      It's really quite simple. "Generally bug-free" means that it "usually works" "most of the time."
    • Re:"Generally" (Score:5, Insightful)

      by Malor ( 3658 ) on Saturday September 24, 2005 @09:56AM (#13637845) Journal
      Way back when, people were flipping out about the 65000 bugs in Windows 2000. I kept saying, "No, you don't understand... this means they can COUNT the bugs now. They have a process that's good enough to detect those bugs, so they'll be able to fix them.' Being able to claim with some precision that you have 65000 bugs is a huge, huge step forward from not knowing how many you have at all. And, as it turns out, Windows 2000 was possibly the best OS Microsoft ever shipped. This was not coincidence.

      I'm much more hopeful that Vista will be a real product after reading this article. It sounded like fluff/vaporware, but now it's starting to sound like it may have actual benefits for real people. (I likely still won't use it, because of the DRM/Palladium evilness inside, and I'll suggest to other people that they not do so either. But it may actually offer some real technical benefits along with the evil.)

      I doubt it will ever be secure. As Microsoft has spent billions demonstrating, you cannot retrofit security.

      The open source people might be able to learn from this process change at Microsoft. The 2.6 kernel has been very, very low quality, at least compared to earlier Linux releases. Even I myself have seen at least one of the problems.... bugs in the kernel directly cost me a couple hundred dollars, because I replaced a hard drive when it had nothing wrong with it at all. I was bitten by ACPI bugs, which mysteriously caused hard drive failures. I figured out the problem after the new drive started failing too, but I was about $200 poorer for it. As far as I remember, I haven't replaced non-broken hardware due to OS bugs since Win95... not exactly the best example to follow.

      I also worry about the desktop environments... they're getting so large and complex, they're starting to look like Windows. Tons of features with lots of interdependencies. I'm sure the code is a lot better than a lot of the stuff in Windows, but clean, tight code will protect against only so much bloat and overcomplex design.

      I'm starting to think that part of the reason the open source code was so very much better than Windows' was because it was a fresh start, with no backward compatibility to worry about.

      I wonder if, once the kernel, KDE, and GNOME guys have to lug around twenty years' worth of backward compatibility, they'll be exactly like Windows... bloated, buggy, and insecure. The last couple of years haven't looked too promising in that regard.
      • Re:"Generally" (Score:3, Interesting)

        by RLiegh ( 247921 ) *
        BSD lugs around nearly 30 years worth of baggage, but I can boot and reliably run NetBSD and OpenBSD on my 2004 hp pavilion. Linux 2.6.* sometimes will boot from the installer, but only if I disable ACPI (and quite often other things such as agp and usb -wtf?- as well).

        Older 2.4.* releases work ok, and the BSDs work ok (except for FreeBSD 5.0-5.3).

        After having similiar experience on other computers with 2.6, I've pretty much come to the conclusion that linux has jumped the shark; at least in terms of stabil
        • Re:"Generally" (Score:5, Insightful)

          by Malor ( 3658 ) on Saturday September 24, 2005 @02:40PM (#13639257) Journal
          Well, I'm hopeful they can nail things down and get them stable, but their focus doesn't seem to be on quality first. I think it was Rik van Riel who said that it was perfectly okay for only 1 stable release in 3 to actually be stable. I kid you not. I'd link it for you, as it's in my old comments. Unfortunately, I can't get to my old submissions, as I don't pay Slashdot anymore. So you'll have to find the quote yourself. lwn.net definitely has it somewhere in their archives.

          It's worth pointing out that the whole move of Linux into the server market was accidental. It was always being written as a desktop Unix. It just happened to be so amazingly robust that it made a dynamite server, and took over a good chunk of the internet. That'd be a good book title, "The Accidental Server". Unfortunately, the development model never changed to match the actual use of the system.

          The reason I started using Linux to begin with was because it didn't ever break... it didn't have as many features as Windows, but it just never, ever, EVER fell over. The 2.2 kernel was probably the most bulletproof piece of software I've ever run on a PC. 2.4 never got to the sheer solidity of 2.2... on good hardware it's quite robust, but I saw a number of machines where stressing it would lock it up after a few days. (from the kernel messages, it looked like it might be bugs in the (different) network drivers.) 2.6, relatively speaking, has just been a disaster. They won't leave it alone long enough to let it stabilize... they insist on jamming new code into every release, and dropping old releases very quickly. (the new 2.6.X setup.) So I can't get my bugfixes without new features if I want to use a vanilla kernel.

          People, of course, instantly bash me and say 'you're stupid, you should be using a distribution kernel'. I'm doing that now, even though I liked rolling my own, but I shouldn't have to. The dev team's attitude seems to be 'ship it and let the distros debug it'... which, as far as I'm concerned, is waving one's hand in the air, hoping that someone else will fix it. Linus' kernel should be rock-solid. It's the center around which the Linux universe turns. Their new attitude means that both Mandrake and Red Hat will have to spend time fixing the same problems, possibly in incompatible ways. And it means that programs may run on Red Hat, but not on Mandrake or vanilla Linux, or some other variation on that. There needs to be a gold standard, a One True Linux. We don't have that anymore, and I think the inevitable result will be to balkanize the community. Without that central kernel, switching from one distro to another, particularly with commercial software like Oracle, becomes much chancier. You'll end up with vendor lock-in... Oracle will run only on Red Hat's kernel, so you're stuck with Red Hat's distro. That's not supposed to happen with Open Source, but it looks nearly inevitable if we can't get a stable kernel at the center.

          Wow, that was quite a segue. Sorry about that. :)
      • "I wonder if, once the kernel, KDE, and GNOME guys have to lug around twenty years' worth of backward compatibility, they'll be exactly like Windows... bloated, buggy, and insecure."

        They do. man 2 pipe. That's not new. man 2 fork. That's not new. Read up on POSIX. That's not new. Read up the C stdlib. That's not new.

        Nothing that has been implemented in a Linux distribution is very young. Most of it is so old, that Windows was just a copy of a program called QDOS bought by a young man named Bill Gat
        • by Malor ( 3658 ) on Saturday September 24, 2005 @02:05PM (#13639082) Journal
          Your entire comment appears to consist of "you're stupid. Microsoft maintains backward compatibility because of money. Linux maintains backward compatibility through 'standard engineering practices'[whatever the hell those are], and because everything in Linux is ancient. You're dumb, you're stupid, you haven't been using computers very long, go away."

          What with all the insults, you're awfully light on actual content in your reply. Ignoring those, I don't even see a clear argument. What, exactly, are you asserting? I think I see 'everything in Linux is old', but that's just so ludicrous that I'll assume I'm misunderstanding. You may want to elaborate a bit.

          By the way, I'm not likely to be an astroturfer. I expect you can probably figure out why.

          I realize that base Unix is very old. However, it's very old and very, very simple in terms of the POSIX APIs. Now, I'm a sysadmin, rather than a programmer, but it has always been my understanding that POSIX was a very limited subset of the Unix libraries; if you wrote to that subset, you were guaranteed portability. From what I remember, the last time I looked (years and years ago), there just isn't a whole lot there. It's a solid set of base functions, but it's quite primitive. There's nothing like, say, DCOM, or DirectX or DirectSound. It's a solid base, but as a guess, (and I invite correction from more knowledgeable people), it covers maybe 10% of the API ground handled by more modern environments. The QT/KDE and GNOME APIs are not very old. And the Linux-specific extensions to the POSIX standard can't be older than about 12 years.

          So yes, there's an ancient standard at the base, but most modern code is going to be hitting libraries that are quite young, relatively speaking.

          All the complexity in KDE and GNOME has many of the same benefits that Windows does, like easy integration of web browsers into other applications. I wonder, though, if they're not getting themselves into the same pickle that Microsoft has. When everything is integrated and interdependent, one tiny code change can blow up an awful lot of other stuff.

          Mind you, I LIKE these desktops, and I appreciate the features very much. But the programmers of old, at the dawn of the Unix era, were some of the most phenomenally intelligent people ever. Most software work today isn't being done by the same kind of luminary. I'm fundamentally trying to make the observations that A) Microsoft has a lot of smart people too, and blew it, and B) the smart people in the open source world may be making the same mistakes, by inventing desktop systems with APIs to do everything from balancing your checkbook to flossing your teeth.

          Now, it'll be EASIER to support them in open source, because it's much easier to modify programs to match API changes. That alone will probably make a significant difference. But it doesn't change the fact that APIs don't easily go away, and lugging them around gets expensive, even in open source. (Binary compatibility is far worse.)

          I talked about Linux in that sense because I'm irritated with it, and because I was thinking about their great efforts toward binary compability in userspace. That's a great feature, and I appreciate it, but I wonder how much it costs, relatively speaking. I was reaching a bit, trying to be somewhat charitable about the reasons behind the poor state of the 2.6 kernel.

          If, as you appear to say, everything in Linux is ancient, and "standard engineering practices" will somehow magically make everything run correctly, then don't you think your comments are particularly damning of its code quality?
  • by yagu ( 721525 ) * <yayagu.gmail@com> on Saturday September 24, 2005 @07:51AM (#13637318) Journal

    It's interesting to juxtapose PR spin from Microsoft. At any given point in time in Microsoft's history, their stance and PR is that they are "state of the art", the most advanced, etc. Yet also at any given point in time they're badmouthing their own product, their own methodologies, from their recent past. Of course their chest thumping for their current "state" prevails, but I'm guessing down the road we're going to hear how messed up they are today, but not until they've made billions off of today's products.

  • Jim Allchin (Score:5, Informative)

    by tyates ( 869064 ) on Saturday September 24, 2005 @07:54AM (#13637324) Homepage
    One of the best books I ever read on the Microsoft code culture was "Breaking Windows: How Bill Gates Fumbled The Future Of Microsoft" by David Bank. From the book, Jim Allchin is the Windows guy who quashed Brad Silverberg and the (relatively) innovative Internet team - although ironically he was an early advocate for getting TCP/IP support in Windows. He believed that all innovation in Microsoft should take place under the Win 2k banner and that the company should just keep making Windows bigger and bigger and bigger. Hmm, maybe it got too big.
    http://www.amazon.com/exec/obidos/tg/detail/-/0743 203151/qid=1127565487/sr=8-1/ref=pd_bbs_1/102-0616 241-1101748 [amazon.com]
  • by ruebarb ( 114845 ) <colorache AT hotmail DOT com> on Saturday September 24, 2005 @07:55AM (#13637333)
    When I took C programming in College, one of the points our professors made was if you like your program, rewrite it...

    the first time you write something, it's always hackney'd - and it gets that way till you figure out what you want to do and how to do it - afterwards, it then becomes so much clearer to see ways to clean up the code and fix issues...

    so one of the first rules he had was once we were almost done, restart our stuff - it ended up being a lot cleaner/modular the 2nd time around...

    of course, that won't help MS, but good for the rest of ya to know ;)

    RB
    • by Anonymous Coward on Saturday September 24, 2005 @08:20AM (#13637433)

      That's terrible advice. Real-world code tends to be messy because you have to put in a lot of workarounds and bug fixes. When you rewrite something, you lose years of cumulative bugfixes. Suddenly obscure configurations are crashing, and you have no clue why, because the old code bears no resemblance to the new code, and the beardly expert on that platform has retired, so nobody is there to tell you that although the specs say foo should be a float, it actually expects an int.

      It's one of those practices that works well in college courses, but simply falls apart when applied to a project larger than a few thousand lines of code. Tell me, did this professor have actual real world experience, or was he in academia for his whole career? I'm betting on the latter.

      instead of rewriting, you should refactor, preferably with the aid of lots of regression tests. That enables you to restructure the application slowly, without changing behaviour in unexpected ways.

      Things you should never do: rewrite. [joelonsoftware.com]

      • Things you should never do: rewrite.

        Naah. Software is math and the first proof of a theorem is generally ugly. So, it can pay to start over. I am not going to say in all cases it's better to do one or the other, but sometimes rewriting is the best option. An example from my own life: I wrote a MUD with some neat AI stuff (quests that actually impact the world in large numbers) in it and now I am working with a small startup to make an MMO and started over rewriting because the way I did it was bad the first
      • When to rewrite (Score:5, Insightful)

        by Jamesday ( 794888 ) on Saturday September 24, 2005 @08:56AM (#13637596)
        It's not so much that rewriting is but but that there are bad times to rewrite. Really old and stable code isn't a good target. Really new code with completely new function and an architecture which has been found not to be a good match for the real world objective it's addressing would be a much better target.
    • There is, as I'm sure you already know, a difference between a C program you wrote in class and an OS. The reason your C program gets better when you rewrite it is because you now have a clear view of what it should look and work like. When it comes to a behemoth like Windows, no one understands the system fully. So even if we have all these people who understand parts of the system rewriting their parts, plenty of design errors can still persist in the way the system is modularized and put together.

      So what
    • by Cthefuture ( 665326 ) on Saturday September 24, 2005 @09:01AM (#13637612)
      That might work for small college projects but the real world is a different place.

      Often the rewrite never gets completed as there is too much crap added to it.

      If you truly want to make something that works you need to plan for an evolution of your software. That is, write the first version with a modular design that can be modified or rewritten in phases. Doing one big rewrite on a non-trivial software system is damn near impossible. It's better to evolve the software over time, always keeping a working system and slowing moving parts in the desired (presumably better) direction.

      I could write more on this but it's too early in the morning and I'm not even sure if what I wrote makes sense. ;)
    • by leonmergen ( 807379 ) <lmergen@@@gmail...com> on Saturday September 24, 2005 @10:04AM (#13637881) Homepage

      the first time you write something, it's always hackney'd - and it gets that way till you figure out what you want to do and how to do it - afterwards, it then becomes so much clearer to see ways to clean up the code and fix issues...

      Ok, I'm not a C programmer myself, but I do know one thing: if you have to find out what you're going to write after you start writing it, there's something extremely wrong in your process. I mean, whatever happened to actually designing the application ? Thinking about what you want to do makes much better code, and heck, it even saves you time; but yes, it's tempting, it's very tempting to rewrite code... why? Because programmers like clean code...

      When you're writing an application over the process of say, what, 6 months, and at the 6th month you look back at the code you wrote in the 1st month, you think "Oh my god, what did I do there? Look at all the mess! This can't possibly be the best way to solve it!"... but if you designed your application well, and the function does what it does, there's no need to rewrite your application - you can possibly optimize the function, but please, don't throw away code that works - it's plain silly!

      Anyway, to sum it up, the lesson I'm trying to preach: design before you code, don't throw away...

  • Amazing (Score:5, Interesting)

    by Ruie ( 30480 ) on Saturday September 24, 2005 @07:57AM (#13637341) Homepage
    The article is totally amazing:

    • I had no idea they were still doing manual builds. Was it so hard to borrow tinderbox ?
    • Still, after the changes it takes several *days* for the build - this is likely an indication of interdependency of different components, otherwise they could have used a cluster to do it.
    • They decided to start from scratch - I'll believe it when I see it. (Hint to Microsoft - Apple used BSD..)
    • Re:Amazing (Score:5, Informative)

      by Antity-H ( 535635 ) on Saturday September 24, 2005 @08:33AM (#13637488) Homepage
      I mostly agree with your remarks but you are mistaken one the last one : they did not restart from scratch.
      The day before in Microsoft's auditorium, Mr. Allchin had announced to hundreds of Windows engineers that they would "reset" Longhorn using a clean base of code that had been developed for a version of Windows on corporate server computers.

      From what I read on the net, the code base used was that of windows 2003 server.
      • Re:Amazing (Score:3, Funny)

        by hhawk ( 26580 )
        Sounds like they just re-built a lot of the userland /desktop stuff

        Automated test (whoooo!! that's so cutting edge)

        And enforced some min. methodology
      • Re:Amazing (Score:4, Interesting)

        by dioscaido ( 541037 ) on Saturday September 24, 2005 @12:57PM (#13638632)
        This is indeed what happened. We are building Vista on top of the Win2k3 code, so from now on we won't have two code bases -- the less stable/secure client platform vs. the rock-solid server platform -- instead now both are one and the same... seems smart to me. Although a side effect was the 'reset' which caused the long delays.
  • by Anita Coney ( 648748 ) on Saturday September 24, 2005 @08:01AM (#13637358) Homepage
    Microsoft is not a NEAR monopoly. It is a convicted monopoly. And since that irrefutable and well published fact escaped notice of the Wall Street Journal, I can't help but smell a little bias.
    • Microsofts 'conviction' happened several years ago, 1999 if I remember correctly. Has the world stayed the same since then? No. Things change, Microsoft was called a monopoly 6 years ago and that may not be the case today. Labels dont stay attached forever just because you want them to.
    • by DavidinAla ( 639952 ) on Saturday September 24, 2005 @08:53AM (#13637582)
      There is sometimes a difference between what a word really means and what a court defines a word as meaning in a specific context. In MS's case, a court convicted the company of having a monopoly within the context of anti-trust law. The Wall Street Journal is using the word as it is actually defined by real people, which means to own ALL of a market. The newspaper is properly labeling reality, not showing evidence of bias one way or another. The fact that I detest MS and Windows doesn't keep me from seeing that the WSJ is just doing its job properly in saying "near monopoly." The moment you don't have ANY choice other than Windows in the market, it will be a monopoly. For now, though, the fact that I'm typing this on a Mac and can go buy as many non-Windows computers as I want says MS does NOT have a monopoly. Period.
  • by ex-geek ( 847495 ) on Saturday September 24, 2005 @08:04AM (#13637370)
    This is a Wall Street Journal article. It has no technical details whatsoever since it was written for business people.

    Just look at this quote:
    The second man Mr. Allchin tapped was Amitabh Srivastava, now 49, a fellow purist among computer scientists. A newcomer to the Windows group, Mr. Srivastava had his team draw up a map of how Windows' pieces fit together. It was 8 feet tall and 11 feet wide and looked like a haphazard train map with hundreds of tracks crisscrossing each other.

    That was just the opposite of how Microsoft's new rivals worked. Google and others developed test versions of software and shipped them over the Internet. The best of the programs from rivals were like Lego blocks -- they had a single function and were designed to be connected onto a larger whole. Google and even Microsoft's own MSN online unit could quickly respond to changes in the way people used their PCs and the Web by adding incremental improvements.

    They are comparing an operating system, which has to be backward compatible with a dozen or so earlier versions of Windows and DOS and support an oodle of devices and subsystems, with a bunch of mostly unrelated web-applications and gimmicks from Google.

    All I'm getting from the article is that the "let's rewrite from scratch" crowd got the upper hand within Microsoft. But that doesn't necessarily mean that they are right or that the end result will be better than continuous improvements. At the beginning, it is easy to maintain a nice, clean and simple system. But a complex set of requirements can't always be broken down into simple Legolike blocks, as the article suggests.
  • by Waffle Iron ( 339739 ) on Saturday September 24, 2005 @08:05AM (#13637378)
    Throughout its history, Microsoft had let thousands of programmers each produce their own piece of computer code, then stitched it together into one sprawling program.

    Microsoft's new approach: Ultra-Extreme Programming.

    Now they have taken the pair coding concept well beyond the next level. They put over 5000 developers in one auditorium, and they now write Vista together as a group. The shared display is up on the movie screen, and every coder has a wireless keyboard and mouse.

    They're going to use thousands of minds working as one to produce a single, cohesive body of code. With so much manpower on the problem, development moves at a lightning pace: once a function has been typed in, it gets refactored dozens times within a matter of seconds.

  • by G4from128k ( 686170 ) on Saturday September 24, 2005 @08:10AM (#13637400)
    I'm sure the root cause of cowboy coding is in Microsoft's quest for being able to put check marks in feature boxes so PHBs can pick MS software as having the most "features." Back in the 80s there used to be a number of standalone outlining applications and high-quality outliners embedded in competing word processors. Then Word got an "outliner." That this "outliner" never worked and still doesn't work to this day is irrelevant. It enabled MS to put a check mark in the outliner feature box and eliminate user's arguments that they need a non-MS product because they need an outliner.

    Checkbox marketing -- about the only way to market when non-users make purchase decisions -- drives software companies to bolt-on features without regard to consistency of or destructive interactions between features.
  • by Alsee ( 515537 ) on Saturday September 24, 2005 @08:12AM (#13637407) Homepage
    After the Windows group was able to install a workable version of the system on their PCs four days before Christmas, Mr. Srivastava says the group celebrated by not working over the holidays.

    They also like to celebrate by not having their fingers broken.

    -
  • by Cereal Box ( 4286 ) on Saturday September 24, 2005 @08:13AM (#13637410)
    Throughout its history, Microsoft had let thousands of programmers each produce their own piece of computer code, then stitched it together into one sprawling program.

    Sounds like SOP for any massive program/OS. If you've ever been part of a truly massive product's development, you'd know what this is like. There are dozens, if not hundreds, of small groups that each specialize in a particular piece of functionality. Executives and architects determine the work items for a particular release. Responsibilities filter down the chain of command. Teams develop their work items for the release and everything is thrown together into the pot as it's done. Builds break frequently, and problems are addressed as they're encountered. Eventually testers can get their hands on decent builds, and testing/bug fixing commences during the whole process. Some ways down the road, a release finally occurs.

    Really, I don't know what the executive in the article thinks should be happening. There really isn't any other way to develop programs on the scale of Windows without the aforementioned "organized chaos". It's not a text editor, it takes numerous small teams working in a coordinated manner to produce such massive piles of code. Obviously, the more teams there are, the harder perfect coordination is to achieve. Hence, things go wrong fairly frequently. This is to be expected, IMO.
  • by ninjamonkey ( 694442 ) on Saturday September 24, 2005 @08:28AM (#13637465) Homepage

    There's just one more lesson Microsoft needs to learn from Longhorn/Vista: Don't start promising features and showing Powerpoint presentations to the press until you understand the scale of the project.

    I love Google, because they rarely promise something and don't deliver. Actually, they rarely promise something. It just shows up one day and it's elegant, clean, and fast.
    • by spisska ( 796395 ) on Saturday September 24, 2005 @11:03AM (#13638140)

      I love Google, because they rarely promise something and don't deliver. Actually, they rarely promise something. It just shows up one day and it's elegant, clean, and fast.

      Hear, hear. MS holds flashy press conferences to announce products that won't ship for a year (if at all), includes laundy-lists of features that will be radically pared down before release, and ultimately ships products that are, at best, incremental improvements over previous versions, although they are touted as 'revolutionary', eg Win 2k vs Win XP.

      Google doesn't talk about products in preparation. They quietly release full-function betas before announcing them, and the betas offer features that really are revolutionary. No Gmail wasn't the first web mailer, but it redefined what a web mail program was capable of. No Google didn't make the first map, but maps.google blows everyone else away.

      Yes, there is a big difference between between building something like Google Desktop Search and building a whole new filesystem and all the other changes that requires. But the point is what is promised and what is delivered.

      Google promises nothing, and delivers products that become essential. Microsoft promises the sky and moon (I thought Windows was supposed to be voice-controlled by now, and my fridge was supposed to automatically order milk when I need it), and delivers products whose importance to daily life is based primarily on the difficulty in avoiding them.

      When Google does drop the next bomb (Google TV?, GoogleFS?, Googlix OS for running a smart terminal?), you won't hear about it in a press release. You'll be an invited Beta tester.

  • Comments (Score:5, Insightful)

    by MyLongNickName ( 822545 ) on Saturday September 24, 2005 @08:47AM (#13637562) Journal
    90% of the comments I've read so far are either entirely or partially "omfg... microsoft suks!". However, read the entrie article, and you are faced with an interesting siutation.

    Software always has to strike a balance point... between features, quality, cost and timing. All software does (sans Duke Nukem Forever). Microsoft has been very good at getting product out there with the feature sets people want (Microsoft is also very good at manipulating folks into getting folks to want what they are able to deliver). Now, they are at a cross-road. Continue their current coding model, and get the next couple versions out there (relatively) inexpensively and quickly, or bite the bullet, and try a new way that will make them competitive for serval versions.

    Seems like an easy choice. But here you have thousands of developers who style is being crimped. Software engineers generally want to write code, not have constraints placed on them. Add to the fact that Google is gobbling up the best and brightest, and suddenly you wonder: If Microsoft forges forward, do they lose even more of their best engineers. They may have a better model for code depelopment, but will they have the best coders to move forward with?

    Which leads to the final question: Does Microsoft really need the "best and brightest" anymore? If so, do they need as many (percentage terms) as they used to? Their products are mostly in the mature stage. Can a few intellectuals keep the ship moving forward. Despite what groupthink on Slashdot may indicate, 90% of coding is not revolutionary, or even evolutionary.

    Just some things to think about and watch for over the next few years.
  • by t35t0r ( 751958 ) on Saturday September 24, 2005 @09:53AM (#13637834)
    If there ever was to be a mascot for Vista it should be a pig with M$'s trademark 4 colored butterfly wings. Sort of interesting if you look at the penguin it has "wings" but cannot fly.
  • by roman_mir ( 125474 ) on Saturday September 24, 2005 @10:29AM (#13637989) Homepage Journal
    it is unbelievable how sad this article is. These MS 'engineers' only now started using automated integration testing, possibly automated unit testing. They only now started writing to predetermined interfaces and producing modular code. Gates, who calls himself 'chief engineer' never cared to start doing any of it before his house of cards, he calls his software production process, collapsed.

    I can't get over this, I thought this must have been obvious, especially in a firm that releases products as big and complex as OSs. I only worked in this field for 9.5 years and in that time I delivered a bunch of projects doing exactly that: well defined interfaces, components, automated unit testing and automated integration testing and at MS there was noone before the shit hit the fan to start doing it that way over what? 25 years?

    New process they have? New process my ass.

  • by DJ-Dodger ( 169589 ) on Saturday September 24, 2005 @11:07AM (#13638156) Homepage
    There are some more technical details on the big map of windows and the quality gates in this blog post:

    http://blogs.msdn.com/larryosterman/archive/2005/0 8/23/455193.aspx [msdn.com]
  • Same Old PR Spin (Score:4, Informative)

    by dcuny ( 613699 ) on Saturday September 24, 2005 @11:25AM (#13638213)
    When Microsoft comes out with a new OS, they have to convince users that they need to switch to it. This can be difficult, since customers have made a hefty investment in the technology, and tend to be pretty happy where they are.

    There's a carrot and stick approach. The carrot is that Microsoft touts all the cool new features that will make life so much easier. Features you won't be able to live without.

    Then there's the stick. Part of it is to have Office use features of the new OS, so you won't be able to perform some spiffy operation without it.

    Another part of the stick is to badmouth the prior version, but explain that all the issues being badmouthed are fixed and gone in the new OS.

    So you get stories where Microsoft "finally admits" to various things, (like that DOS really does underly Win9x, despite assurances that it was gone)... You've read them.

    There's certainly truth to what Microsoft claims, and it's nice to see real issues being addressed. For example, WinXP's move away from the Win9x base to the more solid WinNT base was a huge win for most users (although gamers complained about a lack of drivers).

    But don't be fooled - fundamentally, you're just looking at PR spin designed to created demand for an new OS.

  • Unit Tests (Score:3, Interesting)

    by guinsu ( 198732 ) on Saturday September 24, 2005 @11:40AM (#13638275)
    It seems pretty clear from the article that its describing Microsoft implementing unit testing on a large scale, but trying to explain it in laymens terms. So they didn't have to "rewrite" everything, they just wrote unit tests for everything they could, and dropped other parts (WinFS) until they could get those properly tested. The part about "code jails" and all of that read right out of an extreme programming book. I'm suprised no one else picked up on this.
  • by ravee ( 201020 ) on Saturday September 24, 2005 @11:41AM (#13638284) Homepage Journal
    I wonder, why microsoft can't do what apple did to mac OS. That is, why can't microsoft take FreeBSD code base and build added features into it to create a robust OS ? They could also include hooks in it so that MSOffice and other software suites will run only in their OS like apple is doing to Mac OS.
    This could make their job a lot easier and could get them more patrons for their OS.

    But microsoft has always been good at making even simple things seem very complex.

  • by tji ( 74570 ) on Saturday September 24, 2005 @12:08PM (#13638402)
    Microsoft got The Wall Street Journal to publish that free advertisement? That's incredible.

    Look at MS's big challenge now.. They are a monopoly, they are not going to increase their market share any more, because they already own the market. Their challenge is getting people to stick with their stuff, despite the demonstrated long standing problems in security.

    So, they throw in some tidbits critical to MS's past practices, because everyone is painfully aware of the problems they have had with security, viruses, etc. And they introduce our savior, Jim Allchin, who in a miraculously short amount of time, fixed all the development issues and got the company on track producing bug-free software.

    Now, IT managers can breathe easy, assured that the next release of Windows will solve all that pains them, and will be well worth the high price MS demands.

    This article is a great demonstration of why MS is on top. They have the clout to place a piece of propaganda in a national publication that will be read by a good percentage of corporate execs. That's innovation, MS style.
  • by geminidomino ( 614729 ) * on Saturday September 24, 2005 @01:22PM (#13638792) Journal
    They've been saying forever "Windows will never be secure without a complete rewrite." Could this be their chance?
  • Love this line (Score:4, Insightful)

    by nacs ( 658138 ) on Saturday September 24, 2005 @01:58PM (#13639038) Journal
    Tiny Internet browser maker Mozilla Foundation beat Microsoft to market with browser features planned for Longhorn.
    I love how it's phrased to make it look like Microsoft had plans for all these great new features for IE7 but this bad little company "Mozilla" comes around and steals their featureset.

    If anything, Mozilla is the reason they're finally getting around to 'upgrading' IE to possibly make it a decent browser compared to Firefox.

Never test for an error condition you don't know how to handle. -- Steinbach

Working...