Why Vista Had To Be Rebuilt From Scratch 711
iliketrash writes "The Wall Street Journal has a long front-page article describing how Jim Allchin approached Bill Gates in July, 2004, with the news that then-Longhorn, now-Vista, was 'so complex that its writers would never be able to make it run properly.' Also, the article says, 'Throughout its history, Microsoft had let thousands of programmers each produce their own piece of computer code, then stitched it together into one sprawling program. Now, Mr. Allchin argued, the jig was up. Microsoft needed to start over.' And start over they did. The article is astonishing for its frank comments from the principles, including Allchin and Gates, as well as for its description of Microsoft's cowboy spaghetti code culture."
And Microsoft rule (Score:5, Insightful)
Re:And Microsoft rule (Score:5, Insightful)
The only computer company that has reinvented itself more times than Microsoft is IBM. And both companies are, contrary to popular belief around here, very far from dead. They aren't even sick or gasping.
Re:And Microsoft rule (Score:3, Insightful)
The lego block analogy apply to how Apple wrote code for a while, I would go as far as saying since system6 but more realistically system7 with its core os and extensions attaching to it, they invented plug-ins before browsers were even invented...
That ultimately gave us osX, the ultimate in plug-in philosophy, from the kernel to the GUI.
Mac OS X not that modular (Score:5, Informative)
Mac OS X is not that modular. GNU Hurd is far more, and even GNU/Linux.
Mac OS X’s kernel’ not modular at all. It has conflated the Mach microkernel, which has already been abandoned by the Hurd for its bad performance, with the monolithic BSD kernel. The result is something just as monolithic as BSD, but much larger, more complex and slow. Linux is not as fast or simple as BSD, but still much faster than Mac OS X — and both are just as modular.
In contrast, the Hurd on the Mach is a little bit slower but much more modular, and the new L4 version has the potential to be much faster and still much more modular, because it is a true microkernel with multiple servers.
The Mac OS X GUI’s not modular at all X is.
Re:Mac OS X not that modular (Score:3, Insightful)
Low bandwidth responsive remote desktops is a bullet point that modern OSes should be able to meet. The capability most cer
Re:Mac OS X not that modular (Score:3, Interesting)
Did you know, by the way, that a system can be modular on the source code level and then (based upon a compilation flag) it can either be built such that (A) both regions are in kernel space, or (B) one region is in kernel space and the other is in user space. The former would use a very efficient in
Re:Mac OS X not that modular (Score:3, Insightful)
Can you say monolithic kernel and UI? Nothing like the Hurd or X. You can dislike microkernels and X, but you can't call Mac OS X the ultimate plug-in architecture.
Re:Mac OS X not that modular (Score:4, Funny)
Re:And Microsoft rule (Score:5, Interesting)
According to their marketing and PR departments, anyway.
That ultimately gave us osX, the ultimate in plug-in philosophy, from the kernel to the GUI.
Apple didn't give us OS X. The kernel came from CMU (an open source project), and NeXT and Apple spent the last 20 years making it less modular. The GUI software architecture came from NeXT, borrowed heavily from Smalltalk, and is client-server, like X11, only not as well architected or as efficient.
In fact, Apple's own systems programming staff screwed up so badly that Apple had to go out and buy a new operating system; all their attempts to develop a next generation Macintosh OS in-house failed.
Re:And Microsoft rule (Score:4, Informative)
You misspelled executive management. Apple had plenty of fine programming talent who would have been happy to execute on a strategy. Any strategy.
It may surprise you to learn that many programmers at Apple -- including key members of the Cocoa team, the Carbon team, and the IOKit team -- worked on Copland. Difference between Copland and Mac OS X? Executive management. Define a goal and stick to it. Q.E.D.
In fact, a non-trivial amount of code and concepts from Copland is recycled in Mac OS X (excluding Classic for the purpose of this discussion
Re:And Microsoft rule (Score:4, Interesting)
The MacOS extention mechanism was nothing like "plug-ins" -- There was not a defined "extention API" as with a browser, they were they were system call traps and often relied totally on undocumented behavior.
Someone could write such extensions for any OS, but it's generally considered to be a bad practice. As the unstable, conflicting mess of MacOS extentions proved.
Re: (Score:3, Informative)
Re:And Microsoft rule (Score:3, Insightful)
yes, very competently managed (Score:5, Interesting)
Why is that a problem? Because their management, sales, and marketing are so good that their technology doesn't have to be. They can ship software with security holes, bugs, poor usability, and bad design, but the non-technical part of the company will somehow manage to still sell it and make a bundle on it.
Re:And Microsoft rule (Score:5, Insightful)
This is probably one of two things. He's telling the truth and they have re-written the core parts. This wont fix the vast mass of code sitting on the core code which relies on the way things used to work.
The other option is this is the latest round of "we've fixed it this time, honest". The result of this is left as an exercise to the reader.
Re:And Microsoft rule (Score:3, Informative)
So they took the clean code they had from their server core (maybe from the next server edition, maybe from 2003?) and then used that to create Longhorn. So no, they didn't start from scratch, but they aren't building upon the XP quicksand foundation e
Re:And Microsoft rule (Score:3, Interesting)
Re:And Microsoft rule (Score:4, Insightful)
You don't do it all at once... you rewrite bits of it progressively until the whole code has been refreshed (from experience it'll be smaller and faster... devs always write better code the second time around).
Pointless. There is nothing fundamentally better about freshly written code. Indeed it tends to have far more defects than code that has matured for a number of years and had the defects knocked out of it.
Size? Pretty much irrelevant now. The difference in size between the last person that coded the software and the new person, or the same person twice will not amount to enough to make a difference. Storage and memory is essentially free for the matter of the few K difference in the size of code.
Faster? If you need more speed, you profile the software and just fix the small areas of code that are executed most of the time. A re-write wouldn't give nearly as much benefit, and would require profiling at the end of it anyway. Meantime you company has wasted a year or three on the re-write (even if done bit by bit as you are advocating) that could be spent on writing other products.
Re:And Microsoft rule (Score:3, Insightful)
It's taken the best part of a decade to get the rewrite to an acceptable state, and the company that started it is now dead.
More here (but note the article is 5 years old)
http://www.joelonsoftware.com/articles/fog00000000 69.html [joelonsoftware.com]
Far from proving the other poster right, your example is a perfect illustration of why he's wrong.
Re:And Microsoft rule (Score:3, Interesting)
IMHO, what really killed them was not the code rewrite per say, but attempting to be an early pioneer in open source
Re:And Microsoft rule (Score:5, Insightful)
Every piece of software starts with a clean, elegant structure - in the mind of whoever created it. Over time some of their assumptions prove false, and more importantly, many of the "true believers" who originally engineered the system move on. The inevitable result is the next wave of developers have a burning urge to throw it out and start from scratch. Virtually all developers want to throw out the code they maintain and start from scratch. As this faction gains momentum, what do you think they say about the software? It sucks, it's not engineered, it's not maintainable, and so on. There's probably some truth to it, but a lot of it is people making an argument to justifiy doing what they want.
How the story tracks (Score:5, Interesting)
Put my two cents in as to how the article's storyline doesn't quite track. If Mr. Allchin, despite massive institutional inertia, gave the pig winglets and put it back on a track to actually being releasable then we're missing the motive for why he'll leave on Vista D-Day and why the company wouldn't fight to keep him. In some sense, the article is about the story Microsoft wishes to tell, which is we were writing bad code, but we've fixed that now (and look at the bruises: no pain, no gain, right?), which is what the parent posts suspects.
Now I suspect that the interviews took place before the Microsofta est omnis divisa in partes tres announcement, and there was no desire from Microsoft to have Mr. Allchin candidly describe his reasons for retirement (and maybe Mr. Allchin has a book up his sleeve), so off to press with this peek into the hallowed halls of Redmond.
One quibble I would have with article is in its suggestion that Mr. Gates, as Chief Software Architect has two paradoxical duties to reconcile: coming up with innovations and putting down unrealistic projects. A lot of the candid reporting I've seen is that there's a third element that he practices with zeal, which is to grind into a fine powder any idea he believes shakes a stick at the cash cows.
One implication of the story is that in Summer 2004 Bill Gates didn't know that one of the cash cows was flatlining. There's a thought to ponder.
Re:And Microsoft rule (Score:4, Insightful)
Hah. I'm still trying to count the number of times I've heard "Yeah, we admit that everything so far has been kinda crap, but we've sorted it out this time..." from them.
Re:And Microsoft rule (Score:4, Interesting)
Re:Oh please (Score:3, Interesting)
Microsoft should fear FOSS, not google.. (Score:3, Interesting)
To quote
Microsoft's greatest enemies now are still two for-profit companies - Google and Apple. I'll rest easier when FOSS replaces them (as was promised in 1999). Instead it's just a new master instead of the old one.Re:Microsoft should fear FOSS, not google.. (Score:5, Insightful)
Re:Microsoft should fear FOSS, not google.. (Score:5, Insightful)
In the midst of FOSS activism (which I have no problem with being a FOSS advocate, and often consider myself one) people tend to take their eyes off the ball. The important goal is not to have all software be GPL'ed, but to have real open standards. In fact, I don't think we should even mind Microsoft maintaining a large market share so long as they start using open standards. As customers and potential customers, we should all demand (in whatever way we're capable) that Microsoft provide freely available documentation to their file formats, protocols, and APIs. Insofar as they fail to do so, we should consider that a problem with their product, and look for alternatives.
The tremendous value and power of FOSS is not in having everyone use it all the time, but in anyone and everyone having the ability to use it whenever is appropriate for them. If a Linux server can be used as an easy drop-in replacement for a Windows server and OpenOffice can open/save MS Office documents, then Microsoft will not be capable of abusing their own customers. Microsoft will be forced to compete with FOSS by offering better quality and features rather than vendor lock in, and frankly, if they would do that, I would have no problem with Microsoft whatsoever.
Also, as much of a fan of FOSS as I am, I am also a fan of Apple and Google because I do believe they're competing by offering quality and features that people want.
Re:Microsoft should fear FOSS, not google.. (Score:3, Funny)
another Spaghetti Incident? (Score:5, Funny)
"Microsoft's cowboy spaghetti code culture"
If its any thing like "Guns n Roses - Spaghetti Incident" then this should effectively be the last we hear of Microsoft.
Anarchy of Development (Score:5, Interesting)
I personally would like to hear more about the software development procedures and methodologies used in other large projects - how successful different types of development are.
I work for an automotive parts manufacturer, and to see the lack of consistency within the organisation's software development is disturbing. Safety-critical parts are being produced, and the level of testing between said parts varies quite considerably. Additionally, the level of oversight and adherence to software development procedures is rather bad to say the least. I just hope it's not characteristic of the industry as a whole.
Re:Anarchy of Development (Score:3, Interesting)
The impression I get from the article is that even the documented interfaces (ie, fu
Re:Anarchy of Development (Score:5, Informative)
As you said, there's no way in hell you can have a 12 month rewrite. But, with any luck (for the end-users), this will hopefully turn out to be more than PR fluff.
Re:Anarchy of Development (Score:5, Informative)
Not sure if this is what you were interested in, but I think Paul Thurott has some great lengthy and detailed articles, along with some interviews with Microsoft engineers for some insight in the stress, problems, and achievements with various large Windows projects, and also with pictures of their build labs and test machines.
For example:
Windows 2000
Windows XP SP2
Windows Server 2003
A disclaimer bias-wise is that Paul Thurott is a guy who wants Microsoft to do well, but he's not afraid of criticizing them harshly when he doesn't agree with their decisions, so I think it's still not a case with "inside stories" being too biased to be useful. He was for example the guy behind the quote that Windows Vista had the markings of a shipwreck after seeing Beta 1. Although he has had some missteps IMO such as saying Windows Me should be far more reliable than Windows 98.
Documentary film burned?! (Score:5, Insightful)
Man, that's a shame. I'd love to have seen film. Shame on Allchin if he didn't demand an archive copy that be retained, at least, even if it's only released in 20 years' time.
why ''astonishing''? (Score:5, Interesting)
Re:why ''astonishing''? (Score:5, Insightful)
"Generally" (Score:4, Insightful)
I really wish they explain me the difference between "generally bug-free" and "bug-free". Is the difference around 65,000 (as Win2000 has ~65,000 known bugs when launched)?
Re:"Generally" (Score:5, Funny)
Re:"Generally" (Score:5, Insightful)
I'm much more hopeful that Vista will be a real product after reading this article. It sounded like fluff/vaporware, but now it's starting to sound like it may have actual benefits for real people. (I likely still won't use it, because of the DRM/Palladium evilness inside, and I'll suggest to other people that they not do so either. But it may actually offer some real technical benefits along with the evil.)
I doubt it will ever be secure. As Microsoft has spent billions demonstrating, you cannot retrofit security.
The open source people might be able to learn from this process change at Microsoft. The 2.6 kernel has been very, very low quality, at least compared to earlier Linux releases. Even I myself have seen at least one of the problems.... bugs in the kernel directly cost me a couple hundred dollars, because I replaced a hard drive when it had nothing wrong with it at all. I was bitten by ACPI bugs, which mysteriously caused hard drive failures. I figured out the problem after the new drive started failing too, but I was about $200 poorer for it. As far as I remember, I haven't replaced non-broken hardware due to OS bugs since Win95... not exactly the best example to follow.
I also worry about the desktop environments... they're getting so large and complex, they're starting to look like Windows. Tons of features with lots of interdependencies. I'm sure the code is a lot better than a lot of the stuff in Windows, but clean, tight code will protect against only so much bloat and overcomplex design.
I'm starting to think that part of the reason the open source code was so very much better than Windows' was because it was a fresh start, with no backward compatibility to worry about.
I wonder if, once the kernel, KDE, and GNOME guys have to lug around twenty years' worth of backward compatibility, they'll be exactly like Windows... bloated, buggy, and insecure. The last couple of years haven't looked too promising in that regard.
Re:"Generally" (Score:3, Interesting)
Older 2.4.* releases work ok, and the BSDs work ok (except for FreeBSD 5.0-5.3).
After having similiar experience on other computers with 2.6, I've pretty much come to the conclusion that linux has jumped the shark; at least in terms of stabil
Re:"Generally" (Score:5, Insightful)
It's worth pointing out that the whole move of Linux into the server market was accidental. It was always being written as a desktop Unix. It just happened to be so amazingly robust that it made a dynamite server, and took over a good chunk of the internet. That'd be a good book title, "The Accidental Server". Unfortunately, the development model never changed to match the actual use of the system.
The reason I started using Linux to begin with was because it didn't ever break... it didn't have as many features as Windows, but it just never, ever, EVER fell over. The 2.2 kernel was probably the most bulletproof piece of software I've ever run on a PC. 2.4 never got to the sheer solidity of 2.2... on good hardware it's quite robust, but I saw a number of machines where stressing it would lock it up after a few days. (from the kernel messages, it looked like it might be bugs in the (different) network drivers.) 2.6, relatively speaking, has just been a disaster. They won't leave it alone long enough to let it stabilize... they insist on jamming new code into every release, and dropping old releases very quickly. (the new 2.6.X setup.) So I can't get my bugfixes without new features if I want to use a vanilla kernel.
People, of course, instantly bash me and say 'you're stupid, you should be using a distribution kernel'. I'm doing that now, even though I liked rolling my own, but I shouldn't have to. The dev team's attitude seems to be 'ship it and let the distros debug it'... which, as far as I'm concerned, is waving one's hand in the air, hoping that someone else will fix it. Linus' kernel should be rock-solid. It's the center around which the Linux universe turns. Their new attitude means that both Mandrake and Red Hat will have to spend time fixing the same problems, possibly in incompatible ways. And it means that programs may run on Red Hat, but not on Mandrake or vanilla Linux, or some other variation on that. There needs to be a gold standard, a One True Linux. We don't have that anymore, and I think the inevitable result will be to balkanize the community. Without that central kernel, switching from one distro to another, particularly with commercial software like Oracle, becomes much chancier. You'll end up with vendor lock-in... Oracle will run only on Red Hat's kernel, so you're stuck with Red Hat's distro. That's not supposed to happen with Open Source, but it looks nearly inevitable if we can't get a stable kernel at the center.
Wow, that was quite a segue. Sorry about that.
Do you even understand Unix? (Score:3, Insightful)
They do. man 2 pipe. That's not new. man 2 fork. That's not new. Read up on POSIX. That's not new. Read up the C stdlib. That's not new.
Nothing that has been implemented in a Linux distribution is very young. Most of it is so old, that Windows was just a copy of a program called QDOS bought by a young man named Bill Gat
Re:Do you even understand Unix? (Score:5, Insightful)
What with all the insults, you're awfully light on actual content in your reply. Ignoring those, I don't even see a clear argument. What, exactly, are you asserting? I think I see 'everything in Linux is old', but that's just so ludicrous that I'll assume I'm misunderstanding. You may want to elaborate a bit.
By the way, I'm not likely to be an astroturfer. I expect you can probably figure out why.
I realize that base Unix is very old. However, it's very old and very, very simple in terms of the POSIX APIs. Now, I'm a sysadmin, rather than a programmer, but it has always been my understanding that POSIX was a very limited subset of the Unix libraries; if you wrote to that subset, you were guaranteed portability. From what I remember, the last time I looked (years and years ago), there just isn't a whole lot there. It's a solid set of base functions, but it's quite primitive. There's nothing like, say, DCOM, or DirectX or DirectSound. It's a solid base, but as a guess, (and I invite correction from more knowledgeable people), it covers maybe 10% of the API ground handled by more modern environments. The QT/KDE and GNOME APIs are not very old. And the Linux-specific extensions to the POSIX standard can't be older than about 12 years.
So yes, there's an ancient standard at the base, but most modern code is going to be hitting libraries that are quite young, relatively speaking.
All the complexity in KDE and GNOME has many of the same benefits that Windows does, like easy integration of web browsers into other applications. I wonder, though, if they're not getting themselves into the same pickle that Microsoft has. When everything is integrated and interdependent, one tiny code change can blow up an awful lot of other stuff.
Mind you, I LIKE these desktops, and I appreciate the features very much. But the programmers of old, at the dawn of the Unix era, were some of the most phenomenally intelligent people ever. Most software work today isn't being done by the same kind of luminary. I'm fundamentally trying to make the observations that A) Microsoft has a lot of smart people too, and blew it, and B) the smart people in the open source world may be making the same mistakes, by inventing desktop systems with APIs to do everything from balancing your checkbook to flossing your teeth.
Now, it'll be EASIER to support them in open source, because it's much easier to modify programs to match API changes. That alone will probably make a significant difference. But it doesn't change the fact that APIs don't easily go away, and lugging them around gets expensive, even in open source. (Binary compatibility is far worse.)
I talked about Linux in that sense because I'm irritated with it, and because I was thinking about their great efforts toward binary compability in userspace. That's a great feature, and I appreciate it, but I wonder how much it costs, relatively speaking. I was reaching a bit, trying to be somewhat charitable about the reasons behind the poor state of the 2.6 kernel.
If, as you appear to say, everything in Linux is ancient, and "standard engineering practices" will somehow magically make everything run correctly, then don't you think your comments are particularly damning of its code quality?
tale of two companies, same campus (Score:5, Insightful)
It's interesting to juxtapose PR spin from Microsoft. At any given point in time in Microsoft's history, their stance and PR is that they are "state of the art", the most advanced, etc. Yet also at any given point in time they're badmouthing their own product, their own methodologies, from their recent past. Of course their chest thumping for their current "state" prevails, but I'm guessing down the road we're going to hear how messed up they are today, but not until they've made billions off of today's products.
Jim Allchin (Score:5, Informative)
http://www.amazon.com/exec/obidos/tg/detail/-/074
one of the first rules of programming - start over (Score:5, Interesting)
the first time you write something, it's always hackney'd - and it gets that way till you figure out what you want to do and how to do it - afterwards, it then becomes so much clearer to see ways to clean up the code and fix issues...
so one of the first rules he had was once we were almost done, restart our stuff - it ended up being a lot cleaner/modular the 2nd time around...
of course, that won't help MS, but good for the rest of ya to know
RB
Re:one of the first rules of programming - start o (Score:4, Insightful)
That's terrible advice. Real-world code tends to be messy because you have to put in a lot of workarounds and bug fixes. When you rewrite something, you lose years of cumulative bugfixes. Suddenly obscure configurations are crashing, and you have no clue why, because the old code bears no resemblance to the new code, and the beardly expert on that platform has retired, so nobody is there to tell you that although the specs say foo should be a float, it actually expects an int.
It's one of those practices that works well in college courses, but simply falls apart when applied to a project larger than a few thousand lines of code. Tell me, did this professor have actual real world experience, or was he in academia for his whole career? I'm betting on the latter.
instead of rewriting, you should refactor, preferably with the aid of lots of regression tests. That enables you to restructure the application slowly, without changing behaviour in unexpected ways.
Things you should never do: rewrite. [joelonsoftware.com]
Re:one of the first rules of programming - start o (Score:3, Interesting)
Naah. Software is math and the first proof of a theorem is generally ugly. So, it can pay to start over. I am not going to say in all cases it's better to do one or the other, but sometimes rewriting is the best option. An example from my own life: I wrote a MUD with some neat AI stuff (quests that actually impact the world in large numbers) in it and now I am working with a small startup to make an MMO and started over rewriting because the way I did it was bad the first
When to rewrite (Score:5, Insightful)
Re:one of the first rules of programming - start o (Score:3, Interesting)
So what
Re:one of the first rules of programming - start o (Score:4, Interesting)
Often the rewrite never gets completed as there is too much crap added to it.
If you truly want to make something that works you need to plan for an evolution of your software. That is, write the first version with a modular design that can be modified or rewritten in phases. Doing one big rewrite on a non-trivial software system is damn near impossible. It's better to evolve the software over time, always keeping a working system and slowing moving parts in the desired (presumably better) direction.
I could write more on this but it's too early in the morning and I'm not even sure if what I wrote makes sense.
Re:one of the first rules of programming - start o (Score:4, Insightful)
the first time you write something, it's always hackney'd - and it gets that way till you figure out what you want to do and how to do it - afterwards, it then becomes so much clearer to see ways to clean up the code and fix issues...
Ok, I'm not a C programmer myself, but I do know one thing: if you have to find out what you're going to write after you start writing it, there's something extremely wrong in your process. I mean, whatever happened to actually designing the application ? Thinking about what you want to do makes much better code, and heck, it even saves you time; but yes, it's tempting, it's very tempting to rewrite code... why? Because programmers like clean code...
When you're writing an application over the process of say, what, 6 months, and at the 6th month you look back at the code you wrote in the 1st month, you think "Oh my god, what did I do there? Look at all the mess! This can't possibly be the best way to solve it!"... but if you designed your application well, and the function does what it does, there's no need to rewrite your application - you can possibly optimize the function, but please, don't throw away code that works - it's plain silly!
Anyway, to sum it up, the lesson I'm trying to preach: design before you code, don't throw away...
Amazing (Score:5, Interesting)
Re:Amazing (Score:5, Informative)
From what I read on the net, the code base used was that of windows 2003 server.
Re:Amazing (Score:3, Funny)
Automated test (whoooo!! that's so cutting edge)
And enforced some min. methodology
Re:Amazing (Score:4, Interesting)
FTA: "near-monopoly" (Score:5, Insightful)
Re:FTA: "near-monopoly" (Score:3, Interesting)
Semantics and journalism (Score:5, Interesting)
Not a good article to base Microsoft bashing on (Score:5, Interesting)
Just look at this quote:
They are comparing an operating system, which has to be backward compatible with a dozen or so earlier versions of Windows and DOS and support an oodle of devices and subsystems, with a bunch of mostly unrelated web-applications and gimmicks from Google.
All I'm getting from the article is that the "let's rewrite from scratch" crowd got the upper hand within Microsoft. But that doesn't necessarily mean that they are right or that the end result will be better than continuous improvements. At the beginning, it is easy to maintain a nice, clean and simple system. But a complex set of requirements can't always be broken down into simple Legolike blocks, as the article suggests.
Ultra-Extreme Programming (Score:5, Funny)
Microsoft's new approach: Ultra-Extreme Programming.
Now they have taken the pair coding concept well beyond the next level. They put over 5000 developers in one auditorium, and they now write Vista together as a group. The shared display is up on the movie screen, and every coder has a wireless keyboard and mouse.
They're going to use thousands of minds working as one to produce a single, cohesive body of code. With so much manpower on the problem, development moves at a lightning pace: once a function has been typed in, it gets refactored dozens times within a matter of seconds.
Re:Ultra-Extreme Programming (Score:3, Funny)
Refactoring is Futile.
Feature lists, PHBs, and cowboy coding (Score:5, Insightful)
Checkbox marketing -- about the only way to market when non-users make purchase decisions -- drives software companies to bolt-on features without regard to consistency of or destructive interactions between features.
Celebration! (Score:5, Funny)
They also like to celebrate by not having their fingers broken.
-
This is what normally happens... (Score:5, Interesting)
Sounds like SOP for any massive program/OS. If you've ever been part of a truly massive product's development, you'd know what this is like. There are dozens, if not hundreds, of small groups that each specialize in a particular piece of functionality. Executives and architects determine the work items for a particular release. Responsibilities filter down the chain of command. Teams develop their work items for the release and everything is thrown together into the pot as it's done. Builds break frequently, and problems are addressed as they're encountered. Eventually testers can get their hands on decent builds, and testing/bug fixing commences during the whole process. Some ways down the road, a release finally occurs.
Really, I don't know what the executive in the article thinks should be happening. There really isn't any other way to develop programs on the scale of Windows without the aforementioned "organized chaos". It's not a text editor, it takes numerous small teams working in a coordinated manner to produce such massive piles of code. Obviously, the more teams there are, the harder perfect coordination is to achieve. Hence, things go wrong fairly frequently. This is to be expected, IMO.
Re:This is what normally happens... (Score:3, Funny)
Getting into trouble.. (Score:5, Insightful)
There's just one more lesson Microsoft needs to learn from Longhorn/Vista: Don't start promising features and showing Powerpoint presentations to the press until you understand the scale of the project.
I love Google, because they rarely promise something and don't deliver. Actually, they rarely promise something. It just shows up one day and it's elegant, clean, and fast.
Re:Getting into trouble.. (Score:5, Interesting)
I love Google, because they rarely promise something and don't deliver. Actually, they rarely promise something. It just shows up one day and it's elegant, clean, and fast.
Hear, hear. MS holds flashy press conferences to announce products that won't ship for a year (if at all), includes laundy-lists of features that will be radically pared down before release, and ultimately ships products that are, at best, incremental improvements over previous versions, although they are touted as 'revolutionary', eg Win 2k vs Win XP.
Google doesn't talk about products in preparation. They quietly release full-function betas before announcing them, and the betas offer features that really are revolutionary. No Gmail wasn't the first web mailer, but it redefined what a web mail program was capable of. No Google didn't make the first map, but maps.google blows everyone else away.
Yes, there is a big difference between between building something like Google Desktop Search and building a whole new filesystem and all the other changes that requires. But the point is what is promised and what is delivered.
Google promises nothing, and delivers products that become essential. Microsoft promises the sky and moon (I thought Windows was supposed to be voice-controlled by now, and my fridge was supposed to automatically order milk when I need it), and delivers products whose importance to daily life is based primarily on the difficulty in avoiding them.
When Google does drop the next bomb (Google TV?, GoogleFS?, Googlix OS for running a smart terminal?), you won't hear about it in a press release. You'll be an invited Beta tester.
Comments (Score:5, Insightful)
Software always has to strike a balance point... between features, quality, cost and timing. All software does (sans Duke Nukem Forever). Microsoft has been very good at getting product out there with the feature sets people want (Microsoft is also very good at manipulating folks into getting folks to want what they are able to deliver). Now, they are at a cross-road. Continue their current coding model, and get the next couple versions out there (relatively) inexpensively and quickly, or bite the bullet, and try a new way that will make them competitive for serval versions.
Seems like an easy choice. But here you have thousands of developers who style is being crimped. Software engineers generally want to write code, not have constraints placed on them. Add to the fact that Google is gobbling up the best and brightest, and suddenly you wonder: If Microsoft forges forward, do they lose even more of their best engineers. They may have a better model for code depelopment, but will they have the best coders to move forward with?
Which leads to the final question: Does Microsoft really need the "best and brightest" anymore? If so, do they need as many (percentage terms) as they used to? Their products are mostly in the mature stage. Can a few intellectuals keep the ship moving forward. Despite what groupthink on Slashdot may indicate, 90% of coding is not revolutionary, or even evolutionary.
Just some things to think about and watch for over the next few years.
Vista wll always now be known as the flying pig (Score:3, Funny)
almost unbelievable (Score:5, Insightful)
I can't get over this, I thought this must have been obvious, especially in a firm that releases products as big and complex as OSs. I only worked in this field for 9.5 years and in that time I delivered a bunch of projects doing exactly that: well defined interfaces, components, automated unit testing and automated integration testing and at MS there was noone before the shit hit the fan to start doing it that way over what? 25 years?
New process they have? New process my ass.
More technical details (Score:4, Informative)
http://blogs.msdn.com/larryosterman/archive/2005/
Same Old PR Spin (Score:4, Informative)
There's a carrot and stick approach. The carrot is that Microsoft touts all the cool new features that will make life so much easier. Features you won't be able to live without.
Then there's the stick. Part of it is to have Office use features of the new OS, so you won't be able to perform some spiffy operation without it.
Another part of the stick is to badmouth the prior version, but explain that all the issues being badmouthed are fixed and gone in the new OS.
So you get stories where Microsoft "finally admits" to various things, (like that DOS really does underly Win9x, despite assurances that it was gone)... You've read them.
There's certainly truth to what Microsoft claims, and it's nice to see real issues being addressed. For example, WinXP's move away from the Win9x base to the more solid WinNT base was a huge win for most users (although gamers complained about a lack of drivers).
But don't be fooled - fundamentally, you're just looking at PR spin designed to created demand for an new OS.
Unit Tests (Score:3, Interesting)
Why can't microsoft rebuild windows like Apple did (Score:4, Insightful)
This could make their job a lot easier and could get them more patrons for their OS.
But microsoft has always been good at making even simple things seem very complex.
Well placed propaganda. (Score:3, Informative)
Look at MS's big challenge now.. They are a monopoly, they are not going to increase their market share any more, because they already own the market. Their challenge is getting people to stick with their stuff, despite the demonstrated long standing problems in security.
So, they throw in some tidbits critical to MS's past practices, because everyone is painfully aware of the problems they have had with security, viruses, etc. And they introduce our savior, Jim Allchin, who in a miraculously short amount of time, fixed all the development issues and got the company on track producing bug-free software.
Now, IT managers can breathe easy, assured that the next release of Windows will solve all that pains them, and will be well worth the high price MS demands.
This article is a great demonstration of why MS is on top. They have the clout to place a piece of propaganda in a national publication that will be read by a good percentage of corporate execs. That's innovation, MS style.
Wow, Maybe, just maybe, it could be secure now... (Score:3, Insightful)
Love this line (Score:4, Insightful)
If anything, Mozilla is the reason they're finally getting around to 'upgrading' IE to possibly make it a decent browser compared to Firefox.
Re:That explains a lot (Score:5, Insightful)
A highly structured and organized operating system developed under the instruction of a central authority, no doubt?
Don't be such a hypocrite.
Re:That explains a lot (Score:5, Insightful)
You know, when I read the article, I was thinking: This sounds almost exactly like how Linux is developed, except that all the authors aren't employed by the same company. Who would have thought that the Open Source development model would be the same as that at Microsoft?
Re:That explains a lot (Score:5, Interesting)
Right, but have you ever noticed how many successful Free / Open Source software projects use modular architecture? Take (from my own area) Nessus, or Snort. Both consist of a core engine and frameworks that accept plug-ins and modules. Actually they both also have a lower level that allows ordinary non-programmer users to contribute signatures (rules) to the project.) This applies also to Apache, Mozilla, the Linux kernel, and plenty more.
That explains a lot-Anatomy of a F/OSS programmer. (Score:3, Insightful)
That's out of necessity. Due to the distributed nature of it's dev
Re:That explains a lot (Score:3, Insightful)
The reason we tend to have more modular code in the Open Source wo
Re:That explains a lot (Score:5, Insightful)
I would be surprised if people who actually are employed by MS itself don't have access to all the code. They may not have check-in rights, but they should get viewing rights, because there is no credible (legal, management, or technical) reason to prohibit them from doing so.
2. There are a bunch of users running the code all the time as its being developed and feeding back info.
Do you believe there are more testers for the linux kernel than for the windows kernel? I sincerely doubt it. Most FOSS users use only the stable release of most software (they may run development releases for a select few programs), because running development versions of anything tends to leave you with a non-usable system.
(3. They use the code themselves and have a ethic working to make the best code they can for themselves, knowing it wont be used as a tool to extort money from people.)
Yes, and the windows developers don't use windows themselves. Ofcourse not. Why ever would they do that?
I would challenge you to find anything open source developers can do process-wise that is not feasible in private enterprise. I have yet to find something.
You are all missing the key difference (Score:5, Insightful)
One would think that because of this, Linux would be a mess, but we've seen the opposite is true: For projects to continue to evolve rather than quickly die off, they require _rigid_ structure and sane, intuitive modularity to support the OSS development model. Projects that turn into spaghetti code too fast just fizzle out and never make it into my slackware distro. While at microsoft, they have this whole management system that makes it easier to support spaghetti code. OSS has a much more brutal "natural selection" process that is constantly favoring modular, readable, easy-to-learn code bases.
Plus, spaghetti code is not fun, so hobbiest programmers arent going to waste their time with it.
Thats why so much OSS software is structured so well.
Re:That explains a lot (Score:4, Insightful)
Here's one - never having to hear "Ship it!". People working on OSS projects on their own time aren't generally being told, "you have to ship before Dec. 31st so we can get the revenue on this quarter's books", with no regard to whether that date is reasonable. Lots and lots of companies do it, and almost invariably the preference is to hit the ship date rather than spend the extra time to get it right. It really bothers me and everyone else I work with when we have to ship something we know is broken simply because the powers-that-be won't agree to a reasonable date that allows us to get it right the first time.
I'm rather surprised that Microsoft got their priorities straight this time, but you'll notice from the article that management wasn't exactly a friend to the process.
Re:That explains a lot (Score:5, Informative)
Re:That explains a lot (Score:4, Interesting)
Looking under the hood, the Linux development model is more organised than one might expect. Consider the parts that make up a Linux system.
Libraries and applications are typically managed by smaller teams, and even if people contribute, those contributions are reviewed. Accepting that, we only have to look at the big structure. Some observations about libraries:
The whole is a mixture of bottom-up and top-down hierachical control. To understand the dynamics, consider an individual project. At an early stage, the developers looks around to identify what is already done, and tries to identify reasonably stable, common, and well managed libraries which they can use. This is a very feasible thing to do due to open source licenses. They will then start from there, and do occational changes in dependencies throughout the lifetime of the projests due to new needs and changes in availability and quality of dependent parts. Sometimes, libraries are split out of projects by abstracting out identifiable tasks.
An important observation is that by maintainers of a popular project casts a vote when choosing dependent projects. The more important the project is, the higher weighted is the vote for the dependent parts to survive. When most projects thus migrites to a better library, the rest will have to choose to follow suit or to risk loosing ground due to a more difficult installation process. The distributors are the ultimate judges, though their power is limited by what software is available.
In other words, there is a semi-democratic system that organises a hierachical structure of componets, with no single central authority.
Re:That explains a lot (Score:5, Insightful)
The difference being, Windows is touted as a professional OS built by professional coders, upheld to a high standard, etc, etc, etc. Simply put: People expect more when they have to pay for it. Microsoft has constantly criticized projects such as Linux, because the code isn't built by a central authority. Now we learn that Windows is made pretty much like Linux. I think criticizing Microsoft for this is definitely justifiable.
Re:That explains a lot (Score:5, Insightful)
I didn't say that, and don't even think my logic says that. My logic is, if Company X produces product Ya, whereas I can get product Yb for free, I'm going to need product Ya to be damn good for me to get that instead. Is Yb perfect? No. Should it be used in place of something that's better? hell no. But should it be used in place of something that's just as good? Why wouldn't you want to?
Microsoft has attacked Linux's development method, saying how much better theirs was. People bought into it. Now we learn that they've been lying all this time, and that their development method is just "as bad" as Linux's. When you lie to people in order to get them to buy your "state of the art" product, people are going to expect it to be good. When they learn you've lied, they're going to be pissed, and it's fair for them to criticize Microsoft for this.
That's what I said. I don't know where this "implied logic" that Mac should be selling like pancakes comes from.
Re:That explains a lot (Score:3, Interesting)
We may recall how Gates said security was job #1 a while ago. Obviously they are paying more attention to that now, but a large part is to deflect blame for their daily exploits. And in this case the article says how all the nice new features of Vista have fallen by the wayside, it's years late, but the spin is, as always, "the next version will be better than anything ever made". The classic FUD, and in the WSJ; so the CEOs can tell their geeks
Re:That explains a lot (Score:5, Insightful)
On Linux, all code gets inspected by others before it is accepted.
So, what you're saying is that linux development works better because it is top down cathedral style, where microsoft's model fails because it is a chaotic bazaar style?
Re:That explains a lot (Score:5, Funny)
Translation :
All the developers live in their parent's basements, and walk the code upstairs to show their mom.
Re:That explains a lot (Score:5, Insightful)
It will be interesting to see if Vista demonstrates an improved level of quality due to this new process.
Re:That explains a lot (Score:3, Funny)
So the amazing new innovation that's
Re:That explains a lot (Score:3, Informative)
That should be easy, SourceSafe just can't handle large numbers of developers or files.
Microsoft doesn't even use SourceSafe. I'm pretty sure the VS guys did a study and found the vast majority of SourceSafe users were people like admins or secretaries who were backing up docs and spreadsheets on their department server.
When I started at Microsoft, they use SLM (source library manager) which convenient was pronounced SLIME. This thing was horri
Re:That explains a lot (Score:4, Informative)
Re:Linux Vs Windows (Score:5, Funny)
Re:Linux Vs Windows (Score:3, Interesting)
You sound like somebody who hasn't used Linux in a long time. In fact, it's amazing how far Linux has come in the last few years.
You've obviously never heard of Synaptic [nongnu.org]. I suggest you take a look at some of the screenshots. Most distributions now come with Synaptic. To install software, you just load up Synaptic, select the programmes you want to install from a list and click a big "Install" button. What could be simpler?
You seem to have a hard time grasping this but this is actually simpler and better