The Mythical Man-Month Revisited 317
jpkunst writes "Ed Willis, over at O'Reilly's ONLamp.com, gives his varied reactions to Fred Brooks' classic The Mythical Man-Month, after 'having finally read it in its entirety'. '[...] simultaneously you can see just how much the field has changed since the original writing and just how much has stayed stubbornly the same.'"
Man-month? (Score:5, Funny)
Re:Man-month? (Score:3, Funny)
Which reminds me of a line from the book. Something like: it takes nine months to produce a child, no matter how many women are assigned to the project.
Re:Man-month? (Score:3, Interesting)
Then the really fun meetings when you're behind schedule. The finger-pointing. Blame shifting. Back-stabbing.
communication growth exponential (Score:3, Interesting)
Still one of the best "I-was-there" books (Score:4, Interesting)
It has helped me tremendously (Score:2, Insightful)
Re:It has helped me tremendously (Score:3, Insightful)
Re:It has helped me tremendously (Score:3, Interesting)
(Analogy: don't start from friggin' scratch and you can't customize everything, the parents have already been chosen!) Otherwise, you got 9+ months of waiting.
Re:Still one of the best "I-was-there" books (Score:5, Interesting)
And from an outsider's view of another "I Was There" project, try Soul of a New Machine by Tracy Kidder. Both books were required reading in Computer Science at college about 20 years ago.
Now, is MMM still relevant in the current Microsoft-dominant environment, with a new Operating System every few years, impacting software development? Is the concept of software development still valid, or is it a matter of hobbling "off the shelf" solutions together?
Re:Still one of the best "I-was-there" books (Score:5, Insightful)
It's no different than any other consumer market. Cars come with standard options that were top end ten years ago. What's top end now is pretty far removed from "just being a car," stuff like DVD navigation systems, radar nightvision and dynamic suspension systems. In another ten years, some of these will be standard on all cars, and what's top-of-the-line will be something that seems obscene and unnecessary to us right now.
Re:Still one of the best "I-was-there" books (Score:4, Insightful)
There are plenty of REAL reasons to dislike advertising (such as the fact that it caters to the least common denominater, is overly self important and rarely tells you what you REALLY need to know when evaluating a product or service, instead misleading you with empty statistics such as how popular something is or how many awards it's gotten in advertiser supported magazines). But you can't blame ADVERTISERS for the fact that, someday, a better product may be made. Their job is to inform you of the product that exists RIGHT NOW -- and if the 1973 Corvair was the best Corvair ever made, they're be right to say so, even though it's an extremely shitty car.
Is this a rip off? I dunno. If I need to buy a car, I don't really care that a better one will be available in ten years. I might like to know which is the best car right now. And certainly, since I'm going to be test driving it, I'll be in a prime position to judge for myself whether the car is sufficiently "ultimate" to meet my exacting standards.
Personally, I don't think it's possible for a company to rip you off. People rip themselves off by placing impractical expectations on products with minimal research. Advertisers merely take advantage of that; they make things out to be useful, because they're trying to sell you something. Sneaky, yes, but I don't know why you feel the need to take their word at face value when you KNOW they'd benefit by not telling you the defects.
But I guess in a world where people believe that the world is less than ten thousand years old because some guy who died SIX thousand years ago says a ghost told him that, you can't expect a whole lot of logic. After all, if people can base their whole worldview on wild, unsubstantiated claims, how do you think they're going to evaluate what brand of facial tissue to purchase?
Re:Still one of the best "I-was-there" books (Score:3, Insightful)
Back when XP came out, I distinctly remember disrespecting people at work who went out to buy it. But many of them were thrilled with it, mostly for the "user-land" applications. One guy told me he was excited because it had CD burning built in to the OS and had actually got a CD burner bundled with his purchase. Another was excited by the prospect of XP's driver backoff (which, incidentally, does the same thing I did in 2000 for ye
Re:Still one of the best "I-was-there" books (Score:3, Insightful)
And yes, Brooks' "The Mythical Man Month" is still valid, because it isn't about code, it's about software project management. Like it or not, nothing has really changed in the field in the last 30 years. Yes, the languages have changed (although APL programs and C programs typically have the same number of comments
Re:Still one of the best "I-was-there" books (Score:2)
Just pointing out that between now and when TMMM came out, you have the rise of the microcomputers (Apple and the IBM Clones) and the operating systems which run them (DOS, Windows, Mac, Linux, e
Re:Still one of the best "I-was-there" books (Score:5, Insightful)
Does Brooks' model change from that when the behemoth computers of the 60's walked the Tech World?
No. Brooks' model is one of software development in general, so the particulars of what is being developed matter not at all.
Re:Still one of the best "I-was-there" books (Score:3, Insightful)
No matter how "huge" and IT project is, it is still made up of individual pieces that must be developed and maintained individually. Each of those pieces needs a team to develop it.
OSS merely takes care of a lot of the core functions for you. Instead of having to go out and implement a Kernel, you can use a ready made one. Instead of having to implement a network file system, you can employ one of the myriad that are available. Your project sits atop these other peices, but the same fundi
Re:Still one of the best "I-was-there" books (Score:3, Insightful)
Re:Still one of the best "I-was-there" books (Score:4, Informative)
It's called the "second-system effect".
Person as four-port and hierarchical organization. (Score:4, Interesting)
I saw a great explanation of WHY you get less per man on a large project than a small one, and why hierarchical organization seems to be necessary on projects with large numbers of people but can be dispensed with on tiny ones.
Imagine each person as a device with four "ports" (each representing a fraction of his time and/or attention). Each "port" can be used for communicating with one other person or doing one unit of work.
On a one-person project all the ports are used for work. You get four units of work done per day.
On a two-person project each person has one port used for communicating with the other and three for doing work. You get six units of work done per day.
On a three-person (non-hierarchical) project, each person has TWO ports tied up communicating, and TWO for doing work. Again you get six units of work done per day.
On a four-person (non-hirearchical) project, each person has THREE ports tied up in communication, and only ONE left for work. Now you're down to FOUR units of work per day - same as a single hacker in a closet.
On a five-person (non-hierarchical) project, each person has all four ports tied up with communicating. Nothing gets done. B-)
Of course you can to a limited extent increase the number of "ports" by tools to improve communication, or by overtime. And some people are better at switching tasks or communicate quickly, and thus have more "ports". But the same basic idea applies.
You can go beyond a handful of people and retain some productivity by restricting the interpersonal communication paths - to keep people from using up job-time communicating with others when it's not job-related. This tends to lead to specialization, with some people only communicating. That leads to a tree organization, with the "leaves" being people who actually do some work on the code proper, communicating only with one or two neighboring leaves, and others just communicating - and deciding what messages to forward.
And of course this leads to all the classical pathologies of hierarchies: Distortion of messages by multiple hops. Much decision-making must be done in the tree (and often far from the relevant data) to prevent saturating the communication links. "Leaves" are data-starved and must follow the decisions of "non-leaf nodes" or the project becomes disorganized. So the non-leaves become authorities and run the show.
To do large projects without such explicit communication hierarchies controling the workers you need to divide it into modules done by standalone groups, plus assemblies also done by standalone groups. The standalone groups must be redundant (so that at least ONE of the groups doing each particular thing gets it to work adequately.) Then the hierarchy is still there, but in the form of the invisible hand of evolutionary/market forces: Leaf modules are adopted or rejected by the assembly-constructing group constituting the next level up the hierarchy toward the root of the overall project, assemblies are adopted or rejected by larger-assembly groups, and so on. (Of course there can ALSO be more than one root, and users of the resulting product can replace modules or assemblies with others that do the job if they car to do so.) Each group can be flat or hierarchical, according to their own leanings (and the needs of their task).
Am I the only one... (Score:3, Interesting)
Maybe I'm just uneducated, or maybe it's an American thing... here in England, we probably have dozens of books that are unknown anywhere else.
Re:Am I the only one... (Score:2)
It was probably the first non-technical IT book I read (many years ago), and I remember it had a very big influence on me back then. I really ought to re-read it.
Re:Am I the only one... (Score:2)
Re:Am I the only one... (Score:5, Informative)
If you were for example painting a big house or something it my take one man two months to complete. But if you had two men then it takes one month. The more people you add the faster the job it done. So we often talk about how many man months are needed to complete a job. But that are many tasks that cannot be made faster by adding more people. Brooks states that programming is one of those tasks. Adding too many people to the programming effort will only make it take longer because of interdependencies, communication and coordination required. The programmer and time are not fungible. We cannot simple expect to complete a project that takes 1 man 18 months with 18 men in 1 month. As you add more men the time improvements become less and less.
Re:Am I the only one... (Score:2)
Barnes and Noble's website couldn't offer that as far as I could see...
Thanks
Re:Am I the only one... (Score:5, Insightful)
And in fact as you add more people it takes longer and longer.
The trick is to have a team just small enough that you get the project done as quickly as possible. It's sort of like the marginal revenue curve .. charge more and fewer people will buy the item, charge less and your profit is less.
But the comparison to a surgical team is apt: You don't add more surgeons, necessarily, you add assistants to hand instruments to the surgeon, keep tabs on the patient, hold the light, etc.
Re:Am I the only one... (Score:5, Interesting)
In other words, programmers tend to run afoul of Amdahl's Law [wlu.edu].
Actually, Amdahl's Law would probably be a good way of calculating the maximum effective team size. Unfortunately, it can be very difficult to ascertain a value for the "work" needed on a project. Not to mention the "human factor" of programmers who are faster, less experienced programmers, and "cowboy coders" who refuse to check any of their work into version control.
No, Brooks' point goes beyond Amdahl's Law (Score:5, Insightful)
Amdahl's Law just says there is a part of the work that can't be parallelized; in a system that follows Amdahl's Law, adding more resources always makes things slightly faster, though there are diminishing returns.
Brooks' Law says that you can actually make the project later by adding more people. That's because the new people have to be brought up to speed, all the team members have to communicate, so you can lose more time than you gain.
Re:No, Brooks' point goes beyond Amdahl's Law (Score:3, Funny)
If one programmer can do it in one year, two programmers can do it in two years.
Re:Am I the only one... (Score:5, Informative)
Re:Am I the only one... (Score:3, Funny)
A simpler metaphor (Score:2)
Re:Am I the only one... (Score:5, Funny)
Re:Am I the only one... (Score:5, Interesting)
The British equivalent would be C.A.R. Hoare's ACM Turing Award acceptance speech The Emperor's Old Clothes [braithwaite-lee.com].
Re:Am I the only one... (Score:2)
Re:Am I the only one... (Score:4, Interesting)
Most programmers I've worked with in the UK have either read "Mythical Man Month" or at the very least heard of it. The same goes for Jon Bentleys "Programming Pearls".
Both books were a little bit of an anti-climax when I first read them, probably because I expected way too much in the way of blinding insights. I found I was like the bloke that Brooks sat next to on a plane journey (described in the second edition) - so much of what the book has to say seems obvious now.
However obvious those insights may seem, big projects still get bogged down with the same old problems. I guess that means managing really big projects is still a bit too much for most of us to cope with.
Chris
Re:Am I the only one... (Score:4, Interesting)
This one is beyond a classic, it is still very useful and I re-read it every couple years. The notes on back of the envelope calculations (pi seconds is a nanocentury, the rule of '72', etc.) and the continual admonishment to rethink your data structures are things I try to always keep in mind during meetings and implementation.
You'd be surprised how often a SWAG (scientific wild ass guess) about memory or time requirement can point things in the right direction early in the process.
Don't forget "Death March". (Score:3, Insightful)
People keep making the same mistakes, for the same reasons. Even when they know better.
The trick is to identify the conditions that exist PRIOR to making the mistake and focus on changing those conditions (example: management does NOT know what they want, just that they want something and it has to be next month).
Managing the conditions is very tricky.
Re:Am I the only one... (Score:3, Interesting)
The answer is yes: read it. It's a classic of the IT World and contains some important ideas (as well as being an interesting view of the IT World 30 years ago).
Compression (Score:5, Funny)
The Mythical Man-Week.
Re:Compression (Score:5, Funny)
The Hypothetical Person-Week
Switch to the metric month! (Score:5, Funny)
We've found that we get a lot more accomplished by switching to the 10 day work week and 10 hour work days.
Now, if only Swatch would come out with a metric time piece.
Re:Switch to the metric month! (Score:4, Funny)
Psh. Real geeks use binary [thinkgeek.com].
Re:Switch to the metric month! (Score:2)
Re:Switch to the metric month! (Score:2)
what a stupid article (Score:4, Insightful)
Re:what a stupid article (Score:2)
Re:what a stupid article (Score:3, Insightful)
Likewise, h
The more things change ... (Score:5, Informative)
Another concept he brought to light was originally Harlan Mills's, that of making the programming team like a surgical team. A surgeon, or chief programmer, has primary architectural, design, and implementation responsibility, but is assisted by a copilot, administrator, editor, two secretaries, and a program clerk.
While I've never seen such a team, I have witnessed pair programming that the XP (not Windows, eXtreme Programming) folks praise, and it works quite well. It may not be a full-fledged surgical team as Brooks would've liked, but the productivity of a pilot on the keyboard and a copilot following after every little mistake certainly improves productivity.
Re:The more things change ... (Score:5, Interesting)
My first programming gig was writing device diagnostics for prototype set-top boxes in the mid-nineties. I was still in college, and my programming experience was basically just C -- and on windows and mac machines ( I was a kid ).
The lead programmer could tell I had potential, but knew that the only way I'd be able to do a good job was to work *with* him, since I had to learn VI and learn how to work on an old sparc ( where we crosscompiled for the embedded platform ) he figured the learning curve would be easier if he sat at the keyboard and I went over the algorithms alongside him.
It worked beautifully; we shared responsibility and caught eachother's bugs. After a while as I demonstrated that I was catching up ( read: I learned vi ), we began to take turns as keyboard jockey -- but regardless our combined productivity was much greater than by ourselves.
The comeraderie was great. He was an old-school AT&T programmer and I had a hoot working with him and he had a hoot teaching me how to write *tight* low level code.
The only troublesome part was, since we were developing a precursor to modern video on demand boxes, and it was back in 1995, we had a distinct lack of movie-length mpegs to test against. So we had only _Demolition Man_ and _The Crush_... Which means that for proper testing I must have seen each at least 100 times during my employment there.
Plus we were testing picture in picture and looping stuff for multiple mpeg streams and this meant I sometimes would be watcing demolition man while Alicia Silverstone's stunt-butt scene would loop *forever* in a mini-window.
It drove me mad.
Re:The more things change ... (Score:4, Interesting)
Needless to say, what little pairs programming we did has caused me to swear off of it forever. It was something like You were very lucky.
Re:The more things change ... (Score:3, Informative)
But then, you could simply have read my comment.
Re:The more things change ... (Score:5, Funny)
There was a practice of leaving the audio up for all of the radio dubs that were made for each single, so that the glassy-eyed intern could ensure that it was recorded properly. This was done literally thousands of times... one for each major and minor radio station in North America. For each song that was released. And each interview/soundbite. All during the Lilith Fair days. Joy.
Unfortunately, the interns didn't last too long in this job, as they quickly got very bored of it, so there would be a new one every day or two... each one initially VERY excited about working with "Sarah!", so they'd crank the volume.
This drove me nuts. Almost literally. I'm an older Van Halen and Ozzie fan, and cannot stand to listen to Sarah's stuff more than once or twice... it's not my cup-O-tea. That being said, this was like some insane water torture for me.
It really hit home when I was in to see the dentist a few years back, and he was doing a routine examination on me, and he started to get really concerned. "Are you in pain? There doesn't look like there should be any pain, but you're all tense and flinching... what's up?"
It was at that point that I realized that the receptionist was a HUGE Sarah fan, and was playing Sarah's just released Mirrorball compilation in its entirety... that I'd already heard almost infinitely.
So, I spilled the beans to the doc, and he laughed, got up, went to the CD player, and popped in some classic VH. I loosened right up, almost to the point of going to sleep, I was so relaxed.
The next time I went in to see him, sure enough, Sarah was back on the CD player, but on seeing me, the receptionist killed it and popped in some Stevie Ray Vaughn, and all was well. They'd actually made a note in the book that said "absolutely NO SARAH while he's here".
That dentist has my business for LIFE now, let me tell you!
I guess what I find interesting is that such exposure to audio/video stimulus repeatedly can have big impacts on you... without even really knowing it. I wasn't actually consciously aware of my "audio rage" until it was pointed out to me.
It's almost like it's audio/visual repetitive stress injury or something.
Weird.
Re:The more things change ... (Score:5, Interesting)
Those workers carried alot of instituional knowledge and brought alot of unseen benefits to organizations.
Build one to throw away (Score:4, Insightful)
Donald: You're fired!
Re:Build one to throw away (Score:3, Interesting)
Well, as long as you're being honest about one approach, you could be honest about the traditional other approach:
My Thoughts (Score:2, Interesting)
Re:My Thoughts (Score:2)
I'm doing C# mostly now.
Re:My Thoughts (Score:5, Insightful)
Like I care, I do most of my work in scripting languages. (IncrTCL if anyone cares.)
Re:My Thoughts (Score:2)
Ummmm, COBOL??
A Classic Book (Score:4, Interesting)
Re:A Classic Book (Score:5, Insightful)
They should make this book required reading in all MBA programs, in other words
Re:A Classic Book (Score:3, Informative)
Re:A Classic Book (Score:5, Funny)
It warms my heart to see MBA's are getting real training. I hope some day to have to revise my targets of derision, and (gasp) perhaps raise my level of esteem of them above household vermin.
Re:A Classic Book (Score:3, Funny)
Re:A Classic Book (Score:2)
the interesting thing, is that everything is going exactly as the book said it would.. We're getting ready to throw out the first one!
A wonderful dissection (Score:5, Funny)
================
Regarding source code documentation:
"The most serious objection is the increase in the size of the source code that must be stored. As the discipline moves more and more toward on-line storage of source code, this has become a growing consideration. I find myself being briefer in comments to an APL program, which will live on disk, then on a PL/I one that I will store as cards."
For who among us is this not true? Honestly, you just can't shut me up on cards.
================
Definitely worth a read. To coin a phrase: LOL.
Re:A wonderful dissection (Score:2)
When the sage points to the sky, the idiot sees the finger.
Re:A wonderful dissection (Score:5, Insightful)
Modern computers have their quirks. In 30 years my kids are going to be asking me why I keep referring to "disk space" and "RAM." Then I'll have to explain that back when I programmed, you had two types of memory, the high-speed stuff the computer would work in, RAM. RAM was expensive, finite, and would lose it's contents when the computer rebooted. We also had "disks" that while they were slower, they stored a lot more infomation, were cheaper, and were non-volitile.
Laugh. But you too are going to sound like and old fart one day. And the respect you show or don't show for those that came before you is going to be what you instill in those that come after you.
Re:A wonderful dissection (Score:3, Insightful)
Well, maybe we are the ones that have it wrong.
From the standpoint of users, anything in RAM is forgotten when the power is killed, while everything on disk is "remembered."
Now, which should be called memory?
yes it was "kiloherz" and "kilobytes" in the 1960s (Score:4, Insightful)
A lot of basic technology in compilers, OSes, user interfaces, and artificial intelligence was invented under those terrible constraints.
Perpetual Conflicts of Interest (Score:5, Interesting)
It's also hard convincing "novice" customers that will buy into the experience-proven truth that small feasibility projects make the bigger projects cheaper, more productive and more deadline-friendly. The instant gratification complex of customers is at much at fault as the hunger to get and keep jobs among the IT workers.
Also, programmers usually get into programming through hacking, pleasure programming, or other forms of "undisciplined" programming. Often, the impulsive "go at it" style is the only one they know and enjoy. That causes problems too. As anyone who has ever tried project-managing programmers tends to find out, managing programmers (especially newer ones) is a bit like herding cats.
The one ugly truth nobody likes to talk about is that buggy/complicated systems help ensure jobs. Let's face it... the fact that Microsoft software crashes a lot creates good opportunities for consultants and IT staffs to justify their jobs. And does anyone think that Oracle would have grown into a multi-billion company if there weren't so many highly trained DBAs/High Priests running around promoting its mysterious wonders? Who knows how quickly this foul fruit will sour when all of this rot is billed by the hour?
Open source (Score:5, Insightful)
There is a certain smugness at work in the idea that the architect will make better decisions here than the user will. Certainly this view is out of favor now. We normally try to find out what the user wants (somehow) and then find a way to design our software to provide this to them in the most sensible manner we can envision. I can't imagine saying "no" to the user regarding a feature...
It seems that a lot of open source development actually adheres to the original architect premise here. In this case, the developer is the user and therefore knows best, at least for himself. I always find gathering requirements to be frustrating, and it never feels like a completed task. Especially when the developer is green in whatever industry they're developing to, the users can kill the usability of an app by nitpicking it to death--there is no real overall vision.
It's a shame, IMO...
Re:Open source (Score:4, Funny)
So... if the developer tries to do something in a field that he has no exposure to, and the users complain that he's missed the point, its somehow their fault? Hmm... whatever.
Re:Open source (Score:2)
Re:Open source (Score:3, Interesting)
I found reading this article quite fascinating. I'm one of those old-timers who remembers what Brooks was writing about. I've read that book several times, and still recommend it to people who want to understand software project management.
But what was most fascinating was the author's impressions of the book. He certainly pointed out artifacts that I had glossed over (they seemed normal to me). However, I was also surprised at how he interpreted what Brooks said much different than I had.
For exam
Funny how Willis... (Score:5, Insightful)
I guess eye of the beholder and all that.
Infantile review (Score:5, Insightful)
Picking on Fred Brooks' TMMM by noting it's anacrhonisms is about the most juvenile thing I can imagine. I can only surmise that the alleged reviewer was forced to read the book by somebody he did not like, and while he read the words he certainly didn't extrapolate the lessons to his present day situations.
When I re-read The Mythical Man Month I can see, in every paragraph, perfect analogies to my work today, and the work I see of other people in other fields. I can't wait to have the reviewer look at The Soul of the New Machine and laugh about how people used to build CPUs out of discrete parts, and how therefore none of the lessons of that book have any applicability today.
Who hasn't seen -- or lived -- an example of Brooks's "The Second System Effect?" The movie that I just finished working on, The Chronicles of Riddick was precisely an example of that paradigm with respect to Pitch Black. Every page of the chapter on The Second System Effect has one-to-one correspondences to the work on this movie.
There are few things that I'm dogmatic about -- but Everybody needs to read this book!
Thad Beier
Re:Infantile review (Score:5, Insightful)
It's a book that requires a mature mindset to appreciate properly. (Kind of like object oriented programming.) It only makes sense after you yourself have hit the very walls the book describes.
Shanon's theorum states that information is measured by it's surprise, what you weren't expecting. This book is one non-intuitive (at least to the layman) observation after another. But they are all true. And they all make sense once you are in the feild.
It's that "you would have had to have been there" they makes the book such a difficult read to the layman and the newb. It's also what makes it so damn interesting to the veteren. You know you are ready for the book when every chapter you feel relief that you aren't the only person in the world who has gone through that.
Re:Infantile review (Score:5, Insightful)
Indeed. The Brooksian concerns may be situated in a different era, but the reviewer's derision betrays a pervasive lack of understanding of the underlying constraints - and that within those constrainsts, Brooks actually makes some damn good points.
For example, the APL story, where the reviewer ridicules the anachronistic idea of renting memory for software. And yet, he completely misses Brooks's larger point - that the cost of ownership for software is not just from the code itself, but from code plus the infrastructure it requires. Once we generalize it to modern kinds of infrastructure (e.g. bandwidth costs), we see the lesson is just as valid, and just as ruthless to those who haven't learned it.
Not to mention other instances of missing the forest for the trees. Sure, Brooks may have foreshadowed XP and other strange team development approaches. But his points were much more fundamental - that team efficiency is sublinear with respect to team size and non-monotonic, that it peaks at fairly small team sizes, and then starts decreasing, etc. Indeed, this analysis did not merely foreshadow development styles - such analysis made them possible at all.
But the author is a self-professed neophyte, so maybe this review should be taken with a grain of salt.
Re:Infantile review (Score:4, Interesting)
An article that actually analyzes these issues would make a spectacular read.
Alas, instead of doing that, this article only picked out a few random, specific pieces for discussion, and made a few observations about them. The questions you mention didn't seem to be reflected in the finished piece at all. And the flippant tone and lack of breadth or depth suggest a rather unflattering modus operandi.
TMMM is a complicated book about complicated processes; spending two pages discussing only a few of its elements does it no justice at all. But the questions you mention are very much worth asking, and should not be abandoned because of a rough start on one article.
I wholeheartedly hope that the author would take another look at his article, and maybe write another, this time really comprehensive, in-depth analysis of how and whether the practice of programming changed since TMMM. Maybe even publish it as a series of articles on the site. A comprehensive analysis of Brooks's postulates would be a most welcome contribution.
Ed may be missing the point... (Score:5, Insightful)
[in response to a passage about developers needing their own machine (singular), and that it is supported]
Ed is missing the point here. I think that such a comment by the original author was based on the time-share days, not the more modern workstation days. "Back then", you all worked on terminals and did batch work on a central frame. Nowadays, the central server is good for no more than saving your Pr0n
If one were to generalize, I think that it would be better to say that "Teams building core applications need a dedicated developent environment in which to work; a system that is up to the task, properly isolated, and properly supported"
Re:Ed may be missing the point... (Score:4, Insightful)
Re:Ed may be missing the point... (Score:2)
Re:Ed may be missing the point... (Score:5, Insightful)
No, things haven't chanaged that much on many software projects.
Want to develop with real data? It often makes sense to share a development database - that can be designed, populated, and maintained by the dba.
Developing large, complex analytical applications? Is your production destination a massive cluster? Then you'll probably need a development environment that's at least a small cluster. And no - every developer doesn't get their own cluster.
Need to interface with MQSeries, Websphere, a content manager, and a workflow manager? You really don't want to spend the time to get all that crap working on everyone's pc. Once again, you'll be way better off sharing a development server.
etc, etc.
Zeno's Paradox (Score:2, Interesting)
Old Timers Take on the MMM (Score:3, Insightful)
Re:Old Timers Take on the MMM (Score:2)
Yes, but these days one man and one woman can make 7 babies in 9 months.
Programming Large scale systems (Score:5, Insightful)
Brook's writing is focused on programming large systems like operating systems, or major Information Systems (IS) like bank's accounting, or a Wall-Mart's inventory system. These are still large complex tasks, which isn't done using a couple of programmers sitting side-by-side writing a bunch of code on a couple of PCs.
Willis' comparison to a classic book to modern programming method is laughable, because all those said modern methods (XP, Agile, iterative development, refactoring) were influenced by Brook's writings.
IMHO Willis' piece at ONLamp wasn't very insightful and didn't do much for me. I would recommend to any new or young programmer to read The Mythical Man-Month, it's consider a classic for a reason and don't get bogged down with the historic context in which it was written or trying to poorly graft modern programming paradigms onto MMM.
Project Managers can't read (Score:3, Insightful)
When I think of the problems we've encountered in the intervening years and how much time, energy, money and emotional stress would have been alleviated by simply understanding half of what Brooks covers in his book, I want to cry; okay, sometimes I want to just laugh maniacally . . .
silver bullet(s) (Score:4, Insightful)
There is no such thing as multiple silver bullets. "silver bullet" is a term derived from killing werewolves, where it takes a single silver bullet to kill the beast. not 2, not 3, but one. One thing and it's done.
The author of the article implies that there may be several silver bullets. that's how i read this section. saying "so there is no one, single silver bullet" is redundant and alludes to the fact that there is a concept of multiple silver bullets. that's wrong.
there is no silver bullet. just leave well enough alone.
Ed Willis leaves a lot to be desired (Score:3, Insightful)
In his commentary on Brooks' work. There are a number of issues Willis comments about, including a 'sneer' at the software rent and memory rent. And other comments on the expensive costs of computers at that time. Realize Brooks' is talking about programming on mainframes, machines where you mostly did batch processing and served hundreds or thousands of users.
It wasn't all that long ago when parts for micro computers were expensive, very expensive. I remember when 16 megabytes of memory - and a lot slower than what is available now - cost US$400. I remember when an 80 megabyte hard drive cost US$420.00. I remember these prices because that's what I paid. This is less than 15 years ago. The availablility of really powerful computers for individuals at astonishingly low prices is an extremely recent development.
The lowering of prices (and the resultant raising of the standard of living for those who buy those things) has been going on for thousands of years, as long as we've had free markets to allow this to happen. But initially (or as long as someone has had monopoly control over supply) prices were high and often the items were difficult to obtain. As products become commodities, prices drop. This is why 640 MB CDs (commodity) are now as low as 16c each (qty. 100), 50c each qty. 1. 4,200 MB DVD-Rs are $1 each (qty 4), while 100MB zip disks (proprietary) are still about $8 each (almost no discount in quantity).
Willis is comparing terms and conditions now with the situation of (much worse scarcity) of 30-35 years ago, then cracks up in laughter at his own ignorance of the past.
Paul Robinson <Postmaster@paul.washington.dc.us [mailto]>Guess I need my eyes checked (Score:3, Funny)
So I was thinking Arthur from "The Tick" was coming back.
Imagine my dissappointment...
The author is a whiner and a nitpicker (Score:4, Insightful)
I wish my management read it, too. They seem to think they're gods and they can solve everything by hiring more contractors (as opposed to managing existing programmers/testers better).
Alarming quotes from the article (Score:3, Insightful)
Is the author saying that most people have more gigs of RAM than an install of MS Office takes on disk? I doubt any real major app fully fits into physical memory.
I think he's saying you will invariably throw away the whole implementation either all in one go or a little bit at a time, so it's wise to "plan to throw one away."... This is probably not acceptable now -- certainly I'd be embarrassed to have to do this.
I guess that's why we are exposed to so many programs that should have been thrown away. Airplane designers build and discard many mockup models to discover problems that are not apparent beforehand. In programming, you just need to build one airplane and you are free to reuse any well-working pieces from the discarded model, so what's the big deal?
"The fundamental problem with software maintenance is that fixing a defect has a substantial (20-50 percent) chance of introducing another." I do not believe the risks to be this high now in any reasonably well-run organization.
Didn't we see a study recently that Microsoft is more likely than not to introduce another vunerability with a security update? Definitely simple software maintanance should be supplemented by periodic major cleanups and even discarding/rewritting problem pieces.
"A discipline that will open an architect's eyes is to assign each little function a value: capability x is worth not more than m bytes of memory and n microseconds per invocation. These values will guide initial decisions and serve during implementation as a guide and warning to all." Even in embedded development where I make my living, I rarely see anything like this level of budgeting detail.
So assign values at granularity applicable to your field "capability x is worth not more than 100K and 0.1 second per invocation".
I think the author of the review is still in denial, despite his efforts to keep open mind. "Mythical man-month" was written at the time of small, efficient programs running on limited hardware. Now we have propotionally (and sometimes unproportionally) more complicated and inefficient programs running on more powerful hardware. This just makes software development more perilous, although the end result is undeniably more valuable to users.
Sure some problems shifted from lower-level ("this function is 600 bytes. I ought to cut it down to 200 or less") to high-level ("our app takes up 512MB when running. We need to make each feature loadable on demand to keep average user's memory footprint reasonable"). And if nothing else helps, god bless you, maybe you really have to go through each function in 512MB and shrink it from 600 bytes to 200. But overall, few things really went away. You just need to look for them in another place/design phase.
Lightweight review by a lightweight reviewer (Score:3, Interesting)
The "chief programmer team" concept has fallen out of favor, with one notable exception - game development. Game projects have team members with well-defined roles, because they must integrate many elements that aren't just code. Games have artwork, music, motion capture data, maps, textures, character models, and props. Game teams look more like film production crews, with individuals responsible for specific areas. "Librarian" and "toolsmith" jobs are very real in game development. There's usually a lead "director", who is expected to know all the technologies involved.
SysOps (Score:4, Insightful)
Umm, ever heard of an IT department? Granted they rarely actually program anymore, but they're still configuring and maintaining your system for you*.
*Except of course in my job, where the great & powerful IT department is afraid to even touch a Linux machine (like the ones we use for actual development!)
Brooks and Agile development (Score:3, Informative)
Software Engineering as a discipline (Score:3, Insightful)
Hefty support structure (Score:4, Insightful)
Certainly many of the criticisms were well-supported, but I think the author missed the background on this one.
Re:Updated 20 year old book... (Score:2)