The Mythical Man-Month Revisited 317
jpkunst writes "Ed Willis, over at O'Reilly's ONLamp.com, gives his varied reactions to Fred Brooks' classic The Mythical Man-Month, after 'having finally read it in its entirety'. '[...] simultaneously you can see just how much the field has changed since the original writing and just how much has stayed stubbornly the same.'"
Still one of the best "I-was-there" books (Score:4, Interesting)
Am I the only one... (Score:3, Interesting)
Maybe I'm just uneducated, or maybe it's an American thing... here in England, we probably have dozens of books that are unknown anywhere else.
My Thoughts (Score:2, Interesting)
A Classic Book (Score:4, Interesting)
Man Mythical Month (Score:1, Interesting)
The only thing about this experience that sticks out in my mind is that the professor always referred to the book at the "Man Mythical Month". It was kind of hard to take any of the in-class discussions seriously with this going on. He knew his stuff, too; he clearly understood the concept of the "man month" and the mysticism surrounding it, yet he continued to hilariously butcher the title day in and day out.
The book itself was halfway interesting, but it didn't say anything that anybody with a couple years of software engineering experience didn't already know.
Perpetual Conflicts of Interest (Score:5, Interesting)
It's also hard convincing "novice" customers that will buy into the experience-proven truth that small feasibility projects make the bigger projects cheaper, more productive and more deadline-friendly. The instant gratification complex of customers is at much at fault as the hunger to get and keep jobs among the IT workers.
Also, programmers usually get into programming through hacking, pleasure programming, or other forms of "undisciplined" programming. Often, the impulsive "go at it" style is the only one they know and enjoy. That causes problems too. As anyone who has ever tried project-managing programmers tends to find out, managing programmers (especially newer ones) is a bit like herding cats.
The one ugly truth nobody likes to talk about is that buggy/complicated systems help ensure jobs. Let's face it... the fact that Microsoft software crashes a lot creates good opportunities for consultants and IT staffs to justify their jobs. And does anyone think that Oracle would have grown into a multi-billion company if there weren't so many highly trained DBAs/High Priests running around promoting its mysterious wonders? Who knows how quickly this foul fruit will sour when all of this rot is billed by the hour?
Re:Still one of the best "I-was-there" books (Score:5, Interesting)
And from an outsider's view of another "I Was There" project, try Soul of a New Machine by Tracy Kidder. Both books were required reading in Computer Science at college about 20 years ago.
Now, is MMM still relevant in the current Microsoft-dominant environment, with a new Operating System every few years, impacting software development? Is the concept of software development still valid, or is it a matter of hobbling "off the shelf" solutions together?
Re:The more things change ... (Score:5, Interesting)
My first programming gig was writing device diagnostics for prototype set-top boxes in the mid-nineties. I was still in college, and my programming experience was basically just C -- and on windows and mac machines ( I was a kid ).
The lead programmer could tell I had potential, but knew that the only way I'd be able to do a good job was to work *with* him, since I had to learn VI and learn how to work on an old sparc ( where we crosscompiled for the embedded platform ) he figured the learning curve would be easier if he sat at the keyboard and I went over the algorithms alongside him.
It worked beautifully; we shared responsibility and caught eachother's bugs. After a while as I demonstrated that I was catching up ( read: I learned vi ), we began to take turns as keyboard jockey -- but regardless our combined productivity was much greater than by ourselves.
The comeraderie was great. He was an old-school AT&T programmer and I had a hoot working with him and he had a hoot teaching me how to write *tight* low level code.
The only troublesome part was, since we were developing a precursor to modern video on demand boxes, and it was back in 1995, we had a distinct lack of movie-length mpegs to test against. So we had only _Demolition Man_ and _The Crush_... Which means that for proper testing I must have seen each at least 100 times during my employment there.
Plus we were testing picture in picture and looping stuff for multiple mpeg streams and this meant I sometimes would be watcing demolition man while Alicia Silverstone's stunt-butt scene would loop *forever* in a mini-window.
It drove me mad.
Re:Am I the only one... (Score:5, Interesting)
In other words, programmers tend to run afoul of Amdahl's Law [wlu.edu].
Actually, Amdahl's Law would probably be a good way of calculating the maximum effective team size. Unfortunately, it can be very difficult to ascertain a value for the "work" needed on a project. Not to mention the "human factor" of programmers who are faster, less experienced programmers, and "cowboy coders" who refuse to check any of their work into version control.
Re:The more things change ... (Score:5, Interesting)
Those workers carried alot of instituional knowledge and brought alot of unseen benefits to organizations.
Zeno's Paradox (Score:2, Interesting)
Re:Am I the only one... (Score:5, Interesting)
The British equivalent would be C.A.R. Hoare's ACM Turing Award acceptance speech The Emperor's Old Clothes [braithwaite-lee.com].
Re:Am I the only one... (Score:2, Interesting)
Their view was that if you want to deploy more people on a project, you have to divide it into sub-projects wiith relatuively much more formal and documented interfaces between the separate teams.
My experioence would not contradict this at all.
Re:Am I the only one... (Score:3, Interesting)
The answer is yes: read it. It's a classic of the IT World and contains some important ideas (as well as being an interesting view of the IT World 30 years ago).
Re:Am I the only one... (Score:4, Interesting)
Most programmers I've worked with in the UK have either read "Mythical Man Month" or at the very least heard of it. The same goes for Jon Bentleys "Programming Pearls".
Both books were a little bit of an anti-climax when I first read them, probably because I expected way too much in the way of blinding insights. I found I was like the bloke that Brooks sat next to on a plane journey (described in the second edition) - so much of what the book has to say seems obvious now.
However obvious those insights may seem, big projects still get bogged down with the same old problems. I guess that means managing really big projects is still a bit too much for most of us to cope with.
Chris
Re:Open source (Score:3, Interesting)
I found reading this article quite fascinating. I'm one of those old-timers who remembers what Brooks was writing about. I've read that book several times, and still recommend it to people who want to understand software project management.
But what was most fascinating was the author's impressions of the book. He certainly pointed out artifacts that I had glossed over (they seemed normal to me). However, I was also surprised at how he interpreted what Brooks said much different than I had.
For example, the above quote was in reaction to the statement:
What I think Brooks was saying (or at least what I read from his statement) was that to add a new feature that behaved significantly different than the rest of the system is a bad idea, even if the new feature is very useful. I don't have the book with me, but I'm guessing it was in the chapter where he talked about the beauty of a cathedral came from the fact that each builder followed the original plan.
(During the middle ages, it took so long to build massive church buildings that the construction spanned the lifetimes of several builders. In many cases, each new builder had a "better idea", and so their part of the building looked different than the rest. The result was a patchwork architure that didn't look anywhere near as nice if any one of the individuals builders had been able to build the entire structure.)
I don't think Brooks was saying to ignore the needs of the users, but rather to make sure your changes fit into the overall structure of the program. If different parts of the system work differently, it will most likely lead to user confusion. That is why changes should fit within the framework of the original program. Imagine a system where the author of each component was able to create their own user interface. When you select option 'A', you do it this way, but when you are using option 'B', you have to do it that way. The end result is a confusing mess, even though each individual component might have a perfectly reasonable way of doing things. Its just that most people expect a system to have some consistency in its behavior, appearance, and interfaces.
Speaking of ignoring users, however, I recall reading an article where Kernighan claimed you should ignore all suggestions when a system is first released. The reason is that most people are reacting to the fact that they are trying to use the system differently than it was originally intended. Often, they are expecting the new system to do something in the same way as some other system they used. Once they get used to working with the system, they are able to anticipate how the system wants them to do something, and they become happier and more productive.
If you rush to implement many of the initial suggestions, you will often start changing the overall architecture and/or interface of the system, which is what Brooks (IMHO) is warning against.
Re:Switch to the metric month! (Score:2, Interesting)
On the other hand, you would make your employees very happy if you had gone binary instead.
Re:Build one to throw away (Score:3, Interesting)
Well, as long as you're being honest about one approach, you could be honest about the traditional other approach:
The secret to the build-one-to-throw-away aproach is to do it in small increments. Is the boss eager for a dubious feature? Is a developer all hot and bothered over some new technical approach? Try it for a week, producing a new version of the app at the end of it.
If the idea turns out to be a complete turkey, then the worst case is you've lost a week of work. But generally, the idea has some merit, even if it's only as a stepping-stone to a better idea, so it's rare that it's a 100% loss.
Re:The more things change ... (Score:4, Interesting)
Needless to say, what little pairs programming we did has caused me to swear off of it forever. It was something like You were very lucky.
Re:Am I the only one... (Score:4, Interesting)
This one is beyond a classic, it is still very useful and I re-read it every couple years. The notes on back of the envelope calculations (pi seconds is a nanocentury, the rule of '72', etc.) and the continual admonishment to rethink your data structures are things I try to always keep in mind during meetings and implementation.
You'd be surprised how often a SWAG (scientific wild ass guess) about memory or time requirement can point things in the right direction early in the process.
Re:The more things change ... (Score:2, Interesting)
I am more than happy to commit my knowledge to paper (or bits), because I know that the written information will likely be a ghostly echo of real knowledge. It is Hard to communicate explicit understanding through writing, and all but impossible to communicate the implicit knowledge that is the real value of experience. If a business were to attempt a moderately effective program of creating written records of the institutional knowledge of their people, they would quickly discover the cost and effort swamping the budget.
Most attempts to write documents for things that are contained in the practices and processes of the people, of which I have experience with a few, result in a listless pile of binders that few read and fewer get any understanding from. In the cases I've been involved with, once the written document was published to the organization, the calls and emails from people trying to understand and put into practice the material just added another demand on the time of person who has the knowledge.
Knowledge can't be effectively captured merely through writing it down for many reasons, but a good one is that not everyone learns most effectively by reading. On the other hand, so-called "social learning" [co-i-l.com] techniques like those discussed in Situated Learning [amazon.com] and The Social Life of Information [amazon.com] are much better guides for how to retain and spread knowledge.
It appears common, however, that professional trainers are threatened by anything that would reduce the budget and power of the corporate training department. As an experiment, if your company is big into pre-packaged training materials, try getting a formal mentoring program going in your company.
Re:Switch to the metric month! (Score:2, Interesting)
Still, it bugs me that the 10s and 1s for the numbers each get their own binary digit. I suppose it means more LEDs (and lord knows i want more) but 12 o'clock should be 01100 not 01 0010
Really just decimal if oyu think about it...
Lightweight review by a lightweight reviewer (Score:3, Interesting)
The "chief programmer team" concept has fallen out of favor, with one notable exception - game development. Game projects have team members with well-defined roles, because they must integrate many elements that aren't just code. Games have artwork, music, motion capture data, maps, textures, character models, and props. Game teams look more like film production crews, with individuals responsible for specific areas. "Librarian" and "toolsmith" jobs are very real in game development. There's usually a lead "director", who is expected to know all the technologies involved.
Re:Am I the only one... (Score:2, Interesting)
In fact it's exactly like a marginal revenue curve.
It's a generally applicable economic principle that is called "declining marginal utility".
As you add more resources to any production the "marginal utility" of each new resource will be less than the last until eventually they start getting negative.
In plain English what this means is that if you can do somthing with one person, adding a second will probably speed things along. Adding a third person may also help, but less than adding the second did... eventually you will reach a point where adding another resource (people, in this example) will actually slow things down.
Like I say, this is a general economic principle. Usually the example used is agricultural (a little ferilizer allows for more crops, other things being equal, keep adding more and more fertilizer, eventually you'll start reducing your yield instead of increasing it) but it's widely applicable and just one of the reasons that a more widespread understanding of basic economics would be a Good Thing.
Re:Man-month? (Score:3, Interesting)
Then the really fun meetings when you're behind schedule. The finger-pointing. Blame shifting. Back-stabbing.
Re:It has helped me tremendously (Score:3, Interesting)
(Analogy: don't start from friggin' scratch and you can't customize everything, the parents have already been chosen!) Otherwise, you got 9+ months of waiting.
Person as four-port and hierarchical organization. (Score:4, Interesting)
I saw a great explanation of WHY you get less per man on a large project than a small one, and why hierarchical organization seems to be necessary on projects with large numbers of people but can be dispensed with on tiny ones.
Imagine each person as a device with four "ports" (each representing a fraction of his time and/or attention). Each "port" can be used for communicating with one other person or doing one unit of work.
On a one-person project all the ports are used for work. You get four units of work done per day.
On a two-person project each person has one port used for communicating with the other and three for doing work. You get six units of work done per day.
On a three-person (non-hierarchical) project, each person has TWO ports tied up communicating, and TWO for doing work. Again you get six units of work done per day.
On a four-person (non-hirearchical) project, each person has THREE ports tied up in communication, and only ONE left for work. Now you're down to FOUR units of work per day - same as a single hacker in a closet.
On a five-person (non-hierarchical) project, each person has all four ports tied up with communicating. Nothing gets done. B-)
Of course you can to a limited extent increase the number of "ports" by tools to improve communication, or by overtime. And some people are better at switching tasks or communicate quickly, and thus have more "ports". But the same basic idea applies.
You can go beyond a handful of people and retain some productivity by restricting the interpersonal communication paths - to keep people from using up job-time communicating with others when it's not job-related. This tends to lead to specialization, with some people only communicating. That leads to a tree organization, with the "leaves" being people who actually do some work on the code proper, communicating only with one or two neighboring leaves, and others just communicating - and deciding what messages to forward.
And of course this leads to all the classical pathologies of hierarchies: Distortion of messages by multiple hops. Much decision-making must be done in the tree (and often far from the relevant data) to prevent saturating the communication links. "Leaves" are data-starved and must follow the decisions of "non-leaf nodes" or the project becomes disorganized. So the non-leaves become authorities and run the show.
To do large projects without such explicit communication hierarchies controling the workers you need to divide it into modules done by standalone groups, plus assemblies also done by standalone groups. The standalone groups must be redundant (so that at least ONE of the groups doing each particular thing gets it to work adequately.) Then the hierarchy is still there, but in the form of the invisible hand of evolutionary/market forces: Leaf modules are adopted or rejected by the assembly-constructing group constituting the next level up the hierarchy toward the root of the overall project, assemblies are adopted or rejected by larger-assembly groups, and so on. (Of course there can ALSO be more than one root, and users of the resulting product can replace modules or assemblies with others that do the job if they car to do so.) Each group can be flat or hierarchical, according to their own leanings (and the needs of their task).
Re:Infantile review (Score:4, Interesting)
An article that actually analyzes these issues would make a spectacular read.
Alas, instead of doing that, this article only picked out a few random, specific pieces for discussion, and made a few observations about them. The questions you mention didn't seem to be reflected in the finished piece at all. And the flippant tone and lack of breadth or depth suggest a rather unflattering modus operandi.
TMMM is a complicated book about complicated processes; spending two pages discussing only a few of its elements does it no justice at all. But the questions you mention are very much worth asking, and should not be abandoned because of a rough start on one article.
I wholeheartedly hope that the author would take another look at his article, and maybe write another, this time really comprehensive, in-depth analysis of how and whether the practice of programming changed since TMMM. Maybe even publish it as a series of articles on the site. A comprehensive analysis of Brooks's postulates would be a most welcome contribution.
communication growth exponential (Score:3, Interesting)
Nads that go crunch (Score:2, Interesting)
What a self-loving asshole.
"Fred wrote in a time where systems were smaller and slower, where capacity was expensive.
So I'll mock that, and ignore the fact that he contributed more to our world than I'll ever even review."