Forgot your password?
typodupeerror
Programming IT Technology

Can Software Schedules Be Estimated? 480

Posted by timothy
from the now-stop-abusing-the-mozilla-team dept.
J.P.Lewis writes " Is programming like manufacturing, or like physics? We sometimes hear of enormous software projects that are canceled after running years behind schedule. On the other hand, there are software engineering methodologies (inspired by similar methodologies in manufacturing) that claim (or hint at) objective estimation of project complexity and development schedules. With objective schedule estimates, projects should never run late. Are these failed software projects not using proper software engineering, or is there a deeper problem?" Read on for one man's well-argued answer, which casts doubt on most software-delivery predictions, and hits on a few of the famous latecomers.

"A recent academic paper Large Limits to Software Estimation (ACM Software Engineering Notes, 26, no.4 2001) shows how software estimation can be interpreted in algorithmic (Kolmogorov) complexity terms. An algorithmic complexity variant of mathematical (Godel) incompleteness can then easily be interpreted as showing that all claims of purely objective estimation of project complexity, development time, and programmer productivity are incorrect. Software development is like physics: there is no objective way to know how long a program will take to develop."

Lewis also provides a link to this "introduction to incompleteness (a fun subject in itself) and other background material for the paper."

This discussion has been archived. No new comments can be posted.

Can Software Schedules Be Estimated?

Comments Filter:
  • by Anton Anatopopov (529711) on Monday November 05, 2001 @11:07AM (#2522300)
    But not with any degree of accuracy. Function point analysis is one method that has had some success. The key to delivering projects on time always has been and always will be RISK MANAGEMENT.

    Software development is not a science in the normal sense. Designing large software systems is an art. It cannot be pigeonholed. Stroustrup has a lot to say about this when he describes the 'interchangable morons' concept in the 2nd edition C++ book.

    Anyway, read Death march by Ed Yourdon, and the mythical man month by fred brooks, and antipatterns, any time someone asks you for an estimate say 'two weeks' and then bullshit from there on.

    That is how it works in the real world. The numbers are essentially meaningless, but the bean counters and suits have to justify their existance somehow :-)

    Can you imagine asking Linus when 2.5 will be ready ?

  • Yes but No (Score:1, Insightful)

    by Anonymous Coward on Monday November 05, 2001 @11:11AM (#2522324)
    If you want to keep the suits happy, Its easy, take the time you think it will take, multiply by two and add 10%.

    Now when it comes to the actual work, forget it. Unless your project management is extra tight, which is unlikely from all the places I've ever seen, you will have too many unknown variables, such as:
    • That hot new developer you just hired turns out to be clueless
    • The specs were badly written, or your customer changes the specs mid-project
    • You can never factor in testing time properly (Trust me, I'm a tester). See below.
    Especially when it comes to testing, too many project managers think you can just say "Oh sure, a twenty page testplan for that module will take one person three days. So we'll allow two weeks of testing per build, at three builds" This is total BS, because frankly, you won't know how many bugs are in the product, and therefore how many builds it will take to test the product, until you actually test it.

    Sure, you can get an estimate, maybe to about 30% of the actual time it will take, about 60% of the time. Its certainly an inexact science though.
  • by sphealey (2855) on Monday November 05, 2001 @11:13AM (#2522340)
    Very large and complex projects do get completed, sometimes even on-time/on-budget. Examples include skyscrapers, nuclear submarines, aircraft carriers, power plants (whether conventional or nuclear), oil refineries, B-747/A-320, etc. And all of these systems nowadays have a software component as well.

    So the easy response is that bad management in general, and bad project management in particular, is responsible for software project failures. While this is no doubt true, the next question has to be, why do software projects have such bad project management?

    I don't have a good answer, but one thing that occurs to me is the lack of a fixed endpoint. When an oil refinery ships its first load of POL, it is complete. When an aircraft carrier launches its first plane, it is complete. But the amorphous and mallable nature of software means that it is hard to define an exact endpoint, and very hard to avoid changing the definition of the endpoint as the project proceeds. So things keep "creeping" along until disaster occurs.

    sPh
  • Projects != R&D (Score:2, Insightful)

    by TheKodiak (79167) on Monday November 05, 2001 @11:16AM (#2522350) Homepage
    Straightforward implementation, no matter how complex, can be scheduled accurately. Developing new technology cannot.
  • by sql*kitten (1359) on Monday November 05, 2001 @11:21AM (#2522389)
    Software development is not a science in the normal sense. Designing large software systems is an art. It cannot be pigeonholed

    That's exactly the sort of attitude that has caused the sort of spectactular failures of software projects to be accepted as the norm. Software Engineering is *not* "hacking" or "coding" or "programming", it's *engineering*, like building a bridge or a skyscraper. Yes, those projects go over time and budget too sometimes, but they are the exception rather than the rule.

    That is how it works in the real world. The numbers are essentially meaningless, but the bean counters and suits have to justify their existance somehow

    The problem is endemic in the industry. The other Engineering professions require rigorous accreditation before they let practitioners loose in the world, like the PE (in the US) or the Charter (in the UK). But the software industry hires anyone, and lets them get on with whatever they do, with no real management or oversight or planning.

    In a well analyzed and properly planned project, the actual coding stage is little more than data entry.
  • by ciurana (2603) on Monday November 05, 2001 @11:21AM (#2522391) Homepage Journal

    My company develops turn-key systems. Sometimes we also develop custom solutions for our customers. Our customer base has increased steadily after the dotcom crash, when we switched from products to services. One of the reasons our customers like us is that we don't bill projects by the hour. We will the project on a fixed price, not to exceed, basis.

    The programmers who work with us on a contract basis don't bill us by the hour either. After we have the design and we distribute tasks and prior to submitting the final estimate, we ask contractors to place a fixed bid.

    We've done six major projects like this since March, and in all cases we finished within budget and on-schedule, and the systems are currently in production. They are all mission-critical systems running in either robotics environments or high-availability networks.

    Our economic motivation is then to do things well and quickly in order to increase our profits. That also enables us to move on to the next project faster than slaving over some customer in order to bill the maximum hours.

    As far as development techniques go, we adopted XP earlier on and it's working for us.

    Cheers!

    E
  • by dybdahl (80720) <info AT dybdahl DOT dk> on Monday November 05, 2001 @11:22AM (#2522395) Homepage Journal
    There are four parameters to a software project:

    - Quality
    - Quantity
    - Deadline
    - Costs

    In a competitive environment with humans involved, up to three can be specified. Not four. Good examples are:

    - Many guidelines for managing software projects tell you to reduce quantity when you get near deadline.
    - Some customers have a specified budget but really don't know how much software they can get for that money. They prefer to have costs fixed than to have quantity or deadline fixed.
    - Sometimes deadline is so important, that costs may 10-double in order to reach that deadline, and quality and quantity may get reduced a lot in order to finish the project.

    It is extremely important to realize the meaning of all four parameters before you can talk about estimating project schedules.

    Lars.
  • by Tassach (137772) on Monday November 05, 2001 @11:24AM (#2522406)
    I've been developing software professionally for about 14 years now. In that time, I've almost NEVER seen a development project get completed in the allotted time. This has been true even when the schedule has been padded outrageously to account for slippage.


    The biggest problem I've seen is requirements creep. Most often, you don't have a firm set of requirements to start with. Management and programmers both have a tendancy to view requirements documents and other formal software engineering practices as superflourous. The problem is that without a firm set of fixed requirements, you are always trying to hit a moving target.



    Another problem is attitude, mostly on the part of management, but programmers are guilty too. One faulty attitude is that we are conditioned to expect immediate results. There's also a prevaling attitude that there is never enough time to do it right, but there's always enough time to do it over. This leads to undocumented, unmaintainable masses of code that either gets thrown away after a while.



    Even worse, you wind up with garbage code that SHOULD be thrown away and re-written from scratch, but winds up getting patched and modified for years. I can't tell you how many times I've had a manager say "there isn't time to rewrite it, just patch it". That would be OK if you are only going to patch it once -- but you wind up patching the same program a half dozen times, and it winds up taking twice as long to do all the as it would have if you had just rewritten it from scratch.

  • by Bobo the Space Chimp (304349) on Monday November 05, 2001 @11:26AM (#2522416) Homepage
    You can develop properly, but you have to design modules with all specified functionality in mind -- no last second adding in "oh yeah, don't forget the login system" or "we're gonna want a WOW display attached to the processing so add in all these hack hooks at the last second into the core engine."

    If you need that stuff, design it in from the start. Too many programmers worry about general design to make future expansion easier, while leaving out consideration for real, hard requirements that won't be implemented until later in the project.

    And to avoid the problem with really bad bugs that are responsible for the (double it and add 5) estimation, take a little extra time to write exhaustive testing (as far as possible) of each module, indeed each function, to make sure it doesn't do something wrong when given values out of "happy path" input range.
  • by KyleCordes (10679) on Monday November 05, 2001 @11:28AM (#2522433) Homepage
    This approach applies, more or less, sometimes MUCH less, depending on how well understood the problem domain is, how many times you have done it before.

    If you're building your 57th e-commerce web site, which works roughly like the 56 you build before, you can estimate very, very well, and you can reduce coding to nearly data entry.

    If you're solving a problem of unknown scope, which your team has not solved before, which the solution is not clear to, and analysis has revealed some but not all of the details, etc., then you are not very right.
  • by keath_milligan (521186) on Monday November 05, 2001 @11:28AM (#2522436) Homepage

    If the software industry were saddled with the same level of process that exists in other engineering professions, we'd still be using character-based software, the web and the internet as we know it today wouldn't exist and most business would still be conducted on paper.

  • by xyzzy (10685) on Monday November 05, 2001 @11:29AM (#2522438) Homepage
    I agree with your risk management comment, and a later poster who mentioned fixing the endpoint, but I'm not sure I agree on your claim that it can't be pinpointed with any degree of accuracy.

    After ~15 years in the industry, I've found that one thing that makes a huge difference is the experience of the team, and the familiarity between the actual engineers and the project management.

    As you have experience solving a variety of classes of problems, you can predict with increasing accuracy the time it'll take you to solve later problems. And as your management sees you getting increasingly accurate in your estimates (based on past projects) they can create better and better schedules and estimates for the project as a whole, and have a better intuition for the gray areas of development, or the greener developers.

    Projects that tend to go off into the weeds have included (in my experience) wholly green teams, wholly green management, or areas of development that are outside the areas of expertise of one or both.
  • by KyleCordes (10679) on Monday November 05, 2001 @11:31AM (#2522450) Homepage
    Note that many of the kinds of projects you mentioned also sometimes have cost and time overruns of remarkable size.

    Note also the enormous difference between building the first 747 / skyscaper / nuclear submarine and the 15th or 1500th of each.
  • by Trinity-Infinity (91335) on Monday November 05, 2001 @11:32AM (#2522459) Homepage
    Check out the CSE Center for Software Engineering [usc.edu]
    Home of ....
    • COCOMO [usc.edu] (COnstructive COst MOdeling)
    • MBASE [usc.edu] (Model-Based Architecting & Software Engineering)
    • and other resources [usc.edu]
  • by sid_vicious (157798) on Monday November 05, 2001 @11:32AM (#2522461) Homepage Journal
    I remember being in my software engineering class in college the day the professor was lecturing on "CoCoMo" (think it stood for "Cost Completion Model").

    He very carefully laid out the algorithm - I don't have my textbook handy, but it involved elementary mathematical operations on estimated man hours, estimated lines of code, estimated overhead, etc., then at the end -- and I am not making this up -- they multiply the result by a "magic number".

    Where did you get the magic number, oh sage of the ivory tower? Well, we just made it up -- it seems to work.

    It hit me then that the whole discipline of estimating cost completion is all bullshit. You might as well be estimating with a crystal ball or divining the future with chicken bones. Since I've been working, the best advice I've gotten so far has been "take how long you think it'll take and double it".
  • by dybdahl (80720) <info AT dybdahl DOT dk> on Monday November 05, 2001 @11:33AM (#2522478) Homepage Journal
    When you construct a house or a power plant, you are in a business with subcontractors, that can take some of the risks. It is generally accepted to set a fixed price, because the procedures that are involved, are mostly known.

    In software, however, most projects do not rely on known procedures. It is fairly easy to estimate the costs of creating 1000 different window layouts, which is a known procedure, but it is a very difficult task to estimate the costs of implementing the layouts.

    If software would use as much energy on estimating each new task as construction projects did, developing software would be extremely expensive. Just imagine that you had to do a while-loop according to an ISO standard, and another while loop according to another ISO standard, because the two while loops were in different functions that were categorized differently by a third ISO standard. Instead we hire a bunch of programmers and make them program themselves. Sometimes we do it a little more complicated, like Open-Source, Xtreme Programming etc., but it's still a bunch of programmers hacking around.

    The trick is to manage it anyway - and that's why managing software projects will always be risc management and not very predictable.

    Lars.
  • by mobiGeek (201274) on Monday November 05, 2001 @11:47AM (#2522571)
    Software development is not a science in the normal sense. Designing large software systems is an art. It cannot be pigeonholed.

    An experienced software project manager can usually be quite accurate in estimation of effort for a well analyzed software project.

    This, however, highlights a few problems in The Real World:

    • many (most?) software projects are ill defined.
    • many (most?) software projects are not analyzed properly prior to the start of architecture design and start of coding
    • many (most?) software projects are not resourced properly up front; resources are thrown haphazardly at a project once deadlines are quickly approaching
    • many (most?) software projects are given unrealistic deadlines prior to analysis being done
    • many (most?) software project leaders do not have the political experience needed to manage the business expectations of a project [most engineering schools have mandatory Management Sciences courses for their students. Most CS schools avoid Humanities courses...yes, I am a CS grad].
    • many (most?) software senior developers are not encouraged to get involved in the "business" aspects of software projects.

    Am I too pessimistic? I don't believe so.

  • by clare-ents (153285) on Monday November 05, 2001 @11:51AM (#2522598) Homepage
    "
    That's exactly the sort of attitude that has caused the sort of spectactular failures of software projects to be accepted as the norm. Software Engineering is *not* "hacking" or "coding" or "programming", it's *engineering*, like building a bridge or a skyscraper. Yes, those projects go over time and budget too sometimes, but they are the exception rather than the rule.
    "

    But that's simply not true. Writing software of anything that is non-trivial is not the same as straightforward engineering. For a start there is the rate of progress, how many people have 30 years + experience of building 50 story + buildings. How many people have 30 years + experience of dealing with terabyte + sized datasets?

    When buildling software previous code can be reused for a very small amount of effort, when building skyscrapers the previous design can be reused for only marginally less effort than the last one.

    Compare the difference between building a C compiler from the gcc source and the world trade centre from the blueprints.

    Essentially the estimate is

    Time = [time to do the bits we know how to do [accurate] ] + [guess for the bits we don't know how to do [inaccurate] ]

    With software, the first part of that expression tends towards zero since most things we know how to do we can reuse code, whereas with building it remains a large accurate estimate.

    The error here will be of the form

    Error = [variance of inaccurate terms] / [total]

    For the example of a skyscraper whos construction is mostly a known method this will tend to a small number since the inaccuate term is much smaller than the accurate term, but for software with reuse of all the known methods of coding this will tend to 1 - i.e.. 100% error in the estimate and hence the conclusion that it's worthless to even bother estimating.

    In my company we can accurately estimate how long projects will take providing the projects are mostly identical to ones we have done before, and if this is the case it generally costs the client more in programmer time in meetings to dicuss the cost of the job than it does to write it.
  • by markmoss (301064) on Monday November 05, 2001 @11:52AM (#2522605)
    The problem is, you don't get paid for coding up a small working prototype in order to do an estimate. So my estimating technique is:

    Figure the time to do the parts I understand.

    Count the parts I don't understand. Allow a very long time for each of them.

    Add it all up, then multiply by 3
  • by vanix (177445) on Monday November 05, 2001 @11:53AM (#2522607) Homepage
    That is very interesting, but how do you determine the fixed price you charge the customer?
  • by Mr. Slippery (47854) <tms@nOspam.infamous.net> on Monday November 05, 2001 @11:53AM (#2522609) Homepage
    That's exactly the sort of attitude that has caused the sort of spectactular failures of software projects to be accepted as the norm. Software Engineering is *not* "hacking" or "coding" or "programming", it's *engineering*, like building a bridge or a skyscraper.

    We'd like it to be so, but it ain't.

    The behavior of bridges and skyscapers is determined by classical physics, which allows us to make precise predictions.

    The behavior of computer programs is governered by complexity theory, which tells us that any reasonably complex program has non-predictable behavior. And the manageability of software development depends on human understanding and appreciation of code - there's an aesthetic factor.

    Certainly things could be better...the fact that something has a large component of art doesn't mean that there aren't areas of mastery for a practitioner to study. But at its heart, the creation of complex software requires a creativity and intuition that cannot be set to a timetable.

    (Yes, one can "engineer" art to some degree - popular music being an example, where teams of marketers follow formulas to construct the next boy band. But that does not result in a quality product that stands the test of time.)

    In a well analyzed and properly planned project, the actual coding stage is little more than data entry.

    But the problem still applies to the design phase.

  • by halflinger_n (534215) on Monday November 05, 2001 @11:55AM (#2522629)
    And beyond the marketroid messing things up. In the physical world you just would not build some things certain ways - they would fall down. (This is one reason that engineers need certifications and licensing - a way of making sure that none of them will succumb to the marketroid telling them that "concrete is out - use this cool blue toothpaste to build that bridge" I think this kind of licensing would be very difficult to enforce in SW eng. though that is for another discussion... (which IIRC has already happened here...) In the world o' software there is no upper limit to the amount of complexity you can add to a project, some of the complexity comes in automatically (various OS's, hardware profiles, DB's etc.) and some is sprinkled liberally by the marketroids who tell you that now it has to have an "XML tie in" either one is enough to make it "fall down" alone. Isn't that the fun of it? (No - you're not allowed to beat the marketroids... that one is the boss's nephew...)
  • 2 weeks (Score:3, Insightful)

    by KarmaBlackballed (222917) on Monday November 05, 2001 @11:56AM (#2522633) Homepage Journal
    Ask a sharp programmer to estimate the time to develop a software solution and he might shrug and look irritated. Ask him if 2 weeks will be enough time, and there is an 80% chance he will say "of course" no matter what the task!

    Gung-ho programmers are optimists. Couple optimism with the ennumerable factors involved in programming a non trivial application and you will get what we have today.

    By the way. I am a programmer and I have little to no confidence in my time-estimation abilities, or anyone elses. It has taken me 14 years to come to grips with that.
  • by chrisreedy (127131) on Monday November 05, 2001 @11:59AM (#2522655)

    People like to compare the software development process to manufacturing. But people also ignore the fact that before manufacturing there is design, which culminates in the first version of the object. Manufacturing produces versions 2 and beyond.

    The process of developing software is more like the process of producing the ultimately detailed design. For software, manufacturing is a mechanical process -- duplicating the initial working version.

    Now, with this view, ask how often the design for a product is completed on schedule, especially for a large complex product like an airplane (or the Intel Itanium processor :-)). I don't believe (I have no firm data) that the experience is a lot better than the experience for large software projects.

    Chris

  • by Kefaa (76147) on Monday November 05, 2001 @12:05PM (#2522687)
    The issue is not physics versus manufacturing, it is scope and cost containment like is done in manufacturing. As a person who has lead multi-million dollar projects, I have grown used to the cliché that goes something like this:
    If we built homes like software we would all be living in the street, penniless...

    The major issues I have seen revolve around a lack of scope and cost control. In many cases it is because there is little penalty for being late or over budget. In cases where penalties exist it is often beneficial to then over estimate the effort or cost required. Then once the money is approved, using it is becomes easy.

    Going back to the analogy consider the following:
    Scope
    If you were building a house, each piece has a specified cost, known in advance to a very large degree. In addition, altering the scope itself often incurs a penalty, because the work is not done by the owner. You plan a three bedroom, 1.5 bath home. Midway through planning you decide to make it a two bath home instead. The architect will charge the "re-scoping" fee and the builder will add the material fee. Now do the same after construction has begun. The architect gets their fee, the builder adds the material and resource costs, plus a "revision" fee for changing your mind after construction begins.

    During a software project, it is common for individuals to approach the developers and ask to expand the scope. This would be analogous to approaching one of the work crew and asking them to just add the extra half a bath. The difference is the work crew would get fired, and the developer gets bonus points for adding the feature, either directly or indirectly.

    If the developer chooses not to do it, or pushes them to the project manager, the client may label them uncooperative or difficult to work with. The project manager not wanting to be labeled either may coerce, cajole, or beg the developer to accomplish it, without a scope revision. Failure to do so by the developer results in real financial impact at some point, and offers little incentive to hold the line.

    Cost
    I call this the "Porsche syndrome".

    I go into the Porsche dealership and see a new 911 Carrera Coupe. Smiling the dealer offers to sell it at a deep discount, with options and accessories $84,000 (U.S.). Whewwww baby!!! I cannot afford that. "Look," I tell him, "my wife will never approve that, you need to get it down to $28,500 tops." Would any of us expect to have the price cut down? By half or more?

    Okay, how about "Look, what will it take to get it under $30,000? Seriously now, what do I have to give up" As the dealer is escorting me to the door he explains the only way I will get this car under $30k is with a mask and a gun or from a scrap metal dealer.

    Yet, daily we go to developers and tell them to do the same. We ask for an estimate and then go back with "This is too much, it needs to be smaller or it won't get approved!" --Insert blank stare here--- The idea that if something cannot be cost justified it should not be done, is often lost in the "request" itself.

    To nearly guarantee a project is on budget and time requires things many companies are unwilling to provide. Strict scope control procedures, with oversight by the person responsible for the money. That means each change, regardless of how trivial must be approved by someone above the project management team with business justification. It also means that requests for scope change cannot be made to developers directly, by anyone.

    I was very happy with the people who built my home. When speaking to many of my friends and coworkers who built their homes, they describe it as a process akin to having their flesh removed. Everything required such effort and detail that many would not do it again.

    Most of them were looking for the relationship to be like one at the office. We all want to get along and help each other out. This is not a commercial arrangement, and when we put the commercial context around it, we see it many offices lack structure.

    Internal organizations can be setup like commercial ones, but it is usually unwelcome as the perception is everyone should be working for the greater good of the company and this has the appearance of bureaucracy. Even if inaccurate, everyone "wanting to get along" prevents it from being implemented.
  • by King Of Chat (469438) <fecking_address@hotmail.com> on Monday November 05, 2001 @12:06PM (#2522691) Homepage Journal
    (Maybe someone should do a survey to find out how many of us are pros?)

    Likewise, I've been developing (C++) for a living for about 12 years now and I've come to some conclusions:

    There are estimating techniques/metrics which will work. They depend upon going round a few times to "calibrate" and consistent application. "Task Points" was a good one - basically break your use cases down and down until you have a series of one-line statements about the system. Multiply these by your magic number and that's the estimate. This, like all estimating techniques, is built on sand because:

    It depends upon a development team sticking around long enough to do a few projects to calibrate you method.

    It depends upon the exact functions of the system being known at the time you do the estimate. This is the killer.

    I have never worked on a project where the exact functioning is known at the time coding starts. I have, however, observed that the more analysis/design you do before estimating, the more accurate the estimate is. The problem is, that people always want the answer (estimate) before they've given you the problem (spec).

    FWIW On small projects (which are generally better defined), I run through the spec, do a rough n' ready count up of the number of classes, multiply by a factor (decided by the complexity of each class and who I think is going to code it) add a QA+debugging allowance and come up with figures which aren't too wide of the mark.

    Oh yeah, and the "who's coding it" is important. Lots of studies show that the difference between "good" and "bad" coders can be a factor of ten. I've been slammed by PMs after estimating how long something would take me, then the PM puts some "cross trained" ex VB dork on it.

    To summarise: it is possible if you know who is coding what. Recommendations: 1) read Brooks, 2) keep it small 3) ignore any of the "latest methodologies" that Project Managers try and sell you.

  • by Totally_Lost (177765) on Monday November 05, 2001 @12:12PM (#2522716)
    You are absolutely right for most inexperienced developers. It was certainly the case when I was 24 and first started fixed price contracting. The reality, is that with a small amount of positive feedback most developers can start to get this right - typically within 25% within 3 months, and within 10% in a year. In my case I under bid the first project by a factor of five, and spent 3 months working at about $0.50/hr, the second project was within 50%, and the third nearly dead on. Working and getting paid by the job is experience that I think nearly every programmer needs BEFORE being allowed to work T&M or salaried.

    There are secondary effects of working by the job - you very quickly learn to do only what you are getting paid for - and don't spend a lot of time on personal research projects or unnecessarily rewriting other peoples code that is working just fine but doesn't conform to your personal style. KISS is absolutely a necessary personal style - anything else and you are doom to continuous cycles of project overruns and long talks with management about why your project is another month or two away from completion.
  • by nologin (256407) on Monday November 05, 2001 @12:13PM (#2522724) Homepage
    1. Salesperson comes to initial agreement with client about a product.
    2. Salesperson contacts Software Department and finds out that product doesn't exist.
    3. Salesperson diplomatically alerts client to that effect (essentially turning product into project).
    4. Salesperson initiates project with Software Department director and sets a firm deadline based on estimates from company.
    5. Developers begin working on project.
    6. Problems crop up; appears project may run a little late.
    7. Company hires or assigns a project manager to try to put project back on time.
    8. Project manager and Software Director send mixed signals to development team, causing a waste in time and further delays.
    9. Project manager makes a final projection of delivery, appears to be far later than expected.
    10. Company replaces Software Director due to delays in project.
    11. Developers become less efficient due to uncertainty in company.
    12. New Software Director and Project Manager make compromises of project to reduce the delays.
    13. Project is eventually completed. Project manager is assigned to new project or leaves company.
    14. Go to step 1...

    Some companies actually do business this way. It scares the hell out of you if you are the client, but it is even scarier if you are working for the company in question.

  • by Overt Coward (19347) on Monday November 05, 2001 @12:16PM (#2522750) Homepage
    The key to function points -- or any other -- estimation techniques is relying on historical data to predict future results. This means that they are fairly accurate as long as you collect metrics and stay within the same general project domain and relative project size. The more radical the departure from historical size or domain the new project is, the less accurate an estimate will be.

    However, the biggest thing to remember is that no matter what estimation method is used, the simple fact that a methodical approach to analyzing the problem will almost always yield a reasonable estimate.
    The main reasons projects go over schedule and budget are:

    1. "Feature creep" -- having the requirements change significantly over the course of the project without adding the impact of changes into the schedule.

    2. Rampant optimism -- many engineers (and managers) will typically estimate how long they think it should take to do a specific task but will not add in a buffer in case somthing goes wrong. And something always goes wrong.

    3. Artificial deadlines -- project schedules where the budget (time and money) was set by customer/marketing committments, and not by the technical requirements at all.

    4. Calendar/personnel issues -- people take vacations, there are holidays, and people occasionally fall ill. Plan for it. Also, don't forget any company/department meetings, training, seminars, etc.

    5. Dependencies -- if a required piece of hardware or software won't be available (or is late), it can impact the overall schedule, espeecially if critical path tasks depnd on those materials.

    Risk management is indeed the key. As the project manager or lead engineer, it is your job to predict what potential risks might be and attempt to mitigate them on a cost-effectiveness basis. You can still be bit by bad luck, but you can minize the chances it will strike.
  • by bedmison (534357) <808@@@music...vt...edu> on Monday November 05, 2001 @12:34PM (#2522866)
    I think it is important to distinguish between building custom systems and building shrinkwrapped apps. It IS possible to estimate, with a fairly high degree of accuracy, projects which have a fixed set of FULLY defined requirements. This means everyone interested in the project signs off on the requirements before the first line of code is written. This very useful for beating you customer into submission when they change their minds 3 months into the project.

    Shrinkwrap developers face a much different problem, in that the requirements are often set by the marketing goons based on a tenuious grasp of what they THINK the buy public wants, as opposed to actually polling existing users to find out what they REALLY want.

  • by Xiver (13712) on Monday November 05, 2001 @12:44PM (#2522910)
    The problem with estimating development time lies mostly in the management's concept of software development. I was hired to work on a project that was estimated by management to last two months. My estimate was four months and the actual time it took to complete was over a year. Why could I not meet the project deadline?

    The customer claimed it was because I could not seem to fully complete a component of the project. What they really meant was I could not fully complete a component of the project before they would request a change to that component that in some cases required a complete rewrite of the component. They didn't think it was a big deal to add a button here or there in the application after all it was only a button. Never mind the fact that each of those buttons required stored procedures to be written and existing stored procedures to be altered. They would get upset that I could not make their requested changes in a day when they wanted to completely alter the way the interface to the application worked.

    The bottom line is most people who don't know anything about software development don't think it is a big deal to add a feature here and there at the end of the development cycle. I try to equate software development to carpentry. Sure I can add another door in the center of those cabinets, but don't expect it not to affect the other doors and their space within.
  • by rafial (4671) on Monday November 05, 2001 @12:46PM (#2522921) Homepage
    It would seem that with fixed cost billing you'd need to specify rigid acceptance criteria up front to avoid the customer lobbying for "just one more feature" under the cost umbrella of the current contact.

    How do you reconcile this with the nature of XP projects to deliver something that is noticeably different from the customers original conception of their need (but that in fact fits very well the customers need as learned over the course of the project?)

    I'm seriously interested to hear about folxs who have figured out how to marry an agile development process to fixed cost contracts.
  • by andy4us (324798) on Monday November 05, 2001 @01:05PM (#2523041)
    One of the greatest criteria for a good programmer, whether it is the quality of the code, or the ability to estimate a schedule, stems from humility. Part of the problem with people when estimating a schedule is that they thing they are Superman. They think that they are so good that the complex task that is in front of them is trivial. These people tend to have very buggy code as well (normally from insuffient testing). All programmers suffer from this to some extent. I've also noticed that these people tend to never use libraries, since they can write one better, but then use up all their scheduled time rewriting libraries and never actually working on the project.

    Personally for me, I tend to do the best hourly breakdown I can and then double it before submission. This is normally not too far wrong (say one week on a 3 month project). The double factor allows for inaccuracies, meetings (which really do take time !), and spec changes. I may add more "fudge factor" depending on my feelings for how well the spec is sorted out and the quality of management (i.e. weak management will allow spec changes every week, good management will filter well).

    ANdy
  • by tz (130773) on Monday November 05, 2001 @01:07PM (#2523053)
    When I estimate, and the resources are there, I usually hit, if not dead-on, then very close. Basically I look how complex the system (in this case, embedded systems) is going to be, and can fairly accurately estimate how long it will take me to complete the program. The 20% sometimes is because things go easier (e.g. I find an OS solution so I don't have to write something) or worse (e.g. the hardware has problems so I can't test). But I can usually see the complexity - number of inputs, outputs, equations (reduced to atomic operations), and how they interact, and know my own "velocity" (See the Extreme Programming series for a larger discussion of something that does work).

    But that doesn't help. The first problem is if I say something will be done by January 15th, they will still want it (without any help, tools, extra paid OT, etc.) on December 15. The technically correct estimate is not politically (or in marketing terms) correct.

    A second problem is when you are at the bottom of the feeding chain, so if some of your test hardware goes bad, you can't get it fixed quickly, or if they disassemble your test setup every few weeks to ship engineering modules (which aren't replaced) to customers, so you start with the assumption of a reasonable development and test environment, and retrograde to LEDs on soldered leads to check things.

    Sometimes this effect is in a different order - I depend on a computer or test hardware being engineered in parallel by another group, so the first test milestone in january can't be done until may when the hardware actually appears. Oh, and the extra time for an emulation system so we could develop without actual hardware was shot down because it was guaranteed to be there in january. I think one project didn't have functional hardware until two weeks before the first ship date.

    Those are purely technical, but then there are political considerations. E.g. I'm using the Unix type work environment that exists everywhere free (Linux, Win32 with CygWin, etc.) and GCC but they have been using ideosyncratic windows tools - something not quite completely unlike make as a builder, some other C compiler (it had much better C++ support but C v.s. C++ embedded is another rwar). Some code (non-)documentation and editing tool that isn't integrated (they promise they might do something in a few years to integrate things). So I have to change from a porsche to a top-heavy underpowered motorhome and still try to keep up speed.

    Then some higher up doesn't like version control tools. Not even something as simple as CVS. So we can't reconstruct anything other than release images making simple changes or backouts (or integrations) much more difficult.

    Why is it impossible to estimate how long it takes to empty a 50 gallon trough with a 1 gallon bucket assuming you can do one bucketfull every 10 seconds? Well, they want it emptied in 3 minutes regardless of your calculation. No, you can't use the spigot so when the trough gets empty you won't be able to fill the bucket. Oh, and the bucket had a hole in it and we replaced it with a sieve. And didn't we tell you before the estimate that you can't empty close to the trough, you need to walk 100 feet up stairs and pour carefully through a 1 inch hole - we haven't budgeted for a funnel either. Oh and...

    Estimates are wrong more because the assumptions are wrong (or those doing the calculation are wrong). Or what needs to be submitted needs to be wrong to be accepted - lowest bidder then add cost after it is half done v.s. accurate original bid.

    And if the environment is such that you can't control things, something like extreme programming is the way to go since it is flexible enough to accommodate constant changes to function, priority, and staffing. Though it won't work when the problems are political.
  • by splante (187185) on Monday November 05, 2001 @01:11PM (#2523086)
    How are you compensated for changes that occur after the bid, as development occurs?

    XP calls for short release cycles of a few months at most. Do you just bid on the current short release cycle or on the whole several month (or year) project?

    XP calls for implementing the highest priority features first, so features that slip past the release will be of lower priority. Do you get paid for a release even if lower priority features slip?

    XP recognizes four variables in software development: cost, time, quality, and scope. Of these, one is usually going to have to give. XP recommends fixing cost, time, and quality and allowing the scope to change. It recognizes that requirements are never clear at first, and customers can never tell you exactly what they want. As development progresses, you adjust the scope to match the conditions as you find them. So, following XP, are you saying that you charge a fixed price but change the scope throughout the life of the project? I can see how that can work, but I don't think that's what people understood your post to mean, and it's not what most people consider 'fixed bid'.

    We use and like XP as well, though we charge by the hour. I am intrigued to hear more about how you use XP with fixed bids. It seems like it might be a fixed bid for "whatever we can get done in 3x8 man months," though.

    (my comments about what XP says come almost directly from Extreme Programming Explained, by Kent Beck [amazon.com]).

  • by Anonymous Coward on Monday November 05, 2001 @01:17PM (#2523127)
    It can be done. The place I work now does pretty well at meeting goals. Here's my experience:
    • You must have good, seasoned, management that has sucessfully shipped working products before (preferably, products in the same category).

    • Most of your programming staff must also be seasoned pros with multiple products shipped in the past.

    • You must have formal requirements and design documents and they must be maintained over the development cycle.

    • Middle management must protect the programming staff from capricious changes to schedules and requirements.

    • Middle management must protect upper management from capricious programming changes (hey! let's develop a new lanaguage to meet this requirement, it'll be a lot more fun to code that way). Programmers are just as bad as management at changing things late in the process.

    • The best practice I've seen for making schedules is to set up lots and lots of intermediate goals. Just as important, those intermediate goals must involve integration from the very front end to the very back end of the product in question. Integration of all components must happen as soon as possible in the process, even if nothing is fully working.

    • A formal process of builds, build tracking and build deployment into a test environment must be in place from the very first week of the project. Everything goes under source code control from the very start (including stuff that isn't exactly software, like documentation, html files, etc).

    • Testing should start before there is even anything to test. In the beginning, it's enough to test that the stuff that is there builds and installs and is available, even if it doesn't do much yet. Just being able to say you've started testing is worth something, even if you aren't formally tracking bugs yet.

    • Code for demos. It's a pain in every programmer's butt, but done properly, coding for demos can really help you stay on track. We do a major demo every time the board meets (early on, your demo may be pretty lame and you may have a lot of stuff faked out or held together with chewing gum and bailing wire, but at least you have SOMETHING to show). The demos often cause extra work that doesn't go to the bottom line, but they also provide insight into the final product and feedback to the requirements process. Often, it is the process of integrating several components early on so they can be demoed that uncovers glaring holes in the design or implementation. Also, just knowing that you have to lash together a demo (at least once a quarter) influences the way you code and the flexibility you incorporate.

    • Programmers have most of the responsibility for the coding being done in time and it is imperative that they use good practices. Requirements WILL change (for a good reason, you hope) and well-designed and well-written code will adapt to requirements changes ten times better that dreck that was thrown together by a careless programmer.

    • The best way to keep your programmers on track (and I'm speaking as one of them, not a manager) is to have real, formal design reviews (based on written designs) and at least informal code reviews. The best system I've seen is to have each programmer have one "buddy". The buddy looks over all the programmers code and understands it pretty completely. The programmer is then driven to make the code look good (so the buddy doesn't find foolish mistakes or obviously lazy shortcuts) and is backed up by the buddy (in case the person leaves the company or gets switched to another project or something). In my experience, strict, formal code reviews aren't as useful as informal code reviews. You get everyone in a room picking on one programmers code and that programmer is going to resent some of the comments, no matter how well it's done and no matter that tomorrow will be someone else's turn. The resentment turns against the process, not the defects and pretty soon you have a broken process.



    It can be done, teams do it all the time. It just takes skill, dedication and attention to not-very-fun process.
  • by Anonymous Coward on Monday November 05, 2001 @01:18PM (#2523129)
    12a. Salesperson notices that competitor has new whizbang feature and changes specifications to include it without adjusting deadline.
    12b. Software Director and Project Manager inform developers that half their work is to be discarded and that they will be working unpaid overtime. Developers not pleased.
    12c. Best developers leave to work for competitor.
  • by Aceticon (140883) on Monday November 05, 2001 @01:20PM (#2523144)
    Let's see:
    • At any point in time the ground your skyscraper stands on can crumble into nothingness. [Operating System bugs]
    • Your skyscraper can be required to stand on slightly different types of ground. [Operating System types and versions]
    • Also the steel, glass and cement you are using have wildly varieing properties. They also might have been imposed by an outside entity (read Company Standarts). [Third Party Components]
    • Plus the elevators that you get always do less than their specifications (for example they don't stop on the 5th floor). The next version of the elevator will actually do that but on the other hand it doesn't fit on the elevator shaft.[Third Party Components and Applications]
    • Also half-way through building the skyscraper you find out that the plant has been changed and it's now supposed to have a Shopping Mall on the ground floor.[Creeping Requirements]
  • by richieb (3277) <richieb AT gmail DOT com> on Monday November 05, 2001 @01:25PM (#2523170) Homepage Journal
    Building software from requirements is just like walking on water. It's real easy if they are both frozen. ;-)

    ...richie

  • Ummm, ok......but (Score:2, Insightful)

    by tacokill (531275) on Monday November 05, 2001 @01:32PM (#2523215)
    At one point, NASA could estimate within 5 or 10% of EVERY development project they had running. Of course, they are CMM level 5 - which basically means they have their shit together. Most of everyone else, however, does not. In fact, I would say that the vast majority of projects out there could be considered to be in a state of chaos and I dont see that changing until two things happen: a) the "business" people think through what they REALLY want instead of just throwing a bunch of unformed ideas at the wall and hoping they stick. It constantly amazes me how little thought is given to systems by the very people who have to depend on them. (ie: solid requirements) and b) the developers must start acting like professional developers and not "hackers". I realize that there is a grey area between art and science but too many programmers I know take too many risks and don't think through their analysis. Often times, projects fail because something is not thoroughly analyzed or is not throughly thought out. Don't get me wrong, programmers don't need to be experts in risk management, but some acknowledgement of risk MUST be made by developers nowadays. You can't just go into your corner and code away.
  • by cdn-programmer (468978) <terr AT terralogic DOT net> on Monday November 05, 2001 @01:34PM (#2523226)
    But only AFTER it has been designed. I've been a developer for over 20 years and far too often what is done is that development estimates are demanded before the project has even been designed.

    In traditional development projects, typically people KNOW what they need to do before it is undertaken. The contractor starts with a blueprint. It is actally possible to count the number of 2x6's that a house will need. One can make an estimate on the time required to nail one 2x6 to another and then multiply by the number in the house in order to estimate how long it will take.

    I've had ignornant management ask on far too many occasions how long it will take to develope such and such a project. Best answer is how long is a string?

    Management that has no feel for the problem is the problem. How long does it take to write a book?

    Well - I suppose it depends on the book. Just because you can not estimate how long it will take does not mean that books will not be written or that they are not valuable.

    I can write a book in a day... It will just be a simple book and quite short... but then did anyone define how many pages a book must contain in order to quailify as a book?

    I can write a programming project in a day also. But it won't contain over 1/2 million lines of code. For a complex project... well, when we start to see light at the end of the tunnel, then we'll be able to make an estimate how long the tunnel was.

    That is the best answer I can give.
  • PM Estimates (Score:3, Insightful)

    by Martin S. (98249) <Martin.SpamerNO@SPAMgmail.com> on Monday November 05, 2001 @01:42PM (#2523274) Homepage Journal
    PM: How long to do this work ?
    ME: How about a spec ?
    PM: You're kidding :) I only want a rough guess.
    ME: Roughly 6 weeks.
    PM: Nah, too long we'll never get that past the customers, lets call it 4 weeks.
    ME: Not again remember what happened last time, you chopped my estimate ?
    PM: Don't worry I won't hold you too it, this time!

    PM: That work finnished ?
    ME: NO, two more weeks.
    PM: You said 4 weeks, look here it is in the plan.
    ME: I said 6, You said 4 weeks, and that you wouldn't hold me to it.

    PM: The only thing I can fault you on is your estimates, they aren't very good.
    ME: You £$%&* git !!!

    And practically every project manager does the same thing.

    Why engineer failure into the plan ?
  • by DrSpin (524593) on Monday November 05, 2001 @01:57PM (#2523387)
    You are forgetting politics: I have been explicitly told Your estimates are unacceptable - they will have to be halved!

    Others have mentioned "creeping featureism".

    There is also the "event Horizon" - When faced with a project of infinite size, people will tend towards an estimate that is based on their idea of how long it takes to solve an infinite problem. For a salesman, this is a couple of days. For a typical manager, a couple of weeks. For an engineer, a couple of months.

    For estimates to be meaningful, the work has to be divided into units which you can guarantee will never exceed your event horizon.

    I have managed many successful estimates on large (over one year, more than 5 people) projects, based on the method that it needs an average of two weeks to implement, document and test, any feature of the project you can identify before the project starts.

    By "feature" I mean explicit bit of behaviour by the code eg "ack an inbound packet", "echo the character on the serial line". I know any amount of people who can code this in 3 minutes in perl or whatever. That is not the same as developing supportable code. All loops have to be unwound, all nesting flattened. Every level of the heirarchy has to be accounted for serially.

    Let me introduce Dr Spin's 2:1 Law: Supportable code needs 2kg of paperwork per byte of executable code. Includes minutes of meetings, sketches on envelopes. (Most of it is binned, but it still has to be created).
  • by deanj (519759) on Monday November 05, 2001 @02:01PM (#2523423)
    The real problem with software schedules is that most managers won't believe the estimates that software engineers give them in the first place. When you've been around for a while, you have a pretty good handle on how to estimate things. If you come up with an honest answer, 10-to-1 the manager doesn't want to hear it, and wants something earlier than that. I usually revert to the "When do you need something", get the info, and then tell them what features we can do within that timeframe. If they want more, it'll take longer. If they want it faster, they get less features.
  • by remande (31154) <remande&bigfoot,com> on Monday November 05, 2001 @02:15PM (#2523534) Homepage
    Software development isn't always like physics--often we are boldly going where people have gone before. However, certain factors in software houses cause underestimations:

    Underestimation as a Marketing Tactic
    AKA "Vaporware". Even if marketing knew when a product would be shippable, a particularly cinical marketing department may claim it to be earlier, thus freezing competitor's development.

    Lack of Feedback (Moving Targets)
    Software engineers are particularly bad at estimating because they have never done what they estimated. They are given a large project, give a large estimate, start working on it, and the project changes in the middle in a major way. This is a moving target; the estimate no longer applies. Major law of software development: You cannot change the spec or the development team on the project without impacting the real ship date. If you don't re-assess the estimated ship date, you are simply fooling yourself. Thus, they don't have any clue whether they hit the estimate or not. One way to defend against this is to break the project down into bite-sized pieces and estimate them; a small piece gives you a chance to do precisely what you estimated. Once you have that, you can have somebody track your estimates, and come back saying something like "On average, you go one third over your estimates. Add a third to your estimates from now on, and we'll be accurate".


    Management Estimates
    Often, engineers don't do the estimate. The management or marketing people tell you what must be done, and how long you have. Sometimes this is done explicitly; other times, management may have a number in mind and shame a software team into agreeing with it by laughing off any number that doesn't match theirs. Business people often negotiate the ship date with the geeks, like any negotiate with any other vendor. To a suit, vendor negotiations are how you determine the "margin", or how much the vendor is making (like when you buy a car, you and the dealer come to a number that determines the dealer's margin). This doesn't work in in-house software develoment because geeks hold back precious little "slack" or "margin" (they don't get paid profits, they get paid salaries); in a decent shop, geeks program at flank speed all the time and always give the project 100%.

    See Ed Yourdon's Death March or any of Ward Cunningham's Extreme Programming books for more details, and ways to avoid the above traps. Yourdon suggests that the head geek has to take a hard stand in scheduling to prevent business interests from setting both the project spec and the ship date. He especially tells you never to negotiate schedule, and to help the suits understand why you never do. Whatever number you estimate doesn't affect the actual ship date, so playing with that number is simply fooling yourself.


    Extreme Programming actually has a "planning game" (sort of a ritual dance) which places business interests and geeks on the same side of the table. Two big rules are "The geeks may not reject any part of the spec" and "The suits may not reject any part of the estimate". Once the suits set the spec, both teams break it down into pieces-parts, line them up in order of what gets done first and the geeks give their estimates. From there, the suits can choose the ship date (and can instantly see how much product will be ready by then), or can choose a certain amount of project completion (and can instantly see the ship date). The fun part about this method is that the suits can change their minds at any time by changing, adding, or removing pieces-parts, and can instantly see how that affects the ship date. The other fun part is that breaking up the project into pieces-parts allows developers to do a (small) project they estimated. This allows people to track estimated versus real time, and to give developers feedback that lets them make better estimates. Such a team will start off with bad estimates like everybody else, but they will be able to improve rapidly.

  • by johnnyb (4816) <jonathan@bartlettpublishing.com> on Monday November 05, 2001 @02:18PM (#2523550) Homepage
    The truth is, you can somewhat accurately estimate project time. The problem is, few know how.

    The thing is, you must get entirely through the design stages first. The design stages should include every screen as well as every possible error message, sub-screen, or whatever can pop up, as well as an outline of how the program flow will go. This takes a lot of time, but not quite as much as it sounds.

    Once you have done the complete design, you can accurately make schedules. The problem is, most programmers put all error handling and messaging off as something that doesn't need to be designed. That's where the extra time comes in. If you know _exactly_ how the program flow is supposed to work, estimating time is easy. However, if you haven't finished the design stage, YOU DON'T KNOW WHAT YOU'RE PROGRAMMING, so, obviously, you can't estimate the time. So, with a _complete_ design, including all possible error conditions and actions to be taken, scheduling is not that hard.
  • by cr0sh (43134) on Monday November 05, 2001 @02:24PM (#2523601) Homepage
    "Suffering" from it right now, AAMOF...

    1. Programmer comes up with new system in spare time while learning a language. New system, if polished, would actually make a nice application to sell to current clients. Programmer is excited, and shows "product" to highers-ups.
    2. Higher-ups are excited, can see it may take a bit more work, and look into what it would take to get it to market. They tell sales and marketing to go see the programmer to have him demo it to them.
    3. Programmer is excited, shows it to sales and marketing. Sales and marketing love it.
    4. Months pass. Unbeknownst to the programmer, sales and marketing have sold it to a client, as part of the contract, to be a finished package by the end of the year - OR ELSE.
    5. More months pass - higher ups finally tell programmer, and others, that this new system is wanted - and oh, BTW, it is wanted in Java - not in the VB it was shown it.
    6. Three months are left to complete the project. Original programmer knows little Java. Other Java coders know little Swing. Architecture of app is changed from a simple app to a three-tier client-server system. Only two other coders have sufficient Java experience to code on it. The lead of the project knows no Java, and only takes notes at meetings.
    7. Twenty-one days until deadline (ie, it has to be in QA in 21 days) - everyone sweating bullets knowing it can't be done. Oh, and BTW, at every meeting it seems like a new section not planned for is realized...

    It was an ad-hoc system, and it is progressing as an ad-hoc system - a system that should have NEVER been shown to marketing and sales. I am not the programmer who originated it, but suffice to say it is a system that will be nice for our clients once it is completed. Fortunately, it sounds like things will be able to be smoothed over if we miss the deadline...

    So remember, all you budding coders out there - if you create something in your "learning" time - don't show it to anyone BUT other coders. If marketing and sales come around, have them sign an NDA promising not to sell it or something - you don't want to release a product to market before it is done - quit "selling" vaporware!!!
  • by the_great_cornholio (83888) on Monday November 05, 2001 @02:40PM (#2523700)
    As silly as this paper is, most responses to it are off-topic. What he is trying to show is that there is a good case for saying there is no general, algorithmic way to estimate how long it will take to do a given software project. What he isn't saying is that you can not make reasonable estimates on a given project.
  • by 1010011010 (53039) on Monday November 05, 2001 @04:30PM (#2524356) Homepage
    This reminds me of "The New Jersey Method versus the MIT method."

    The MIT Method is to take as long as needed to get a task done "right," regardless of cost and schedules.

    The New Jersey method calls for solving 80% of the problem, and putting off 20% until later.

    The MIT method results in more project failures than the New Jersey method. Microsoft epitomizes the New Jersey method, as does open source. Multics followed the MIT method, and was never actually finished, just killed off years later...

    If anyone has a reference for the "MIT vs NJ" in its original form, please post it.
  • by Anonymous Coward on Monday November 05, 2001 @06:47PM (#2525028)
    In my limited experience, the problem is that the "coding" is relatively easy and can be likened to engineering. The problem is that the team doing the coding normally also has to do the following:
    1. Setup the process and all the tools required to support it
    2. Interpret/Interview and write the specifications
    3. Fix old software the engineers worked on that broke
    4. Figure out why simple things don't work like the documentation says
    5. Try to figure out why the manufacturer of the tools left out essential features
    6. etc.
    These numbers are left out of the estimate. And dominate the time.

    That's the real problem. At least in most smaller companies.

    Also one should note that a large software package is more like building/designing an new airplane than building a building.... just a thought....
  • by Anonymous Brave Guy (457657) on Monday November 05, 2001 @06:50PM (#2525045)
    With objective schedule estimates, projects should never run late. Are these failed software projects not using proper software engineering, or is there a deeper problem?"

    Yep, there's a deeper problem, and it's very simple. Suppose your manager asks you for an estimate, and you say "six months" because that's how long you think it will take. Your manager works out that the project will not succeed if it takes six months, and asks you if you can do it in four. If you say "Yes", you have just become a statistic.

    Saying yes does not mean that you can do it if you couldn't before, it just means that you have lied to management, prevented them from doing their job properly. If your project would take six months, but it will not make money if it takes six months, then you simply should not start that project. Failing to realise that simple fact is the major cause of late/failed projects, IME.

  • The article presents an interesting arguement for why a completely new software project must have an arbitrarily large upper bound for time/quality estimates and can have no lower bound.

    But herein lies the rub -- exactly how many software systems are "completely new?"

    Damn few!!

    Unless you're merely doing maintenance on an existing program and know exactly what you need to change, what you are doing is new. Especially if you are trying to fix a problem with a software package that you are not familiar with.
    The average software project in an average industry will be primarily a repackaging of previously solved problems.The majority of integration tasks will be sufficiently similar to previous integration tasks as to be known.
    If that was the case we would be able to make better estimates. This is almost always not the case.
    You will be left with a small number of "sub problems" which are unique and new. But now we have a situation where the caveats of the article are very important. Specifically, if we have decomposed the programming tasks to a sufficient degree, it should be the case that the estimation is tractable.
    Software development is an art form. You can hire someone to paint your house and he can tell you exactly what it will cost. This is presumed upon the house being already built and it being an exact structure before he starts; that you not rebuild the house while he is painting it; nor change the paint color in the moddle of the job; and not asking him to remove the previous paint coat, etc. Otherwise it's akin to doing the Sistine Chapel without even an image to start with. An unlimited job results in an unlimited requirement. Until someone pulls the plug.
    Also, it should be noted, that the author assumes that a good estimate is one obtained through formal methods that is objectively defensible. However, in project maangement, a good estimate is defined as one that is believable and acceptable to all stakeholders in the process. The method for obtaining the estimate is not important.
    It is if you want it to be realistic. Usually the estimate is either totally unrealistic or it's manufactured from whole cloth.
    Moreover, good project management will include some significant up-front analysis. One common (at least common to companies with good PM'ing track records) is to run "monte-carlo" simulations of project work with large variances in schedule-v-actual work. With a run of a few thousand simulations, those processes that are most important to the time and budget performance of the project.
    This is ridiculous. If management knew what it was doing we wouldn't have so many businesses run themselves into the ground and the dot com bubble would never have happened in the first place.
    These "key" work packages are often non-obvious without this type of simulation work. However, with a good work breakdown structure and a good simulator, it is possible to generate a reasonably accurate picture of project performance based on what is not known.
    Asking for estimates on the development of art work is ridiculous unless you have fixed guidelines and an exact idea of what you want, something which is usually lacking.
    This means that in the "real world" of business, the article's claim is irrelevant!!
    If it's irrelevant, why is it in the "real world" more than 3/4 of all projects run over time and over budget and something near 1/2 end up being cancelled?
    We don't NEED objectively defined and defensible estimates. Instead we need estimates that the project stakeholders (which includes the people doing the work) can agree to.
    You can get people to agree to anything. The question is whether the estimates are anything close to accurate. In most cases, they are not.
    We don't NEED our estimates to be generated by formal methodologies. Subjective estimates backed up by years of experience are just as good, and often better, from a planning perspective.
    True. But the problem is, most places don't know enough about what they are doing or how it is defined to be able to give any kind of reasonable estimate. If you don't measure what's going on, and you do everything in an ad-hoc style, you will get estimates that are essentially about as valid as rolling dice to get an answer. And maybe less valid than that.
    This whole article strikes me as another programmer trying to show how dumb the business people are.
    It is not that business people are dumb, it is that we are failing to make adequate estimates and standing up for them as based upon what we know to be correct. But again, since the measurements of what is being done are often missing, the estimates are usually nothing better than seat-of-the-pants guesses, and wildly wrong.
    Hey folks, good business people KNOW that estimating is hard and that it isn't objective. But just because something isn't objective doesn't mean it can't be done well. It is possible to build models that compensate for unknowns if you can do enough decompossing of the problem to limit the unknowns to a well defined, small manageable few.
    If that was the case, why is it common place for managers to demand increases in functionality and cuts in the schedule? Because those who hear the estimates think they are overly padded (and therefore should be cut), and those who make the estimates don't have the means to show where they get the numbers from (and therefore can't show why their estimate is even close to correct, when it probably wasn't anyway).
    So, in the view of this PM, this is all just academic and has no bearing on the real world.
    Believe that if you will; the way things are really happening in the world prove otherwise.

    Paul Robinson <Postmaster@paul.washington.dc.us [mailto]>

  • by Kris_J (10111) on Monday November 05, 2001 @07:34PM (#2525219) Journal
    The best phrasing is; The project can be on time, on budget or right, pick two.

    It all comes down to experience with similar things. Like any other project, if a software project is very like something you done hundreds of times before you'll know pretty well how long it will take. If it's unlike anything you've done before there isn't even much point in guessing.

    Thing is, in the real world development happens once and then the "project" is duplication (ie; For a "So-and-So Homes" place - Design house once. Build house hundreds of times), but with software duplication is instant - just copy - the project is the original design. (ie; Design software once. Burn 10,000 CDs)

    The fact that many companies design their own software, even when they're not software design compaines is the problem. If you were a real-estate place you wouldn't build your own cars, or photocopiers, why do you design your own software? Moreover, why are you surprised when it takes longer than you estimated?

  • by Baldrson (78598) on Tuesday November 06, 2001 @01:59AM (#2526398) Homepage Journal
    Of the question "Can software schedules be accurate?" I can only say, it depends on how much new stuff has to get done.

    To take a reductio ad absurdum:

    You are given the task of duplicating the functionality of Windows NT. Furthermore, you are given the source code for Windows NT in a .tgz file and the associated development environment within which that source code can be tested. The question now degenerates into "How long does it take me to copy the tgz file?" That can be accurately predicted by measuring how long it takes to copy files on that environment in general, and the estimated schedule can be predicted to absurdly high degrees of accuracy with enough benchmarks of the system's file copying performance.

    Here's another reduced complexity angle:

    Translate a program written in Visual Basic and convert it to C++ (readably).

    You actually can sit down and convert a sampling of the program and get a measure of how long it will take you to do the whole thing -- the more you sample, the more accurate the measure right up to the point where you have converted the whole thing.

    Here's another example with a bit less reduction in complexity:

    You are given a working program but no source code, and some expert users of that program. Here we are getting into what might be thought of as "function point analysis" but really, it is much easier and more accurate than that since the program exists and works as it is "supposed" to work, you can bang away on it, and the expert users can bang away on your version of it to ensure it meets their needs -- perhaps discovering that some of the features in the old program were not really used thereby simplifying the task.

    Each step has been away from the "absurd" position of simply copying a program which was, in a sense, a "spec" for itself.

    At the other extreme, we get to the problem of "write a program that will make me as rich as Bill Gates". Note that this specification is not very specific.... it is very far from being source code for a program you can simply copy, isn't it? Guess what that says about the accuracy of the schedule?

    So a lot of this hubub about estimating software schedules is really hubub about the nature of the program specificiation process.

  • Frank Lloyd Wright (Score:2, Insightful)

    by shovelface (466145) on Tuesday November 06, 2001 @03:45AM (#2526591) Homepage
    Frank Lloyd Wright's buildings were often new ideas in theory and construction, much like the "unknown" part of estimating a software or web project today. His materials were often strange (or at least had traditional material joining with more exotic material) and the structures were oddly shaped.
    This is why Frank Lloyd Wright's buildings were often way behind schedule and way over-budget. He was a great architect and a wonderful designer, and I'm sure most of the engineers and builders were talented as well... but when you are dealing with brand new ideas, there is a certain amount of trial and error neccessary. Unfortunately he also didn't build that trial and error into his estimates.
    Also unfortunate is that many of his buildings have leaky roofs.

    The way that guy Joel does project management is the way I've been doing it for quite awhile, but he does say it so nicely:
    http://www.joelonsoftware.com/stories/storyReade r$ 31

    If you are going to compare building a bridge or a house with building software, choose the right bridge or house to compare with. Most software projects are not a cookie-cutter suburban home that everyone knows exactly what it's gonna be like and how to make it. Most of the time it's more like a Frank Lloyd Wright or IM Pei house.... We know the physics and tools of building a house. But we usually want to make them more useful, more livable, and more beautiful. That last part takes more time.

    -trout
  • by DoctorNathaniel (459436) <nathaniel.tagg@gmai[ ]om ['l.c' in gap]> on Tuesday November 06, 2001 @09:28AM (#2527044) Homepage
    Physics allows projectable timelines? Think again. I'm currently employed on a fairly major project (http://www-numi.fnal.gov:8875) that, when I joined, had a completion date of 2003. Now it's 2005 and counting.

    Software and physics have certain similarities (not least of which being that physics requires software development). The essential point is that you don't know how long it will take to do something that you haven't done yet. If you HAVE done it, then you don't need to do it again; all software design (or experimental physics experimentation) is essentially a research endevour, although the research results aren't neccessarly of interest in themselves.
  • by xdangavinx (534619) on Tuesday November 06, 2001 @04:22PM (#2529356) Homepage
    From my experience in the development department at my place of work, often no matter how wacky some of the deadlines given by "project managers" are to have fairly significant pieces of code or patches to be done by are often met - more times than we'd like to admit they're met at the last second.

    However since they're met at the last second, often the code that is written suffers. From there often the QA department will find something wrong with the poorly written code, send it back to the development department who then has to spend some more time to fix the new errors that the sloppy code created. So although the "project manager's" deadline was met, the end client often is delayed by the additional things that were discovered.

  • by noisebrain (46937) on Sunday November 11, 2001 @08:57PM (#2552170)

    It looks like several people (well, more than several) posted responses without reading beyond the lead-in. If you're one of them, yes, the argument here is in the general ballpark of "software estimation is hard or impossible", but it actually says something more specific than that.

    The article does NOT say the following:

    1. software estimation is impossible
    2. objective software estimation is impossible therefore software estimation is impossible

    The article DOES say

    • Various software engineering authorities claim that objective software estimation is possible [paragraph 3, quotes on first page].
    • objective software estimation is in fact not possible [body of article]

    From this, it does NOT conclude either of the points 1,2 above. Instead, it concludes:

    • Software construction is inherently creative and subjective, having more in common with physics than manufacturing; software estimation is inherently subjective [conclusion, Bollinger quote].
    • Because software is used in the government, in vehicles, and other places where it can potentially have a negative on people's lives, we (software writers) have an ethical responsibility to not over-represent our ability to estimate (especially when it comes to estimation of software quality- r.e. correctness claim in the supplementary material).

    Now some of the response posts, paraphrased:

    • "The article says that estimation must be objective rather than subjective"
      No, it does not say this.

    • "The article says that subjective software estimation is not useful"
      It also does not say this.

    • "The article says that we are looking for exact answers, not estimates" or "the article doesn't understand what `estimate' means"
      No, the article distinguishes subjective and objective estimates, and specifically discusses the case of an objective estimate with bounds in detail.

    • "People/organizations can make accurate estimates, I made one last week" or "Estimation is hard, I double my estimates and still miss them".
      Ok, but slightly off topic: the article is specifically talking about those who claim objective estimates.

    • "You can do objective estimation, and I did it last week using COCOMO"
      And where did you get an objective estimate of the complexity of a new project? Read the article...

    • "I think I'm the only person who has read this far".
      Yes, you are. Your boss is monitoring you, get back to work.

    • "Software estimation needs common sense, not advanced mathematics."
      Certainly. The 'manufacturing' camp of software estimators (Humphrey quote in the supplementary material [idiom.com]) say or hint that software construction can be made into a repeatable, fairly boring process where projects are always on time and programmers are like factory workers. This may or may not be true (I don't think it is), but regardless: to make this view seem more science than philosophy some of these people have fallen into the trap of cloaking their estimating process with formal notation and claiming or hinting objectivity. This part is wrong.

      On the contrary, [conclusions to the article and the supplementary material]:

      Good estimation therefore requires experience and judgment. The software industry should value human experience, intuition, and wisdom rather than claiming false objectivity and promoting entirely impersonal "processes".

Prototype designs always work. -- Don Vonada

Working...