Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×
Programming Books Media Java Book Reviews IT Technology

Test-Driven Development by Example 189

PinglePongle writes "Kent Beck is well known as one of the main drivers behind eXtreme Programming -- a style of development which favours a very disciplined but low-formality approach to coding. Writing applications 'test-first' is one of the practices of XP, and this book explores the subject of test-driven development in detail." Read on for the complete review.
Test-Driven Development by Example
author Kent Beck
pages 220
publisher Addison Wesley
rating Superb
reviewer PinglePongle
ISBN 0321146530
summary Kent Beck -- author of the original Extreme Programming book -- explains in detail how to turn your development world upside down by starting with the test, then writing the code.

What it's all about:

Test-driven development is about being able to take tiny little steps forward, test that the step took you in the right direction, and repeat. The "TDD Mantra" is red/green/refactor:

  • Red: write a test which will exercise a feature, but which will fail (because you haven't yet written the code)
  • Green: make the test succeed, doing whatever you need to do to get to "green" as quickly as possible -- don't worry about prettiness
  • Refactor: now that you have code which passes the test, eliminate all the duplication

The book then shows 2 fairly detailed examples of a development project (or snippet of a project) which progress using this style of coding. The first example deals with the creation of multi-currency capabilities for an existing project. In the space of 17 chapters, the author walks you through the creation of 6 classes (1 test class, 5 functional classes), complete with the thought-processes behind them. The code is written in java, and is trivially easy to follow, because it gets introduced in tiny little chunks; most chapters are less than 6 pages in length.

The second example is the creation of a unit testing framework in Python. It is significantly more complex and real-world than the first example, but again proceeds in very small steps, and in small chapters.

The final part of the book contains patterns for test-driven development -- practical real-world advice on how to do this stuff for real. Nearly all the "patterns" are phrased as question/answer pairs, and they range from deeply technical design patterns to advice on the best way to arrange the furniture.

What's good about the book?

Kent Beck is a very good writer -- his writing is clear, he is not afraid to leave out stuff he assumes you can guess for yourself, but when he does go into detail you feel it is necessary to get the big picture, rather than mere geek bravado. Even if you don't adopt Test-Driven Development, many of the ideas are well worth considering for your day-to-day coding situations.

What could have been better?

The book stresses the importance of taking 'little steps,' and sometimes you feel impatient to move to more challenging tests before properly finishing the current chapter. I was also hoping for more of a discussion on the practicalities of unit testing database-driven systems, where you frequently have to test business entities which are closely coupled to the database.

Summary

If you code for a living, or manage people who do, you should read this book -- it's a quick enough read -- and consider some of the assertions it makes. If you feel you're introducing more bugs than you expected, if you feel uneasy about how close your work matches the requirements, this book gives you some powerful ideas.


You can purchase Test-Driven Development by Example from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.
This discussion has been archived. No new comments can be posted.

Test-Driven Development by Example

Comments Filter:
  • Although it sucks to lose 20% of your code when you drive it off the lot.
  • hopefully... (Score:2, Redundant)

    by EnderWiggnz ( 39214 )
    hopefully, this will last as long and have as big an impact as extreme programming did.

    for the sarcasm impaired : this is sarcasm.
  • Once my manager came in with the buzzword eXtreme Programming. I read a couple of articles, I tried it but what a disaster. I just can't stand other people nozing around in my half completed code, the extreme attention to testing and the team-spirit you ought to have. Needless to say the experiment failed and things went back to normal :-)
    • by Peyna ( 14792 )
      XP has worked in some small scale applications pretty well. It isn't for every project or every person, but in some cases it can be very effective, and in others (as you stated) it can be a disaster.

      I think it's great that there are new ideas for ways to develop software, (although XP is actually fairly 'old' in computer science terms). There is no magic formula for producing quality software in a reasonable amount of time yet, but XP is another step in the right direction.

      Note: While XP's entire methodology wasn't written down until about 1996, many of the ideas that it uses had been in circulation since the 70s.
    • by Anonymous Coward
      Lucky! Had the buzzword bingo won out, they'd replace you with 12 hard-working yet inexpensive immigrants. Of course, the 12 immigrants wouldn't produce any results faster or better, but they sure look busy! And the respect the boss gets! Those immigrants sure know how to run around and bow to the boss! Run! Run! Or I'll call the INS!
    • by Lumpish Scholar ( 17107 ) on Tuesday January 28, 2003 @11:22AM (#5174902) Homepage Journal
      I just can't stand other people nozing around in my half completed code ...
      Then you ought to (1) try the other XP practices before you try pair programming, and (2) learn how pair programming really works!
      ... the extreme attention to testing ...
      You don't think testing is important???
      ... and the team-spirit you ought to have ...
      Check out the work of James O. Coplien. He's an extremely hard core C++ guy, but when he was doing research at Bell Labs, he descovered that organization effectiveness was far more important for software development productivity than any technological advance.

      I once worked at a start-up where someone started on Monday, and never came back after Wednesday night, leaving a voice mail message that said, "You never told me I was going to have to work with other people!"

      You're going to have to work with other people. The better you work with them, the better you work, the better everyone works. (Hugs not required.)
      • by October_30th ( 531777 ) on Tuesday January 28, 2003 @11:58AM (#5175209) Homepage Journal
        You're going to have to work with other people. The better you work with them, the better you work, the better everyone works. (Hugs not required.)

        I couldn't agree more.

        I can't think of many jobs - at least for college/university educated people - that do not require soft skills like ability to work with your coworkers and communication (meetings, presentations, acting as a tour guide for VIPs, etc.).

        I used to hate having to deal with other people any any way. In fact, that was one of the reasons why I decided to embark on a research scientists (Physics) career. As a scientist I wouldn't have to deal with people, give talks, socialize or - most imporant of all - end up in any kind of a manager/boss line of work. I would just sit, think and write papers. That's what I thought and boy how wrong I was.

        Most science, and experimental physics in particular, is done in groups. There's no way around it. You can't run a lab alone so you need to have people around you. Even as a postdoc you have to be able to hire good PhD candidates and supervise them. You have to be able to interview them, understand what makes each of them tick at workplace and how to manage them to get the best out of them.

        Then, unless your lab is established and extremely well funded, staffed and equipped, you often need collaboration from other groups. Making contacts with other scientists and establishing mutually beneficial collaboration requires publicity (talks!), diplomacy (socializing!) and patience.

        A person who cannot work with other people simply does not fit into this environment. No matter how brilliant scientist he is, without the social skills he is very likely to turn out as nothing.

        I still get slightly nervous when I have to a give a talk. I still don't like meetings. However, I have grown fond of managing people, because, as difficult as it sometimes can be, it's a wonderful feeling to see your highly motivated and skilled people working with you in harmony. The older I get the more I appreciate social skills over raw intelligence or mathematical/logical ability. If it all comes in one package, jolly good. However, if I had to choose between a budding physics genius with a highly abrasive personality and a slightly well performing person with good social skills, I'd choose the latter. No question about it.

      • by jstoner ( 85407 )
        Check out the work of James O. Coplien. He's an extremely hard core C++ guy, but when he was doing research at Bell Labs, he descovered that organization effectiveness was far more important for software development productivity than any technological advance.

        This is exactly the point the XP people are missing. All the practices they prescribe require a very sound organizational culture. Unusually sound. If you already have that in place, then XP might enhance your organization's performance. But if you already have that in place, you've already solved a long list of much harder problems.

        And don't get me started on the requirements gathering end of the process: in full XP, you have to collaborate tightly with your business stakeholder folks. You have a problem with the other developers in your own department? Wait till you have to deal face to face with the idiot from marketing or the arrogant ass from finance. A lot.
  • Good (Score:3, Insightful)

    by giel ( 554962 ) on Tuesday January 28, 2003 @10:47AM (#5174623) Journal

    I think we must try to get rid of wanting to design and plan every little thing in front and then find out stuff doesn't work ending up running out of time and in the end having noone willing to pay for all useless efforts.

    Although many people don't believe in XP it is a way to accomplish development in such a way you do get deliverables. Maybe it does not improve speed but it does improve quality and reduce risk.

    So any book which is able to explain the pro's of XP and open eyes of non-believers is good.

    • XM (Score:1, Insightful)

      eXtreme Medicine.

      Your feeling I'll, so you pop along to the doctor.
      Hey Doc, I have a problem can you fix it.
      The Doc has an Idea what's wrong, could be complicated, but:
      throws a load of drugs at you
      does a some tests, your a bit better but, well have some nasty side affects.
      Changes some of the drugs, and gives you a few more......

      In the end, your sort of all right, but a drug addict, and occasionally have to go back and get a different fix from the Doc.

      Not-so eXtreme Medicine.
      Your feeling I'll, so you pop along to the doctor.
      'Hey Doc, I have a problem can you fix it.'
      The Doc has an Idea what's wrong, could be complicated.

      I'll put you on some light medication and get an appointment with a consultant.

      The consultant comes along, has a word, still not sure, advises the Doc on some better medication gets in a specialist for one of you conditions.

      The doc treats most of you conditions, and some have already cleared up.
      The specialist takes a look at you and give the Doc some more advise and training.
      After a few months you in perfect health.
      • Re:XM (Score:3, Insightful)

        by giel ( 554962 )
        IMHO you mix things up, sorry.

        eXtreme Medicine -> not so eXtreme Programming:

        (in general use a lot of blahblah and don't show what you're doing)
        make a big picture of all what's wrong
        try to fix everything at once (dripping nose, broken legs, breast enlargements, etc)
        determine whether or not thing have gone well is the patients problem (hey my?? oh, cute, well let's make em even bigger)
        leave your patient, with a lot of (new) problems


        not so eXtreme Medicine -> eXtreme Programming:

        (in general use a little blahblah and show what you're doing)
        focus on the most important issues (broken legs)
        make sure you can see if it's fixed
        fix the most important issue, and continue with the next important thing (dripping nose)
        disencourage breat enlargement


        Yes, it means that you need very good developers for XP. These people must be able to do good designing to have the big picture in mind and they must be able to judge quality.
        Do they exist? Yes...
        Many? Enough, but there is an awfull lot of very crappy developers out there.
  • by thac0 ( 644918 ) on Tuesday January 28, 2003 @10:47AM (#5174624)
    The reason some XP projects are successful is because they actually have testing as part of the game plan. It is *shocking* to me, having been in the industry for better than a decade and pounding code for 20 years, the state of testing in corporate america. Just atrocious.

    There are many labs that don't test at all, and a vast majority test poorly. I've worked in some fortune 500 labs that didn't test at all. Scary stuff. Nothing life threatening, but all of a sudden I wasn't so convinced that the reason my account was misconfigured was because *I* gave wrong data. Simply bug riddled. Those that do test often do so manually. Forgetting for a moment that humans are likely to take short-cuts and not bother to execute tests they perceive to be out of the scope of their recent change, they are failable. Of course they are, that's how the bugs got there in the first place.

    So, the XP folks have the testing thing down. They test before the code is written, and their tests are automated.

    Then they take leave of their senses. The claim that because they've successfully turned one idea on it's head (i.e. testing *first*) that they can turn others is ludicrous. Design first is still valuable guys. I've eliminated thousands of bugs simply by having the right design to begin with. Waiting until you've cobbled something together that passes the test and then hoping that your boss will allow you to refactor is a loser. If it weren't Scott Adams wouldn't be a millionare.

    So, write your tests first. But do your design before you code, not after you've put together a thousand lines of crap.
    • Do they even consult?

      seriously the percentage of projects I've seen with features that don't work because that's what the client asked for. Or are poorly put together is amazing.

      You can't test if you don't have a design to test against. You can't design if you haven't consulted the requirements.
    • by Lumpish Scholar ( 17107 ) on Tuesday January 28, 2003 @11:09AM (#5174807) Homepage Journal
      ... some XP projects are successful is because they actually have testing as part of the game plan.... the XP folks have the testing thing down.
      The best thing about XP is that, with the possible exception of test-first (a.k.a. test driven) development, none of the practices are new and original. I mean that in the best possible way! All the practices are tried and true, based on experience long before Kent Beck was using the word "extreme" this way.
      Then they take leave of their senses.... Design first is still valuable ...
      "Design" in the pure waterfall sense -- do 100% of the requirements before doing any of the design, do 100% of the design before doing any of the coding -- doesn't scale up to large projects or rapid development. It's important to use an iterative approach: do a little analysis, do a little design, do a little coding, do a little testing, repeat until done.

      XP breaks the design/coding/testing cycle into very small iterations, each one as big as one automated unit test case. It's a very exploratory style of software development. XP doesn't mandate any high-level design artifacts (though it doesn't forbid them either).

      What none of the XP books say is that developing unit tests is a design activity, and the unit tests are design artifacts! Unit tests outline the responsibilites of classes, in the original responsibily-based style of object oriented design.

      XP programmers do design on whiteboards, and in their heads. Some of these artifacts are lost. Some would have become obsolete in a hurry. (The unit tests are guaranteed not be obsolete, at least as long as they're passing.)

      I'll take that, any day, over a hundred pages of out-of-date UML diagrams.
    • In a way you are right: don't throw together a thousand or ten thousand lines of test code for testing's sake.

      You missed one thing though: Test-first is design. The tests don't just ensure that the thing works as written, they define how the thing is intended to work, and in order to write a test that exercises that aspect, you have to think about what problem the code is supposed to solve. This results in the test being the thing that validates the features of the code under test, and by definition, the features of the code are its design.
    • So, write your tests first. But do your design before you code, not after you've put together a thousand lines of crap.

      The name of the practice is either Test Driven Development or (more commonly) Test Driven Design. I think it's a bad name, because it leads to the sort of confusion you have.

      The tests they're talking about ARE design, so by writing them before you write code, you are, in fact, designing before coding. The difference is twofold: first, instead of stopping the design at an arbitrary level of detail and going to some other part of the program, you continue the detailed design until it produces code for that part of the program; second, instead of writing your design in an ignorable form, you write it in a simple, immediately testable form.

      Because of the first difference, you get to implement your design while it's still fresh in your head; because of the second, your implementation will be forced to match your design.

      I've worked with TDD for about a year now. It's a real winner -- mainly because it makes testing a creative, constructive activity (part of design) instead of destructive gruntwork (intended to tear down parts of your program).

      Waiting until you've cobbled something together that passes the test ... is a loser.

      Waiting? That's exactly the point -- you DON'T wait. You implement the solution needed to pass the test (i.e. the design) immediately, as soon as you've managed to express your design in a testable form. You get to see immediately whether or not your design is workable, before you build some other aspect of your design on it. ...hoping that your boss will allow you to refactor is a loser.

      I've worked TDD both with and without refactoring. It works either way -- without refactoring I have to take a few more risks, code a little slower, and accept slightly lower quality overall code (i.e. it doesn't fit together quite as well; refactoring doesn't slow me down at all. The nice thing is, though, that it's not under the control of my boss, anymore than which shift key I use is under his control. I don't send a unit in to Configuration Management until it's ready to roll; so within that unit, I refactor at will. (We're a CMM level 3 organization.)

      Amusingly enough, the only criticism I get from my boss is that I _may_ be testing too much. Literally. He's not sure, but he's a bit nervous that I might be testing outside of the needed range -- for example, testing for negative numbers when the only allowable input is positive (he hasn't read my tests, and I don't think he can even imagine tests being something he'd want to read). You should see his tests -- but that's a different story.

      -Billy
      • I don't have a lot of experience with XP and TDD, but from what I have seen, only a little of it impresses me. I like the idea of writing the tests as you go along, but some of the "small" steps I've seen in examples are stupid. For example, writing the test but not completely filling in the frame work for the code the object being tested.
        In one example I saw, the person wrote the test and the frame work for the function, but didn't write the skellital frame work of the class which contained the function used in the test. They also didn't put in any of the includes/imports that was needed. Instead, they compiled it and added it after the compiler told them these things didn't exist--kind of stupid when you knew it was needed before you compiled.
        To me that's too small a step. I can see the benefit in taking small steps, but not taking microsteps. Also, I can see using the compiler errors as a sort of to-do list, but I can't see wait to put something in like a framework and/or includes when you know they are required. That seems excessive and a waste of time.
        In some ways this is just a rehash of the methodolgy of structured programming. The way I was taught to break thigns down was to write the higher level code first, and this meant writing empty functions (or functions that return a constant). After that was done, you'd go back and fill in the code in increments. This just seems to be a rehash of that idea--which seemed to get lost mess when the OO craze started.
        • I like the idea of writing the tests as you go along, but some of the "small" steps I've seen in examples are stupid. For example, writing the test but not completely filling in the frame work for the code of the object being tested.

          Yes, that's an example of an example trying to make a point: it's very easy to run the tests, so you'll often run the tests before the code's ready to compile. In real life, this happens a lot, no matter how silly it looks in a book: it's way more expensive to put in too much functionality than it is to put in too little, since if you put in too much your tests won't complain.

          BUT, you don't have to do that. The #2 rule in the Pragmatic Programmers book is "THINK!" Good advice -- don't turn your brain off.

          In some ways this is just a rehash of the methodolgy of structured programming. The way I was taught to break thigns down was to write the higher level code first, and this meant writing empty functions (or functions that return a constant). After that was done, you'd go back and fill in the code in increments. This just seems to be a rehash of that idea--which seemed to get lost mess when the OO craze started.

          I see what you mean, but although this provides some wonderful support for top-down programming, it's not what it is or requires. It's just as easy to start in the middle and design outwards.

          -Billy
    • So, write your tests first. But do your design before you code, not after you've put together a thousand lines of crap.

      If you've written 1000 lines of code before you refactor, you're waiting way too long. I do little refactorings every few lines, and bigger ones whenever things don't look right.

      Look at it this way: you can do your design before you code, while you're coding, or afterwards. The waterfall advocates about 80:20:0. But that assumes that you learn nothing, think of nothing when coding. That's stupid; instead, what people do is grumble and imagine what they'll do in the next version.

      Now that I'm doing test-driven development, my ratio is about 20/30/50. I still do some design up front, and that's very valuable. But instead of having those raging arguments about what design might work best, we start building and find out, refactoring towards the best design as we go.

      Waiting until you've cobbled something together that passes the test and then hoping that your boss will allow you to refactor is a loser.

      If a boss is such a micromanager that he's telling you which lines of code to work on every 15 minutes, then it's time for a new job. The people I work for recognize that I'm a professional and trust me to tend to my business, especially given how willing I am to explain it to them.

      When bosses ask about refactoring, I explain that we do it so we can go faster. If they think it's a net loss, then I am always glad to prove them wrong: if they want to measure productivity each month, we can see which way works better.

      Think of it this way: not refactoring is like not exercising: you get slower and slower, everything gets creaky and unreliable, and you die sooner because your arteries are clogged. A little regular exercise keeps you healthy, limber, and active.
    • You wrote:

      > I've eliminated thousands of bugs simply by having the right design to begin with.

      Have you? How do you know that?

      Yours,

      Tom
      http://pmd.sf.net/
    • Here's the thing in a nutshell as I see it. Up front design is not outlawed by XP, but the XP theory is that you can go faster if you skip it. More importantly if you have a big up front design you are likely to be more likely to stick to it, even when you realize that real life is different from the assumptions that you based your design on. You will then start living with adaptations and local fixes and your code starts growing these tumor-like work-arounds that will eventually kill your ability to maintain the code.

      The really important thing in XP is that there is zero tolerance for bad code and incomprehensible design. Once it is realized that something could be made better another way it is changed to be that way. The automated tests greatly facilitate this.
    • Yes, do your design before your code, but you will _still_ need to refactor, partly because requirements change, but also because after doing some implementation you may realize the original design was not quite right.

      That much I think is not contentious. Very few people (even those with experience) can pick the ideal design ahead of doing any implementation or predict what the changes in requirements will be.

      More controversial, perhaps, is the XP idea that your initial design should not be any more general than it needs to be to implement the functional requirements of your first code drop. There is some merit in this, since the requirements _will_ change and unused generality is often a waste of coding effort (not to mention creating extra complexity which may not be tested enough), but still I feel you have to use common sense and often design for extensibility at the start, even if you are not 100% certain the extra flexibility will be needed. You might be 50% certain and that is often enough.

      But I do feel that the XP approach fits in with my personality. If there is no bus approaching bus stop, I would rather walk to the next stop than wait for a bus to come along, because at least I am making some small progress and this journey strategy minimizes risk, even if the mean journey time is shorter by waiting around at the bus stop.
  • by netsavior ( 627338 )
    it would be nice to have 2 developers for every problem... but that is never going to happen.
    • it would be nice to have 2 developers for every problem
      It would be even nicer if two developers working together were more than twice as productive as either of them working alone ... but that's pair programming, a different XP practice, and a whole different book (Amazon.com [amazon.com], BN.com [barnesandnoble.com]).
  • by Some Bitch ( 645438 ) on Tuesday January 28, 2003 @10:48AM (#5174633)
    ...been doing this for years anyway? My code development cycle has always been... 1. Write code to use function/procedure. 2. Write function/procedure. 3. If function/procedure==fucked return to 2 and unfuck. 4. Once it works, tidy it up. Now it appears someone else has added steps 5 and 6... 5. Write a book about it. 6. Profit!!!
  • by QEDog ( 610238 ) on Tuesday January 28, 2003 @10:48AM (#5174634)

    What would Dilbert do?

    Here [dilbert.com]and here [dilbert.com]!

    I love buzzwords with X anyway...
  • Buy PEAA, too (Score:3, Informative)

    by Amsterdam Vallon ( 639622 ) <amsterdamvallon2003@yahoo.com> on Tuesday January 28, 2003 @10:49AM (#5174640) Homepage
    You can get Martin Fowler's "Patterns of Enterprise Application Architecture" in a bundle bargain at Amazon along with Kent Beck's "Test Driven Development" for $79.98 [link to PEAA bundle [amazon.com]]
  • Is XP good? (Score:5, Interesting)

    by timeOday ( 582209 ) on Tuesday January 28, 2003 @10:49AM (#5174644)
    Now that XP has been "out there" for a while, does anybody have some war stories?

    Does the reliance on incremental development and refactoring rather than a intricate, up-front design really work, or result in a big wad of band-aids?

    Is pair programming OK, or do you sometimes get stuck with the nitpicker from hell who has to have every detail his own way?

    Is close involvement with the customer good, or does it just give them daily opportunities for endless bright ideas that prevent convergence?

    Just wondering...

    • Success Stories? (Score:4, Informative)

      by Lucas Membrane ( 524640 ) on Tuesday January 28, 2003 @11:07AM (#5174788)
      A few years ago, the XP promoters were bragging about the Chrysler Compensation System, the first big XP project. They were redoing all of Chrysler's payroll systems, they were part done, they were succeeding where much larger efforts had failed, and the new managers from Mercedes wanted them to finish it so that they could go on and apply XP to other problems.

      Last I heard, the Chrysler Compensation System was not finished, scrapped prior to going into production. What are the more recent projects that demonstrate how well XP works?

      • Any ideas why it failed?
        You've only told half the story :) Don't leave us in suspense.
        • Re:Success Stories? (Score:5, Informative)

          by Joseph Vigneau ( 514 ) on Tuesday January 28, 2003 @11:32AM (#5174984)
          Any ideas why it failed?

          Hear it from the players players themselves [c2.com].

          One of the benefits of XP is that it can tell you much earlier about whether or not to terminate a project in the first place. From Extreme Programming Explained [amazon.com]:

          One of the features of XP is that customers can see exactly what they are getting as they go along.
          -- Kent Beck
          • Hear it from the players players themselves [c2.com].

            A brief excerpt:

            I think this is almost always the case if you have a OnsiteCustomer, since the GoldOwner usually cannot be full time on site. ScrumMethodology tries to balance this by having the GoldOwner in the DemoAfterSprint meeting (roughly equivalent to XP's IterationPlanning).


            Ack. Whatever its merits as a development technique, XP has clearly suceeded in generating a remarkably high concentration of silly buzzwords.
    • Re:Is XP good? (Score:3, Insightful)

      by silkySlim ( 565600 )
      I can't speak for XP specifically, but lightweight development has been nothing but a positive experience for my team. But it's a tricky question. Because if you truly believe in the "Agile" methodologies, you find that the development process quickly becomes customized based on your team members and type of project. It's all about creating the path of least resistance for your team while still moving towards the end product.

      I work in a game studio where our last project had 6 months of pre-production time. We generated reams of technical design documents. The intention was good, but they were never maintained or even referenced after their initial creation. We just said "documentation is necessary" and it needed to be done. In production, the team wasn't on the same page. Every programmer had a task list they just milled through. The assumption was the initial requirements won't change. The result was ugly. The product was subpar and a couple months late. Everyone was miserable. It sucked.

      I'm currently leading a new project here. We're 6 months into production and every milestone has been delivered ON TIME and accepted by our customer. The team is focussed on the current milestone, there isn't a lot of process to get in the way. The best part is, writing code is fun again. We don't have goals we can't accomplish. And we fully expect the product requirements to change during production.

      I could get into specifics about our process. But I don't think it would be that helpful. I think specific methodologies like XP are guidelines to get you started. From there, you really should re-evaluate your process frequently (a fundamental excercise to be "Agile") and make changes as necessary. Kind of like optimizing your code.

      The following links gave me all the information I needed to devise an initial process plan (which included TDD). But once it was put into practice, it naturally evolved into the process we have now (which doesn't include TDD)...

      The New Methodology by Martin Fowler [martinfowler.com]

      The Agile Manifesto [agilemanifesto.org]

      Agile Software Development Ecosystems by Jim Highsmith [amazon.com]

      I also suggest reading the chapters on "thematic milestones" in Writing Solid Code.

      • IIRC, he was some years back the mastermind of a big OO project at Cash'n'Carry stores. Is that right? It was supposed to be a killer major competitive advantage kind of huge leading edge triumph. Somehow, I never heard a success story kind of post-mortem on that one. What happened?
    • Re:Is XP good? (Score:2, Insightful)

      by spstrong ( 175063 )
      Now that XP has been "out there" for a while,
      does anybody have some war stories?

      It's interesting. The "war" has been with management. We showed our Project Manager a couple of the books (he actually READ them) and we were allowed to use XP. After our first project was successful, we have been trying to get official approval to develop using the methodology and it has been tough.

      Does the reliance on incremental development and
      refactoring rather than a intricate, up-front
      design really work, or result in a big wad of
      band-aids?

      As with any approach the team must be disciplined to
      1. Test First
      2. Talk to each other
      3. Ask for help when needed
      4. Refactor mercilessly
      5. Code the simplest thing that will work

      You will probably say "of course!" to all of the above, but if you don't stay disciplined within the team, you will get into trouble.

      Is pair programming OK, or do you sometimes get
      stuck with the nitpicker from hell who has to
      have every detail his own way?

      Pairing is great. If your goal is not well written simple software, you are not part of the team. You are the cowboy who is the reason the team is all working on Saturday to fix the sh*t you went off and wrote by yourself! (Ok, Ok, breathe....)

      Is close involvement with the customer good, or
      does it just give them daily opportunities for
      endless bright ideas that prevent convergence?

      The customer wants an application as quickly as possible. They have a business need and don't want to have to wait 4 more months for their great ideas to be implemented. If they continue to think off the top of their heads, they probably didn't know what they wanted in the first place and it will take that extra 4 months to get it out of them and get it into the application anyway.

      XP works. Our teams are 4 or 6 programmers with a tester. When we test first and give the tester something to test that is not fragile, we get much farther much faster than "code it and throw it" at the tester and hope it works.

    • Re:Is XP good? (Score:2, Interesting)

      by Anonymous Coward
      War Stories you want? You got it!! By the way, I am going as Anonymous Coward 'cause I can't say anything negative about the company. The company shall also remain anonymous. We tried it and it worked. However, it did not work well. The primary reason it did not work well was that the concepts were not consistently enforced. Also, upper management forced it on us as the next great thing and then proceeded to do things in the same old way.
      • We had a project and a project lead who developed the stories.
      • A customer is supposed to develop the stories. The customer tells you what is needed and you develop that, and
        only that. Stories are small pieces of functionality that can be developed in a few (three to five days). One of the mantras is that the stories have to be short.
      • We had problems with the project lead writing stories and then changing them in mid-stream.
      • There is nothing wrong with the project lead re-writing the story. However, our project lead would re-write the story
        without allowing the developers or testers to re-factor their development time.
      • We had problems with the lead developer including functionality because the customer will want this down the road.
      • One of the primary rules is that you code the story and
        only the story. You do not add in functionality that you thing the customer might want.
      • We had stars and gurus that were allowed to work alone rather than as part of a team.
      • Another rule is that all production code is developed in pairs. This is for mentoring and learning as well as having two sets of eyes looking at the code to notice
        the missed comma or semi-colon. You pair an experienced programmer with one less experienced so the less experienced programmer can learn.
      • We had tests written to test the story, and tests written to test what the customer really wanted.
      • We wasted time testing the
        additional functionality as well as wasting time coding the additional functionality. It also, agin, violated the XP rules.
      • We had stories that took three days to code; the tests to verify the story took two weeks to code.
      • This was, in most cases, an extremely valid time requirement. However, upper management and the project lead refused to allow the time. The statement was made repeatedly that
        it shouldn't take that long. Again, this violates an XP rule. The customer tells the team what is needed and developers tell the customer how long it will take to code the story. The testers tell the customer how long it will take to code the test. Management, the project lead, and the developers, as a whole, were of the opinion that the test should not take any longer than the actual code. This resulted in the testers working overtime on a near-continuous basis.

        Then, after several months (almost a year) of developing the XP way, the company had to have a demonstrable product within the next two weeks. In the end, nothing mattered except the ol' bottom line. We had to go to market to beat a competitor. It did not matter that we still had nine weeks of stories to finish the project. It did not matter that tests were failing. It did not matter that the kernel and OS version had changed twice in the year, each time causing the developers and testers to re-visit, re-test, and re-code a large part of the existing code.

        A nice shiny package was not ready exactly when the suits wanted it and the company shut down the devision, moved everything to another state and other developers. The product still has not gone to market, but the company got rid of the developers and testers that didn't deliver.

        Lest you think this is sour grapes, I actually think XP practices can work for what it was intended; small project development.

        Ours was a major undertaking with an entirely new product. There was education and training necessary that was never taken into consideration as far as the project timeline was concerned. We also had to learn new hardware technology, operating systems, development and testing languages at the same time we were learning XP methodology.

        It was not fun!!
    • Re:Is XP good? (Score:5, Interesting)

      by dubl-u ( 51156 ) <2523987012@@@pota...to> on Tuesday January 28, 2003 @12:52PM (#5175618)
      Hi! I've been doing XP on and off (depending on client preferences) for a couple of years.

      I'm immensely happy with it. It's not a magic bullet, but gives me the tools to solve a lot of common problems.

      Does the reliance on incremental development and refactoring rather than a intricate, up-front design really work, or result in a big wad of band-aids?

      Yes, it works. It takes a bit to learn how to do it well, but it works.

      Indeed, in some ways, it works better. When I did most of my designing up front, I had to guess about a lot of things. Now I'm a pretty good guesser, but I'm not perfect. And since I knew I was making designs that I had to live with for years, I tended to play it safe, putting in things that I would probably need someday. I hoped.

      Now, I don't put anything in until I know I need it. This keeps the design cleaner, and saves me all of the time I would have spent on bad guesses and things obsoleted by changing requirements.

      Is pair programming OK, or do you sometimes get stuck with the nitpicker from hell who has to have every detail his own way?

      I like pair programming a lot, and more people are good at it than you would expect. But some people are indeed pains in the ass; just last week we had to sit one guy down and have The Talk with him. If he doesn't shape up, he can go be a nitpicking cowboy on somebody else's project.

      Right now I'm also doing some solo work, and it really suffers from the lack of a pair.

      Is close involvement with the customer good, or does it just give them daily opportunities for endless bright ideas that prevent convergence?

      It can work really, really well. The trick is that you must let them steer the project, so that they can see that asking for all sorts of random stuff means that their precious project will be screwed.

      For example, I and another fellow were recently called in to work on a project that was six weeks from delivery and running late. The product manager handed us a one-page spec, spent 30 minutes showing us the existing code and said, "So can you do that in time?"

      Now the only honest answer was, "I dunno, but probably not with all that you've asked for." But we walked him through the XP practice known as the Planning Game, where we broke his requirements down into bite-size chunks, wrote them on index cards, and then he ordered them by importance. There was far more work than could be done in time.

      So then we asked him to draw a line at the point which, if we didn't reach it, he would slip the date, and if we did, he would ship. It cause him great pain, but he did it. Then we agreed that we would start developing and see how things were progressing.

      As time went on, he, like any client, had all sorts of great ideas, and so did we. Every one got written on a card, and we asked him to place them wherever he wanted in the stack. This forced him to make tradeoffs; the more stuff he put in before the release, the farther away the release would be.

      When the six weeks were up, we shipped a product with less than half of what he originally asked for. But instead of being pissed, he was happy! He got the half that was most important to him, and he was the one who made every choice about scope, so it wasn't our fault, that was just how things were.

      And best of all, because we'd stuck to the XP practices and done extensive testing, there have been no bugs reported against our code. With code that solid, we're glad to start extending it, instead of the usual pattern where a big deadline push means the code is crap.

      So yes, I like XP a lot. I get to write good code, polish the design, and have good relationships with the business folk. It rocks!
    • Re:Is XP good? (Score:2, Insightful)

      by jjl ( 514061 )
      > Is pair programming OK, or do you sometimes get stuck with the nitpicker from hell who has to have every detail his own way?

      The team should create a common guidelines specification which then must be agreed by everyone in the project. After that nitpicking to make stuff conform to the guideline is a good practise, since sticking to the agreed style is an improvement.

  • by Anonymous Coward
    Isn't that the whole crux of XP? If you're not organized, nothing will help. If things are organized, but need tweaking, then XP can be beneficial. Having the pre-requisite of having some organization to how software is developed is the hardest part. If your PM isn't organized and you're not organized, forget about it. If you're organized, then you're probably already doing unit testing in some form.
  • The idea of breaking your program into small parts and testing them is fine. Writing programs to test these parts seems to be a bit problematic. I wrote a test program to test this module. But then I remembered that I forgot to write a program to test the test module! You use your specifications to write your module, then you test against these specifications. If you write a program to test these specification, then you already must have a firm grasp of the specifications. What assumptions about your module are you making when you write your test? Or is this a simple black box type test where you do not care how it works inside? Any faults that you have in your conception of your test program will be carried over into your module, or worse your module might be correct but your test program flawed. Seems like an extra layer of complexity rather than a useful method. I also dislike the do it dirty and fast, then refactor part. Proper planning and implementation seems the better approach to me. I understand that "mad hacker" instinct to dive in and code away, but that has to be restrained a bit when the complexity of the problem rises.
    • Actually, I think that refactoring roots itself in reality. A greater percentage of the time a programmer (as a consultant or newly hired employee) will be brought into a situation where there is an existing application. Generally there isn't an option to redesign an entire application without costing the company more $$$ than it would like.
      Better Design is perferred, but it is not always a reality in the workplace.
    • Any faults that you have in your conception of your test program will be carried over into your module, or worse your module might be correct but your test program flawed

      In practice, this isn't really a problem. As long as you write your tests first (rather than hacking them together afterwards) then they're a pretty good expression of what the code should do. Then the code itself is an expression of how to accomplish that.

      Of course, it's possible that one has the wrong idea and makes the same mistake twice, but you use other practices to guard against that.

      Seems like an extra layer of complexity rather than a useful method.

      Maybe. My score for the last year or so is less than one reported bug per man-month of coding. And it's lower than that for my code written while doing pair programming.

      I spend almost zero time hunting bugs.
    • I wrote a test program to test this module. But then I remembered that I forgot to write a program to test the test module!

      Actually, test-first development tackles this case specifically. You write the test first - which has to fail since you haven't written the code yet. (Technically, it even fails to compile. You then create stubs for the methods you call so that it compiles.) This is the state known as "Red". If the test passes when you first create it, you know that you have a bad test.


      Once the test fails, you then write the code so that the test passes. This is the state known as "Green".


      In practice, I do find that I occasionally write insufficient tests - which means that bugs get into the final product. But once a bug report comes back in and we find the problem, the first thing we do is write a test that fails (ie, back to the red state.) Then we fix the bug so that we move to the green state. And voila, we have a regression test for that particular bug.


    • This argument (that tests may be flawed, and so don't help) isn't as much of a problem as you think.

      Suppose 10% of the code you write contains errors (at random), and you write tests covering 50% of the code. Then you find 90% of the errors in 50% of the code (assuming your error rate in test code is the same as the error rate in 'real' code). Your error rate for the overall program should now be around 5.5%.

      In other words, it doesnt matter if your tests can also be flawed, you'll still improve your code by doing testing. On the other hand, planning will make absolutely /zero/ difference to your error rate (compared to any other method that appears to implement a specification). In order to remove bugs, you have to do /some/ testing, so I have to assume you're advocating test-last.

      It then boils down to whether test-driven or test-last is the most economic policy, and how much testing to do.

      Anyway - since this economic argument (not flawed tests) is the crux of the matter, you should take a read of Beck's thinking on the issue, in May 1999 issue of C++ Report [adtmag.com], particularly the last paragraph.

      Cheers,
      Baz

  • The Unix Philosophy [camalott.com]
    [camalott.com]
  • The RIAA issued a press release saying that they are initiating a lawsuit against Kent Beck because eXtreme Programming principles could lead to faster development of file-sharing software. The RIAA's main argument is that CD sales are falling at levels that could only be explained by eXtreme Programming.
  • by Waffle Iron ( 339739 ) on Tuesday January 28, 2003 @10:59AM (#5174733)
    XP testing techniques are great in theory, but there is still a gaping hole: if your tests aren't correct, your program could still end up flawed.

    That's why I've moved on to XXP, which focuses first on correctness of tests. First, I write a test that tests a test. Then I write the test. I test the test until the test tests ok. Then I write a test for another test, and so on.

    My pair programming partner is currently working on an idea he calls "XXXP". I'll post our results if we ever finish a project without getting lost in infinite recursion.

    • I test the test until the test tests ok. Then I write a test for another test, and so on.

      Test the test? Sure it's possible ...

      Tools like Jester [sourceforge.net] let you evaluate the quality of your JUnit test cases. So yes you can evise your test cases in between refactoring your code.

      Hopefully the developers of Jester didn't evaluate their test code with Jester (XXP with Jester-for-Jester), the mere thought has me spiralling into a daze of infinite recursion.
  • In my experience, extreme programming was invented by/for programmers who hate designing and discipline, and just want to start hacking. It leads to sloppiness and half-finished code, and it's not a scalable approach for large projects. I had the misfortune of having some guys on a project who insisted on working this way, and we had a huge mess to clean up in their wake. I'm glad this programming philosophy seems to be dying out.
    • It's not necessarily dying out. Look around at a lot of the young people heading into the profession, there's a lot of BPFH (Bastard Programmer Fro...why am I explaining that on slashdot?) wannabees out there who see incomprehensible code as 'job security' and a sign of their '1337 sk1llz'.

      Admittedly I tend to just sketch out a basic overview of any project I set out on then dive straight into the code but my projects are small scale so I can get away with it :). I do tidy things up and comment before declaring them done though.
    • Yeah, aren't you glad we live in a world with disciplined designers who don't test their code and never work to customer specifications and don't integrate continually and never run acceptance tests? I'd hate to think about the quality of software we'd have if people did THOSE things!

      • I'd hate to think about the quality of software we'd have if people did THOSE things!

        Are you saying THOSE things were invented for XP?
        • By no means! I am merely saying that they're fundamental to XP. If you're attempting to do XP without those, you will need a combination of very lucky and very, very brilliant developers to have a chance of delivering good software on time.

          (If you're not doing XP but you are doing those practices, you have a pretty good chance of delivering good software on time.)

    • by Anonymous Coward

      Actually, if you look at the credentials of the original "inventors" of XP, and some of it's biggest champions (Robert Martin, Martin Fowler), you'll see that they are big champions of proper software design as well. They just realized that "big design up front" -- which they were preaching -- wasn't working out, and that most implementations of more iterative approaches weren't working out either. Hence a "reaction" in the form of XP.


      As for discipline, proper implementation of the XP practices requires far more discipline than most highly structured approaches. What it minimizes is the creation of documentation for documenation's sake. What has happened is that people who hate design/discipline have decided that by saying "we're doing extreme programming", they are off the hook. But don't blame the inventors.

  • by kahei ( 466208 ) on Tuesday January 28, 2003 @11:05AM (#5174773) Homepage

    I'd be the first to admit that XP offers a lot of risk-reduction -- for teams that are working on things that are easy to unit-test.

    With a class that is supposed to take in a bond and output the yield curve, it's easy to write a unit test. But what about the next class, that renders the yield curve on the screen? What about the complex, distributed system of Excel objects and forms things that draw a network and things that flash green and go 'ping' to indicate a change, that are equally necessary but generally much harder to write and much more likely to go wrong?

    Has anyone tried to apply test-first programming to complex guis? I can't say that any obvious way to do it has ever occurred to me. Worse yet, when I ask I generally find that people either

    a) Are in the same position as me, or
    b) Believe that a GUI is a little thing you spend a couple of days on after you finish the application

    So, for now XP is something I read about rather than something I actually do.

    • Just for the record, the "correct answer" to your conundrum is to get the widget writers on board with the testing stuff. They need to add methods for testing, things that might expose inner details (readonly) about the widgets and other things that technically violate OO principles, so that testing can be done.

      Other things such as matching pieces of the picture and such would also be nice.

      It'll never be easy to test a GUI, but right now GUI widgets are almost entirely focused on giving commands and limit the information you can extract back from them to screen position, size, and little else. I want to be able to assert that my textbox can handle text of the size 50Ms with all of it being visible, which is exactly the sort of thing a user requirement might be. (Indeed, I think of this example because of a text dialog I just dealt with for entering a Wireless access key that was two chars too short; if you're that close, why not go for the gusto and let me see the whole thing for visual verification of correctness?) In fact you can do some of this in some widget sets; Tk will let you do that exact test, for instance. But it's haphazard from a testing point of view, because it wasn't implemented for testing.

      The other thing we need is a way of inserting all user-sourced events cleanly, and in a well-documented and supported manner, directly into the event loop as if they came from the user, indistinguishable from user input to the rest of the application. Again, haphazard, poorly-supported and poorly-documented abilities to do this exist in some toolkits, but since it's not meant for testing it's often not complete or completely undocumented.
    • All my GUIs seem to collapse on themselves, so I'm in no position to talk...

      However :), what I _do_ like are programs where you seperate the code and gui as much as possible. Even to the point of making them seperate programs or at least a library.

      Pros:
      You can have many different gui's - personally this is how I like cross platform gui's done - one tailored for each platform.
      If it is a seperate program, then you can script it, right from the start too.
      You can test the code easier, since there's the gui is seperate.

      Cons:
      For some reason having them seperate seems to make them less robust in the examples I've seen.

      So er yeah anyway. One solution is, as above, seperate your gui and code. Another solution, perhaps, is to plan all your dcop calls (or whatever your favourite scripting thing is).
      This ties in nicely in nicely with your use case diagrams since you can have calls for your "stories".

      I'll shut up now.
      • It is commonly known as the MVC model. MVC is Model, Veiw Controller. Basically you have code that handles the data (model) and code that handles the View (GUI) then the controller sits in between as a basic sort of glue to combine them.

        On smaller projects the controller and the view often get integrated into the same place, but once you start talking multiple windows etc then you need to split the Controller out.

        I beleive that Java's gui stuff is done like this and Cocoa on MacOSX definitely uses this (the framework is built on this foundation).

    • Has anyone tried to apply test-first programming to complex guis?
      Yes, but this book doesn't really cover the subject. See the test driven development forum at Yahoo! Groups [yahoo.com] for more on test-driven development of systems with GUIs.
    • b) Believe that a GUI is a little thing you spend a couple of days on after you finish the application

      A couple of days? If a variable dump's good enough for me it's good enough for the proles ;)
    • MarathonMan [sourceforge.net] Brought to you by your good friends at ThoughtWorks.
    • The problem is that a lot of GUI frameworks are pretty hostile to testing, but it can be done.

      Rendering stuff, for example, can be tested by comparing output bitmaps. For example, I'm doing some photo stuff, and all of the methods that output images I just get working manually and then once their output looks right, I save that and make the check part of the test suite. Anything that outputs numbers (e.g., the calculation of the size of the new image for a scaling operation) is just tested the usual way.

      Other GUI stuff you can test by generating fake events and setting up mock collaborator classes to make sure things do what they should when they should.

      And you can also take a just-the-essentials approach. For my web work, I test the underlying components extensively. I also have a web robot that hits a test copy of the site and makes sure that it all works well enough. I don't test every bit of HTML; I just make sure that important strings appear in the right place, that clicking through gets to where I think it should, and that feeding the right data to the right forms gets abuot the right responses.

      Still, it's more of a pain than it should be. I look forward to the day that somebody develops a GUI framework test-first; those components will be much easier to work with.
    • My solution for web development has been to use unit tests wherever possible, and to write the UI components to be as test-friendly as possible.

      That is, I try to do a strict MVC (Model-View-Controller) class separation, with the View as lightweight as possible. That is, make the UI as dumb as possible, so that if there is any room for errors they show up in the simplest UI walk-through.

      This also means that you can do rapid development (usability testing, anyone?) on the nitpicky presentation details. Automated GUI testing tends to be screen-reader testing, which means the test needs to change when minor GUI components change-- which is unreasonably nasty.

      As a specific example, JSP tag classes have a doStartTag() method which typically write to the web page stream. It is annoying to unit test because it needs web page context and must handle lots of data gathering and exceptions.

      I write my doStartTag()s to gather data, handle exceptions, and write to the web page-- and everyting else (the real meat of the method) is done by a helper method that takes data and returns a String. The helper method is written to be easy to unit test: it throws all the interesting exceptions and can fail in all the interesting ways. The doStartTag() is simply the JSP wrapper.

      Of course, a pure MVC approach has my helper method in a separate class-- which is actually my most common approach-- but in many cases that would be overkill.

      This approach can be applied to any kind of GUI: treat GUI code as a minimal wrapper around easy to test code.

    • Having had to test GUI apps before, I think a useful approach is to separate the GUI ("View") and the data ("Model"). Automating GUI tests is a bitch and the tests are very fragile if anything in the GUI changes. If possible, design a command-line version of the app that includes everything but the GUI. The command-line version can be scripted to perform automated input/output tests. If your real app is a COM object, your command-line version could even be a simple app that loads the app COM object.

      This application design is not a "natural design" a developer would choose on their own, but I think using a test-driven process it can really improve the app's modularity.
  • by Anonymous Coward
    I've never really tried the official XP methods, but I have had a couple experiences with some of the ideas of it, like when I'm coding when my brothers or "programming literate" friends are around. My general conclusion is that it works pretty well, you don't make as many mistakes when first typing your code and they make it easier to find problems when you do make mistakes.

    I don't know whether it is truly more practical than two separate individuals programming, but it does feel more productive and more enjoyable, it's like you hardly ever get stuck on anything.

    The thing is, like I said I've only done it with my brothers or friends, people I generally get along with. I could easily imagine my great experience with it turning into the programming project from hell if you don't like the person you're with.
    • It does not make any sense making two developers an xp team who do not like each other. If a manager does this than he/she is a bad manager. Generelly you should change teams quite often to ensure knowledge flow and you should never force anyone to do pair programming. Some people just can't stand it and they for sure would be less effective if forced.
  • by Anonymous Coward on Tuesday January 28, 2003 @11:16AM (#5174861)
    What? Is he going to write a book for every rule of XP?

    Testing is important, but XP testing philosophy is a catch all for actually thinking about your product and the purpose of you product. XP is about making hack programmers look legit. XP has some good points, such as an emphasis on simplicity, testing, and customer satisfaction, but mostly it's about making bad habits look good, like no design and iterative feature hacking with ignorance to the bigger picture of the app.

    Some design up front is important. Documentation is important. Code ownership is important to an extent. In a medium to large system, having everyone able to change any line of code is just stupid. People change shit and don't have a clue why the code looks the way it does. One of the arguments for no code ownership is that a lead architect can't keep it all in his head. Well, what about a team of that consist of many folks that aren't as capable as that lead architect? they are able to comprehend the whole system according to XP. And, they are allowed to change whatever they want, when they want. So, get a couple of average programmers with large egos, and you have a lot of problems.

    XP is great for people who are happy doing bug fixes all day instead of avoiding the bugs to begin with. The assertion that XP results in less bugs is pure speculation and from my experience, a very misleading claim. Just because your test succeeds doesn't mean that your program is correct. And if the test is the only glue validating the success of your final solution, you're screwed.
  • by WPIDalamar ( 122110 ) on Tuesday January 28, 2003 @11:18AM (#5174874) Homepage
    It sounds to me like this is just another way of designing, but that makes the wrong emphasis. These tests sound a lot like low level use-cases. You should be creating those anyways. But you should create all of those first! Not one at a time as you code.

    • It sounds to me like this is just another way of designing, but that makes the wrong emphasis. These tests sound a lot like low level use-cases. You should be creating those anyways. But you should create all of those first! Not one at a time as you code.

      Should? What, because it was written on the back of the stone tablets that Moses brought down?

      Hi, I've tried it both ways. Test-driven design and refactoring work. It's possible to build good software both ways. In my opinion, TDD and refactoring are more productive and more fun. But hey, you're welcome to stick with your stone tablets if you want.
  • Requirements are the Achilles heel of XP. Without rock solid requirements, you are just guessing for the test scripts.

    Take a trivial example -- an entry form for a phone number. What is a valid phone number? Add in real world things like extensions, folks using alphanumeric substitution (1-800-DISCOVER), and internationalization and it gets interesting. Now a test driver is not that big of a deal if you know what to put in it. From a design standpoint, it would really be nice to have solid requirements and test scripts that provide concrete examples as to what the business was asking for. Real world? I could only dream for mediocre requirements that might resemble not only what they asked for, but what they want.... At least enough to try and read their minds.
    • Requirements are the Achilles heel of XP.
      Then user stories (what others call "use cases") must be the glass slipper. XP addresses this; read Planning Extreme Programming by Kent Beck and Martin Fowler (Amazon.com [amazon.com], BN.com [barnesandnoble.com]).

      XP also calls for "customer on site"; the theory is, it's far quicker to get answers in real time rather than waiting for someone to write a 600 page document.

      THIS DOES NOT SCALE UP. You know it, I know it, Kent Beck probably knows it. Some of the other agile development methods [agilemanifesto.org] try to address this.
    • I think you are missing the point. You seem to realize that you are never going to get rock solid requirements, which I think most people would agree with. But then you use that as an excuse to throw away XP?

      Test Driven Development is a great way to deal with changing requirements. For a phone number validator, you would write up tests for all of the initial requirements:

      testLocalNumber()

      testLongDistanceNumber()

      testTenDigitLocalNumber()

      testAlphaNumericNumber()

      Then when you deliver the application and find out that you need to deal with international numbers, you write:

      testInternationalNumber()

      You get a red bar, beacuse you can't handle i18n numbers yet. So get that working, and when you are on a green bar, then you know that it still works for all those US numbers and it works for the new foreign numbers.

      Then you extend as you get new requirements:

      testEnglishNumber()

      testFrenchNumber()

      testItalianNumber()

      What is the alternative to this? You are still lacking requirements, even if you aren't doing TDD. But you wouldn't have any tests, and you wouldn't know exactly what your class can do for you.

      Test Driven Development is not all about leaving test artifacts. The tests are constantly changing during development as your requirements change. The main idea behind TDD is to program from the client side of things first. This is similar to the idea of writing documentation first, with the added benefit that as you finish the tests, you prove that the class does what you want it to do.

      -Mike

      • I think you are missing the point. You seem to realize that you are never going to get rock solid requirements, which I think most people would agree with. But then you use that as an excuse to throw away XP?

        I'm not sure I have. First off, I'll qualify this - I've done some XP development and I am bit cynical at this point in time. I'm not expecting a 200 page specification document, use cases, etc. Normally I'd settle for a return phone call, email, or some sort of feedback/response. The business users tended to be vague or rotate the requirements faster than a borg's shields if they responded. That was a pretty big if, btw... Poor communication is a major hurdle for any methodology, and XP suffers 'when business behaves badly'.

        In the above example, what the business really wanted was a unique identifier - they just did not know it at the time. The phone number changed to an email address, to a SSN, and a few other things. The XP process puts too much emphasis on testing at the wrong time, IMHO. The forest was lost in the trees.

        Don't think I'm against test scripts... I create drivers in my own code, and also expect someone to test the veracity as well. To be honest, I think it is better if I don't create the final test script as users tend to have an unlimited imagination when it comes to using things improperly. The lack of faith in XP does not translate into dumping QA at the development, staging, or production level.

        I've got opinions on paired programming, and a few other aspects of XP too. The short of it is if you have strong communication, teamwork, and realistic expectations, almost any methodology will deliver. Success when things are going to hell in a hand basket? XP is not a silver bullet. I've seen management cut the number of workstations in half, but when was the last time you saw a 40 hour work week?

        Anyhow, I concur about test based development. Same idea, different denomination... (grin)
    • So write tests only for the cases your code can handle. If someone asks if your code handles extensions, you can look at the tests, see that there's not one there, and confidently answer "No, but I can add that..." Your tests are then invaluable in describing the functionality of your system.

      --Steven
    • Requirements are the Achilles heel of XP. Without rock solid requirements, you are just guessing for the test scripts.

      Spoken like a guy who has never done XP.

      One of the core requirements of XP is known as the On-site Customer. This means that there should be somebody at all the meetings and sitting within shouting distance of the programmers, somebody who can answer questions like that or find out the answers. (In some companies this person is called a Product Manager; in others a Business Analyst).

      I could only dream for mediocre requirements

      Yes, requirements documents are generally silly. That's why XP doesn't use them. Instead of waving around some phone-book sized pile of garbage, XP practitioners use high-bandwidth, low-latency communications techniques: they talk. And they try out new versions of the software every week.

      At least enough to try and read their minds.

      Yep! And get yelled at when it turns out you don't have psychic powers. 'Cause that's what it takes to make standard development practices work. Isn't that a sign we should try something different?
  • OK, so developing automated unit tests, possibly even before writing the code that will be tested, is usually straightforward and almost always a good idea.

    But how can anybody design automated integration tests for applications that are intended and designed to have pseudorandom behavior, such as interactive entertainment software?

  • by Googol ( 63685 ) on Tuesday January 28, 2003 @11:40AM (#5175070)
    here [yahoo.com], [pdf,yahoo.com] with more information here [vt.edu][vt.edu].
  • by Lumpish Scholar ( 17107 ) on Tuesday January 28, 2003 @11:42AM (#5175091) Homepage Journal
    I've done a little test-driven development on my own.

    Test-driven development seems to call for a series of baby steps, each corresponding to a unit test case. Unfortunately, I wasn't always able to identify "the next baby step"; even if I could pick what I thought was the next unit test, I sometimes found myself spending far too much time, and writing far too much code, for just that next test.

    I also sometimes found that "the next test case" already passed. I don't know if I wrote more than I needed to early on, or I picked the wrong next case, or if there's more to all this than I've picked up.

    When I was in good TDD mode, I was flying; test, red, code, green, refactor, green, next! It's a very rapid, and very intense, experience. There's a reason XP usually calls for a 40 hour week; by the time you're done with a few hours of this, you are tired! (But you've gotten a lot done.)
    • I sometimes found myself spending far too much time

      Generally my unit test code writing takes about 100% - 150% as long as my production code writing. However, I spend almost no time on debugging. I find 95% of the bugs that would otherwise make it to the QA department (or worse yet, production) while the problem is still fresh in my mind. Usually it's caught and cleaned within a few minutes of when I write it.

      Try not writing any unit tests on a fairly isolated chunk of code sometime. Then keep track of your debug time on that class for the following six months.

      and writing far too much code

      The more you write unit tests, the quicker you will become at laying out test data. You may also want to look through the constructors of your production code - sometimes this can be an indicator that it would be worthwhile to simplify the construction process. If it's taking you a bunch of lines of code to get a test-worthy object when testing, it probably takes a bunch of lines of code to get a usable object in production.

      That said, there are some objects that are just a hassle to create a sample of. For those, I grit my teeth and get through it by reminding myself how much worse it would be to have to debug it after it's in production, with the customers at a standstill and my boss pacing a groove in the carpet.

      I also sometimes found that "the next test case" already passed. I don't know if I wrote more than I needed to early on

      My first test for a given class usually fails with a "cannot resolve symbol" because the class doesn't exist yet. I also occasionally go back and intentionally break a class to get the test to fail, because I want to verify that the test itself is actually testing something (prevents you from having to write a test class for your test class: Monkey, MonkeyTestCase, MonkeyTestCaseTestCase, MonkeyTestCaseInfinity, MonkeyTestCaseInfinityPlusOne...)

  • With time passing it seems to me that all this "test driven programming" promoted by people who completely forget about basical design.
    Realy! Who will care today about all those boring things like FSM analysis and formal task description! Who needs deep understanding of what his/her software does, when one just can test small part of it!
    I just want to emphasize following simple thought: one may invent new perpetuum mobile without slight undrstanding of physics. And those invention nowdays aren't even considered anymore.
    Computer science is matured enough to be utilized in everyday work. One have no need to study another perpetuum mobile invention guide. Just remember you classes.

    I hope that in my lifetime analitical computer science will take its place in program developmen instead of witchcraft spells.

    I just want to ask those adpets of XP: how you will ensure that small change in the code won't affect quite distant code parts?

  • One day my manager came and suggested we try XP. So I did what I normally do in such situations and did a search on the pattern [stupid management suggestion] +stupid and quickly turned up what I was looking for:

    http://http://www.softwarereality.com/lifecycle/xp /case_against_xp.jsp [http]

    In two minutes I was able to save both myself and my company many man-years and headaches.

    • Oops, link should be:


      http://www.softwarereality.com/lifecycle/xp/case_a gainst_xp.jsp [softwarereality.com]


      How XP of me.

    • Some of the comments I read in the "software reality" web forum that defend XP sound a little... well, defensive. One comment made it sound more like a philosophy than a set of measurable practices. I also sensed in the "case against XP" article that the author feels that XP's defenders can always say "Well, they didn't implement XP in the way that Marx^H^H^H^HBeck intended, so it doesn't count!" just as supporters of Communism sometimes said about Marxism(TM).

      I think the "Case Against XP" makes a lot of valid points. Who can afford to continuously refactor 100KLOCs of C code for example? It's hard enough keeping up with the integration issues with a stable code base. You want code rage? Try breaking a few dozen test cases by making "improvements" to your design that ripple in all directions.

      One important point about "Case Against XP" is that it may be criticizing radical forms of XP that might not exist in practice. A little slack might be warranted if XP projects in practice do not actually churn the code base as much as the religion might expect. And the author makes fun of Beck for excessive optimism, even as he exhorts the effortlessness and elegance of "getting it right the first time"... yea right.

      So yes I'm a critic of XP, but I'm also a critic of ~XP.

      If you meet the XP {booster,critic} in the road, listen to him but don't follow him... chaos and madness may await you as surely as dereferencing NULL. You must find your own way on the road, grasshopper.

  • Great without XP (Score:1, Interesting)

    by Anonymous Coward
    We've been going through a big management push to have quality code. A QA manager is now senior VP over development. So, I picked up the book and read it and tried it out.

    I'd already written an engineering requirements document which we reviewed with management and our partner company on the project. I've already put together an architecture document showing all the major components and who talks to whom (reviewed with the dev and QA teams). My manager even insisted that I write a design document on the specific module I was about to write, which included a schedule and unit tests.

    After two months of documentation I was desperate for a chance to code. TDD? Why not. I gave it a try -- not in nearly the detail the book mentions, but I have a dozen tests each hitting a key requirement of the design. When I recompiled the code for the ARM hardware, it was gratifying to see that everything still worked on the target platform. Every time we make a change, we have the initial tests I wrote to make sure nothing is broken. It's nice.

    I'd like to make the junior guy's do this -- at least then I can tell how far along they really are. "Oh, it's 80% there!!" Sure, show me the test cases that pass and how these relate to the requirements and design.

    Don't ask me to pair program. I'd rather write docs and review them with the team and do code reviews.


    BTW, I didn't like the book that much. But the method is good.

  • I work for a largish (the largest?) media organisation in the world cutting code as part of a team of 4 developers plus a senior developer.

    We've been using Scrum and Test Driven Development for about six months now, and there is NO WAY I'm ever going back to writing code without writing tests first.

    Scrum (see http://www.controlchaos.com) is a "lightweight" development methodology. It's developer driven (no more gannt charts !!!!), and it's agile, meaning that it embraces change throughout the project lifecycle. I can highly recommend it. But I digress...

    Test Driven Development is something that every halfway serious programmer should be doing IMHO. It doesn't replace the initial back-of-a-fag-packet design stage, nor does it stop you designing elegant and effective architectures. What it does do is:

    • force you to define rules that your code must obey by requiring that you write tests first. This really is useful, because it makes you think about the implications of how your application behaves.
    • facilitate unit testing - you know that every function, procedure etc works as expected, and fails as expected.
    • facilitate regression testing - when you change / enhance / refactor or otherwise modify your codebase, you know that you have not broken any existing functionality, because you will re-run the test suite against the modified code. Of course, you will have added tests for the new / changed functionality before having coded the new / changed functionality.
    • focus your mind on developing the required functionality - when all the tests pass, it's time to stop coding. No temptation to just tweak a line of code. Careful with that axe, eugene.

    So I can recommend TDD. Check it out. By the way, we're coding mainly in python with some java thrown in too.

  • ... but I have always found that quality goes up and the nunmber of bugs go down when there is actually adequate time allowed for development and testing.

    So many projects don't even get completed and this is mainly because management will not accept realistic estimates as to how long they will take to complete so the timescales get shrunk and funnily enough, the project runs over. Once the time pressure is on, quality goes out of the window because it is more important to deliver.

    Cynical??? - Maybe, but I have seen this happen so many times and yet no-one ever faces up to the fact that poor estimates are the root cause of most failures. If it can't be done in time, DON'T DO IT. If it must be done, allow sufficient time to do it properly.

    Still, I'm only a software engineer. What do I know about anything.

    harrumpf.

  • First of all, note that I'm not knocking the principles of Extreme Programming (XP).

    The first XP book written by Kent Beck reads like a self-help book. If you're going to write a book whose principle is "feel good about yourself," and you're trying to fill 200 pages, then you can't just cut to the chase. You have to ramble on for a few chapters about what you're going to say, and slowly let out bits of information here and there, then there are chapters the reiterate what you've already said. Beck's books--and all of the books in the XP line that I've seen--read the same way. You could explain XP clearly and concisely in a few pages, but the XP books go around and around in cicles, and after a while you're not sure if you're getting new information or not. And, miraculously, the XP line has been extended to six or more books, each of which goes over the same small bit of information in another verbose and rambling way. There's even a book about XP critcisms, which is an officially sanctioned book in the XP series, which exists simply to reinforce the basic principles of XP.

    The whole thing smacks of books like Dianetics or various lightweight volumes from self-help gurus. If there was any meaning to XP, it has been lost in endless self-justification. Imagine an entire series of books that did nothing but tell you how cool Linux was. What's the point?
  • Here's an idea: design projects so that they're testable! Yeah that's right design components and subcomponents so that they can be tested! This is the biggest problem I have found when tackling any project. Developers are always thinking in terms of quick and dirty solutions delving straight into implementation details without paying any attention to architecture

    The fundamental problem is that most folks aren't thinking in terms of testing and architecture. I'm not convinced that the XP method works. Actually, after having read the case against it, it seems counter-intuitive and plain annoying. I'll say this though, XP is onto something by placing such a strong emphasis on testing. The first question on every developer's mind should not be "How can I solve this problem?" but "How can I solve this problem with a testable solution?" The project architecture should be subsetable and the only way to achieve that is by designing each piece of the total solution to be testable.

    A project I'm currently working on has a portion of the team building a GUI enabled, unified test suite for testing the many components being developed. The developers in turn are making sure that each component has an interface available so that they can plug their components into the tester and test them. Not only can the developers test their own code with confidence but so can their peers without having to look at code written by someone else.

    So far it seems as if the effort being invested into the dedicated testing environment is paying off...

    • I find most stable, well written code I encounter is usually pretty modular and thus, able to be tested easily.

      I have some misgivings about XP, having had a very, very bad experience with it in the past. But I do think there is something there... however, perhaps a lot of what makes XP work is that simply by writing tests first, they cannot help but write implementations that lend themselves to testing as you have noted! The resulting code is thus more modular and less prone to error.

      I still think that XP presents a lot of overhead that might not be nessicary given the right people - like instead of having a thousand unit tests that you have to maintain with the code (one of the nightmares encountered), you have well-written modular code with just a few tests.
  • Most of the time software is bad because in experienced college grads nto trained in testing are hired to code out a project...win3.0 through win95 and winNT3 to winNT4 are good exampe sof that approach..

    Paired programming starts with the HR department when they hire programmers for a project in that one should hire 7%5 exp in solving that exact or similar problem and 25% inexpericed in that area for that specific project..

    But most companies as you may knwo woudl rather save short term costs and vastly increase futre costs rather than do it right the first time..

    Contrast the with OpenSource OS efforts where testing is a standard processs simply because it gurantees it will compile right on very lean sub standard machines that this group tend s to have access to do their work..

    The reason why HP and Microsoft and Cisco went to India to recruit and staff coding centers is because they understand testing jsut as OpenSource does...

    How to chgange in the US? Volunteer to be an informed Software Eningeer speaker at your local univeristy..speak and instruct the new CS students on testing...
  • Extreme Programming was eating Taco Bell and coding at the same time. /rimshot
  • Write a bunch of code that possibly does something like you think you or the user wants. Let someone else try it out, then fix what they complain about.

    I'm only partly joking. When the requirements are vague, this actually works.

In the future, you're going to get computers as prizes in breakfast cereals. You'll throw them out because your house will be littered with them.

Working...