Software Accountability Made Real? 49
An Anonymous Reader writes "In a recent presentation and post, Kent Beck (eXtreme Programming, Embrace Change) highlights Open Quality Dashboards as a means to make software development accountable. Many different approaches attempt to reduce the number of issues creeping in all along the development process. Whether a shop abides by the rules of up-front UML design or test-driven development, or a methodology somewhere in between, the ongoing burst of popularity for tools enabling continuous integration and frequent releases shows the need for unit testing to appear earlier in the development process. In this context, quality dashboards could well establish a credible benchmark for software accountability."
We do this internally already. (Score:5, Informative)
This makes it a lot easier for developers to do the right thing (and fix these problems). Nothing like a big red bar to motivate you!
Re:We do this internally already. (Score:2, Funny)
Wha? (Score:4, Insightful)
Re:Wha? (Score:5, Interesting)
There are a number of interesting benefits to this. The best one so far is that we maintain a 'responsibility trace' right from individual stakeholders in Management, to each requirement, to each design element... we can actually tell who in management has a stake in a particuliar _block of code_.
The other neat thing is, the execs can make changes all they want. We really don't care. Because we're on a fixed 3-week development cycle (all the way through the cycle each 3 weeks, culminating in a release) we can either say "sure, we'll do that in the next build" or "scratch the current cycle and we'll do that now". In the latter case, we only lose a maximum of 3 weeks work. Not bad at all, and if management complains, well, we can show them WHY we lost 3 weeks. They shut up pretty quick.
Unfortunately, convincing management that the paperwork we end up doing to improve and maintain our process is a Good Thing, is difficult. If we aren't coding, we must not be working, right? Wrong. Now we have nice graphs showing number of defects in our software falling through the floor, time spent fixing defects falling through the floor, developer productivity skyrocketing... It's fantastic.
Bottom line: Management in some places doesn't WANT responsibility. They want to hand down directives from above, and we are the magical little gnomes who make their projects at 1/4 their salary, if we're lucky. If they go sour on a gnome for whatever reason, they want to be able to fire with impunity. Process is the way to make them eat their own crap whether they like it or not. They WILL end up liking it, and you get your life back.
Re:Wha? (Score:2, Funny)
Movie trivia. Which movie is this from?
"Attitude reflects leadership".
Re:Wha? (Score:3, Informative)
However, his solution might work well in many places where feature-creep happens, even when there isn't as much animosity between developers and management.
Re:Wha? (Score:3, Insightful)
I think his solution stinks to high heavens.
You know what writers say? "The story needs to be as long as it takes to tell the story well."
I say that trying to make all software development fit the 3-week cycle, is akin to making all software development fit the J2EE Way. I need more flexibility.
Re:Wha? (Score:3, Informative)
A 3 week cycle could work pretty well in a web-environment (which is what I work in).
Re:Wha? (Score:2)
The worse is that the devs think it's normal.
Bitter? Nah...
Re:Wha? (Score:3, Insightful)
Just don't discount the value of _having_ a lifecycle. I never said there was One True Way, I just presented our shop's journey as an example.
If it sounds like we're in chains, you're dead wrong. We've never had more freedom. The hard part is deciding you're actually willing to do what is necessary to get it when you work in an environment
Re:Wha? (Score:1)
Re:Wha? (Score:1)
"Attitude reflects leadership"
"Remember the Titans".
I'd write it as "Attitude reflect leadership", since that's the way it was said, and because I like the combination of bad grammar (or poor enunciation) and sharp insight. Julius Campbell (Wood Harris), a star black player on the team, said it to Gerry Bertier (Ryan Hurst), the white team captain, when Gerry complained of Julius' bad attitude.
Yes, my kids have watched it so many times I can recite most of the dialog. Good movie, though.
Re:Wha? (Score:2, Insightful)
Nobody will pay extra for quality software (I'm talking about business customers; individuals don't have any kind of realistic influence on software development). If you (as a developer) tell a customer they'll have to pay $X and wait Y months for the software they want, they'll just buy it from someone else who promises it at $0.5X and 0.6Y mont
Re:Wha? (Score:3)
Re:Wha? (Score:1)
You can have all sort of measurements, but the question of whether they are accurate or meaningful is another question...
Re:Wha? (Score:2)
Indeed.
"High risk change. Requested at 6.30pm one day, needs to be available by the following morning."
Hmmm, wonder why there's no high-quality code on that project...?
Rise of software liability (Score:5, Interesting)
If General Motors can be held liable for damages caused by a defective car part, some argue that software makers should be held liable for damages arising from buggy code.
Re:Rise of software liability (Score:2, Interesting)
This happened today: customer calls claiming my software is broken -- he's getting "invalid opcode" messages trying to run the thing. Well, noone else gets the same message. Turns out he's running the software on some crap 386. So, is it my software, or: a) an old Windows95 so loaded up with spyware, viruses, registry crap from software uninstalled 5 years ago, outdated network clients, etc. that it takes 5 minutes to boot, or b) bad hardware (me
Re:Rise of software liability (Score:2)
Re:Rise of software liability (Score:1)
Re:Rise of software liability (Score:2)
Re:Rise of software liability (Score:1)
Re:Rise of software liability (Score:2)
Re:Rise of software liability (Score:1)
Ok, fine. I'll just gouge my customers up front rather than sticking it to them later by not reimbursing them for patching their systems.
Software shops can then sell protection plans along with the product that guarantees a payout in the event of patching.
Do you wanna pay now or later?
Re:Rise of software liability (Score:3, Insightful)
Do you wanna pay now or later?
Absolutely true! But if a competing software company actually creates quality code, then it won't have to gouge its customers, won't have many pay-outs for defective/patched software, and won't have to sell a protectio
The most important thing is common sense (Score:3, Interesting)
The most important aspect of development for my team today is requirements reuse, sound silly but works great. By following this simple methodology we have made errors nonexistant; it beats unit testing by a mile in efficiency, plus it matches the results.
Most other teams fail with this approach though, and hard. It simply comes down to what the team is made of, mine love it.
Re:The most important thing is common sense (Score:1, Insightful)
1) As someone said, "Common sense is not that common"
2) IMO unit testing IS common sense. As a matter of fact I can't think of a more sensible thing to do that to test units of code before integration. It's very logical approach that is finally getting the recognition and importance it deserves.
Re:The most important thing is common sense (Score:1)
Dart (Score:2)
I've seen their in-house dashboards and they're quite impressive. These guys eat their own cooking.
What I don't like about XP (Score:5, Insightful)
I can't recall where, but I remember reading the quote somewhere, "you can't refactor an elephant into a cheetah". I don't think many managers truly understand that...
To me XP/Agile is just an excuse that allows marketing and management to not have to do their job.
Re:What I don't like about XP (Score:3, Insightful)
The fundamental premise of XP is that there are ways to reduce those costs dramatically. A secondary premise is that exposing the costs of those changes at the point of change gives stakeholders more information to evaluate the necessity of those changes.
Re:What I don't like about XP (Score:4, Insightful)
I don't doubt there are ways to reduce those costs dramatically. But reducing those costs increases costs in other places. Change is not free. You aren't reducing overall costs, you're just moving them around. You can simplify your design, make it completely decoupled and resilient to change. But, again, this is not free. These decisions have costs. Especially in terms of technologies, performance, etc, etc. XP offloads up front work onto developers later in the project. They don't tell you that the entire project will cost more and take longer, they leave that part off. Management only see, "As requirements change, your software changes". Your comment is an example of how XP fools management into thinking that reducing costs in one area do not impact costs in other areas.
Re:What I don't like about XP (Score:3, Insightful)
The fundamental premise of XP is that there are ways to reduce those costs dramatically.
Yes, as long as you keep in mind that (a) those costs are still much greater than zero and (b) those costs are also much greater than the cost of doing the up-front analysis.
Designing and implementing for change is a good thing, but it's still more cost-effective to get it right the first time wherever possible.
IMO, some of the ideas in XP are valuable for every development approach, but they don't change the fact
Re:What I don't like about XP (Score:3, Insightful)
Only as far as the business requirements do not change after the analysis or the initial analysis anticipates changes in business requirements accurately.
That's possible, but I've never seen it happen.
Indeed; that's exactly what XP suggests!
Re:What I don't like about XP (Score:3, Interesting)
I think you're talking about two things here. The first is the reduction of scope of a large project. Where we know we'll have to do X, but we'll only do x for now. We'll incorporate into our desing the knowledge that we will eventually have to do X. The second is anticipation or prediction of related requirements. As in, I am doing an email client, of
Re:What I don't like about XP (Score:1)
Whether or not you'll be able to get an expert is another question. But if you read Agile Project Manageme
Re:What I don't like about XP (Score:1)
Ah, I see our differences now. My experience is in building software for specific clients. Yours sounds closer to off-the-shelf software.
I suspect that having greater knowledge of what your target market wants and needs will be helpful, which is similar to what you said earlier. You're right; if you don't have a clear customer capable of identifying and prioritizing actual business needs for the software, XP's planning features won't work very well.
Re:What I don't like about XP (Score:2)
Only as far as the business requirements do not change after the analysis or the initial analysis anticipates changes in business requirements accurately. That's possible, but I've never seen it happen.
Me neither, not 100%. But my point is that doing as much as you can up front to really understand the problem pays off, and the payoff is greater than linear.
If you can get the requirements half right before you start, your project will be much shorter and much cheaper than if you just start building
Re:What I don't like about XP (Score:1)
Even if that were true, it's never possible; not in the real world. Even in the pretend world, the effort needed to get the theoretical correct up front and for all time requirements is prohibitive. In the same theoretical play world where getting it right the first time is possible, you'd still find that the company would be bankrupt and the business problem to be solved would be moot before you were done.
Re:What I don't like about XP (Score:2)
Even if that were true, it's never possible; not in the real world.
Of course not. Not 100%. My point was that you're money ahead by getting as much as you can up front.
In my case, the software development I do is on a contract basis, often for firm fixed price. So I'm very accustomed to having to get the requirements very close to complete and correct up front before planning the development effort (and including some padding for the inevitable changes!).
Re:What I don't like about XP (Score:1)
That comment makes it abundantly clear that you know that it's never right up front. Why pad then? That's either a) cheating the customer, when your estimate turns out high or b) a gigantic risk of your company's profits if you guess low. Do you think the customer doesn't know you are padding? Of course they do -- negotiations up front on scope/price/schedule then end up being a struggle to push the padding one way or another. XP and similar agile methods want you
Re:What I don't like about XP (Score:2)
Customers don't hire my company to get the lowest possible price, they hire us to get a guaranteed-to-work solution, and sometimes to get a guaranteed price. They know that we have the resources to do the job (whatever it may take) and when they sign a firm fixed-price contract they know that we've added plenty of contingency to cover the risks, and that we intend to walk away with a very healthy profit, and hope to have an insanely high profit, but that's okay because they've run their numbers and determi
Re:What I don't like about XP (Score:2)
XP/Agile is a way to capture this process. You don't hav
Re:What I don't like about XP (Score:1)
Really Make It Real (Score:5, Insightful)
These are all good ideas, the unit testing, the automated frequent testing, etc.
Having experience a few crashes of bleeding edge versions of evolution and firefox with the automated calling back to the developers about the crash symptoms got me to thinking that having actual use (and abuse) be automatically incorporated into test suites might really abet the development of less crash prone code.
Despite the capability of automated testing to test many more features than can be done by hand, new applications have so much context and so many options that we need to test for what the users are actually doing with the application. Not just what we think they're doing, what we hope they're doing, but what they're really doing.
The most important bugs would be the ones that happen to the greatest number of people the most times.
Harvesting application interactions and sending them back to the test suite has a lot of value, but it's up to the developers to do this in ways that are sensitive to the user's need for privacy, too.
thinly veiled ad for a startup (Score:2)
Should you collect statistics on your project, bugs, test coverage, and all that? By all means. And there are lots of tools to do that, free and commercial.