Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Programmers work 47 days per year 228

According to a new study from the consulting firm Software Productivity Research, software programmers spend 47 days a year developing or enhancing software applications. The rest of the time is spent testing, fixing bugs, and working on projects that will later be cancelled. This might be deemed poetic justice, given that users avoid using more that 10% of application's functionality for fear that something will break. On the other hand this could be seen as good news for newer projects: add fewer features but get them right. Eg: a light-weight word-processor that imports foreign formats correctly, but only has the features most people use. What do you think? Can anyone corroborate the article's statement that 90% of nonprofit organizations in the U.S. cannot afford to maintain more than 15 networked computers?
This discussion has been archived. No new comments can be posted.

Programmers work 47 days per year

Comments Filter:
  • Where I work biggest problem comes from requirements not being solid and changing from day to day. Sometimes without even being informed of these changes. It's a moving target... Luckily I'm not so shabby at sniping...

    Other issues include the constant re-organizations and endless meetings. People not taking responsibility for their actions. Idiotic contractors... The list goes on. People are coming and going so fast no one even remembers who or what they did.

    Miscommunication across the board. People going around finding out who didn't do what instead of just doing it and getting it out of the way.

    It's a mess.

    And I don't doubt these things happen elsewhere as well. Time is wasted in misunderstandings rather than planning things out at the beginning and saving everyone the time at the end.

    Okay. I *think* I'm done ranting...
  • WRT the simple vs. complex applications: Isn't that the Unix philosophy? A number of simple, small utilities that work together to perform complex tasks! Edit a file with VI, check it with ispell, convert to the format of your choice with any number of utilities. Given, modern users don't like the command line, but couldn't we create GUI programs that allow the same interoperation?

    Jeremy

  • In practice, I find that the complexities of meeting user requirements adds major challenges to the job. The alternative would be writing hard core mathematical software, which needed raw brainpower to determine efficient algorithms. At least users are friendly, buy you coffee and let you beat them at snooker - and are also someone to moan about when you need to loose off some stress.

    The other benefit is that because the users don't even know their own business processes half the time, you can make a discernable impact on the business itself by both making recommendations to them, and also just making decisions on their behalf. I like having that level of influence, and it's a nice side-effect of the job I do.

    I guess I could get a job that is just making business decisions, except that writing software is far more fun...

    ~Cederic
  • Coding is a minimal part of a design work, such as software development. It is like the time architects spend drawing the plans of their designs.
  • 47 days a year performing the tasks that're in our job description, I can believe. However, I'd bet a big stack o' money (that I don't have) that we spend, and are expected to spend, a lot more time working than most other professionals.
  • Just a few comments-to-your-comments:

    Not to get into a methodology war, but RUP is iterative at a fundamental level, and perhaps like XP's "user stories", the fundamental requirements building block, if you will, is the "use case".

    The point I was making in my earlier post is not so much that modern software methodologies don't have methods of measuring productivity - The ones we've mentioned do. My point is that the culture of software management is often one that is uncomfortable with these metrics. What most software managers (at least the ones I've met) want to see is source code gathering in repositories, not use case diagrams or insert_your_methodology_objects_here.

    As for having quick access to users and/or requirements generator folk to help me with requirement clarification/disambiguation, I've found that this only happens under two conditions:
    1.) The software you're developing is so overwhelmingly wonderful and necessary to the user that they are dying to help you develop it. (Don't laugh! It's happened to me! :-)
    2.) Some upper level manager tells the users/requirement generator people that unless they get completed software by Day X, they should seek other employment. (This has happened as well...)


  • The reason they are out of date is that a lot of the testing has been taken over by QA departments, at least after Unit Testing.

  • 42 is the answer. 47 is everything else.
    ----
  • Part of the problem (perhaps most) is that so many of us are maintaining applications and systems that have hung around since the dark ages. We inherit this junk and spend most of our time trying to get it to do things that was never part of the original concept. Few organizations will authorize the rewrite of a functioning system, even if it's riddled with bugs. Nor will they give us time necessary to 'do the "Raid" thing' and exterminate the bugs.

    Another part of the problem is that if we waited around until the users gave us enough info for complete specifications, NOTHING WOULD EVER GET DONE! Sheesh folks, we're SOFTWARE ENGINEERS, we write code that automates what's in the users heads! If the user can't (won't) explain what they do, we can't implement it...

    I'd LOVE to be able to wait for complete specs, but if I did, someone else would soon have my job...

    Capitalism - a horrible system, but simply better than anything else that's been tried...
  • At least in my current experience i don't think it's so much coders not planning their code out, it's schedules artificially compressed by mgmt at expense of high-level design, etc., which would prevent a lot of the back-end cost debugging this crap. there's too much pressure to just sit down and start coding.
    anthony
  • Fah, neither management, nor the customer actually have any idea what they want until they see what you have done and then utter the fatal words...

    "Well, that's almost right."

    At which point the customer is generally able to point out the ways in which your software is wrong. This is perfectly normal. If your customer knew anything about software design, they could write the software themselves. They don't, however, and so they don't categorize the million and one exceptions until your software somehow magically fails to divine them from thin air.

    That is actually one of the reason why Open Source hackers "scratching their own itch" are such an effective programming force. They happen to know what they actually want the software to do the first time that they sit down to write it. It's also why planning to "throw the first one out" is such an important concept. Unless you are writing software for yourself you will almost certainly misunderstand the customer the first time (or more likely they will fail to give you enough information to accurately understand the problem).

    Planning needs to happen, design needs to happen as well, but feature creep will happen (especially with new projects where the programmers are not expert in the particular field).

  • . . . or mabye you're writing on a broken API like MFC, and due to project constraints, doing otherwise isn't an option.

    Maybe programmers spend the other 318 days a year working around MFC bugs.

  • then you might not be spending enough time on solid design and correct coding or even just determing requirements before working

    Remember, that most programmers are not the ones that are comming up with the requirements. I dont know how many times we have something planned out at the beginning, start building, then every few days, someone from marketing or biz dev calls and says "oh yeah, we need this too", or "this has to do this now". Every time this happens, you cant throw everything out, redesign, and start over, or you would never finish. So you adapt the old code to fit the new requirements or enhancements. Unfortunately, since you didnt plan for the new reqs, there will be unforseen problems.

    The more mature a project, the more bugs there will be.

  • have fewer than 15 people in the office needing a PC. Just a thought.
  • And yet some of these packages still manage to be quite robust. For example, when was the last time that you crashed your Linux box (not on purpose :)?

    I have been using Emacs since 1995, and am constantly amazed by the next weird thing someone has gotten it do. For example, do you know that some freak show wrote a GNUS backend for Slashdot. I can now read /. using both my GNUS scoring system and the /. moderator system. That's insane!

    And yet in all the years I have used Emacs I have never had it crash one me. My Journal is a 200 page LaTeX document written primarily in Emacs, and I have written who knows how many lines of code, in several languages. I use Emacs as a debugger, a mail client, a web browser (well, sometimes). I even play the built in games. And yet I have never ever crashed the darn thing.

    Of course, Emacs is pretty modular (read, there are a whole pile of Lisp packages). But it certainly goes to show that a program can be big without being a bug haven. Sometimes the next cool thing really is the next cool thing.

  • A friend of mine who used to work at Apple was in a program where they helped inner city schools set up their networks,etc. I think this was organized by the corporation. Does anyone know of anything similar that I could do to volunteer some of my time to help out in San Francisco schools? You'd think there would be programs like this.
  • 47 [47.org] just makes a lot of sense [47.net]. I can't imagine it being any other number.
    ----
  • Simplicity: One function
    Simplexity: Many ways of doing one thing
    Complexity: One way of doing many things
    Complicity: Many functions

    I can usually deal with the first until I'm bored, appreciate the freedom of the second, respect the wisdom of the third, and I absolutely abhor the fourth.

    It's not fear of breaking that gets people to not use functions. It's the utter disorganization involved. Most products are built so that an illiterate could learn each thing one day and do just that. The problem is that even if you're somewhat intelligent, the big easy books make things impossible to recall. Granted no hacker would pride him/herself on his/her memory, but his/her skill. Thing is you don't even have the ability to record enough to make a picture of what you're learning and to organize it.

    Lwave the basics to the beginners.

    If you're doing something a bit more complicated, step up to the plate and learn some insights into the software before you go further. The same goes to devers. Don't dumb it down. It turns into crap. You end up making computers think the way humans do and react the way computers do, instead of making them react the way humans do and making them think the way computers shopuld think.
  • How on Earth can you say that. That was and still is one of the best things invented. If only modern IDEs, such as JBuilder supported VI mode. My productivity would be 2 times higher.

    VI still cuts it.. VI lives.. VI is the best.... well.. YEAH!!


  • i am a "consultant" but i'd say i spend much more time reseting passwords and and fixing things than i do planning
  • I have a fairly popular local non-profit as a client and I can testify that the funds game isnever ending and tiresome. Their director spends most of his time finding cash instead of running the place. They have 10 computers. 4 very old ones and 6 brand spankin new ones bought with donations from local politicians and such. Very touch and go for them.

    bm :)-~

  • by leppi ( 207894 ) on Wednesday November 29, 2000 @02:29PM (#593247)
    I can definately vouch for the fact that non-profit orgs cannot afford to stay "high-tech" to say the least.

    My mother is the director of operations at a school for handicap children, that just begs to have a sys admin. They are now even in multiple sites (with two seperate small networks), and would like to have those networked together in some way (right now they use dial-up networking in Windows on a one-by-one basis). They have people that would like to dial in from home, etc, but they just dont have the resources to manage a small-medium sized network.

    I used to "run" their network, which is just a simple file sharing network, with no real servers (in my spare time). But they have set up Microsoft mail on their own, and are frustrated that it isn't easier to maintain. They are just not technically oriented people, but find great benifits in using tech.

    The real problem is the pay. Most of the people there are there becasue they want to help the less fortunate, and often take pay cuts. There just does not seem to be a great desire to lend ones technical knowledge to them. I help when im home, but since i have moved away, they are just stuck. I have a feeling that it is the same for most non-profits out there as well. you don't make the bucks, so you can't hire the help. People can donate their time, but the people with the proper skills (usually these people are younger), are not willing to help.

    Just my observations, and encouragement to get involved in a cause that you think is worthwhile. They could definately use someone that would rather hammer at a keyboard than hammer a nail.

    /db

  • So if I work 47.5 days a year, I can get a pay raise?
  • ...is because when you get to 16, "office work" becomes 8-hour-days of 8-on-8 Counterstrike.

    Huzzah!

  • I got a B in economics in college so maybe my crack habit hasn't affected (too much).

    The way I see it, if only 47 days out of 364 put to innovation...thats 11%. Thats assuming all programmers do is drink jolt and are handcuffed to their cubes... oh I was supposed to make something up for that analogy... forget it. 11% of anything sucks. Now keep in mind IANAP(programmer) and I know that it can never be 100%. The only point I'm trying to make is that I think as the IT market matures (20+ years) we'll see this ratio get higher because there will be less standing in progresses way, like the Execu-droid 2000 that listens to the consultants instead of you and asks to bring in your own copy of a visual C compiler. Also, IANAE(economist)


    "Me Ted"
  • Discrepancies of several hundred dollars were common. The rule was, any gap less than $1k is not worth auditing.
    Sorry, but this says nothing about Excel. My wife used to be an auditor (she's a financial analyst now). From what she's told me, the standard rule for any system is that gaps less than $1k are not worth auditing.
  • Comment removed based on user account deletion
  • Technically, only private Universities, and only some of them, are non-profit. There are for-profit Universities. Also, State-Funded Universities are not non-profit organizations, they are publically-funded (e.g. they are a part of the state government and therefore don't fall into the non-profit category)
  • (j/k)

    (o/t)

    Hmm, it's a while since we've seen Sengan round here... he got booted, didn't he? (Can anybody remember what for? Something to do with Iraq if I recall?)

    Anyway, welcome back!

    Cheers,
    Alastair
  • by m00t ( 256995 ) on Wednesday November 29, 2000 @02:31PM (#593255)
    Something similar to this happened at my Highschool. There wasn't a large budget for the tech program at the time so he recruited students to learn and set up the network and computer systems in the district. He got it to the point where students could receive credits for doing it. He taught the 'Tech class' on his own time for no pay. The district ended up driving him away when he asked for some compensation and replaced him with 3 people who knew less and *each* were paid more than what he was asking for.
  • Oh my, check out the moderation on that post: funny, informative, underrated, redundant, troll. Looks like moderators are having turkey hang-overs today.
  • by jfedor ( 27894 )
    I know a certain non-profit organization [distributed.net] that maintains helluva lot networked computers... :)

    -jfedor
  • My father worked for Bonneville Power Admin.
    for 9 years in the Long Term Study program.
    He had models of stream flow with over 100 years
    of data. All in the form of multi-hunred-MB's
    excel spreadsheets. He said the margin for error
    was a less than a 1/100th of a percent. For the most
    part, excel sucks only as bad as the operator!
  • asside from the fact that a product is just no good unless it's bugs are fixed (assuming it has bugs), its really sad that projects that seem promissing and represent a large investment of time get cancelled.

    But about making and selling applications with simpler, smaller feature sets, I think you're going to come up against two kinds of reality.

    • The first is the fact that maintaining mostly essential features is a good thing (yay vi or bbedit [barebones.com]) for the user and the programmer (especially in terms of memory, disk and cpu usage).
    • The second is the fact that no average consumer is capable of passing up a feature, no matter how detrimental to their overall memory, disk, and cpu usage nor their stability. Besides the popularity of word, just take a look at an average permenently connected windows box, and see how many little pieces of crap software they're running at startup and in that little tray area that they jsut don't need, no longer understand, and often cause them to wonder why they keep getting error messages from this or that cryptic application name. They see a cool windows feature/extention on the net, they grab it, they forget.

    -Daniel

  • I don't charge by the hour that is my retainer if my stuff broke and it took me 20 hours to fix it I would get $200 for that month. I'm good but not that good but I do have the good sense to choose good tools. This is why I can make $200 on retainer and barly work. OSS rules for this reason.
  • You are not only rude, but also you lack of a minimum sense of humor. I don't believe very much in rules of thumb, that was my intention when I tried to add a little of sarcasm by saying "we don't drink coffes at all".

    And I mentioned this book because it was referenced in the survey. It also mentioned Brooks' (his surname is "Brooks", not Brook, read the cover page, not only page 19 :-) was given the Touring award, so he _should_ be right.

    So, take it easy guy, we all know youve read the book. Indeed, although I am not Williams Shakespeare (Cervantes neither), I can read (and write a little) of English.

    --ricardo

  • Having also (alsø alsø wik) worked for a NP [aapt.org], I know first-hand that talented network admins are extraordinarily hard to hold on to. The turn over is massive. I stuck it out longer than most and I was only there for a year. At less than 30K a year, someone was bound to outbid the NP for me.

    On the other hand, I wouldn't mind volunteering my time to work on a NP's network. Of course, it has to be a Mac network. Or a Linux network. Either so easy to set up, you can show someone else, or so bullet-proof, you set it up once and forget about it (Set it and forget it, Ron!). Windows networks are designed to keep sub-par network admins employed.

  • ...and on the 48th they rested.

  • by Thomas Wendell ( 98443 ) on Wednesday November 29, 2000 @03:58PM (#593266)

    While it may be true that most Word users use only 10% of the program's features, they don't all use the same 10%. If you were to create a simple word processor that only had 10% of Word's features, which 10% would you pick? The 10% you use?

    It's the obscure features that make a product popular and hard to displace. A small fraction of Word's users use Word to create documents with complex mathematical formulas, but those users have to have something like the EQ field or the Equation Editor. Most users have no idea what either of those features are, but most users have some backwoods feature they do depend on to make better looking documents quickly. They will never willingly switch to a product that doesn't have their pet feature.

    The same is true of other products. Have you ever looked at a new email client that did some useful things your current client doesn't but lacked some minor feature you're sure you can't live without? Did you willingly switch anyway? Most users won't.

    Simply requiring that this simplified word processor does a good job of importing documents requires that it be complex. When you open a document in a product that doesn't support the original application's full feature set, and your document doesn't come through looking just like it did originally, do you say, "Gosh, I guess I shouldn't have used a feature outside of the universal 10%" or do you say "this product and its crappy import filter are useless, give me the real thing even if it is bloated and cumbersome."

    Complex cumbersome apps are a big problem, but it's hopelessly naive to think the solution is to just write a new product with fewer features. It doesn't work. Even Microsoft tried that strategy once - Microsoft Write for Macintosh. It was a trimmed down version of Word 3.0 that sold for about half of Word's price. Sounds great, right? But Microsoft couldn't give it away.

    If you want to design a new product to replace a high-powered popular product, you need to build something that supports a superset of the feature set and does it in a more elegant, intuitive manner, where "intuitive" means "easily figured out by current users of the big ugly product". Anyone who thinks that's easy to do has never tried. Of course there's a $billion a year waiting for anyone who can do it.

  • The "magic bullet" for those issues is requirements analysis and requirements, design, and code reviews. Requirements that are supported by test cases, peer-reviewed code, and configuration management by technical experts would go a long way towards making sure that the right code is being written. This sort of falls under the communication that you mentioned, though.

    One possible exception: when the user decides they want things to work in a different way after you've already written the software once. That's more of a feature creep problem than a real bug, though, and it's basically management/business people's jobs to deal with that sort of thing.

  • Can anyone corroborate the article's statement that 90% of nonprofit organizations in the U.S. cannot afford to maintain more than 15 networked computers?


    As IT administrator for all nonprofit organizations in the U.S., yes I can.

  • 47 Hours per week, plus another 20 reading tech journals and books, 20 more reading/research online, 10 more hours per week mulling over problems in the back of your mind. 100 hours per week seems right. Philip K. Dick inspired grindcore:
  • The rest of the time is spent testing, fixing bugs, and working on projects that will later be cancelled.

    This is kind of a muddled analysis. If you work a lot on projects that get cancelled, then this is a problem with your management. If you spend time dealing with poorly written vendor products, as illustrated in the anecdote at the beginning, then this is a problem with the vendor. If you spend a lot of time debugging, then, well, that's programming for you. As for testing, are they saying people should test less? Then we'll have more of the second problem, except you'll be the incompetant vendor to someone else.

    I'm not exactly sure what this article is about. Bad software? Bad management? That programming is hard? You can't really address a problem if you don't know what it is.

  • Very good point. By that argument, quality control people, and business consultants don't work at all

    Well maybe they're right about that last one ;)

    Doug
  • One thing comes with simplicity- the inability to fix more complex problems. The article keeps on mentioning how software should be designed for basic use, and that the same software must be bug-free. However, the issue about how power users could not use simplified software has been addressed (one program to do all the tasks rather than 50 to each perform one is much better after all); and simplification also requires more coding. Design is another matter- programmers shouldn't be called on to do it if people complain so much. As for coding, much software requires special features for different users. If some installation or process requires automation, you get more code and more bugs in the end. Finally, simple design is not always best. Case in point- I help out with the Macs at my school, and while they may be simple for using software, it is a pain to install special hardware. All the problems mentioned turn up sooner or later, it's a matter how how easily they can be fixed.
  • Not only are testing and fixing bugs considered work, they are also considered to be an integral part of software engineering.

    The last thing the industry needs is the belief that programmers should only code... oh wait, that is the belief...
  • Having worked for Blackbaud [blackbaud.com], the largest supplier of fund-raising management software, I can attest to the fact that Non-Profits generally do not have more than a few networked computers and rarely have in-house IT staff. Often, they have no one on staff that is more than rudamentarily computer literate. One of my job functions was to restore databases for examination by the programming and QA staff and I can't count the number of times that I received a tape from a client that was marked "return ASAP" because it was their only backup tape. It was infuriating. Now I'm a private consultant and I don't LET my clients get into such a situation. :)
  • I marvel that if Microsoft is so interested in innovation, that they can't "integrate" a basic word processor with spell-check into the operating system like they can a browser . They must be too busy inserting Pac-Man and Flight simulator easter eggs into their products to give a damn about what most consumers would like. Microsoft feels that you are entitled to get the features you want in an operating system integrated. More people want a simple spellcheck-capable word processing program included with their OS than want an integrated browser. Microsoft is in effect saying that it is ok for the consumer to get what they want bundled in their OS for free. That means it is OK to pirate Office.
  • My, this really sounds stupid to consider coding and testing/debugging as two distinct tasks.

    As fas a I'm conderned, I'm gonna spend the next 365 days writing code non-stop, adding functionnalities each time I think of any, and I won't even try to compile or run the software untill these 365 days are over. Then I'll send my code on the Internet so that other "less productive" developers test and debug it, and because it will obviously turn out that all this code is an awful mess that can't be debugged, I'll claim that the rest of th world is outrageously improductive. Deal?
  • > One of the figures lies, or we work more than 11 man months a year or we don't drink coffes at all.

    I hate to be rude, but another alternative is that you can't read. What you didn't quote is the sentence immediately preceding Brook's list. It reads "For some years I have been seccessfully using the following rule of thumb for scheduling a software task:". You'll note he says "I", not "most people", or "the industry in general". If most poeple followed more of the advice in TMMM then perhaps we'd all be better off, and the figures in the survery would be rather different.

    Mike.
  • by plague3106 ( 71849 ) on Wednesday November 29, 2000 @05:03PM (#593325)
    Ya, its funny how they say that the bugs shouldn't even be there, yet they also say that testing is a waste of time. Its also by that logic a waste of time to test circut boards, cars, or just about anything else you buy. Computer programs are among the most complex things ever designed and 'built' by man. As such, its expected that large and complex things not be 100% by release, and obviously need to be tested. I got halfway through the article before i couldn't read anymore; its too ubsurd, and written by someone with little to no clue.
  • > What other 1/6th were you talking about?

    These are Brooks figures given after this sentence "For some years I have been successfully using the following rule of thumb for scheduling a software task:".

    So the answer is, It's a rule of thumb, giving general quanities. An even more pedantic answer would be to point out that the values are for *scheduled* work, and therefore he is free to leave a certain amount of time un-scheduled to allow for project flexability.

    Mike.
  • I'm spending more of the time debugging and cleaning things up than I do when I'm first writing the code.

    No offense, but if that's the case you're not doing it right. You need to do some design before you code, or at least do a prototype that you throw away. Coding straight out of the gate will only lead to endless nights of debugging massive piles of spaghetti code.
  • I'd pretty much concur. Most of the NPs that I've worked with have fewer than 5 computers. -- in fact, 90% of all BUSINESSES may have need for less than 15 computers (most businesses are SMALL. Many just need one or two computers).

    Big nonprofits, like the Smithsonian are the exception, rather than the rule. They may account for lots of computers, but they're only one entity.
    `ø,,ø`ø,,ø!

  • I don't think Quality Control (QA) has much to do with debugging, but everything to do with testing. Debugging is an attempt to track down a problem, is almost always done by the developer in response to a bug (probably submitted by QA). Development and debugging go hand in hand. Now testing is another matter. Having developers working on testing can be a costly waste of resources.
  • This is a very interesting point, but I don't think it will help with the problems the article describes. The article is partially complaining about how software is not easy to use, but any powerful set of tools will necissarily be more abstract. You may remove the feature bloat and allow all technical people to be competent in a wide range of tasks, but stupid or unskilled people will still be stupid or unskilled.

    I think there is a kind of Church Turing Thesis which applies to this: You can choose between using a blender or using a computer. A blender only dose a few things, but a cmputer's interface must be a programming langauge (or else it's a blender).
  • by gallir ( 171727 ) on Wednesday November 29, 2000 @02:49PM (#593343) Homepage
    This is not a new topic in already studied in the Brooks' books.

    In page 20 it says:

    1. 1/3 planning
    2. 1/6 coding
    3. 1/4 component test and early system test
    4. 1/4 system test, all components in hands
    That yields 1/2 for testing and only 1/6 for programming. If we assume that programmers in small companies and start-ups do at least the three first phases (I guess that the fourth one is for customers Alpha/Beta), that gives us:
    1. Planning: 94 days
    2. Programming: 47 days
    3. Component Testing: 70.5 days
    Total: 211.5 days/year

    Given that a labourable year has about 45 weeks*5days= 225 days (11 man months), which means, that according to Brooks' distribution of software engineering work, those programmers only spend 0,66 man months a year for coffes, reading, research and so on.

    One of the figures lies, or we work more than 11 man months a year or we don't drink coffes at all.

    --ricardo

  • Well, a lot has to do with whether or not the non-profit is swimming in cash, and where their priorities lie. Most non-profits can get semi-decent hardware donated. But this frequently doesn't include any software licenses, so if they want to be legal and run Windows, they have to pony up the cash. And the creep of new applications puts the pressure on to upgrade applications, which sometimes means a new version of the OS, plus better hardware.

    Some non-profits have a very healthy cash flow, and are able to take advantage of technology, and spend money accordingly. Others recognize the potential benefits of technology, but don't have the cash to put it in place. As an example, animal shelters can make great use of computers, but if the available money can either go to computers, or to pay someone to shovel the shit out of the pens, well....the shit shoveling has to be done, the other stuff can be done the old-fashioned way.

    Keep in mind that many non-profit jobs are low paying, and you don't always get the most computer literate people. Turnover is frequently high as well. This leads to a training nightmare, along with system problems caused by clueless users, and supposed problems that are really just the user making a mistake.
  • by Cyclopatra ( 230231 ) on Wednesday November 29, 2000 @03:02PM (#593347)
    Don't assume that the people working at nonprofits can manage their own computers. My mother is a director of a family of nonprofit women's clinics, and I just spent half an hour last night explaining to her (again) that when you're typing in an exact URL, you type it in the address bar, and not Yahoo!'s search box. She's considered one of the *more* technically-savvy people in her organization.

    All of this means that when something goes wrong with their 4-computer network, they have to hire in an MCSE at $200/hr to fix it (which he usually can't; I think he knows less about networking than your average gerbil) because there's *no* way they're going to afford (or justify to their BOD) a full-time network technician. Their accountant is studying for her MCSE, but she appears to be learning even less than their current computer rodent knows.

    As for buying new computers, forget it! My mother has finally convinced her boss to replace her 486, after she made him sit down and watch how long it took to open Word, *and* got a quote proving it would be cheaper/easier to buy new than to upgrade the 486 into something useful. They barely make payroll most months, they certainly can't increase the size of their network.

    oh, and by the way - their board of directors are such technophobes that one of them adds up the financial report every month on his pocket calculator, because he doesn't believe Excel will add numbers right. Good luck convincing them to free up funds for a shiny new network (or even a slightly dinged one), even if there were such funds. I'm donating my old Celeron 266 to them in a few months, just so I don't have to look at the pieces of crap they're using now.



  • It is perfectly valid and is in fact a major component of any programmers job (and it SHOULD be as well).


    I'd have to disagree with that. If a programmer is doing his job right, then he should spend time at the begining setting up tests but that's mainly it. Spending a lot of time after the fact trying to ensure the job was actually done correctly, or trying to track down why it wasn't should not be a major part of his job.

    I know XP [extremeprogramming.org] takes this to the extreme, but the basic principle holds true. If you're spending too much time debugging, then you might not be spending enough time on solid design and correct coding or even just determing requirements before working.

    In other industries this is known as "measure twice, cut once" and the like.

  • It has a simple(ish) core, tons of plugins, and is used by web designers, and others, all over. And it's available on just about everything.

    Not only that, but while new versions may not be available on Unix, Photoshop is available in older versions for Unix. Not being a pro, and having used both, I find GIMP does everything I need.
    ----------

  • by Kristopher Johnson ( 129906 ) on Wednesday November 29, 2000 @02:49PM (#593362)
    One of the points in the article is that lots of this "work" is finding and fixing bugs that should never have been there in the first place. Capers Jones and other software engineering gurus believe that if the proper techniques are used, the number of bugs can be cut drastically from the current norm. So, such bug fixing is not "productive work", it is "wasted time".

    It's certainly true that testing and debugging are part of a programmer's job. But it would be nice if they didn't make up 90% of a programmer's job.

  • The thing about users only using 10% of an app is probably true. The problems with using this 'fact' to promote less featureful programs are as follows:

    • Buying decisions are always made on the basis of highest capability.
    • They all use a different 10% of the features

    Why do you think manufacturers put bullet points all over the box? Features are the only easily quantifiable selling points.

    Also, manufacturers are motivated to reach the widest possible audience, to get the highest return on their development cost. Additional features are perceived to cost very little, and thus it seems attractive to add features to reach more market segments.

    The only solution to bloatware is to successfully communicate the actual cost of adding features, in terms of exponentially increasing debugging time and, inevitably, more bugs at release.

  • The same researchers who came up with this study went into other fields hoping to improve efficiency, and came up with these remarkable statistics:

    - In the food service industry, waiters and waitresses are spending a distressingly small percentage of the time serving food. In fact, more time on average was spent simply taking the order. The median time to get a meal to a customer was just 10 seconds, showing that the food servers have mastered the art of taking food from a tray to a table, but are still lacking in doing this efficiently

    - Engineers were found to spend a considerable amount of time "testing" their potential designs, hurting their cost-effectiveness when their products could have gone straight to the market without this in-between step.

    - In a very disturbing trend, medical doctors seemed to only be treating their patients a remarkably small percentage of the time. However, new techniques may make the "diagnosis" stage a thing of the past. Future patients may be treated for a disorder before they knew they had one (including glitches like broken bones, decapitation, ebola, and autoerotic asphyxiation). The fact still remains that doctors have a 100% failure rate among patients when viewed in the long term
  • by barleyguy ( 64202 ) on Wednesday November 29, 2000 @03:04PM (#593366)
    software engineering gurus believe that if the proper techniques are used, the number of bugs can be cut drastically from the current norm

    This is a topic we spend a lot of time discussing where I work. There are adamant opinions on both sides of this issue. The question is - What magical techinique are you going to use to eliminate bugs?

    Most of the techniques that are commonly used help eliminate crashes, deadlocks, interface problems, etc. But they do NOT insure that a program does what it is supposed to, or that it is intuitive in the eyes of the user.

    What good does it do for a program to be stable, if it adds together 2+2, and gets 5? And like the article in this thread pointed out, just because a program is stable doesn't mean the user understands how to use it.

    My basic point is that nobody has found the "magic bullet" yet. Even if we use some magical tool to insure that a program will not crash or deadlock, the computational logic and usability are not guaranteed. Only a human being can check those types of things. So in the end, magical techniques are no substitute for just simply being a good programmer (or a well communicating team of programmers).
  • I don't think they're trying to say that time spent debugging isn't considered "work" - obviously it is, as fixing problems to make a stable product is an extremely vital activity (as we all know). What I think they're trying to say is that a lot of our debugging time is spent dealing with stuff that shouldn't be a problem in the first place (e.g. typos, poorly implemented standard algorithms, hurried functionality additions, etc.), and thus time which could be spent doing "real work" (e.g. implementing new, fast algorithms, figuring solutions to latency/concurrency issues, etc.) is instead wasted on fixing simple problems.

    Canceled projects can also be considered wasted time, IMHO, because in the end, there is nothing to show for the work that was done. Sure, your team did a crapload of work, but if someone scrap's everything you've done, what was the point in having you do it in the first place? Was there not something more productive you could've been doing than working on essentially trash? Yes, perhaps your team has gotten a bunch of new skills, and that's great, but usually they can also do the same thing in something called "training."

    Personally, I think there is a whole ton of wasted effort going on in the working world, but everyone involved is in such massive denial about it that nobody attempts to do anything about it. While I think this study is a bit extreme in its estimates, I hope that it induces some change for the better in terms of productivity.

  • by Anonymous Coward on Wednesday November 29, 2000 @03:08PM (#593374)
    There are number of 80-20 rules, this is one.
    • Of the total spent on debugging, 20% goes to killing 80% of the bugs.
    • The remaining 20% are really tough bugs, and take 80% of time.
    • In a business, 80% of the profits are produced by 20% of the employees.
    • 20% of a business's customers create 80% of the problems.
    • Writing the program is 20% programming, 80% debugging.
    • Your boss gets 80% more pay than you, but does 20% less work.
    Just to mention a few. And you can always create more:
    • 80% of trolls on /. are posted by 20% of the people
  • said "There are lies, damn lies, and Statistics"?

    I mean really, 90% can't manage more than 15 computers? I can do 5 at home by myself..
  • by 2nd Post! ( 213333 ) <gundbear@pacbe l l .net> on Wednesday November 29, 2000 @02:18PM (#593379) Homepage
    This soons sooo much like the old CISC vs RISC arguments!

    Someone needs to develop and outline a RISC UI environment and tools; Reduced Interface Set Computing, or something.

    A set of small, fast, reliable, easy to develop, easy to debug, easy to enhance set of tools. The base for this exists in the GNU toolset... but this has to be applied to a bigger base.

    A image processor that handles photos, web-prep, and printing, for the average consumer, without continually adding features or cruft that users don't really use. Leave the hooks for plugins, of course, to enhance and extend it... but leave the core simple, small, and reliable.

    Winamp is something very similar for songs and sounds!

    Mozilla could stand to be something similar. Is there already too much cruft in it? Mozilla-PARED?

    Word processors? VI doesn't cut it, as much as I like it. A Wordpad++ or something like that.

    Anyone agree?

    Geek dating! [bunnyhop.com]
  • by cra ( 172225 ) on Wednesday November 29, 2000 @02:18PM (#593381) Homepage
    I am a programer, so I know for a fact that this is not right. 47 days is the time reported spent on programming. About 70-80% of that time is wasted on stretching lunch breaks, surfing the web, chatting and other things.


    ---
  • The URL for the story on "Programmers work 47 days per year" has changed from http://www.latimes.com/business/columns/techcol/to days.topstory.htm to http://www.latimes.com/business/cutting/20001127/t 000113753.html
  • by ZanshinWedge ( 193324 ) on Wednesday November 29, 2000 @02:19PM (#593387)
    Why is testing and debugging not considered "work". It is perfectly valid and is in fact a major component of any programmers job (and it SHOULD be as well). Programs would be a lot better if there was more testing and debugging. In other industries this is called "refining" and "perfecting".
  • Eg: a light-weight word-processor that imports foreign formats correctly, but only has the features most people use.


    One of the things that makes importing foreign formats so difficult is that you almost have to support every feature the foreign product supports, or it won't import correctly. Don't support watermarks? Funky table alignments? Post-it-notes? Then it's never going to import 100% correctly.

  • A few, admittedly highly cynical observations, respectfully submitted:

    This article is the typical mishmash of clueless quotes, half baked whines and self serving statistics that we all know and love from our friends the papparazzi. It's no better or worse than hundreds of others. I believe the slashdot community is one of the few that knows this, it's not really worth much of our time responding to these kinds of silly know-nothing opinions. That said, I'm going to anway dammit...

    One truth (apparently accidentally) embedded here is that most projects should be killed at inception. As most of us know, projects are started by users with no idea what they want, which is frankly understandable, to a point. They are then ably aided and abetted by the common variety of clue challenged technology management teams, too weak to question them or help them figure it out. There's no motivation to challenge or help the users get focussed, because most managers are never measured or held accountable for the time they waste. Why not?, because managers being humans (yes its true!) are very careful not to measure the number and dollar cost of their failures.

    I believe that an innovation we need is not some new language, tool or OS (even Linux), but a simple and practical project killing methodology. Basically, software development teams need to consider all projects a waste of time and money, until they are proven to have a viable business case and clearly achievable design. Middle management opinions should be considered inherently suspect. Failing all else, they should document their concerns and make sure these are noted by the CFO when the inevitable failure occurs. If you ever try to do this, you will probably be told its not your problem and the users/leaders/daddy knows best. If you ever hear this , its a huge red flag and clearly bullsh*t, given the failure rate. I consider this a perfectly ethical response to gross mis-management, for which developers are incorrectly blamed.

    Remember, the fox is guarding the chicken coop. I believe that most management teams secretly love to have a huge backlog of support and bugs and many understaffed, under-planned, slowly failing projects. They are arsonists, who love to run around being hero's putting out fires they secretly started. Preventing the fires is no fun, and too hard, doesn't get you visibility or promoted and forces you to clash with ingrained cultures and cynical developers (yes that's us). IMHO, this is an un-virtuous circle that lies at the heart of the so called software problem. That is, there isn't a software development problem at all, there is a software development leadership problem of enormous magnitude. I know, I am one (d*mn, how did that happen!, I used to be a real developer honest..).

    I am very doubtful that this problem is truly fixable. I believe it isn't because of the nature of human organization, the way technology projects are initiated and the inbuilt motivations to leave the situation the hell alone. These factors are built in, process improvements are transient and subject to entropy, in my experience they all fail, given sufficient time. That doesn't mean don't improve things, it means expect it to be temporary and won't matter much given the meta-chaos just described.

    Finally, I believe the business and academic crowd, though fun and entertaining to watch, are fundamentally out of touch with what we need to fix this. They are inadvertantly contributing to the problems and confusion with silly research, bogus stats and idealized impractical methodologies. They mean well, but they can't openly grok whats going on.

    Ok, I feel better now, gotta go, there's a new, cool, top priority, mission critical, must be done, greenfield, fully buzzword compliant project just starting. More sharp acronyms for the resume. You want in?...
    Gotcha!

  • I disagree. The person who wrote the spec should be the one writing the test cases. That way the program will be tested according to the spec, not according to the way the coder interpreted it.
  • Let me see you pay me ~1000 dollars to set it up a nice little Samba server a Internet gateway and pull the cable. Use a simple switch (on that size network you don't need much else). Then ~200 dollars a month to maintain it. I have several that size I make money for about an hours work per month and they are happy. Oh wait I forgot these people are trying to use Winders on servers. Yup it would be much more then.
  • From everything I've done and seen, debugging a program is often more difficult than programming it in the first place. Some of my friends spend 3 day straight binges doing nothing but bugfixing and bughunting.

    Sounds like work to me.


    -CoG

    "And with HIS stripes we are healed"
  • I'm sick of hearing the same crap. Whenever somebody makes a sweeping comment about software consumption, half the slashdot population respond with their two cents about fine-tuning VI or some other nonsense. The point is that while mass produced software is sold on the basis of endless options LISTED on a shrink wrapped box, it is widely used in a cautious and naive manner. YOU don't come into it at that level. None of you. You are all pros, your efficiency is based on flexibility at a complex level... usually in a development environment. VI, emacs, blah blah are all perfectly suited to your needs. This is one of the biggest shortfalls in software engineering: programmers programming for themselves, rather than for the end user. In the end I think that the problem that the article is getting at is that these end users will not buys simple software because they think it is a waste of money.
  • by isdnip ( 49656 ) on Wednesday November 29, 2000 @03:12PM (#593404)
    Ken Olsen, founder of DEC, was once asked how many people worked there.

    "About half."
  • by back@slash ( 176564 ) on Wednesday November 29, 2000 @03:13PM (#593406)
    One big reason software developers can't spend time working on new features is that most programmers won't write code that is easily modifiable unless they know they will have to modify it themselves in the future (and even then a lot don't). Using RAD tools that develop quick and dirty two tier applications with dbtext widgets work great when you develop the application the first time but when the schema or data source changes for the widgets you have to go and modify ALL the widgets in the application to use the new schema instead of modular components that take care of db access for all the widgets. Ending this shortsighted two tier approach and instead creating multi tier robust applications the first time will go a huge way into increasing the productivity of the programmer in the long run where it matters instead of the productivity in the first cut which RAD tools focus on. I'll take a good guess (maybe others can give more insight) in saying this is especially a problem in contract work where basically there is a lot of pressure to meet deadlines and who gives a f*ck how easy the code is to modify after the contract is finished. I think its obvious that more programmers isn't going to solve the problem as you only wind up with graduates who can't even grasp the concept of creating modular multiple tier programs( you know the ones i'm talking about.) The best path is for the skilled programmers to lead by example with good programming practices, taking accountability and pride in your code and teaching others to do so also.
  • by rknop ( 240417 ) on Wednesday November 29, 2000 @02:56PM (#593407) Homepage

    OK, 47 days sounds reasonable for actually writing code, but debugging it is work!! I've heard it said that programmers spend 10% of the time writing code and 90% debugging it. That's perhaps too extreme, but some bugs can certainly be hard to swat.

    I agree fully. Indeed, when I'm "programming", I'm spending more of the time debugging and cleaning things up than I do when I'm first writing the code. I don't even distinguish between the two when I say what I'm doing. Debugging is such an integral part of getting something working that I consider it all part of the same thing.

    When we talk about when a writer is "writing", do we just count the first draft? Take out the time when he presses the "delete" key? Take out the time reading, proofreading, and rewriting? All of that is considered part of writing. Why is debugging not considered part of programming?

    -Rob

  • by Carnage4Life ( 106069 ) on Wednesday November 29, 2000 @02:56PM (#593409) Homepage Journal
    As anyone who has read Fredrick Brook's classic Mythical Man Month knows, software development is best done with the following time allotments
    1. 1/3 planning
    2. 1/6 Coding
    3. 1/4 unit testing
    4. 1/4 sytem and integration testing
    Most of the time spent developing software is spent planning,designing algorithms and testing , initial coding takes very little time with respect to the other factors. For anyone to claim that debugging and testing are not or should not be major parts of the development process is sheer nonsense.

    Grabel's Law
  • Write it right, write it once.

    If you spend more time up front making a solid product, you don't spend time on the backend fixing it.

  • by Gandalf_007 ( 116109 ) on Wednesday November 29, 2000 @02:20PM (#593416) Homepage
    OK, 47 days sounds reasonable for actually writing code, but debugging it is work!! I've heard it said that programmers spend 10% of the time writing code and 90% debugging it. That's perhaps too extreme, but some bugs can certainly be hard to swat.

    It's still work, even if a project gets cancelled, because I spent time on it, and I still got paid for it! My boss has sometimes not implemented a feature because the cost for programmer time would outweigh the benefits.

    Now some programmers may only spend 47 days a year working a the rest surfing the web, but I value my job!

  • For one, I can say that this varies widely depending upon where the programmer is working. In the private sector work was fast, often requiring massive effort, such that 47 days of programming may have happened in one month. In the public sector it's been more often that projects move at a slower pace, because they usually lose their patron saint or just become less interesting.

    Quite probable that within specific areas the public and private examples transpose, as I've seen examles of that, too, doing one years work in 3 months (just before quitting) at a college.

    --

  • Complex cumbersome apps are a big problem, but it's hopelessly naive to think the solution is to just write a new product with fewer features. It doesn't work. Even Microsoft tried that strategy once - Microsoft Write for Macintosh. It was a trimmed down version of Word 3.0 that sold for about half of Word's price. Sounds great, right? But Microsoft couldn't give it away.

    That was because Microsoft was trying to compete with Write Now, which was a fabulous word processor on the Mac. I remember loading up Word on my old LC and remembering how sluggish it felt compared to good old Write Now. If only people didn't keep sending me blasted Word documents!
  • It makes a lot of sense, actually... Have you looked at any piece of software recently? The linux kernel supports so much more than it used to, and don't begin to think that it didn't require testing to get there.
    Same goes for most things, be it X, Mozilla, MS Windows, etc. While I refuse to make a statement regarding the quality of any of these pieces of software, they all have a lot of features that we'll rarely, if ever, use.
    Same for a few obscure vi commands, random crap in MS Word, half of emacs :-), and so on.

    It's just the way things work. Remember, most packages did start small. Then they got hit by the "wouldn't it be cool if?" syndrome.


    Raptor
  • You forgot the worst form of lie. Computer modeling. (yeah, that wasn't around when Twain said the original but you all know it's true)

    There are (obviously) some people who are better at managing computer systems then others. There are also some supposed network administrators who can't do anything right. I suspect that's what affects the "average" number of computers that can (or can't) be managed. Also, those statistics can be massaged any number of ways to produce results that are (1) frightening (2) thought-provoking (3) sound-byte sized or (4) all of the above.

    Just my 2 shekels.

    Kierthos
  • by goliard ( 46585 ) on Thursday November 30, 2000 @06:52AM (#593426)


    Howdie! Two stories.

    0)

    A friend of mine who is a $200/hr alpha-geek programming god and entrepeneur saw something about Doctors Without Borders, and was so moved, he grabbed a similarly mage-like friend, convinced this guy of the righteousness of their cause, and went down to Doctors Without Borders and said: "We want to help! We'll help you network your computers! We'll help you with your computers! We'll do it for free and we'll even pay for equipment out of our own pockets."

    And the flunky looked down his nose at them and said something like "We'll get back to you."

    And never did.

    There are a lot of non-profits, or so has been explained to me, which just don't know what they're doing technically. They don't know what they want to accomplish, or how to do it, or even how to let people do it for them.

    1)

    I've done tech-support and coding at non-profits, as a temp. I have found that they often have a very exploitative corporate culture.

    The two worst ethical situations I have ever been put in (in over 50 clients) were at non-profits (both educational). Apparently there is something about working for a "higher cause" which convinces people that ordinary rules and laws are trumped by the moral righteousness of their mission. Ikk.

    Even at places where things aren't quite that rotten, I have found that at non-profits, once you have demonstrated that you are willing to be moved by the "we're a good cause, help us" spiel, they keep loading more projects on your shoulders. People who were once saying "If only our mail server worked reliably, that's all we want", turn -- *bamf!* -- into people saying "You're incredible! Wonderful! Let's set up an extranet for every volunteer across the country, and set up an e-commerce system to take donations on it!"

    It's one thing to be in the front lines, actually doing the primary mission work of the non-profit -- teaching students, helping the sick, donating to the deprived, whatever -- in which you are directly engaged in the mission. When you do that, you feel a connection with the purpose of the enterprise, and that in itself recharges you and inspires you to do more, give more.

    But when you're in the back room with the servers, there is no recharge. It's just more work. The people out in the front lines -- who are giving 110% -- don't understand why the people in the back room just aren't so inspired. Well, duh. Could they be more disconnected from the point?

    You're right that working at non-profits is not sufficiently rewarding for most geeks -- but the reward saught is not just money.

    Add to that the fact the sysadmins are often treated as janitors, and often personally blamed for failures of the network, software, hardware, etc.... If you're going to be treated badly, you might as well be treated badly and well paid.

    Non-profits have to cultivate a culture -- perhaps different than they already have -- in which technical staff and technical volunteers are treated as partners in the mission, and keep them involved.

  • by jafac ( 1449 ) on Wednesday November 29, 2000 @02:22PM (#593427) Homepage
    so you're saying (with that title) that programmers aren't *WORKING* when they're testing, debugging, coding projects that end up shitcanned, etc.?

    A more proper title for this article:
    Programmers Code for projects that eventually see the light of day, 47 days per year.

    The other 318 18-hour days, they're working their ass off doing the other stuff.

    Do programmers only code? News flash, duh, they don't. gee whiz, I think I'll change my major now that I've learned this startling revelation.

    No, actually they spend the other 318 days reading "Visual Basic For Dummies" books.
  • If I recall correctly, OpenDoc wasn't necessarily
    tied to a specific language or platform. It was a description of how to write container apps and objects. A spec, which Apple produced an implementation of. I think a company called Digital Harbor actually produced the theoretical word processor.

    I've heard you can do most of the things that OpenDoc did with Java, and perhaps other languages. True?

    And if so, why not try and create such a thing?
  • If a program team came into a project with the specific goal of satisfying 1 need, instead of 100, or 1 customer (the boss, or his daughter, or someone's niece), then you'd have much higher efficiency.

    Then perhaps v2 would to rewrite the program into a framework such that it could be extended with plugins and paired with other similarly written programs.

    It's a methodology that might actually work. Design for one person. Test on one person. Target one person. Support one person.

    Then, afterwards, rework it so that that one person can still use it perfectly, but that everyone else on the team can still use it. Maybe stage three is to rework it so anyone else can use it, with small, minor enhancements and such, but always back to the basis that the one tester, the one user, the one customer, can still use the product.

    Does this sound bad? It sounds reasonable, to me!

    As for the Church Turing Thesis; a computer is a computer, but MS Word is a blender. It should not be used for maintaining address books, making flow charts, web site design, or databases. It is a blender.

    It could have a plugin for web export. For address book export. For Database export, whatever. But it isn't a computer, or a system. It's just a word processing program!

    Geek dating! [bunnyhop.com]
  • I spend the other 318 days reading slashdot myself.
  • Sorry to comment twice, I know, "read twice, comment once."

    However, I have to respond to this statement. When I started my business, I made the commitment that 50% of my clientele would be non-profits. Since then, I've cut that down to two low-maintennece non-profits.

    Why? Simple: non-profits are as high-maintenece as the $150/hour for-profit clients, and they can only afford to pay 1/4 of that. Becuase the average non-profit has spent years without adequate tech support or training, most start phoning any friendly tech consultant morning, noon, and night. It's not harassment - they're just desperate.

    Worse, Publically funded non-profits and the government are all too similar in their operation. I've had one grant-funded non-profit threaten to sue me for a database I created for them for 10% of standard rates because it wasn't as fast as they wanted. I've had grant-funded non-profits waste dozens of hours of my time on pointless meetings and specification drift, and then throw away the software once I completed it. As a result, I only do private non-proftis now, and only ones with 10 staff or less.

    Still, I feel that every independant developer should help support one local non-profit. Stick to the small ones, and make sure they know the limits of your support.

  • heh - that's *IF* you believe everything you read. and what happened to the other 1/6th??

    1/4 + 1/4 = 1/2

    1/3 + 1/6 = 1/2

    1/2 + 1/2 = 1

    What other 1/6th were you talking about?

    Grabel's Law
  • by Flavio ( 12072 ) on Wednesday November 29, 2000 @02:25PM (#593446)
    I somehow fail to find that surprising.

    Every programmer knows that no matter how determined one is, debugging and pieces of "work in progress" (that'll be canned for some reason) are major time spenders.

    I assume that study is targeted for management audiences with no technical knowledge so I won't slander it.
    I believe it's pretty clear that simpler is the way to go. Compare Linux to the proprietary OS with the largest desktop market share and you'll see the largest example of this.

    Furthermore, a "light-weight word-processor that imports foreign formats correctly" is an oxymoron. If your word processor is so light-weight it'll end up discarding most of the features that advanced processors use, so the file importation won't be that correct.

    I can safely state that this "faster, lighter, simpler" idea is nothing new at all and has been used for eons by competent programmers. Every simple piece of software that stays simple is an example (look at vi and all basic unix tools).
    If you want something newer, I'll mention abiword and xmms.

    This revolutionary concept is summarized in one term: focus.

    Flavio
  • by ackthpt ( 218170 ) on Wednesday November 29, 2000 @02:26PM (#593447) Homepage Journal
    ...that life is like a Dilbert strip.

    To paraphrase a popular photocopied piece: The meetings will continue until we find out why so little programming gets done.

    --

  • I can't really find any information on the site that confirms that this is grounded in hard research, like studying people working at software firms or companies.

    It's sort of suspicious that the company reporting this is in the business of selling planning tools... this sounds like a great sale line: "Your current process is amazingly inefficient, but using our tools, you can streamline productivity!"

    I'm not bashing planning tools or a good development process, but this number seems to be a bit sketchy.
  • by DanMcS ( 68838 ) on Wednesday November 29, 2000 @02:26PM (#593451)
    In Austin, Texas, where I live, the technical-support ratio in the local school district is about 2,500 computers per tech-support person. The district's ratio means a majority of its computers get no attention at all. One local high school has computers still sitting in boxes, months after their purchase, because no one available knows how to set them up.

    Yeah, because there aren't three dozen geeks at that school that wouldn't rather set up the boxes for free than do super-easy high school assignments. When we got computers at my old school, our physics teacher wanted to actually use his, so he had a couple of us set it up surreptitiously (students weren't allowed to touch them). Before long, there was this underground network of teachers saying "psst, could you guys set this up for me?", and when the official guy came around to finally do it, he poked around, shrugged, and told the teachers everything was great.
    --
  • The article said of the approximately 200 days a worker spends at work, 47 days are spent developing new code and the other 150 are spent fixing bugs and doing testing. This gets back to the statistics about how few lines per code a person generates a year.

    What would be more interesting to learn is how many those other 150 days are broken down into design, testing, and bug fixing. Working on a large project myself I would certainly agree that not enough testing is done and the brightest aren't always the same set of people doing the design.

    I'm not sure if, however, all the labor shortage could be avoided by designing things in the first place. I believe the Mythical Man Month postulates that as software projects and their teams grow in size, their complexity grows disproportionately faster. Sure it is easier to design something that is small that work just as you intended. Try throwing 40 people together each doing that with a mediocre architect and I'll tell you the results aren't pretty. Even worse are the people usually put on maintenance.

    Everyone should go read the Big Ball of Mud [laputan.org]. Really. Everyone that has ever worked on a big project will be able to relate to its contents.

    -mjg

Without life, Biology itself would be impossible.

Working...