Why Software Sucks, And Can Something Be Done About It? 498
CPNABEND tipped us to a story carried on the Fox News site, pointing out that a lot of programmers don't understand their users. David Platt, author of the new book 'Why Software Sucks ... And What You Can Do About It', looks at the end user experience with end user eyes. While technically inclined individuals tend to want control, Platt argues, most people just want something that works. On the other hand, the article also cites David Thomas, executive director of the Software & Information Industry Association. His opinion: Users don't know what they want. From the article: "'You don't want your customers to design your product,' he said. 'They're really bad at it.' As more and more software becomes Internet-based, he said, companies can more easily monitor their users' experiences and improve their programs with frequent updates. They have a financial incentive to do so, since more consumer traffic results in higher subscription or advertising revenues." Where does your opinion lay? Should software 'just work', or are users too lazy?
Re:This is just a little bit crazy. (Score:3, Interesting)
I don't even agree that software sucks, I'm perfectly happy with most of it. In fact, what am I even doing here reading this?
Soul of a new machine (Score:3, Interesting)
You should read "Soul of a new machine" by Tracy Kidder. Its an old book but its written by a guy embedded within the hardware and firmware design guys at Data General as they build an entire new processor.
At one stage the PHB arrives in the war room and utters his one and only edict - "NO MODE SWITCHES".
Pissed off with him at the time for making their design job more difficult, by general concensus, the engineers later applaud him for his vision (however the company has since folded so perhaps this was not such a great analogy).
Re:Of course it should just work. (Score:5, Interesting)
Re:one example of too many (Score:4, Interesting)
The real question is how much sophistication can be reasonably expected from lifelong computer users. The file concept and needing to save one's work is an example of something that we've accepted that everyone can and should learn, in spite of the dire predictions of UI experts. The idea that people would be better off sheltered from the file concept is, in retrospect, pretty silly -- as silly as the idea of equipping an automobiles with reins and a whip so it would feel like a horse-drawn carriage.
I think we should stop projecting limitations onto humanity and see what happens. The typical "poor ignorant user" of 2030 is going to be at least as savvy as today's typical middle-class college student, and maybe more savvy in ways that surprise us.
Good Design Means Not Needing To Choose (Score:3, Interesting)
The key is good UI design, in particular good separation between advanced options and standard options. Windows fails because far too frequently a normal user needs to go access the advanced features so all the advanced features and terminology are there to confuse the user. Try sharing files in windows and you need to do arcane things like change the workgroup name. Just to check if I could uninstall programs I've needed to run msconfig. Conversely on a mac the normal user just deals with the preference pane and never has to run command line programs or the like.
I don't mean to be a mac zealot. They've done things wrong as well (I'm pretty pissed about their special power cord) but they did a good job of separating advanced from basic features, partially because they were willing to jettison the old ways of doing things.
In any case good design doesn't require a choice between power and ease of use. It just requires a clear cut distinction so the normal user never needs to deal with the advanced features.
Re:This is just a little bit crazy. (Score:1, Interesting)
The word "save" isn't that hard of a word to grasp. People save money. People save possessions. Saving documents is no different. Grade schoolers understand it.
So when I write a document with a pen and paper, I have to "save" it? No, it is "saved" by default. I have to specifically THROW IT AWAY to "not save" it.
Same with money.. the way to "save" money is to NOT do anything with it. It doesn't evaporate if I don't click the right button.
Like the poster above, I would much rather see everything saved by default, and have everything be undoable. I've been using Apple Aperture lately, for instance. It's got non-destructive image editing, and no concept of "save". You just make your edit and it sticks. Don't like it? Remove it later.
I'd also like to have a list of every file I've edited or saved, in chronological order. I'd like to go back to any previous version. Like time machine + RSS feeds or something.
Having to "save" things is still slightly bizarre to me, and I've been working in IT since 1990. My computer-illiterate mom has figured out that if she wants things to stick around she has to click "save", but still doesn't know WHERE files go when they are saved, nor does she realize that you can save things WHILE you're working on them, as a checkpoint.
Re:Fine, not lazy (Score:1, Interesting)
We wouldn't sympathize with anyone who fiddled with their engine when they installed a "Nitrous Unit Of Extraas Booostinger" that they got for free from some guy on the street but we accept that kind of thing from computer users. Then we blame the machine instead of the moron using it.
One has to learn something (Score:1, Interesting)
Basically, it all comes down to what you learn. If you learned French all your life and suddenly find yourself in an English environment, then you are probably going to hate English and how difficult it is to use, and perhaps, rightly so. But your average American (ah-mir-i-can) will think that learning French is a horrible idea because it is so different from English.
So you can't really blame the programmer. They can't make a program that will suit everybody. They try to make something that has everything that anyone could possibly want and leave it to the user to not use what they don't need. Hey, if I have a program that is capable of doing a million things that I might need and one that I definitely do... I'm glad that it has that one thing that I do need to do.
At least with software I have a little bit more of a choice of what I learn than language. I had to learn Ada in college... No control over language. At least the VHDL has some application.
Re:Better analogy (Score:2, Interesting)
One of his peeves is when a text-editing program like Microsoft Word asks users if they want to save their work before they close their document.
blah
For them, a clearer question would be: "Throw away everything you've just done?"
No, I think the original analogy was to the point. The author complains that they're using a term that's new to people new to computers, which is really stupid. What do you mean 'Throw away everything you've just done?'. So much ambiguity... Does that delete the document, or merely discard changes? Where does the document go? Once you throw it away, can you pick it up if you change your mind?
Besides this, dumbed-down terms don't really help teach users how computers work. If there's a new term, it's associated with a new action, and a new understanding of how that action works. It'll take time, like anything new does. Once you're used to it, you wouldn't dream of calling it anything else. Microsoft Bob [wikipedia.org]... Where did that go?
If you don't know, ask someone who does. Or click the god-damned 'Help' button. Maybe we need another label saying 'Clicking: Pointing your mouse at an object and depressing the left (or right, if you're left-handed) mouse button and then releasing it'.
Perfect timing (Score:5, Interesting)
Re:one example of too many (Score:3, Interesting)
I wouldn't say user's are lazy, but they often don't seem to care about the choices that exist in what it is they want to do. Regardless of needing a bachelor's degree in computer science, how about at least understanding the complexity of the task they've chosen to do, or at least have some appreciation for the complexity. The average user is very simple minded. They don't want to know about the things they've chosen to do. They want only to know about the small piece of it that they have patience for. In that sense, yes, they are VERY lazy.
While I agree with some particular gripes that Platt has, I think his general approach is overly simplistic, does not address the actual CAUSE to the problem, and is unrealistic. As I'm sure many have already said here, tasks people want to do are NOT often simple, especially not when dealing with all the different variations and choices there are to be made when performing the task. Oversimplifying will frustrate other users. I think that David Thomas has a much more realistic and practical appreciation for the problem. It's not about lumping all users into a simple-minded bucket and catering to them. It's about understanding the users of a particular environment and working to streamline for those users. It's about understanding the needs and truly getting to understand the needs of the user. Platt does not do this. He has a preconceived notion that all users are like his view of the least common denominator. He has a statement to "programmers": YOUR.USERS.ARE.NOT.YOU. Well, to throw that back at him... our users are not him, either.
Platt uses a lame example (of many lame examples). The "Do you want to save" prompt. If it were to be changed to "Do you want to throw your work away?" more people will hit "Yes" by mistake. I'd be willing to best that statistics would show people are more likely to make that mistake and be frustrated than there is frustration with being asked to save. He just bitches because he wants it to be HIS way (see my comment about our users not being him). Sure, we all have our frustration with how certain software works and their defaults... but try seeing past your own narrow needs, and understand what the software is actually TRYING to cater to.
It's worth going and reading the comments on his blog. Many people quite intelligently rip his views apart. And also recognize... Platt is trying to sell a book... he's shooting for the "hype" and position that will get people "talking" about his book. It just makes me a HECK of a lot more skeptical about the validity to his claims, versus his attempts to sell a book:
http://suckbusters2.blogspot.com/2006/12/web-site
-Alex
Re:Fine, not lazy (Score:2, Interesting)
Re:one example of too many (Score:1, Interesting)
The computer illiterate are in no danger of going away. You can get away with that kind of thinking for certain applications like in an office setting where people either use the custom software for their company or find another job. But this entire argument is worthless for people writing shrink-wrap software (e.g. me).
I make a PC-based scan tool for auto mechanics. The cars are quite complex these days and the mechanics don't want to have to learn the ins and outs of a complex menu structure or user interface if they don't have to. To make our product competitive we have a very simple interface that packs lots of features into it. We spent a great deal of time and energy by having each developer go to a mechanic shop and fix a car with the tool (most of us here are car buffs anyhow). Then we had a mechanic use the tool with a developer "standing by" taking notes.
Doing this for two years allowed us to streamline the interface without it becoming a complete mess. We do get quite a few compliments on the interface because of all this work. But the computer illiterate guy who wants to get his work done in a hurry (most of these guys are paid flat rate -- a fixed amount per job) is not a temporary phenomenon.
These people are not going away. Guess what, even the computer literate ones don't want to spend time on an unintuitive interface -- they are trying to do their job which does not revolve around computers.