Follow Slashdot stories on Twitter

 



Forgot your password?
typodupeerror
×
GUI Programming Software IT Technology

On The Durability Of Usability Guidelines 233

Ant writes "Useit.com's Durability of Usability Guidelines article says about 90% of usability guidelines from 1986 are still valid. However, several guidelines are less important because they relate to design elements that are rarely used today... The 944 guidelines related to military command and control systems built in the 1970s and early 1980s; most used mainframe technology. You might think that these old findings would be completely irrelevant to today's user interface designers. If so, you'd be wrong."
This discussion has been archived. No new comments can be posted.

On The Durability Of Usability Guidelines

Comments Filter:
  • by lachlan76 ( 770870 ) on Tuesday January 18, 2005 @08:47AM (#11393970)
    The thing is, interfaces are still relatively similar to how they were 20 years ago. We still use a CLI (not all of us, but enough to matter), and when in a GUI, the graphics have improved, but not much is really radically different.
    • Indeed (Score:5, Insightful)

      by Oestergaard ( 3005 ) on Tuesday January 18, 2005 @09:45AM (#11394272) Homepage
      Absolutely correct (and his nonsence about common identifications of the "screen" you're working on being outdated, well, guess he never noticed the title bar in his windows...)

      Also, take a look at the Orange book - it's from 1985, and close to 100% of what you find in there is as relevant and correct as ever, but unlike the user interface guidelines, the computing industry has not (except for a few notable exceptions) seemed to really converge towards compliance here.

      (closest thing you get in a general purpose OS is Trusted Solaris, certified against the LSPP, which corresponds pretty much to the 'B2' profile from Orange book. Nobody ever made a general purpose OS that even approached B3 (let alone A1 and beyond) from Orange book).

      With all the "computer security" fluff in the media these days, it's easy to feel disappointed at the "evolution" in the IT world when you read the 20 year old cook-book to how secure systems can be built.

      Nobody cared enough. And people pay dearly for that today.
      • Well, eyes and fingers and thoughts are pretty much the same as they were twenty years ago.

        Amazing that the principles for interacting with these have changed so little! :-)

  • Guideline 1 (Score:5, Funny)

    by Anonymous Coward on Tuesday January 18, 2005 @08:47AM (#11393971)
    Make sure any self-aware military machine, in control of nuclear weapons can play tic-tac-toe against itself.
  • by stevie-boy ( 145403 ) on Tuesday January 18, 2005 @08:48AM (#11393982)
    They're *guidelines* not *rules*, so they should be sufficiently general to still be relevant today
  • WILI v KISS (Score:5, Insightful)

    by MosesJones ( 55544 ) on Tuesday January 18, 2005 @08:58AM (#11394023) Homepage

    I studied Useability at University in the late 80s/early 90s. One of the key elements was the on-going battle between WILI and KISS.

    WILI means "Well I Like It" and is normally for those interfaces designed, built and initially used by one person or group.

    KISS means "Keep It Simple Stupid"

    There are many other rules, reachability etc but under-pinning them all is the concept that WILI is bad and KISS is good.

    The Web was a whole bunch of WILIes ignoring 30 years of interface design. I'm not stunned at all that lessons from 20 years ago are still valid, because people are still the same and interactivity is still the same. Mainframes are very similar to the web as it tends to be a modal interaction model (click......wait......read.....click.... etc etc etc) there are some different concepts when elements are being dynamically updated and adjusting based on context and input. Most of that research is 20+ years old as well though.

    WILI v KISS, its the battle of "art" v "HCI". HCI is a discipline that takes in ergonomics, psychology and computing, and produces the best engineering. "art" or "creative interfaces" are the equivalent of a chocolate teapot, it doesn't matter if you like it... its still rubbish.

    The best interface is the one you don't notice, it just does its job and enables you to get on.
    • Bwah, _majority_ of the current electronics, software and stuffs are based on KISS principle: "Keep It Simply Stupid".
    • Re:WILI v KISS (Score:5, Interesting)

      by fossa ( 212602 ) <pat7@gmx. n e t> on Tuesday January 18, 2005 @09:42AM (#11394253) Journal

      Hm... you have a point certainly, but to dismiss "art" in an interface as rubbish is a bit drastic. Have you read Don Norman's Emotional Design? In it, he cites a two studies that compared ATMs. Two types of ATMs were used, identical in function with one looking "good" and one "not so good". Users of the nicer looking ATM had fewer problems using it than those of the other. Yes, actual observed problems, not answers to a survery "did you like it?". I do not know how they decided one ATM was "better looking" than the other, which is the first question I'd like to have answered.

      At any rate, the study seems fascinating but not terribly surprising. Norman proceeds to sketch a theory of why the nicer looking ATM was easier to use, using cognitive psychology and the usual HCI tools to do so. I have only read the first couple chapters of the book, but highly recommend it.

      Your final comment is appropriate. An interface that ignores art will likely look awkward or be otherwised noticed by the user, thus negatively affecting usability.

    • Well it looks to me that the flaw in the WILI principle is actually that people pretend to like something, when they actually don't, or else convince themselves that they do for personal ease, or for reasons to do with group dynamics.

      People's minds differ, but similar principles apply, so using yourself as a guide is not too bad. Using each member of a group, individually, as a guide is better still, especially is you have different thinking styles. And usability need not be maximum stupidity, although

    • I would like to expand on your thoughts a bit, as I do a lot of interface design for the web and shrink-wrap software.

      I read people like Jakob Nielsen and appreciate what they have to say. But, most importantly, I read everything he said with a critical eye. I'm pretty sure he'd like that. Frankly, some of his rules are just incorrect or highly suspect, but most are as true as the sky being blue.

      I liken User Interface design to putting on a good outfit. What you're wearing depends on the type of event (yo
  • Now we don't have a "unique identification for each display" like an ugly serial number. We have a URL.
  • by Alioth ( 221270 ) <no@spam> on Tuesday January 18, 2005 @09:03AM (#11394038) Journal
    Something that's a lot shorter and easier to remember and incredibly useful today (despite having been around quite a while) are Shniederman's 8 Golden Rules of UI design.

    Obligatory linkage:
    http://www.cs.utexas.edu/users/almstrum/cs370/elvi sino/rules.html [utexas.edu]

  • by jellomizer ( 103300 ) * on Tuesday January 18, 2005 @09:07AM (#11394059)
    The two examples of unneeded guidelines still are needed today, they are just in a slightly different form. The one with having a unique variable on your screen to show the user where they are on the program. This is still a good idea and still needs to be used, and is used sometimes. Look at the title on your web browsers Every time you go to a different web page the title of the window changes. Or at least should, and every screen in an application should have a different title. Some programs don't and when people call you for a problem you ask them where they are at in the program and they will just tell you the program name. If you are lucky they will tell you the name of all the buttons they clicked. But if you gave every window title a unique name say PROGRAMNAME: Intro Screen, PROGRAMNAME: ECO screen. then people may be able to help you pinpoint where the problem is, as well as in giving directions.

    Secondly using function keys, for common to use functions. Well they are called hot keys now and they still are relevant. And I know from experience. I once program a section in a program that had no hotkeys and I got the worst feedback from the program because of it. Still a lot of programs are data entry programs in a way and hot keys keep the speed up. Also on the same vain as hotkeys there are thinks like task bars, or docks, with the most common tools available on the screen so you don't need to navigate a slew of menu items just go to the next part of the program.
    • To make the point that this is still important, I've recently been using a web app that uses the name of the program as the title of every page, and it is therefore very difficult to return to a page you were at before.
  • by JustOK ( 667959 )
    Usability guides are too hard to read.
  • by thrill12 ( 711899 ) * on Tuesday January 18, 2005 @09:08AM (#11394068) Journal
    In the first paragraph, the author already mentions that the 977 found guidelines pale against the 1277 guidelines he('we') offers.
    Upon entry of said page, user is shown a bill of 100's $ to buy these guidelines.

  • Guideline 4.2.6 said to provide a unique identification for each display in a consistent location at the top of the display frame. This guideline worked well in the target domain of mainframes: Users typically navigated only a few screens, and having a unique ID let them understand their current location. The IDs also made it easy for manuals and help features to refer to specific screens.

    Today, screen identifiers would clutter the screens with irrelevant information. They would not help modern users, who m

  • No surprise (Score:4, Insightful)

    by AtariAmarok ( 451306 ) on Tuesday January 18, 2005 @09:11AM (#11394085)
    "You might think that these old findings would be completely irrelevant to today's user interface designers. If so, you'd be wrong."

    While the machines have evolved, the humans have not, so it should be no surprise that usability standards for humans would change so little.

    • speak for yourself. Whislst you may not have changed, I have evolved a distended belly, widely spread rear, pale skin, poor eyesight and arm muscles specially suited for resting on a desk.
  • by hcdejong ( 561314 ) <hobbes@nOspam.xmsnet.nl> on Tuesday January 18, 2005 @09:11AM (#11394088)
    Of course most of those guidelines are still valid. Human behavior hasn't changed, and that's what these guidelines are based on.
    The real trick is getting people to follow the damn guidelines [asktog.com]. Programmers should have them tattooed into their foreheads. They should be able to recite them verbatim, and show examples for every guideline.
    Apple got it right with their Macintosh Human Interface Guidelines (and associated Thought Police). Following these guidelines shouldn't be an option or an afterthought, it should be at the core of everything a programmer does.
  • Wow.... (Score:4, Funny)

    by ZoneGray ( 168419 ) on Tuesday January 18, 2005 @09:11AM (#11394089) Homepage
    Wow.... we could have Windows applications that are as reliable and usable as nuclear weapons.

    Better than the other way around, I suppose.
  • by Anonymous Coward on Tuesday January 18, 2005 @09:14AM (#11394105)

    As a person who designs user interfaces, I'd have to say that, while those usability guidelines are in fact dated, they're still quite useful as the general concept carries on. There's certain guidelines that, although related to older technology, are still relevant, much like iterative software development developed in the 1970s is still relevant today.

    On another note, doesn't anyone find it ironic that the section508 [section508.gov] government website doesn't even conform to the same accessibility guidelines it lists?



    Cheers, James Carr [james-carr.org]
  • Usability (Score:3, Informative)

    by CastrTroy ( 595695 ) on Tuesday January 18, 2005 @09:31AM (#11394198)
    I was lucky enough to have UI design class forced upon me in my University studies. I think that everyone taking a degree in computer science (I took Software engineering) should have to take this course. Despite the fact that my professor was pretty bad, I still learned a lot of useful things from that class.
  • Paradigm rot (Score:5, Insightful)

    by Doc Ruby ( 173196 ) on Tuesday January 18, 2005 @09:32AM (#11394201) Homepage Journal
    In 1986, most office workers were familiar with a desk's top, a file cabinet, and a stack of documents. Almost none had ever seen a computer, and couldn't visualize a directory structure or other virtual organization of their data. So virtual "desktop", "folders", menus, dialog boxes, mouse, and other now-familiar artifacts of the Xerox system adopted by Apple for the Mac were a way to leverage people's physical experience into computer techniques.

    20 years later, more office workers are familiar with the "desktop" than with hanging file folders. Most "civilians" are familiar with a desktop on their computer than in their home. The office prop simulations are the starting reference for reality now, and are more of a straitjacket than a life preserver. We need a new paradigm, especially because mobile devices need a desktop about as much as a desktop needed a papertape reader, or a fish needs a bicycle.

    We've got to get our info architecture universally separated into tiers. At least a data/logic/presentation (M/V/C) model, which is possibly the most flexible that can work as legacy desktop systems. As the components become more distributed, humans will directly interact more with "mere" multimedia terminals, their ticket into the heterogenous networked virtual world overlaying the physical, in which all communications work is performed. Implicit state management, without human intervention (saving, login: all invisibly automated) must be the norm; otherwise, state must be discarded outside the transaction, which must complete immediately. Universal APIs and network protocols must join the tiers together, so any of a number of choices can be selected for that particular operation, depending on the mutual combination of human/corporate parties to any transaction. For example, data entry/report must be sent to the presentation layer in a format independent of rendering *sense*, like sight or hearing (or even Braille touch, etc), rather than the current paradigm of "device independence". And when an extended subset of, say, an office desktop plus a car's internal environment and (of course) their common media player (eg. stereo) are operated from a "mobile phone" console, rather than a notebook computer or projection/tablet controlroom, the frameworks for those Human/Computer Interfaces must be only as different as appropriate to the constraints of those scenarios, which are severely different.

    The 1980s got "computing for the masses" by making desktop computing look like old skills. Now, desktop computing is an old skill, that holds many back. The new paradigm that is emerging, mainly among mobile devices, must take into account not only the new scenario, but the new wisdom that computing paradigms change drastically, but the old ones must stick around too, if only because many people won't want to learn two different paradigms in one lifetime. Enough new data and operations are undeniably floating around now that a paradigm shift is inevitable very shortly. In the words of William Gibson, "the future is already here, it's just not evenly distributed yet". Let's use the vast momentum we've generated by wisely using desktops to get us here, by moving far beyond them.
    • What "vast momentum" would that be? Interface design has been stagnant since 1984. I've seen nothing that can replace the desktop. The two companies placed best to introduce such a replacement are either too busy refining their current projects (Apple) or are content to sit on their ass and copy other people's work (Microsoft).
      • Momentum and inertia are two sides of the same coin. Interfaces are moving lots of people and info every day, though generally in a linear path. I personally think that GNOME's Beagle is the beginning of a new paradigm, but I'm sure more are out there, and to come. We must keeping looking for the new paradigm to emerge, rather than stay mired in the past model, which doesn't cover our new world accurately.
        • Until file cabinets, folders, desks, walls, and documents all dissapear, nothing is going to be more familiar then those ideas.

          I don't know about you, but I'm surrounded by file cabinets. Don't 'fix' what isn't broken.
    • You go ahead and do that, and then when nobody we wants it we can all point and laugh, okay?
      • Do what, participate in the ongoing distributed computing revolution by helping develop new, better paradigms? Your puny finger and horsey laughter are totally irrelevant to what we're doing for our customers. What are you doing on a Newsite for Nerds? You're a cryptojock.
    • 20 years later, more office workers are familiar with the "desktop" than with hanging file folders. Most "civilians" are familiar with a desktop on their computer than in their home. The office prop simulations are the starting reference for reality now, and are more of a straitjacket than a life preserver. We need a new paradigm, especially because mobile devices need a desktop about as much as a desktop needed a papertape reader, or a fish needs a bicycle.

      Err, maybe in another ten or twenty years. Th

  • I'm not suprised? (Score:5, Interesting)

    by defile ( 1059 ) on Tuesday January 18, 2005 @09:33AM (#11394207) Homepage Journal

    The fact that I get my work done faster using a command-line 95% of the time, and manipulate GUI elements using conventions established in the 80s around the X11 project suggest that computers haven't gotten that much easier to use. In fact, in their rush to become more usable for the uninitiated, I think they become harder for experienced people to use.

    When I sit down at Windows or Mac, my productivity drops. Eventually it comes to a total standstill because I'm so frustrated that I have to stop and find out how to emulate x-mouse under the workstation I'm in front of today. Or find some alt/ctrl-click window resize equivalent since every laptop has a difficult to control pointing device and positioning it over the exact lip of the edge to drag is pretty troublesome. Or look for some xkill equivalent and realize that most systems don't have one and that I really do have to wait for this sluggish application to decide to respond.

    I'm still trying to figure out how to make MacOS X usable since everyone sits me in front of it expecting me to enjoy it more because it's "UNIX underneath, somewhere". Then I spend a few minutes to try to remember where to find Terminal and then spend another 10 minutes trying to adjust the colors/font settings so that it's white on black and not 6pt font. I've been doing it for about 4 years now so I figured I'd be an expert on it, but I never can seem to remember. Maybe it's because one day Apple decided to improve it and moved the widgets around and I haven't been able to make any sense of it since. I usually give up and go to a different computer or suffer with the terminal as-is, hoping that I get my work done before I go blind. At least when I can't figure out how to make gnome-terminal or kterm do what I want, I can ALT-CTRL-F1 and get the virtual console which is usually a heavenly 80x25.

    Also, apparantly no one but me feels that MacOS X's interface is too slow, even on really really powerful machines.

    Complaints that no one understands. *sigh*

    • That was a good rant, no doubt about that. Now, was there a point relating to usability guidelines somewhere?
      • Now, was there a point relating to usability guidelines somewhere?

        Yes, actually. The headline is: Usability Guidelines From Early 80s Still Relevant, Industry Shocked.

        My point is that it's not that shocking. Computer using conventions from the early 80s are still relevant, and haven't much improved since. For me.

    • Complaints that no one understands.
      I'm not surprised. You're part of the tiny minority that actually prefers a CLI environment. For that minority, GUIs can be hard to use, mostly because this group can't be bothered to learn to use the GUI/has CLI reflexes so stubborn no amount of GUI work is going to displace them. ... I think they become harder for experienced people to use.
      I think that only goes for the tiny minority I mentioned before.
      Still, you're right in saying that there hasn't been much progress
      • Re:I'm not suprised? (Score:3, Interesting)

        by defile ( 1059 )

        ...there hasn't been much progress in UI design.

        Maybe even anti-progress. I think computer uninitiates are terrified of modern desktops. They don't really guide the experience. You get a mouse pointer and you're basically told to figure it out. Some things you push down get mad and ring a bell, some things open other things to push, and some things just do nothing. Seemingly at random, ALERTS! pop up and give you a cryptic message. The thing you once pushed down that gave you a mad sound now does n

        • A command-line interface, on the other hand, is a guided conversation.
          Uh, no. A command line offers no clue as to possible commands, for example. You need a teacher to figure it out. "man ls" offers no help for the non-technical, and supposes you already know to use "ls" if you want to know about a folder's contents.
          'Calm and soothing' aren't words I'd describe the CLI learning curve, either. "Hair-tearingly frustrating and arcane" would come closer - for me, at least.
          With a GUI, you have no syntax errors
          • Re:I'm not suprised? (Score:3, Interesting)

            by narcc ( 412956 )
            I teach adult computer literacy classes professionally. I've always believed it was easier to teach someone (especailly older) how to use a CLI than a GUI. It's true that the learning curve is (or can be) much higher - and that they can't often use new skills across applications. However, the older user isn't as limited by their degrading physical dexterity and vision in a CLI.

            I once had an 80y/o woman with crippling arthritis spend 2 days trying to use the mouse to play solitaire. She knew what to do, b
        • Try teaching your grandparents to use a computer sometime. I did. My grandmother kept picking the mouse up and moving it around in the air because she didn't understand dragging it on the pad. She was confused that such a sophisticated piece of machinery didn't let her do something that felt so natural to her.

          Sure its easy to teach someone to memorize CLI commands. But if they really want to learn, they need an interface that utilizes patterns. Can you teach someone vi and expect them to understand sed?
    • I use MacOS X, and I find a few things exceedingly hard/tedious to do. One is leaving my home directory and finding my way to the Macintosh HD, which takes exactly one click with Finder and I've never been able to do in the Terminal. When I use SSH and FTP, downloading and uploading to varying folders is much, much harder without a split-frame interface that represents the server on one side and my home, client computer on the other. I'd like an easy way to download into nested directories aside from the h
      • I don't care much for OS X, or anything else, really, because there's been so little progress in all sorts of UIs as implemented for so long. The CLI in particular needs to be rethought from the beginning, and needs to take into consideration the improved technologies and UI principles available; it's still too much like a teletype.

        However, in regards to some of your immediate problems:

        One is leaving my home directory and finding my way to the Macintosh HD, which takes exactly one click with Finder and I
        • Thanks very much for the information.

          I suspect that the CLI may be very useful for users who accumulate sufficient lore to use it effectively. I'm trying to do that to some degree, but I have many other cares and burdens aside from my general interest in effective, efficient computing. Some of the commands you list -- especially "q" -- seem fairly obvious in retrospect.

          I suspect the real reason more people don't use the command line is the steep learning curve it requires, particularly compared to GUIs. Per

      • When I use SSH and FTP, downloading and uploading to varying folders is much, much harder without a split-frame interface that represents the server on one side and my home, client computer on the other. I'd like an easy way to download into nested directories aside from the home folder, which I've never been able to do.

        Have you used scp? it's an ssh-based file copier that supports recursion. I can't help you on the home folder thing, though.

      • When I use SSH and FTP, downloading and uploading to varying folders is much, much harder without a split-frame interface that represents the server on one side and my home, client computer on the other.

        I've always thought those interfaces are pretty brain-dead... I mean, you already HAVE a perfect representation of the files on YOUR computer, it's called OS X Finder. Just open up a Finder window at the directory and have your FTP client display the remote directory, and drag&drop between them. Why
      • One is leaving my home directory and finding my way to the Macintosh HD, which takes exactly one click with Finder and I've never been able to do in the Terminal.

        erm.... 'cd /'?

        When I use SSH and FTP, downloading and uploading to varying folders is much, much harder without a split-frame interface that represents the server on one side and my home, client computer on the other.

        Huh? the CLI for FTP and SSH are the same, no? If you're talking about a GUI client, there are some you can download for FTP (

    • by Anonymous Coward
      You seem confused. Your experience with UNIX and command lines isn't a magical thingy that lets you use any platform you please.

      The Mac interface is not hard for an experienced Mac user, nor is the Windows interface hard for an experienced Windows user. If you feel these platforms make you unproductive then it's because you're a novice user, too locked in by his prejudices to figure out how to use them properly.
    • Re:I'm not suprised? (Score:3, Interesting)

      by Eythian ( 552130 )
      You're definitely not the only one. I came from using Linux, to using OSX, which I suffered for about a year. I was used to the customisability that Linux provides (focus-follows-mouse (I started UNIX on a minimal WM on Digital UNIX where that was the default), the ability to define shortcuts for all kinds of things, ability to pick a theme that behaves how I want, and so on). I was really slow working on OSX, as about the biggest customisation I could make was between the blue theme or the grey one, unless
    • No one understands them because you're unique. You represent maybe 0.01% of computer users-- users who got used to a Unix command line and have resisted learning a GUI over the last 25 years.

      Most users who got used to the Unix command line back in the day have taken the time to learn the philosophy behind GUI interfaces enough so that they have at least a working knowledge of them. And look at it this way: At least OS X gives you the OPTION to use the Unix interface you love so much. Windows doesn't.

      A
  • by daveho ( 235543 ) on Tuesday January 18, 2005 @09:33AM (#11394213)
    IMHO, software has gotten significantly less usable in the 15 or so years that I have been using computers on a regular basis. Several examples of this trend:
    • Skins. Goodbye consistency. I blame Winamp for this, although Windows Media Player has taken the problem to dizzying new heights.
    • Software that tries to outguess the user. E.g., menu items that aren't shown by default, because you "probably don't need to use them".
    • Feature bloat. Firefox is the only mainstream application I can think of that actually removed features from an application in order to improve usability. Bravo.
    • Agreed.

      But recall that in early Winamp (version 2?) the skin only effectively "coloured-in" the various UI components. There were some terrible skins available but the basic one was okay, if a little fiddly with tiny buttons. However it was not until later versions, like the unforgivable WMP, the skinning effectively changed the whole shape of the app. Nasty.

      Oh yeah, "personaliozed menus" are crap too. Always the first thing I switch off.
      • Winamp has sucked from day one. Its ability to use skins meant that none of the controls were standard Windows controls, including the bar at the top of the window, and the text. So it didn't scale with the Windows UI ("use large fonts" setting).
        It also looked horrible, with tiny controls and text.
        Windows Mediaplayer sucks even more, with moronic skins that only work half the time (try running the app full-screen: it'll revert to the default appearance).
    • by Rie Beam ( 632299 ) on Tuesday January 18, 2005 @11:10AM (#11395215) Journal
      "Skins. Goodbye consistency. I blame Winamp for this, although Windows Media Player has taken the problem to dizzying new heights."

      You think that's bad? Can you explain to me in a rational-and-well-thought-out manner why Spybot: Search and Destroy has a skinning mechanism? Why exactly would you need to skin your anti-Spyware program?
    • Firefox is the only mainstream application I can think of that actually removed features from an application in order to improve usability. Bravo.

      This happens all the time in the commercial software world. Sometimes its more successful than others. Look at 'light' versions of professional apps: Photoshop Elements, Final Cut Express and even Windows PX Home edition (like I said, sometimes its successful, sometimes, not so much). In most cases the UI is also simplified in the consumer versions.
    • Actually, I like the hidden menus - its a nice usability feature that keeps the unused things hidden. I think you just need to get used to it and then you love it.

      The only disadvantage is that the menu items change around until its fully trained which items you do and don't use.
    • 1. Skins are a good selling point. I also think the ability to make your workstation look, and work the way you want is a boon to efficiency.

      However, having said that - it is important to both understand the standards, and have public computers stick to the standards.

      I think for any home use computer system, there needs to be a balance reached between the equivalent of having your home decorated like the local school, and having things randomly thrown about.

      People want to have some control over their env
  • If it's not broken, don't fix it.

    In some cases you might need the second guideline:
    If it breaks, shake it a little.

  • by Vollernurd ( 232458 ) on Tuesday January 18, 2005 @09:54AM (#11394308)
    A good article and a good site too. I always had pretentions of reading more into the HCI/usability arena.

    You would think with 20+ years of generaly accepted interface guidelines, look and feel consistency, etc. that AOL would get their Nutscrape browser working with the "File Edit ..." menus on the correct side of the window.

    Don't get me crapping-on about that damn, godforsaken RealPlayer. What the hell is that supposed to be? Ack.

    I wish people would just try and stick to conventions. After all, how many times are you fooled into pulling open a door with a pull-handle fitted when the door is actually push-only?

    Or is that just me?
  • Guideline 1.4.15 (Score:3, Insightful)

    by CrazyWingman ( 683127 ) on Tuesday January 18, 2005 @10:05AM (#11394380) Journal

    1.4.15 Explicit Tabbing to Data Fields

    Require users to take explicit keying ("tabbing") action to move from one data entry field to the next; the computer should not provide such tabbing automatically.

    If only M$ had been listening. I know I'm not the only one here who hates that damn auto-tabbing IP address entry box!

    BTW, anyone interested should read M$'s HIG sometime. I hope they've started following it recently, because there were many sections I found where the HIG said one thing and their Office suite did something completely different.

    Also, read Apple's HIG, Gnome's HIG, KDE's HIG... Subtle differences, interesting things.

    • I hope they've started following it recently, because there were many sections I found where the HIG said one thing and their Office suite did something completely different.

      A huge problem for anyone who's trying to conform to the guidelines. "That's the standard!" "Well, it's not what Office does!".

      I have the same complaint about Apple, by the way, although they cover themselves by editing the HIG every time they do something stupid, like brushed metal.

    • If only M$ had been listening. I know I'm not the only one here who hates that damn auto-tabbing IP address entry box!

      I liked the IP thingy - type in an address with all the dots and it shows up just right. It is a little funky, but an IP address is one data field, right?

      BTW, anyone interested should read M$'s HIG sometime. I hope they've started following it recently, because there were many sections I found where the HIG said one thing and their Office suite did something completely different.

      Offi

      • Ever try editing the field after you've screwed something up? Backspace doesn't exactly to what you'd like...neither do the arrow keys. Are you changing 118.153.0.19 to 118.152.0.19? After clicking in the 153 field, backspace the 3 and type the 2. Now your cursor is in the 0 field. Why? It's just moving stuff around a whole bunch for no reason.

        Disclaimer: I have no idea what the above IP addresses relate to - I've never used them, they're just examples.

  • by cbelt3 ( 741637 ) <cbelt AT yahoo DOT com> on Tuesday January 18, 2005 @10:16AM (#11394481) Journal
    Am I the only one who is concerned about the base fact that the concept of 'intuitive' is deeply cultural ? Reminds me of the kids who scored poorly on IW tests because it had questions like: Cup is to Saucer like Hat is to Head. The kids who had never seen a Saucer or a Hat were screwed.

    In my experience (14 years of weapons system/ military test systems design) the real benefit of milspecs/standards is that they are mono-cultural- Military culture ONLY. They assume NOTHING, and define only those things that personnel who fit the military human standards (height, weight, strength, dexterity, vision, etc..) are capable of doing.

    "Modern Intuitive" GUI's and CLI's are intuitive to the designer ONLY. Icon to hold documents together as a staple ? Great ! What about cultures that use straight pins instead of staples ? etc., etc, etc... Good design means knowing your audience. Great design means BEING your audience.
  • Over a decade back, my employer wrote this application that ran on a DOS platform. It did its thing and it was replaced with a Win3.1 application. Last year, I was cleaning out some old files and found the original DOS specs. They had some stuff that had been obsoleted, but the core calculations had not changed and they provided test cases that were pure gold for the current rewrite.

    Mindful of this, I am careful to retain specs of even the deadest of projects, because 90% of the time someone with pointy ha
  • by Anonymous Coward

    5.2.16 Editing Address Headers Allow users to edit the address fields in the header of a message being prepared for transmission.

    Even though this is still true, it's hard to imaging a designer who would produce an uneditable address field today. I'm sure they exist, but they must be rare.

    Unforunately not. One of the most annoying features of MS Outlook (Exchange-mode) is that you can'tedit addresses once they've been verifed (which typically happens in the background). This makes perfect sense if you're

  • by faust2097 ( 137829 ) on Tuesday January 18, 2005 @11:54AM (#11395857)
    ...and Jakob Nielsen is not an interface designer. He's an analyst and a pundit extrodinaire but he's not a designer.

    Any rule of UI design should be broken if there's a solution that benefits the user more than the one that follows the traditional guidelines. Now the reason that we have all these HCI folks busily compiling lists of the 'right' way to do things is that they didn't actually teach them how to design anything in their masters' program.

    The sad fact of our industry is that the people who reach 'guru' status tend to spend more time bolstering their book [and overpriced PDF report] sales and retainers for giving speeches than they do trying to advance the state of the art. I can't blame them, book tours are probably easier than real work anyway.

  • So are there any studies backing any of this up or is it all just personal opinions?

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...