Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
GUI Software Input Devices Operating Systems

A Non-Dogmatic History of the GUI 305

Zoxed writes "Jeremy Reimer provides an 8-page history of GUIs from the early 1930s to the present day. For example, from the conclusion: 'the truth of the story is that the GUI was developed by many different people over a long period of time. Saying that "Apple invented the GUI" or "Apple ripped off the idea from PARC" is overly simplistic, but saying that "Xerox invented the GUI" is equally so.'"
This discussion has been archived. No new comments can be posted.

A Non-Dogmatic History of the GUI

Comments Filter:
  • I'd want to see 'Windows 32' up and running. I have these win32* files on my computer, I'm pretty sure they are reminiscents of that ancient 30's GUI. But on what computer hardware would that have been running on?
  • Love of the Mouse (Score:5, Insightful)

    by Janitha ( 817744 ) on Saturday May 07, 2005 @05:18PM (#12464304) Homepage
    After reading this, I noticed one thing, seems like the idea has been stuck into the same idea this whole time, a simple 2d screen. Even vr googles use two 2d screens. Hopefully this will change more as the development of layered LCD's and other technologies start comming up. True 3d gui's are what I am waiting for now.
    • While a 3D GUI would be nifty, I can't seeing layered LCDs really bringing it to its full potential. Furthermore, the mouse has to go as an interface device for a 3D environment. It needs to be replaced with something better for simple 3D interaction. The mouse works for 3D games, but it can't go six directions for a 3D environment.
      • Re:Love of the Mouse (Score:4, Interesting)

        by Janitha ( 817744 ) on Saturday May 07, 2005 @05:29PM (#12464353) Homepage
        The mouse subject was accidental, I was initially writing about the mouse when I realized this, and your are right a mouse wouldnt really work well.

        Would be pretty nice to have a a simply point and touch in a real space, or simply just have the computer track eye movements, where you can just simply stare down a item you wish to click (or look at it and click) or a combination of touch and look.
        • Re:Love of the Mouse (Score:4, Interesting)

          by elgatozorbas ( 783538 ) on Saturday May 07, 2005 @05:38PM (#12464392)
          Would be pretty nice to have a a simply point and touch in a real space

          Nice, but probably also very tiring and difficult, without point of reference. Besides, what would all these 3D gui's be suitable for? 3D-modeling, mechanical CAD and the likes are obviously a good candidates. But apart from those? I never felt the need for either a 3D pointing device or display. Can you give some more examples of applications where such devices would have an added value?

          • by Anonymous Coward on Saturday May 07, 2005 @06:19PM (#12464623)
            Besides, what would all these 3D gui's be suitable for? [..] I never felt the need for either a 3D pointing device or display. Can you give some more examples of applications where such devices would have an added value?

            One thing that stuck in my head from my CS course in user interfaces was that (paraphrasing and condensing), "3D interfaces suck".

            A major problem is that in a 2D interface, everything is visible (*); in a 3D interface, things can be 'hidden' behind other things, or even 'inside' a group of other things. At least you'll have to move around to see items; in the latter case, you'll actually have to move some items themselves aside to see other items.

            Also; bear in mind that 3D interfaces will always be observed via two 2D projections (in your eyes, of course); this does *not* give anything approaching full knowledge of the 3D world from a given perspective. 2D on the other hand, does not have this problem.

            On the course mentioned above, we were shown a video from the early 90s (this was circa 2001/02), with a virtual reality file management system. Navigation was via a hierarchy of logically-organized rooms in a building; files were finally found within boxes and folders.

            Even then, it seemed vary dated in an "early 1990s virtual reality fetishising" way. In short, with the benefit of hindsight (*or* the ability to step back from the hype), it is obvious that the system was gimmicky and inefficient compared to the still-prevelant Mac/Windows-style folder navigation.

            If technology was the limiting factor on 3D interfaces, I believe they'd have become commonplace by the second half of the 1990s. The fact that they aren't says more about the mechanics and usability of them than it does about display technology.

            (*) Yeah. Things *can* be hidden by other things in most current GUIs. That's because they offer a very restricted level of 3D; namely layers of depth. They get away with this *because* it's limited.
        • Personally, I'd like to have a mouse with a mini-trackball rather than a jog-wheel. Move the mouse around for large distances, move the trackball around for small distances.
    • by Peter Cooper ( 660482 ) on Saturday May 07, 2005 @05:41PM (#12464404) Homepage Journal
      I don't see the point. It's like demanding 3D paper or 3D TV. Paper and TV have been around significantly longer than GUIs, and I don't see anyone jumping to make those 3D. I saw some demos of some 3D TVs in the 90's, and while the idea had a certain 'cool' factor.. it seemed pointless.

      Some things simply don't need to be more complicated than they are.. like adding buttons and extra text boxes onto Google search, or developing 3D paper.
      • The Sharp 3D Actius laptop can do true stereo display without the use of special glasses, although images to have to be preprocessed. If this could be extended to LCD's in general and windowing system updated accordingly, then true 3D GUI's could be possible.
    • by Dun Malg ( 230075 )
      After reading this, I noticed one thing, seems like the idea has been stuck into the same idea this whole time, a simple 2d screen. Even vr googles use two 2d screens. Hopefully this will change more as the development of layered LCD's and other technologies start comming up. True 3d gui's are what I am waiting for now.

      Why? We can't see in 3 dimensions. Our visual organs only see two dimensional pictures. Our brains use the parallax from two 2D images to give us depth perception, but this really isn't tru

    • Re:Love of the Mouse (Score:3, Informative)

      by kryptkpr ( 180196 )
      Layered LCDs (aka SOLED [universaldisplay.com]) really have 2 advantages. One is that they (that is, TOLEDs [universaldisplay.com]) can be made (around 70% opacity) transparent by using a transparent conductor (called ITO .. read Universal Display's patents one day if you can manage to get through it) for the electrodes, and two is that they can produce the full RGB spectrum (by stacking transparent layers) in 1/3rd of the physical space.

      So while this is cool and all, it's not really going to help with a 3D gui, as these OLEDs are still made in 2D sh
    • Re:Love of the Mouse (Score:5, Informative)

      by iamlucky13 ( 795185 ) on Saturday May 07, 2005 @07:07PM (#12464853)
      The Sphere desktop environment [hamar.sk] probably isn't quite what you're looking for, but I think it's neat. It's really more of a toy than a tool.

      It's basically a program that replaces the standard desktop in XP with a spherical one. Your vision is at the center of the sphere, and you can look 360 deg, including up and down and place windows throughout the entire environment. You can also get some rather dizzying backgrounds for it.
  • Very interesting... and you can see GUIs stem off to those such as on Console gaming machines, Palm Pilots, Cell Phones, even Home Theater receivers and DVD players. Well done, nice read (well, I skimmed through it).
    • Re:Cool (Score:5, Interesting)

      by MichaelSmith ( 789609 ) on Saturday May 07, 2005 @06:18PM (#12464620) Homepage Journal

      I work for a company which sells air traffic control software. Lately I have been conducting training courses for software engineers working with our product.

      I offered the opinion to my students that the radar display, implemented as computer graphics, is one of the best graphical user interface metaphors that you can find.

      And there it is in this article:

      During the war he had worked as a radar operator, so he was able to envision a display system built around cathode ray tubes where the user could build models of information graphically and jump around dynamically to whatever interested them.

      Which makes me think that the CRT radar display where theta on the screen tracks the radar head revolution, and R represents the time for the echo to come back was the first true, working GUI.

  • Heh,my primary interface to my computer is still a terminal, but inside a GUI. The only applications I use that really use the mouse are Mozilla and Gaim. For the rest, I use the keyboard primarily, for school work (Emacs, gcc, and latex) and such.
  • Jef Raskin (Score:3, Funny)

    by drivinghighway61 ( 812488 ) on Saturday May 07, 2005 @05:22PM (#12464313) Homepage
    Why, why, why, I invented Ars Technica!
    Jef Raskin [folklore.org]
  • GUI is over-rated (Score:4, Insightful)

    by xiando ( 770382 ) on Saturday May 07, 2005 @05:25PM (#12464330) Homepage Journal
    I know. I am a bit strange to think GUI is over-rated. And in very many cases, GUI does the best job. But CLI, text-based, is my preferred choice for a broad variety of applications. Text-based simply gets the job done quicker and more smoothly in many cases. Actually, unless I am working with something that actually requires graphics I prefer text-based..
    • here here!! (or is it Hear Hear? I can never remember).

      and also, there are very few (any?) server apps that would benefit from a gui.

      I don't really have an imigination but I cannot think of one off hand.

      anyone?

      • or an "imagination" for the Grammar Nazis (go GNs, Go)

      • The only server i can think of that requires a GUI is X11(boom tish) or the likes .
        Other than that a GUI interface is really just a preferance for the user.
        I personaly do most of my work on the command line , the only thing i would really not enjoy doing on the command line would be Graphical work , i know its possible but its not plesant.
        • good one.

          But really, Is there a server that really benefits from a gui.

          Isn't a text config preferred by ALL admins?

          Windows Admins? Remember being able to use a logon script to actually do stuff?

          And pelease don't mention windows scripting host. That is the most non intuitive device I have ever seen.

          A script is a set of commands. On their own the commands should do work.

          • Re:GUI is over-rated (Score:3, Interesting)

            by jc42 ( 318812 )
            Isn't a text config preferred by ALL admins?

            Indeed. I've been involved in a number of network-management projects. In all of them, we've eventually had discussions of an interesting phenomenon: When you watch network admins in action, and a problem occurs, they invariably (and instantly) abandon the fancy GUI tools on their workstations. They open up 2 or 3 text windows and start typing commands.

            If you talk to them about it, you'll find that their attitude is "Oh, those fancy GUI things are for impre
    • I agree, there are cases where the CLI works better, but there are also cases where GUIs work better too. Tell me how to select and manipulate 10 dissimiliarly named files at the same time in a directory with 100 files in, on the commandline? EG to ftp somewhere, to cp somewhere or to delete. You might spend some moments formulating a regexp to catch just the files you want and testing it, but a GUI selection ability is much faster.

      My point? Theres point to both methods.
      • Tab completion ;)
        Globbing, wildcards, tab completion, and regexes are good tools to handle different situations where you need to quickly get the computer to pin down certain files without much effort from the user. Clicking with the mouse can be useful, but sometimes CLI will actually be faster.. depends on what you're doing.
        One really nice GUI feature is copying+pasting of files. I like it. I would like more, though, the ability to gather files as I maneuver around the filesystem and then deposit them
      • I agree, there are cases where the CLI works better, but there are also cases where GUIs work better too. Tell me how to select and manipulate 10 dissimiliarly named files at the same time in a directory with 100 files in, on the commandline?

        There's always full-screen text console apps like Midnight Commander, which resides somewhere between a GUI and the CLI. I probably use bash to navigate files 70% of the time, a GUI file manager 20%, and Midnight Commander 10%. Each has its own uses: CLI when you're g

      • ... but a GUI selection ability is much faster.

        Not entirely clear. Wouldn't be if the number of files is long or in separate folders (requiring scrolling etc.). For your example a cut/copy/paste command line equivalent might be something like:

        • fcopy file1 file2 ... - Remember file selection.
        • fcut file1 file2 ... - Remember file selection and remember to delete them.
        • fadd file1 file2 ... - Add file selection to existing selection.
        • fpaste - Paste file selection in current directory and delete source if
    • by NanoGator ( 522640 ) on Saturday May 07, 2005 @05:49PM (#12464442) Homepage Journal
      "I know. I am a bit strange to think GUI is over-rated. And in very many cases, GUI does the best job. But CLI, text-based, is my preferred choice for a broad variety of applications. Text-based simply gets the job done quicker and more smoothly in many cases. Actually, unless I am working with something that actually requires graphics I prefer text-based.."

      In your case, that's great. However, UI isn't just about quick efficient interfaces with the computer, it's also about making an interface that a new user can do something with. A text interface is the WORST interface to give somebody who's never used the system. If a GUI is designed well, you can tell a user what their goal is and they'll work it otu. With a text UI, the user will fly over to Google.
      • With a text UI, the user will fly over to Google.

        No. They'd like to, but they won't be able to figure out how.
      • A text interface is the WORST interface to give somebody who's never used the system. If a GUI is designed well, you can tell a user what their goal is and they'll work it otu. With a text UI, the user will fly over to Google.

        Doesn't Google already have a text UI? That is, between all of the graphical white space around the sides of where the user actually types the text explaining what they want.

        I would have classified the Google interface as a very well designed text interface. Consequen

    • I think CLI is valuable but there's a point where it is more or less a bunch of excess keypresses to perform some operation, even with tab completion.

      The CLI to mySQL is attrocious for example, it seems the only way to fix it is to have a graphical interface.
    • Doom in CLI (Score:4, Funny)

      by Rallion ( 711805 ) on Saturday May 07, 2005 @06:17PM (#12464612) Journal
      > Forward 4 meters
      > Turn left 7 degrees
      > Fire
    • Pine Email! (Score:3, Interesting)

      by iamlucky13 ( 795185 )
      Just four years ago, Pine was still the standard for remote email access at my school. Going from hotmail to that took some getting used to, but now I'm hooked. I can check for new messages in the time it takes Firefox to start up and load the hotmail login page. Of course, I don't even need to touch the mouse. The only downside, in my opinion, is downloading attachments is slightly more complicated, since I have to FTP them.

      Of course, progress has a nasty tendency to ruin the best things in this world.
  • by Anonymous Coward on Saturday May 07, 2005 @05:30PM (#12464358)
    ...As taken place in IRC

    maoepdmz: apple invent gui, m$ stealz it
    CHRIS: FAG STFU... XEROX MADE FIRST gui
    maoepdmz: no
    jay: microsoft made the gui truly successful
    CHRIS: WTF
    maoepdmz: STFU JAY FAGG0t
    *** jay has been kicked (Suck Bill's cock)
    CHRIS: MY DAD WORKS FOR APPLE, HE WAS THERE WHEN THEY MADE THE GUI FOR LISA AND STEVE JOBS CREATED THE MOUSE
    maoepdmz: raelly?
    CHRIS: YEAH
    maoepdmz: so xerox are liars
    CHRIS: HELLS YEAH THEY TOTALLY RIPPED IT FROM APPLES
    maoepdmz: wow
    gorbulon_neo_matrix21: u all fags, linux had the first gui in 1983... its called x windows system, idiots
    CHRIS: NO, UNIX MADE THAT
    maoepdmz: chris do u have unix
    CHRIS: YES.. ITS L33t
    maoepdmz: can i see screenshot?
    CHRIS: NO
  • by unassimilatible ( 225662 ) on Saturday May 07, 2005 @05:31PM (#12464359) Journal
    Apple ripped off the idea from PARC" is overly simplistic

    How about, "Apple bought some ideas from Xerox for millions in cash and stock?"

    This "Apple ripped off PARC" thing is nonsense. Just because the PARC group didn't like that their company sold the GUI rights doesn't make it a rip-off.

    Bought and paid for.

    • by As Seen On VT ( 881871 ) on Saturday May 07, 2005 @06:05PM (#12464536)
      Once we'd bought the GUI from PARC and started to develop it, loads of Xerox engineers started to jump ship and move to Apple - we were where the action was at. This was a hard time for most of us old timers, the PARC engineers were smelly, had facial hair and a tendency to work naked (a very confronting sight).

      Acceptance of the new engineers did slowly grow though. I remember the turning point was when in a weekly meeting where we showed off the latest advances in ASCII porn - a vital part of Apple's plan to get computers into every bedroom - the PARC guys demoed a system of multiple 'windows' containing 4-bit Grayscale photorealistic porn!

      It took a minute to get over the fact that they had used themselves as models, but Steve immediately saw the potential and that won us over they quickly.

    • This "Microsoft ripped off Apple" thing is nonsense.

      Just because Apple later changed their minds about licencing the Mac technology to Microsoft in return for a word processor doesn't make it a rip-off.
      • Apple screwed up the license when it wanted MS to deveop programs for the Mac OS. Bad lawyering yes, but Jobs and Co. never intended that MS be able to copy the "look and feel" of the Mac OS.

        Unfortunately, their licensing agreement let MS do just that, and the rest is history.

        There was, however, no "changing of minds."

        • Well, the post was a little joke, so don't read too much into it -- Although PARC also later argued "bad lawyering" and tried to sue Apple over their deal.

          The vast majority of the "Look and Feel" of a GUI system is not 'intellectual property', and therefore was not Apple's to sell anyway.
          • Well, the post was a little joke, so don't read too much into it

            Yeah, I figgered that.

            The vast majority of the "Look and Feel" of a GUI system is not 'intellectual property', and therefore was not Apple's to sell anyway.

            Not sure if this is a value judgement or a legal one. The look and feel claim failed because of the license, according to the jist of the judge's decision. In other words, Apple might have had an IP claim, the judge said, but the poorly-written license trumped it. I have no doubt t

    • it means they didn't invent it. they bought it at the xerox flea market.

      rip-off can usually mean two things :

      1: it was stolen

      2: it was a blatant copy (as if thats a bad thing)

      but in this case it also means that it wasn't a home grown project.

      people really have to get off the rediculous bandwagon that copying is bad.

      it is THE way that the human species has lived for the past millennia. they copy/learn/share info/data/ideas.
      • people really have to get off the rediculous bandwagon that copying is bad.

        There are a few of us who read /. for the articles (you know, as with Playboy), but don't buy into the whole "IP is bad" thing. The US Constitution is clear on mandating Congress to pass laws to protect art and science:

        To promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries;

        Some of us still believe in reasonable

    • Licensing something doesn't give you the moral right to claim credit for its invention. Do a Google search for "xerox parc" or "engelbart" or "xerox alto" on apple.com; the lack of attribution and credit given by Apple to these pioneers is embarrassing. Unfortunately, that kind of arrogance and lack of credit continues to this day; the claims accompanying Tiger are more of the same.
    • by john82 ( 68332 ) on Saturday May 07, 2005 @08:01PM (#12465119)
      How about, "Apple bought some ideas from Xerox for millions in cash and stock?"

      Bzzt. Still wrong.

      The accurate statement would be that some things started at Xerox, some at Apple, and some occured simultaneously. One common thread is the late Jef Raskin. Raskin's Computer Science thesis (in 1967) took the position that computer interfaces should be graphical. Note that would be some 17 YEARS before the Mac appeared and some time before much of the work at Palo Alto.

      As a professor at UCSD, Raskin was a visiting scholar at PARC where one would expect there was a bit of a mutual admiration society. Raskin and the folks at PARC were on the same wavelength. To be fair, one might wonder if the work at PARC may have owed something both to Raskin's thesis, and also to his occasional presence in the labs.

      Following his move to Apple, Raskin apparently curtailed his visits to PARC. But it was Raskin who got Jobs interested in things UI and the work at PARC. For more on this, check out this article [vwh.net] by Raskin.
    • "How about, "Apple bought some ideas from Xerox for millions in cash and stock?""

      This is a common myth on Slashdot. Xerox never licensed their GUI technology to Apple. That's why they sued Apple for violation of copyright.

      As I've described before, it was common in those days for a company to sue another over the "look and feel" of the product. The theory was that copying the behavior of a product violated the copyright.

      This was exactly the same argument Apple made when trying to sue MS. Neither Xerox nor
  • by mcc ( 14761 ) <amcclure@purdue.edu> on Saturday May 07, 2005 @05:40PM (#12464403) Homepage
    They go directly from Smalltalk/PARC to Apple/LISA as if nothing happened in between. There actually were a decent number of GUI/windowing systems in the late 70s / early 80s, and a number of pre-X attempts at making a UNIX GUI, that time has totally forgotten. PERQ [greenend.org.uk] is the only one I can seem to find evidence of the existence of on Google offhand. If you can somehow find a copy of the book containing this history of GUIs written in 1986 [acm.org], it's rather fascinating...
    • by sydb ( 176695 ) <michael@Nospam.wd21.co.uk> on Saturday May 07, 2005 @07:22PM (#12464931)
      I used to have a PERQ II. They were made by ICL. It was a washing machine sized brown box; it was heavier than a washing machine though. The screen was a remarkably clear black & white portrait job. It ran Unix/X and it came with a copy of the Bell Labs manuals. I believe it came from a local university via a couple of friends.

      The "mouse" was a "puck" - no ball it was used with a tablet (like a Wacom). The puck had a bit of transparent plastic at the top with cross-hairs - I presume so you could trace out a drawing. IIRC the buttons were different colours.

      This was my first exposure to Unix and I loved it. My biggest regret, other than falling in love with the wrong woman, was taking this to the dump six years ago because I had no room for it. Now I have lots of room :-(
  • Oh shit (Score:5, Funny)

    by boomgopher ( 627124 ) on Saturday May 07, 2005 @05:45PM (#12464419) Journal
    Don't show the Gnome devs this:

    Alto File Manager [arstechnica.com]

    it might end up being the next version of Nautilus...
    • Re:Oh shit (Score:2, Insightful)

      by Anonymous Coward
      Anything that gets them to change the current file manager is a good thing.
  • Thank you (Score:3, Interesting)

    by sahrss ( 565657 ) on Saturday May 07, 2005 @05:48PM (#12464433)
    A "thank you" goes out to the author of this article, from me. At my college, we have two different version of OS history: The one where Windows was the first real OS, and the one where Linux is the newbie version of the first real OS, which is really UNIX. (note this was sarcasm)

    Those are the two versions our Win32/UNIX teachers preach. Neither bothers to look at the facts or the history of any other OS. *growl*

    I'll be showing this to some of them who aren't totally hopeless.
  • by Anonymous Coward on Saturday May 07, 2005 @05:48PM (#12464436)
    GUI screenshots.
    http://www.aci.com.pl/mwichary/guidebook/interface s [aci.com.pl]

    Englebart's famous 1968 demo.
    http://sloan.stanford.edu/MouseSite/1968Demo.html [stanford.edu]

    Acorn Archimedes GUI
    http://homepage.tinet.ie/~lrtc/computers/acorn_ro/ acorn/ [tinet.ie]

    http://www.bbc.co.uk/h2g2/guide/A225785 [bbc.co.uk]

    Knowledge Navigator.
    http://en.wikipedia.org/wiki/Knowledge_navigator [wikipedia.org]

    Apple II GS
    http://applemuseum.bott.org/sections/computers/IIg s.html [bott.org]

    BeBox
    http://www.bebox.nu/history.php [bebox.nu]

    8-1/2: The Plan 9 Window system
    http://plan9.bell-labs.com/sys/doc/8%BD/8%BD.pdf [bell-labs.com]

    Genera
    http://www.geocities.com/mparker762/toys.html [geocities.com]

    Video Interviews of Early Pioneers
    http://www.invisiblerevolution.net/ [invisiblerevolution.net]

    GUI News
    http://interfacelift.com/news/ [interfacelift.com]

    ZUI's
    http://www.cs.umd.edu/hcil/piccolo/applications/in dex.shtml [umd.edu]

    ---
    Important Stuff to keep slashdot's filter happy.

    # Please try to keep posts on topic.
    # Try to reply to other people's comments instead of starting new threads.
    # Read other people's messages before posting your own to avoid simply duplicating what has already been said.
    # Use a clear subject that describes what your message is about.
    # Offtopic, Inflammatory, Inappropriate, Illegal, or Offensive comments might be moderated. (You can read everything, even moderated posts, by adjusting your threshold on the User Preferences Page)
    # If you want replies to your comments sent to you, consider logging in or creating an account.
  • I had forgotten how impressive small talk was, and still is. For more on the history of smalltalk you can go here [smalltalk.org]
  • Hoorah! (Score:2, Insightful)

    We now have a simple place to point the trolls who insist that Xerox invented the mouse, NEXT invented the dock, and so on.

    It's actually worth it to RTFA, even if (as pointed out) it's not textbook-thorough...

    DN
  • next generation guis (Score:2, Informative)

    by zazelite ( 870533 )
    ... will rely on a rock-solid & well-established foundation of 3D rendering techniques whose relative usage of system resources is at or below that of the rock-solid & well-established foundation of 2D rendering techniques used by today's GUIs.

    Sound familiar? This is what Microsoft (among others) is working on. Exploiting the raw processing power of GPUs to create the GUI means the performance hit is minimal for applications, letting them become prettier without getting noticeably slower. Let's

    • Argh... I know this sounds Apple fanboy-ish, and I know how unpopular that is in moderation =p, but while many others are working on this... Apple has it now. It's called Quartz Extreme (catchy, I know =/).


      Maybe it's too soon to tell where they (among others) are going to take it, but the signs from them (and MS) point toward Nowheresville. It just speeds up the same old gui.

    • ... will rely on a rock-solid & well-established foundation of 3D rendering techniques whose relative usage of system resources is at or below that of the rock-solid & well-established foundation of 2D rendering techniques used by today's GUIs.
      Sound familiar?


      Why yes. Yes it does. I'm typing on it right now, it's called Tiger.

      I'm not sure what the Next Generation GUI looks like, Apple hasn't announced what the next OS release will hold yet. Hopefully Mail.app uses it though and moves beyond its
  • I did not read the article, but having lived throught it, and watch the technology develop, let me say that it is difficult to say what happened and who did what

    What is true is that MS helpd mov the market from IBM and mainframes and minis to micro computers. The computers were not advanced. The CLI sucked, when compared to what one could do on a mainframe. Command completion, complex pipes, etc. MS Dos and Apple DOS and ProDOS, even CP/M, were terrible inadequate. For anything but spreadsheets and wr

  • by rmallico ( 831443 ) on Saturday May 07, 2005 @06:28PM (#12464674) Homepage
    http://xanadu.com.au/ted/ [xanadu.com.au] This guy is one of those folks who happened to have gone through the growing pains of the gui, hypertext world as it came to be (at least form its inception to its current state) i could say he is my crazy uncle but... 1. he is definately not crazy 2. he and my aunt just won't marry (but i still think of him as one helluva uncle) 3. he's pretty cool..
  • It's a nice historical piece on GUIs. It lacks lots of the drivers for things: vector graphics and math co-processors; how CPU and bus design influences graphics; multi-tasking/multi-threading and its impact on GUI design; the advent of the (awful) browser; the Unix schisms and X developments; other window managers and their designs; how the awfulness of small, low-res monitors impacted GUI design; memory mapping, bit mapping vs drawing real lines in the kinescope fashion.

    Also glossed: Desqview, RHM, and

  • This article was rather omissive, in my opinion.

    Showing an early KDE screenshot in no way represents the 10 years of X history that came before that, nor does it speak to the competitive features of early X Windows software, such as network transparency.

    Where were the 8 major versions of the Mac OS that appeared between System 1 and OS X?

    The article doesn't even touch on Windows XP, except to show it on the final timeline!
  • One thing I would have added in there was an entry on enlightement - back in the day, it was the first window manager that allowed almost complete customization and theming. I'd say it's probably behind the drive to 'prettify' GUIs that has only become popular in the commercial world in the past few years (os X, xp (to an extent, anyway), other window managers for X). I can still remember the first time I saw someone running E - it blew my mind.

    Also, did anyone else notice that the one entry for X was li
  • by Anonymous Coward
    Pretty feeble description of X-windows-- no mention of the fact that X is network aware (or network transparent); i.e. X clients do not need to be running on the same machine as the X server they display to. This was a revolutionary idea at the time and allowed for the X terminal thin-client architecture. This is STILL a pretty radical idea for folks in the Windows world...

    Maybe VNC also deserves a mention?
  • It is correct to say that Xerox did not invent the GUI. But the article seems to use that as some kind of exonoration of Apple, and that it isn't. The researchers at Xerox made enormous contributions, both to the user-visible aspects of GUIs, as well as to the underlying technologies (OOP, design patterns, etc.). In contrast, the developers at Apple made some moderate, practical improvements to the user-visible aspects of the GUI (although, ironically, in OS X, they are actually picking up more and more
    • by SuperKendall ( 25149 ) * on Saturday May 07, 2005 @07:46PM (#12465060)
      And little has changed. The poor foundations of the original MacOS haunted Apple until they finally had to throw out MacOS and start over again with OS X. And what do they do? They base it on NeXT and Objective-C, a system that was pretty nice in the 1980's, but that has never been technologically cutting edge and is pretty much obsolete today as far as software technologies go.

      Now Objective-C I'll grant is a bit of a mixed bag - primarily because of the lack of garbage collection, though autorelease pools are not too bad...

      But the NeXT foundation and Objective-C together are actually very pertient to the world we live in today. The very heavy message-passing style of calls actually mirror the growing populartity of message passing in large enterprise systems, such as JMS.

      Objective-C is actually where the industry should have gone instead of C++. It's easier to learn and use than C++ (I've done both) and might be a little behind Java or C#, but then again it's also not really been overhauled for a while.

      The rapid degree of progress Apple has managed to make in the OS and with other programs is a good demonstrator for how efficient Objective-C can be.
      • But the NeXT foundation and Objective-C together are actually very pertient to the world we live in today. [...] Objective-C is actually where the industry should have gone instead of C++.

        Objective-C is indeed better for GUI and application programming than C++. It would have been great if people had adopted it in the 1980's, because it might have allowed C programmers to find out about, and transition to, better approaches to programming. But even in the 1980's, Objective-C was not state of the art; it
        • by SuperKendall ( 25149 ) * on Sunday May 08, 2005 @01:19AM (#12466382)
          If you give, say, the iTunes design to a Cocoa programmer and a programmer using, say, Java/Eclipse or VisualBasic programmer, you'll probably find that the Cocoa programmer will take longer to implement it.

          Actually, having used the tools for all those languages I do not think that's an accurate statement. I've built a lot of Java GUI apps. just a few visual studio ones, and only done a bit of XCode so far - but I really feel like once up to speed XCode is probably the best GUI design app around. Again the message passing nature of the language underneath really helps since you're basically building a stub GUI that you then flesh out the code behind.

          The thing that makes Apple apps really good is because they don't have to go to heroics to design nice interfaces, the tools lend themselves to easy and rapid GUI refinement.
  • Smalltalk GUI (Score:2, Interesting)

    by ipoverscsi ( 523760 )
    I had no idea that Smalltalk implemented the GUI on the Alto, and all I can say is "wow". That has got to be the most powerful programming concept ever. You've got all the introspection and application-hooking capabilities you can imagine to customize every feature of every application, including the window manager! Of course there was probably no memory protection nor access controls, making it totally useless for today's desktop. But, damn! it would be cool to play with.
  • by The Lion of Comarre ( 799695 ) on Saturday May 07, 2005 @07:46PM (#12465061)
    From the article:
    Smalltalk was the world's first object-oriented programming language, where program code and data could be encapsulated into single units called objects that could then be reused by other programs without having to know the details of the object's implementation.

    http://en.wikipedia.org/wiki/Object-oriented_progr amming [wikipedia.org]
    History
    ...
    The first object-oriented programming language was Simula 67, a language designed for making simulations, created by Ole-Johan Dahl and Kristen Nygaard of the Norwegian Computing Centre in Oslo. (Reportedly, the story is that they were working on ship simulations, and were confounded by the combinatorial explosion of how the different attributes from different ships could affect one another. The idea occurred to group the different types of ships into different classes of objects, each class of objects being responsible for defining its own data and behavior.)

    http://en.wikipedia.org/wiki/Simula [wikipedia.org]
    Simula introduced the object-oriented programming paradigm and thus can be considered the first object-oriented programming language and a predecessor to Smalltalk, C++, Java, and all modern class-based object-oriented languages.
  • X has changed a lot from the early versions to the modern ones, and there have been numerous Window managers and operating environments over the years. I would have liked to have seen more discussion of that.

    Also, what about Sun's aborted alternative, Sunview?

    A discussion of some of the Windows alternative GUIs (Dashboard, etc.) would also have been interesting.

    Bruce
  • The real first GUIs (Score:4, Informative)

    by Animats ( 122034 ) on Saturday May 07, 2005 @09:12PM (#12465445) Homepage
    The first "intelligent graphical user interface" was probably General Railway Signal's NX [nycsubway.org] system, in 1937. Interlocking systems, which prevented setting signals and switches in incorrect ways, predated NX, but NX was the first system that went beyond interlocking to actually helping the user do things. The dispatcher selected a train, and NX would light up all the potential routes the train could take, taking into account all conflicts. The dispatcher could then select a route, and NX would set and lock all the switches and signals for that route, releasing the resources as the train passed. This was the birth of "user-friendly" systems.

    The first computerized system with a GUI was SAGE, [mitre.org] the air defense system. This had CRTs and pointing devices in 1958. The pointing device was a light gun, and it really looked like a gun. This was appropriate, because, in the appropriate modes, pulling the trigger on the light gun could launch a surface to air missile.

    There were a number of graphical CAD systems well before the PARC effort. Sutherland's Sketchpad, in 1963, was the first prototype. The General Motors DAC-1, in 1964, was the first commercial one.

    The PLATO system, a very early computer-based instruction system, was demoed in 1960, but, like most of the other systems of that era, tied up a whole mainframe for one user. Plato was gradually scaled up - by 1967, there were special plasma flat panel displays (red only) and time-shared access.

    So by the early 1970s, there were quite a few GUI projects that worked. They just cost too much.

    Getting the cost down took a while. The early minicomputer-based workstations like the Alto were in the $25-50K range. The UNIX workstations of the early 1980s (Sun, Apollo, PERQ) were in that price range. The original Apple Lisa, a good but expensive machine, cost $10K. The original cost-reduced Macintosh was around $2500, and, lacking a hard drive, it really wasn't very useful. Not until the Macintosh was built up to a reasonable hardware level (512K and a hard drive) could you really get any work done with it.

    By then, in the late 1980s, the hardware was finally ready. You could get a megabyte of memory, a bit-mapped display, a reasonable CPU, and a hard drive in a desktop box for under $3K. At which point Microsoft moved into the field.

  • by mveloso ( 325617 ) on Saturday May 07, 2005 @09:16PM (#12465468)
    One incredibly important tidbit is buried in the article: regions.

    "One critical advance from the Lisa team came from an Apple engineer who was not a former PARC employee, but had seen the demonstration of Smalltalk. He thought he had witnessed the Alto's ability to redraw portions of obscured windows when a topmost window was moved: this was called "regions". In fact, the Alto did not have this ability, but merely redrew the entire window when the user selected it. Despite the difficulty of this task, regions were implemented in the Lisa architecture and remain in GUIs to this day."

    That man was Bill Atkinson, and he came up with region drawing code that Apple patented. It's the reason that Apple's GUI was brutally faster than any other GUI out there. What was great about it was that it not only did rectangular regions, it was able to handle arbitrarily complex regions.

    It's worth it to go over the patent, if you get the chance. It just goes to show that a misunderstanding can have incredibly positive repurcussions.
  • by cr0sh ( 43134 ) on Saturday May 07, 2005 @10:19PM (#12465708) Homepage
    Something else PARC invented, but which has yet to make a full impact on today's systems, was a concept they called "Tabs, Pads, and Boards".

    Essentially, "Tabs" were little name tag like devices a user could wear, and via RF signals, other devices near the user could know when and where a user was within the environment, and change/react accordingly. Simple things like finding out where a person in an organization was, as well as turning lights on and off, was one use. More complex tasks, like phones having special ring tones and ringing that phone near the user in his ringtone, as well as computer systems the user sat down in front of becoming active with the user's last state (from the last machine he was at?) were also ideas bandied about.

    Pads were essentially the same thing as today's PDAs - wireless input/output devices linked to the network via RF links, which could share information with other users. Thus, a scribble could be made and saved, or "beamed" to another user(s), perhaps in a business meeting or such.

    Finally, boards were something like an "electronic whiteboard" - where a user could draw on the board (actually a large screen display with a light pen) and save the data - or beam it to other user's Pads - or recieve scribbles from others Pads...

    Ultimately, the idea was a ubiquitous and pervasive, network aware computer-aided collaboration system for (mainly) office use - possibly with the goal of eliminating paper from the office. With everybody having a Tab, a Pad, and access to Boards - collaboration and meetings could become real idea brainstorm and learning sessions. All data generated during the meeting, as well as viewed during the meeting, would be electronic, so the idea was that nothing would be lost, misinterpreted, mistyped, or any number of other things that could occur...

    Today, do we have any of this technology? The obvious answer is that we have all of the technology available, for a price: Tabs are simply RFID tags of a sort (and one could even add a real LCD or OLED display to have them show messages as well), Pads are today's wireless PDAs and cellphones, and Boards exist today as (albeit expensive) electronic whiteboards. All the wireless networking system exist as well, we also have IP phones and such...

    Basically, we have everything available, except for an integrated software solution to tie everything together so that it operates as a cohesive whole - as well as a price point that makes it affordable for regular businesses to make them switch to it over what is currently done (standard whiteboards, flipcharts, maybe a projector and powerpoint).

    I would be so grateful to have PARCs vision be the norm - the savings in time and hair pulling alone from losing information (either purposefully or accidentally) from regular whiteboards, to having it all be electronic, would be worth it...

  • Ok, so we have a history of GUIs for windowing systems. It's a shame because there is just much more GUI out there. I'd like to see more on modern GUI than highlighting the start menu, and double buffered windows.
    I want to learn about tk, QT, java GUI, .net, MFC, the webapp, the thinclient, up to the modern day.
    The benefit being so much more use of the word GUI!
  • by Zobeid ( 314469 ) on Sunday May 08, 2005 @01:09AM (#12466357)
    More than 20 years after the introduction of Macintosh, there are still a lot of people who don't seem to "get" the GUI concept. I suspect a lot of them are Linux programmers.

    My first exposure to a GUI was when I got my Atari 520ST in 1985. I approached it with skepticism, it was a newfangled-looking thing at the time, but I soon became a true believer. The ST didn't have any command line -- everything you did on it, you did using a GUI. That applied right down to application programming, which was done using a GUI-WIMP based text editor, IDE, and visual GUI editor (i.e. resource editor). Likewise, every third-party utility, no matter how technical in nature, came with a GUI interface. They had to, because that was the only way to do it.

    And you know what I found out? The Atari ST, despite its limitations, was an easy computer to use -- and an easy computer to program.

    It's remarkable today to observe how many programming environments *still* don't come with features like an IDE or visual GUI interface creator, and to ponder the reasons why not.

    Example: Python is hot. . . I'm sure it's OK for simple scripting, but why are people getting so excited about a language that doesn't even come with a good WIMP-based IDE and visual GUI creator? Are we really expected to create applications with this?

    The problem as I see it, as that a lot of programmers from the Unix tradition still view "user friendly" computers with contempt. To them, user-friendly means idiot-friendly, and a GUI exists only so that Grandma can launch her web browser without getting confused. They don't program their Unix/Linux boxes using a GUI environment, and it would never occur to them that they should.

    And here's the revelation. . . The GUI wasn't invented for Grandma. It was invented for everybody: office workers, scientists, artists, publishers, musicians, network administrators . . . and yes, programmers. The purpose is to make complicated things easy, not simple things.

    I find one of the most frustrating aspects of Mac OS X is the occasional need to work with utilities from the Unix world (Subversion being a recent example). As long as I stick with Apple-supplied software, everything is easy and natural. As soon as I need to install and configure any program from the Unix/Linux world, and I'm forced to dig "under the hood" of Mac OS X, everything quickly goes to Hell. Sure, I can make it work eventually -- after enough tinkering and fiddling and digging around for documentation -- but I find myself asking why. Why should I have to put up with this nonsense in the year 2005?

    The problem goes beyond the lack of GUI interfaces for programs coming from the Unix and Linux world. There's also the poor quality and inconsistency of those programs that have a GUI. These are interfaces designed by somebody who doesn't want to use a GUI himself. They're tacked on as an afterthought because "the dumb users" want a GUI, not because the program's designer wants or appreciates a GUI. And can you blame them? The only kind of GUIs they have regular experience with are the desperately *bad* and confusing ones typical of Unix and Linux applications.

Hackers are just a migratory lifeform with a tropism for computers.

Working...