Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Programming IT Technology

Do You Remember Bob? 315

GdoL writes: "Do you remember Bob? Byte's editor starts his monthly column talking about Bob the OS Interface from Microsoft in the middle 1990s. And he didn't forget either Bob the programming language from a former technical editor of Dr. Dobbs Journal, David Betz. This OO language is widely use on 'DVD players and set-top boxes produced by the likes of Toshiba, Samsung, and Motorola.' Do you remember any other language long forgotten that is still used in the real world?"
This discussion has been archived. No new comments can be posted.

Do You Remember Bob?

Comments Filter:
  • C? (Score:2, Funny)

    by SwingGeek ( 85187 )
    How about C? I hear some people still use it where VB and JavaScript won't work.

    Heh.
  • If I remember correctly BOB had an "feature" that let you assign a new password if you after three login attempts still hadn't given the correct password.
    • Microsoft claims that they have been really innovative, but most of their technical stuff has been "inspired" or invented by someone else, from DOS to Windows, etc.

      However BOB is different. BOB WAS innovative. This shows one important lesson folks: as much as MS talks about "innovation," innovation is completely meaningless. When a problem arises we can resolve it, but every product does not need to be technically innovative.
  • What about BASIC? It is still alive and well, and although it's not used mainstream anymore many schools still use BASIC and LOGO to teach introduction to programing courses.
    • Those who can't do teach. A lot of silly things pass for education (those who can't teach, teach gym.)
  • BOB UI and WinXP (Score:5, Informative)

    by mESSDan ( 302670 ) on Sunday November 18, 2001 @09:40AM (#2580901) Homepage
    While reading the first article, I was struck by something strange:

    In the picture of the Bob UI, it shows a little dog who has a caption bubble coming from his mouth. Well, in WinXP if you do a file search (hit F3), you'll see an almost identical dog.
    Maybe Microsoft thought that Bob was ahead of its time?
    Anyway, it's strange.
    • The dog looks french, doesn't he. Think Spirou...
    • If I'm remembering this right, the little "Bob Assistants" in MS Bob were their first shot at those things.

      Clippy et all are the only refugees left from that experiment.

      *shudder*

      But, I suppose, you have to give them some credit for trying a different OS interface. Even if it did suck in all ways...

      • Re:Clippy (Score:3, Interesting)

        by Jay L ( 74152 )
        Bob wasn't just a random experiment. Read _The Media Equation_, by Reeves and Nass. A very readable account of psychological experiments that show how people respond to computers, and to technology in general. Much of this research went into Bob, and later the Office Assistant. I'd love to know what went wrong.

        Some examples: People watching a news program on a TV *set* labeled "news television" will rate that program as more informative and authoritative than those watching the same program on a TV labeled "general-purpose television". People using a computer program that praises another computer program will rate it smarter than a program that criticizes another program. Larger pictures will be better remembered and better liked than smaller pictures. People will rate a speaking tutorial program more honestly if the rating program uses a different voice!

        Fascinating stuff.
        • by n8willis ( 54297 ) on Sunday November 18, 2001 @05:17PM (#2581836) Homepage Journal
          Lord, no! I implore you! Stay AWAY from The Media Equation! This is one of the worst peices of drivel ever pawned off on an unsuspecting public. Please allow me to elaborate.

          I had to read this book for a graduate Mass Comm class two years ago, and it is without a doubt the most awful excuse for experimental science that I have ever seen. Unfortunately for the authors, I had taken a class in research methods before I encountered their book (they should consider doing the same).

          It's shoddy science, through and through. They ignore intervening variables, operate every experiment without controls, provide no accounting for intercoder reliability, the samples are always too small to be statistically significant (only one had more than 30 participants, many had less than ten), and comprised of forced participants (Reeves' and Nass's freshmen psych students at Stanford, to be precise. Even without a grade on the line, that's a bad sample). Usually, they rely on reported rather than observed behavior, and the only operating hypothesis ever examined is their goofy "equation" (you want me to spoil the beginning of the book for you? Here is The Media Equation: Media Equals Real Life. That's it. Word for word).

          As if that wasn't enough, they make constant generalizations of their results (which with forced, nonrandom sampling is the first thing thrown out the window). They grandstand on every turn -- everything supports the Media Equation, and there is nothing it doesn't affect. You should always be suspicious when "scientists" do thirty experiments and always find their hypothesis supported exactly how they predicted. It's usually bunk, and in this case, it's a pantload. In one instance, they even admit to writing the hypothesis AFTER the experiment was performed. They make repeated references to other "research" in this area, but if you read through the bibliogrpahy, they are merely citing *themselves* from previous experiments. Many of these experiments, if you were interested, have still not been accepted for publication, many years after they were done. Most are not even available at the authors own Web sites. If it weren't for the fact that The Media Equation was published by Reeves & Nass's employer, I doubt whether they could've goten it published at all.

          It's bad science, and it's only an afterthought: they plainly thought up their "equation" first, and then set out to prove it. That's the Scientific Method in reverse, people.

          Let me give you one quick example in reference to the poster above: in the larger/smaller pictures experiment mentioned above, they show participants photographs of people's faces, some in close up and others standing 10 yards away. And the "test" is showing the subject another photo of the same people, and seeing which person the recognize most often. Guess what: it's the person who's face they saw in close-up. Surprised? You shouldn't be. No thinking adult would be; you see the face close up so you see more detail, and see it better. Plain and simple. But that's not the conclusion Reeves and Nass come up with; they decide instead that this turn of events means that you are having a psychological reaction to the face, and the biger face makes you happier because it seems more like a person. So if you think it's a person, you will remember it better.

          That's the media equation, you see? The more person-like an electronic communication is, the better it works. The only time this has been tried under real-world circumstances, of course, is their grand experiment: Microsoft Bob. Funny how well that went over, huh.

          Man, I hate that book.

          Nate

          PS - you might try the following link to Amazon, I submitted the review under "n8willis": here [amazon.com].

          PPS - if anybody cares, I'll follow up what I say by emailing them the paper I had to write about this ridiculously bad book; it goes into more detail. Or perhaps I'll submit it as a /. book review. I meant to at the time, but it was just way too long....

  • I know some colleges still teach Pascal in their "Intro to Programming" classes. I know thats what i learned on.
    • And Delphi (which is based on Pascal) from Borland is still alive and well. I haven't heard much about it, but I very frequently run across programs written in it.
    • Re:Pascal? (Score:1, Troll)

      by tmark ( 230091 )
      The University of Toronto used to teach their Comp Sci students a language called Turing. I can't recall what the rationale was, but no doubt they believed it was well suited for teaching programming. I am not sure if it is still taught today.

      I remember at the time (over 10 years ago) thinking this was the most ill-advised thing the department could have done. I thought it was absolutely pointless to teach kids a language that was not also in wide use outside the classroom - such as C. To this day, I have never seen any jobs for Turing programmers, and I laugh out loud every time I see a resume where the only language the applicant knows is Turing, which still happens.
  • I must admit I forgot Byte even is existed. Since the paper edition stopped being printed I haven't really bothered checking the on-line edition.

    I know this is off-topic, but I have too many karma points and I just need to air my view ;)
    • Since the paper edition stopped being printed

      There have been rumors that the print edition may be returning, based on passing comments by certain coumnists in their web journals. But nothing tangible yet

  • Modula-2 (Score:3, Interesting)

    by Space Coyote ( 413320 ) on Sunday November 18, 2001 @09:54AM (#2580923) Homepage
    I worked at a nuclear power plant for a while working on the plant monitoring systems. All the PC-based stuff was written for OS/2 using Modula-2. Anybody ever use Modula-2? Anyone ever use it outside of a first year CS class? Turns out it's actually a great language for systems programming, at least with the Object Oriented extensions that the version I used came with. It was actually a lot like Delphi. And much nicer to debug than C++.
  • Yes, I remember Bob. I bought a Gateway in the 1994 - 1995 timeframe and it came with Windows 95 preinstalled and Bob (and a whole bunch of other software) on CD-ROM. I installed Bob just for fun. The next day, I was reinstalling the whole system from scratch and if I rember right, quadruple booting 95, NT 3.51, Linux, and OS/2 Wrap.
    • That's when my father bought his Gateway too, I remember helping him set it up and getting a kick out of Bob. Totally useless, but the little geography quiz game rocks! We played it for hours.
  • by Billly Gates ( 198444 ) on Sunday November 18, 2001 @10:01AM (#2580937) Journal
    Its rumoured that linus is one of the animated characters. I would love to hear him explain why XP just crashed.
  • I remember when I was working in a computershop, my colleague installed Bob on the PC of my boss. You can imagine my boss's reaction.. it was actually quite a good joke.
  • HTML (Score:1, Funny)

    by Chas ( 5144 )
    Look at the crap that passes for it on some of these "you must use BrowserXYZ" to view this site" sites.

    ;)
  • I remember Bob as being the smart driver that doesn't drink...
    In Belgium it is the guy who is a volunteer in a party to drive all others home, while drinking soft drinks....
  • DIL16 (Score:3, Interesting)

    by geophile ( 16995 ) <jao@NOspAM.geophile.com> on Sunday November 18, 2001 @10:15AM (#2580950) Homepage
    I don't know if it's still in use, but it sure was odd. DataSaab was a division of Saab, and they had their own hardware and system software.

    DIL16 was DataSaab Interpretive Language for their 16-bit minis. Looked like assembler but had no registers, and yes, it was interpreted. Completely bizarre. I used it to work on a teller system at Citibank in the mid 70s.

    Any other DIL16 programmers out there?
  • I'm currently consulting to Merrill Lynch in Jacksonville, FL., and I can say it has been an eye-opener! We transitioned 49 PC-based insurance apps from Springfield, MA., down to Jacksonville this past summer, and you wouldn't believe the mix of languages there. By the way, I'm talking about over 960,000 lines of code in these things. The predominant language was Clipper, versions: Summer '87, 5.0 and 5.01a. There are also a couple of RBase apps, one written in something called ArevDos, a generous smattering of C modules (mainly linked into the Clipper apps for faster calculations), a couple of compiled BASIC apps, some C++, Visual Basic and some PowerBuilder. From what I've been able to find out, the Insurance industry is chocked full of applications written in "obsolete" languages.

    Now for my real question, how do I write this into my resume and make it look good? ;-)
  • SNOBOL4 (Score:5, Informative)

    by geophile ( 16995 ) <jao@NOspAM.geophile.com> on Sunday November 18, 2001 @10:19AM (#2580956) Homepage
    The greatest string processing language of all time. Blows away Perl. In SNOBOL4, the space was (is!) an unbelievably powerful pattern-matching operator. A single match could break apart a string and assign variables with pieces of it. A single statement could succeed or fail, and then there were up to to transfers of control at the end, one for success and one for failure.

    SNOBOL4 was completely flexible on type, (e.g. you could do "5" + 3); had dynamic memory allocation and garbage collection; had the ability to evaluate dynamically generated SNOBOL4 ... the list goes on.

    It's probably still in use, and it was bizaare and wonderful. I also have fond memories of two compiler courses taught by RBK Dewar, one of the implementers of the Spitbol implementation of SNOBOL4.
    • SPITBOL 360 (written by Robert Dewar and Kenneth Belcher) has just been released under the GPL.

      However, I prefer to use the results of Robert's more recent programming language activities.
  • Other Language? (Score:4, Offtopic)

    by Foxman98 ( 37487 ) on Sunday November 18, 2001 @10:20AM (#2580958) Homepage
    Hmmmmmm...... how about Latin? Not everything has to be computer related...
  • by weave ( 48069 ) on Sunday November 18, 2001 @10:23AM (#2580960) Journal
    This is still hanging on my office wall...

    UNIX Gurus in Hell [ibiblio.org]

  • I tried to lern Lisp using XLISP (despite having an old book on Lisp for reference), but I failed. Somehow, nothing worked as I expected. Probably I didn't know that XLISP was, despite its name, a Scheme dialect.

    Instead I learnt FORTH (using the great F-PC system for PCs), and returned to Lisp later when I encountered Emacs 19.
    • That rather depends on which version of XLISP you're talking about. XLISP 3.0 was in fact a lisp dialect, and it branched off into XLISP-Plus; however, the main branch turned Schemy at 3.0.

      /Brian
      • Hmm, I think I used 2.0 or something like that. Anyway, the defuns I encountered in the book from the library (yes, in Germany, small suburbian libraries have books on Lisp!) didn't work, and I was frustrated because there was so interesting stuff in the book (natural language processing, for example).
  • by mtm ( 10808 )
    Perhaps not as forgotten as some, but it is included with every Sun and most modern Macs in the form of OpenFirmware [openfirmware.org].

    I used to have a blast messing around with FORTH on my old Kaypro II, Vic 20, Apple II, CoCo, Amiga, etc. A really fun language to hack around in.

    • I still have warm feelings for forth. I remember the first time I got acquainted with forth. It was some 3d framework called graforth.

      I was quite impressed with its counter-intuitive reverse-polish-notation syntax:

      c a b + = if then

      Isn't it much more stylish than writing:
      if (a+b==c) {} ?
  • PL/M for the 8085 and 80x86 is still being used by Thermo Jarrell Ash in some of their legacy analytical instruments as the language the firmware was written in. What a beast that was...no floating point, no pointers... ugh.

    We used ICON to whip up a driver program for the CNC milling machine that the on-site machine shop had in order to aid conversion from their paper tape (!) library.
  • Subgenius (Score:5, Funny)

    by nickovs ( 115935 ) on Sunday November 18, 2001 @10:40AM (#2580993)
    Bob is a language. Bob is an OS interface. Bob is everything. You would know this if you had joined the Church of the Subgenius [subgenius.com].

    May Bob be with you.

  • by kptBlaha ( 522498 )
    Real programmers still use Fortran and Cobol. For someone who reads tons of articles about Java and Haskell and who considers C++ obsolete, may be very surprising that large part of scientific numerical computations is still done in Fortran. Do you remenber the demand for Cobol programmers in 1999 (Y2K)? The critical systems still use Cobol.
    • There are some libraries, such as LAPACK, BLAS, LINPACK, EISPACK, etc, that are still widely used. They are well written, complete, and, most important, well debugged.

      If you look into Octave source code you will find those FORTRAN libraries there. Since they are public domain, Matlab and other commercial number-crunching software probably use them as well.

      They are still coming up with new FORTRAN versions, I believe FORTRAN 2000 is the latest. Someone once said that we don't know which language people will use for numerical analysis in the year 2050, but we know what its name will be: FORTRAN.
  • assembley! (Score:2, Insightful)

    by Simm0 ( 236060 )
    Although it's an oldie, assembley is still a goodie (imho atleast). I still use assembley in some code that isn't well done by the compiler (gcc) at times to get that extra ms less out of the the app. Only problem is it isnt cross platform/architecture at all.

    This is if you regard assembley as a programming language :P
  • Bob the language (Score:5, Informative)

    by p3d0 ( 42270 ) on Sunday November 18, 2001 @10:54AM (#2581023)
    Just in case anyone is wondering...

    It's hard to find any documentation for the Bob language. Having a quick look at some Bob source code, it is a simple OO language without classes, where subclassing is the same as instantiation, much like Self or Cecil. It seems to support only single inheritance, though I gather it's dynamically typed, so there's no need for "interface inheritance".

    It's not "purely" object-oriented, since you can define procedures that are not methods of any class. At first glance, there doesn't seem to be any access control: all features of an object are public.
    • Re:Bob the language (Score:3, Informative)

      by BlueGecko ( 109058 )

      (Disclaimer: I'm not an expert in prototype-based languages, but I'm fairly confident that the description below is quite accurate. Corrections are, therefore, very welcome.)

      Languages such as this are called prototype-based languages, and are generally seen as the successor to object-oriented programming. However, no prototype-based language has, to my knowledge, actually gone anywhere (the one that got closest was NewtonScript, which would have made it if it weren't for the fact that the only place to use it was on the Apple Newton), so it's nice to see that at least one actually made some progress towards general acceptance.

      For those unfamiliar with the concept of prototype-based programming languages, Bob (and all prototype-based languages, for that matter) are by their very nature single-inheritence. The general idea is to eliminate the whole idea of classes, and instead treat everything as an already existing object. You them modify those objects as necessary, and, if it's an object which is handy, you just make lots of copies of it. I find it much clearer to use the word "copy" instead of "initiate," as you did for this reason. For example, define anObject = new Foo() makes a new copy of the already existing Foo object, that will have all of the same values, etc. as Foo. You can then modify that copy by adding new values as necessary.

      The reason I make this distinction is that one of the powerful things you can do in prototype-based languages is give an instance of a class a new function. For an example of when you'd want to do this, let's say you have a Canvas object in a GUI. Now, you, at some point, are going to need a screen, and the screen is going to need some variables and methods you don't need for a standard Canvas (for example, Screen.refreshRate or Screen.setColorDepth()). The normal way to do this in a regular OO language would probably be to declare a subclass of Canvas that had the extra functionality, and then to make a single instance of it, probably called theScreen. This is awkward, however, because, 99.9% of the time, you really only want one screen object, so making a subclass just for a single instance seems odd. In a prototype-based language, however, you'd simply "copy" a new instance of a Canvas (called theScreen) and add your extra methods and functionality specifically to that object. Ultimately decide that you really do need multiple screens? No problem! Probably you'll want to add a monitorNumber attribute directly to the already existing screen object, and then make a copy of that. Similar functionality is also present in Dylan [gwydiondylan.org], and, by extension, probably LISP's CLOS, although I'm honestly just not familiar enough to know.

      • Er... one prototype based language has got pretty far. You're very likely running it, given its built into pretty much every GUI web browser on the planet. It's called Javascript by some, or ECMAScript by the rest of us, and is actually quite a nice little language.
      • Re:Bob the language (Score:5, Interesting)

        by p3d0 ( 42270 ) on Sunday November 18, 2001 @01:53PM (#2581348)
        Good description, but...
        Languages such as this are called prototype-based languages, and are generally seen as the successor to object-oriented programming.
        "Generally seen" by whom? Perhaps I'm out of touch, but I have never heard this before. Is this really a common belief?

        Prototype-based languages have an interesting minimalistic feel that allows language designers to get right to the heart of some issues, and I think they are very valuable for that reason. However, I think ultimately classes will win out.

        One of the common mistakes in OO programming is to confuse kinds-of-things with instances-of-things. A program for a clothing store might have "shirt" objects with properties such as the manufacturer, size, colour, and quantity in stock. Seems ok so far. Then, when looking for a place to store information on a specific shirt, such as the customer who bought it, the programmer notices that a "shirt" has two meanings: a kind of shirt versus a specific individual shirt. The quantity-in-stock belongs to a shirt type, while the purchasing-customer belongs to a specific shirt instance. The design has to be juggled a bit to accomodate this new insight.

        Combining classes and objects the way prototype-based languages do seems to encourage this kind of confusion on a massive scale. Programmers would undoubtedly add comments to indicate which objects are actually "meta-objects" (ie. classes). The Bob language examples do exactly this, as a matter of fact. I suspect that the degree to which they don't differentiate between classes and objects is exactly the degree to which they will have confusion between instances and kinds of things.

        Incidentally, I think there is a great deal of promise in the idea that everything (including classes) is an object, but I don't see prototype-based languages as being a successor to class-based ones.

      • Oh, I forgot one thing:
        For those unfamiliar with the concept of prototype-based programming languages, Bob (and all prototype-based languages, for that matter) are by their very nature single-inheritence.
        I don't think that's true. If I'm not mistaken, both Cecil and Self are prototype-based languages with multiple inheritance.
      • > Languages such as this are called prototype-based languages, and are generally seen as the successor to object-oriented programming. However, no prototype-based language has, to my knowledge, actually gone anywhere

        The guy I knew who wrote one called it an exemplar-based language, which is probably clearer. He didn't necessarily think they were going to take over from class-based OOL, just that it suited his purposes well - the proprietary language within one particular company's product (where I gather it worked very well, but "general acceptance" was never a goal).

        > probably LISP's CLOS

        See "The Art of the Metaobject Protocol". Worth a read even if you never expect to program in LISP, just for exposure to the ideas.
        http://www.amazon.com/exec/obidos/ASIN/026261074 4/ 104-5424839-3447106
    • I've seen the docs -- they're about as minimal as they come. It's no more than a quick reference.

      It does look like a rather interesting language, though; in some ways it seems almost like a predecessor to Java (if that means anything). It's a fairly cleverly done macro language, basically similar to C++ with all the garbage stripped out.

      /Brian
    • According to the XLISP home page [mv.com], the Bob source code was last updated today (oh, ok, maybe in response to the /. article). But at least the author is out there and paying attention...
  • Used for marketting hype, to curse companies into bankruptness, and to bring false hopes since about 6 years, and still in use in my 5 amigas at home :) Nothing beats a A1200 (unless it's a dell 8100 laptop with AmigaXL [haage-partner.de] on it :) ).
  • x86.. (Score:2, Funny)

    by wackybrit ( 321117 )
    Yeah, I remember a really crappy obsolete language, called something like 'x86'. To write even the simplest program you had to write about 1000 lines.

    You could barely do anything with a single line of code. Whereas in Perl, you can make the coffee and clean your bedroom in one line, with the obsolete 'x86' you had to pretty much write a bible-worth of code.

    I reckon they should consign x86 to the scrap-heap and make Intel processors run directly on BASIC instead.
  • Think about what Bob was...a layer on top of the OS, that simplified it for novice users, categorized things, made it more "friendly", etc.

    The first version was widely reviled, but the team starting working on a second one. Now it is often true that the third version of a product is the one that catches on -- the first one is rushed out, the second has all the stuff that was supposed to go in the first, then the third can actually respond to user feedback and become useful. But for some reason, Microsoft untypically cancelled Bob 2.0 in mid-development.

    Now if you imagine Bob continuing to evolve and eventually adding Internet access (still categorized, simplified, friendlier, etc), then it could have become...AOL. People make fun of AOL also (for similar reasons), but it's a pretty successful company and viewed in many ways as the only tech competitor to Microsoft. Now imagine if Microsoft had short-circuited that with Bob 5.0.

    - adam

  • by Christopher Bibbs ( 14 ) on Sunday November 18, 2001 @12:02PM (#2581133) Homepage Journal
    Last time I checked there were very few books being published on it and most new developers have never heard of it. However, several large insurance companies still use an app written in it. It appears they all bought the source code and continue to modify it to keep things up to date.

    As a tip, if you are ever called out to do a consulting gig and the customer mentions "Visual DOS", run like hell.
    • I believe there was only one version of visual basic for dos, before then it was known as quickbasic, of which qbasic was a cut-down version sans compiler (and I imagine quite a bit else).

      Seriously tho, I think all operating systems should come with a version of basic, for learners and people old to the game but new to the OS to play with.
      • QuickBasic (4.5 was the latest version, I think) was a structured basic which allowed $includes, subroutines, user defineable data types. As Basics go, it's not too shabby at all.

        It didn't have the 'Please hog all my system resources' that VB has.

        dave
        • No, the last version was QuickBasic PDS (Professional Development System) - which implemented quite a few new features (from what I understand) over QB 4.5. After QB 4.5, came VB for DOS 1.0 - which was dog slow (ie, the .EXE's compared to QB 4.5 EXE's).
  • Don't blame Lisp for the failure that was Microsoft Bob!

    This would be like blaming general relativity for atomic weapons or Thomas Edison for phone sex and the psychic friends network.
  • by jht ( 5006 ) on Sunday November 18, 2001 @12:18PM (#2581167) Homepage Journal
    As we all know, Microsoft is absolutely merciless when it comes to tolerating failure. People get bounced out of the company constantly.

    So does anyone want to guess what happened to the program manager for Bob?

    That's right. Bill Gates married her. Go figure.

    The idea of predictive interfaces was interesting, but Bob had the fatal flaw of being way too complicated for the hardware of the day. Some of the technology lives on in Office's Clippy, but Bob itself was a disaster to the point that even the people who pirated it returned it.
    • As we all know, Microsoft is absolutely merciless when it comes to tolerating failure. People get bounced out of the company constantly.

      So does anyone want to guess what happened to the program manager for Bob?

      That's right. Bill Gates married her. Go figure.

      Actually, if memory serves, it was the other way around. He married her, then let her spec out what she wanted and see it developed.

      Anyone else and it probably wouldn't have seen the light of day.

    • by Maigus ( 118056 ) on Sunday November 18, 2001 @02:08PM (#2581371)
      Not Quite.
      Karen Fries was the driving force behind Bob. Melinda was just part of the PM team associated with it.
      I was on that team as a contract tester. It was my first job after dropping out of college. I'm terribly surprised I still work in the industry after being associated with that disaster. I did come away with some entertaining memories, however.
      The original project codename was "Utopia" (actually, it might have been Utopia Home). I've still got a T-Shirt with the Petie the Parrot character on it and Utopia scrawled across in a kind of abstract architect font.
      When the name Bob was revelaed, there was a meeting of all the team members and a bunch of muckety mucks. This incredibly cliche marketing consulting team was the group which came up with the name. They were all up in front of the room in their black turtle necks and black plastic framed glasses.
      When they got through their powerpoint presentation to the name and the glasses wearing smiley face icon the room was deathly still except for Karen Fries excited squeal and clapping. She looked out over the crowd assembled and started to look cross - we got the message and started clapping.
      Now I've been involved with more than a couple doomed projects since then (perhaps I'm some sort of CS pariah) but I've never seen a group of people so unhappy and depressed about their work.
      A little while after that, I believe, Melinda got her engagement ring. There was another big party. The joke I always tell about that event is: "It was a pitty about her arm..." I'm sure, being a geek that Bill felt he had to make up for something there and prove to the world that this relatively attractive woman was indeed taken. The rock on her ring was as large as one of my knuckles. There's no way when wearing that ring she could put her hand in a tight pocket. It was one of the most ridiculous and sad things I've ever seen, yet there I was saying, "Wow, that's... great!"
      We knew when we were working on Bob itself that it would be a disaster. At the time, Pentium computers were just coming out in the consumer space. A P90 was reqiured to run Bob with any sort of usability. Most of it was written in VB, back when VB had no chance to rival C in any task and just using the product was painful. These computers were 3-5 thousand dollars and we expected new computer users to buy them just to use a piece of even less functional than MS Works software?
      The whole thing would have been unbelievalbe to me if I hadn't lived it myself.
    • Anybody else notice that the 'helpful dog' in the article's screenshot has be resurrected into the default search helper for Windows XP? Now with animation and 3-D Rendering!

      Apparently, there are some aspects that Microsoft is determined to keep alive; I guess you gotta have something for people to consider 'cute'.
  • by wstearns ( 5784 ) on Sunday November 18, 2001 @01:43PM (#2581323) Homepage
    For those interested in old languages...
    "The Retrocomputing Museum is dedicated to programs that induce sensations that hover somewhere between nostalgia and nausea -- the freaks, jokes, and fossils of computing history. Our exhibits include many languages, some machine emulators, and a few games.

    Most are living history -- environments that were once important, but are now merely antiques. A few never previously existed except as thought experiments or pranks. Most, we hope, convey the hacker spirit -- and if not that, then at least a hint of what life was like back when programmers were real men and sheep were nervous."

    http://www.tuxedo.org/~esr/retro/ [tuxedo.org]
  • How about REXX? (Score:3, Interesting)

    by Greyfox ( 87712 ) on Sunday November 18, 2001 @02:42PM (#2581431) Homepage Journal
    REXX wasn't just a wonder dog. IBM embedded it in all their OSes and you can still find an object oriented version of it on their sites somewhere (probably alphaworks.ibm.com has it.) It's a cute little scripting language and the first language I'd ever run across with the nifty built in stack/queue primatives. Oh, they put it on the Amiga too. These days I prefer perl over it though, or I'd still be using it.

    And then there's PostScript. PostScript isn't forgotten, but there aren't a whole lot of programmers who know how to use it. It's a rather unwieldy language with a lot of primatives, but it looks a lot like forth. I preferred it over forth though, as it struck me as being a lot cleaner. If I were going to use a reverse polish notation language, it'd be a stripped down version of PostScript. If anyone wants to learn PostScript, Adobe sells a language reference manual and some tutorials that cover the language very nicely. Ghostscript is all the language interpreter you need. Then you could do cool stuff like make the printer compute and print calendars for you.

    • You can (or could) write viruses in it too.
      • by Greyfox ( 87712 )
        When I was doing work writing printer drivers, I was kicking around an idea for a PostScript worm that would propigate from printer to printer and whose only other function would be to replace every instance of the word "strategic" with the word "satanic."

        Unfortunately, I never did figure out how to open a network socket in PostScript. It would have been a really cool hack...

  • Solder (Score:2, Funny)

    by sks ( 85021 )
    One of my favorite quotes is from Steve Ciarcia, who wrote the long-running Circuit Cellar column in Byte (long since evolved into Circuit Cellar Ink Magazine). Steve preferred to do most of his work in hardware, and viewed software as a necessary evil upon occaison. Steve said this in one column, and it's now immortalized: "My favorite programming language is solder."
  • by isorox ( 205688 )
    Do I remember bob?

    Do I remember waterworld!
  • by olevy ( 63189 ) on Sunday November 18, 2001 @03:54PM (#2581612)
    OK, first I have to admit that I was one of the developers for Bob. Don't hold it against me, it has been a long time since I worked for Microsoft. Most of the other Bob developers have long since left as well.

    Bob, was one of the very, very few truly creative product attempts for the general market Microsoft has ever made. The first version was deeply flawed, but it also had some very good ideas. Microsoft is not very comfortable with the messiness of creativity and so like a foreign microbe Bob got expelled before these problems could be fixed. Version 2 got cancelled just a week before going into general beta.

    The product started out as skunk works, and if it had stayed like that, we might have done a better job. However, I think the biggest curse was that mid-project our Product Unit Manager (PUM) became Melinda French, soon to become Melissa Gates. Melinda never had much direct say in the product, but she was obviously very well connected. We then got showered with money and developers and it went to our heads. It has become a very good object lesson to me on the dangers of over-engineering.

    What I find distressing, though is that the good ideas that were in Bob are ignored, and no other product seems to be picking them up.

    Here are some of the key ideas:

    * Menus are not necessarily the best UI. Think about it; they are passive, they quite often show lots of options that are in appropriate, and the commands are stuffed in all sorts of weird places. Even experienced users have trouble finding some of the options.

    * A shockingly high percentage of people are still scared of computers. If you are truly going to create consumer software you have to address this somehow.

    * UI is a conversation. GUI's are built on the realization that we are very visual creatures. But what about tapping into our sociability? We are very social creatures. There is a body of evidence that shows that people interact in a social way with their computer (really!). That is where the characters come in -- in extensive usability tests we found a real benefit to them. They helped allay the fear factor and they served as a useful UI metaphor -- UI as a conversation. By the way, the characters were always completely optional -- there was a very easy way to turn them off completely.

    *Task basked UI. Most programs are general purpose programs that do quite a number of things. The only problem is that the vast majority of people only use a small fraction of the features. One solution is to take the code for word processing and present it as a family of specialized tasks. So you would end up with a letter writer, a report writer, an e-mail writer, a list maker, etc.

    I wrote Bob's Letter Writer. This may sound like a weird specialization, but since we knew that people using this particular program were just writing letters, we could do a great job of making mail merge easy, and also doing neat graphic effects (ala Publisher) that would appeal to someone writing a letter to a friend.

    * Files are a low level concept. I mean really -- why should the common user have to care about such a geeky thing as a file? They just want to get their document. They could care less about whatever low level construct the developers have come up with to store this information, and really they shouldn't have to. It is weird that we still do not have an object oriented OS. My biggest disappointment with Linux is that it has done very little to push forward truly new ideas (I'm still rooting for it though).
    On a technical side, the reason why Bob performed so poorly was because we tried to create the very first OLE component system that worked just as well for C++ as for Visual Basic. VB was not yet up to the challenge, and yet most of the apps were done in VB. We also used every Microsoft technology (the Jet database engine, the Quill word processing engine, VBA, etc.) and yet machines of that time only had 4 megabytes of memory! We required way too much memory for the time -- probably around 12 MB. The graphics looked bad because we had such a tight memory budget that we did not use any bitmaps at all. Everything was done with meta files (vector objects). On top of that we had to write to Windows 3.1 -- 16 bit programming.
    • > Bob, was one of the very, very few truly creative product attempts for
      > the general market Microsoft has ever made.


      yes, many people forget that. It shouldn't be that hard to remember all three innovations from microsoft:
      1) 8 bit BASIC. Yes, the language existed, but actually implementing it for those silly little hobbiest toys as a commercial product was innovative.
      2) The usable word processor footnote in 1984 (Word 1.0, Mac). Yes, we *could* make footnotes in wordstar, but it was a PITA. I'm told that Word Perfect came out with a footnote the same year, but it would be anothe rcouple ofyears before WP was in wide use (WS still reigned. Right up until that WS 2000 fiasco . . .)
      3) Bob. Oddly, I've actually met two students who have seen in--both times in response to asking if anyone had ever heard of it. One not only remembered its existence, but actually thought it was cool, and had spent a lot of time at it.


      And why doesn't it surprise me that most of the people from MS's last round of innovation are gone??? I still occsasionally use what I think are the final two decent products to leave MS: Word 5.1a, and Excel 4.0 (both mac).


      Hawk, who really isn't anti-ms, but a) just hasn't seen anything worth owning from them in close to 10 years now, and b)has the usual free-market economists' distaste for monopolies which mess with his precious markets.

    • In Donald Norman's [jnd.org] excellent book The Psychology of Everyday Things [amazon.com] (or in paperback, less colorfully, the Design of Everyday Things [amazon.com]), he notes that it takes several iterations before a really new & revolutionary product can mature enough to be accepted by the public. The examples he gives are the talking vending machines & cars that you used to see in the 80s: being able to walk up to a coke machine & say "give me a coke please", or telling your car "change the station to WZBC" isn't such a bad idea, but the early implementations of it were so bad that the public completely soured on the whole concept, and now no one will even research it because it doesn't seem to be viable in the market anymore. Microsoft Bob, as it was developed & release in the early 90s, was another great example of a highly revolutionary but incredibly unfinished / unready product, but maybe it deserves to be reconsidered in this light.

      As the commenter above notes, the now standard WIMP (windows, icons, menus, pointer) interface isn't necessarily the optimal way to interact with a computer; it's just what we've all learned to work with. And it's worth noting that even that interface took several iterations to get right, just as it does for a lot of MS software (IE, Office, Windows, etc all seemed to come of age with the 3.x versions, and start surpassing the competition that they copied with the 4.x & 5.x versions. They of course start bloating by the time they get to 5.x & 6.x, but that's a separate problem... :)

      Computer hardware is now drastically more capable than it was when Bob came out, to the point that software developers are always looking for ways to fill up all those extra clock cycles -- anything from running Seti in the background to having hooks in the Windows interface that pause for a few hundred milliseconds before opening a menus so that "it feels like the computer is working harder" -- surely my least favorite part of the Windows interrface and the first thing I try to disable with TweakUI on any computer I'll be using regularly. The really "revolutionary" releases of the recent past -- Mac OSX and Win XP -- aren't really revolutionary at all, but glossier and more refined versions of what we've been using for well over a decade now -- and in the case of OSX at least, you could argue that the interface is a step backwards in terms of flexibility and usability, emphasizing style over substance at the UI level, even if the underpinnings are surely much more advanced than before. XP might also be guilty of this, but I haven't used it yet so I can't say; I do know that the dissolving menus that Win2k had were guilty of the same sort of cute wastefulness that OSX/Aqua's pervasive translucence & drop shadows represent...

      Maybe it's time to consider abandoning the WIMP interface. Maybe the world is ready for Bob or something like Bob to give it another shot. Or is it? Bob tried to represent the computer 'space' as the interior of a home, and for a desktop computer of sufficient power (i.e. what most of us have now, but didn't have when Bob came out), this isn't so bad. But in a networked world? Can you achieve some sort of network transparency & represent it in that sort of metaphor? I dunno, maybe. I am sure that it's an interesting challenge, much more than ever more glossy iterations of the same old Mac & Win interfaces could ever be, as they both try to refine their implementations of the Macintosh Interface Guidelines ever further.

      Maybe it's time to give the Anti-Mac Interface [acm.org] a try -- a system that inverts all the assumptions that we've been working with for years now.

  • My kids loved Bob! (Score:2, Insightful)

    by Newer Guy ( 520108 )
    And I credit Bob for making them feel comfortable with the computer. My son began playing with the computer at five thanks to Bob. He still occasionally asks if the Bob CD is still around. The problem with most of us is that we see things through OUR eyes as opposed to seeing through the eyes of a child. Yes, Bob was rondly trashed in reviews....and all the reviewers were ADULTS!! It shows how truly clueless so many of us can be...software designed for children being trashed by adults.
  • You can see Bob alive on Windows today as a DesktopX theme (www.desktopx.net).

    Theme is on Wincustomize:
    http://www.wincustomize.com/preview2.asp?source= ht tp://www.wincustomize.com/library/accounts/Frogboy /dx/bobxp.jpg

    It's Just for fun. Nobody in their right mind would run this as their UI. Just like no one in their right mind would use Bob before. ;)
  • The funny thing that I see while reading the comments is the blasting of Clippy and the Dog for being cute. This group of people actually love and enjoy these little ai characters and build them for themselves. I see them in many different channels on IRC. The most notable one is purl. So, would it make you all feel better if Clippy reacted to ! commands and wasn't so cute? The functionality that these "bots" provide is not the greatest, but they are useful and are needed. Otherwise, purl wouldn't exist.

    Oh, and how could one forget the greatest "bot" of them all? The computer from Star Trek. "Computer, where is Worf?"

    You want these bots, You need these bots. If you don't like the manner in which I provide these bots, then why don't you sit at a keyboard and write one yourself?
  • There was a very good parody of this on the PC_Answers magazine, called Bubba, Bob's Country Cousin. It's main feature was an office in a barn. It actually worked as a Windows interface, and fitted on a 720K floppy. The other free shell that fits on a 720K floppy was an add for OS/2's WPS. It is the only shell that allows you to set folder backgrounds.

    For a product that failed to make an inpact on the market, Bob has a supprisingly large number of entries in Microsoft's Technet. Despite Bob being gone, its annoyances and bugs soldier on through Windows 95, Windows 98, Windows 2000, Office and ultimately, Windows XP.

    Firstly the sound themes are already present in BOB. So is the annoybot that ultimately becomes clippit. Then there are the sound schemes. And cab files. But there are prehaps a lot of technical features that ultimately appeared in Win95, the P!us pack, and later.

    When you want to annoy the hell out of some MSCE or Microserf, you tell them that Windows NT is Microsoft Bob on top of a bloated WinOS2 shell running on top of 16-bit OS/2 1.3

    This explains the extensive entries for both MS OS/2 [in both Technet and WinNT/2K help], and Bob. It's a handy place to hide surplus bugs. :)

This restaurant was advertising breakfast any time. So I ordered french toast in the renaissance. - Steven Wright, comedian

Working...