


Do You Remember Bob? 315
GdoL writes: "Do you remember Bob? Byte's editor starts his monthly column talking about Bob the OS Interface from Microsoft in the middle 1990s. And he didn't forget either Bob the programming language from a former technical editor of Dr. Dobbs Journal, David Betz. This OO language is widely use on 'DVD players and set-top boxes produced by the likes of Toshiba, Samsung, and Motorola.' Do you remember any other language long forgotten that is still used in the real world?"
C? (Score:2, Funny)
Heh.
BOB a hollywood OS (Score:2, Funny)
BOB (Score:2)
However BOB is different. BOB WAS innovative. This shows one important lesson folks: as much as MS talks about "innovation," innovation is completely meaningless. When a problem arises we can resolve it, but every product does not need to be technically innovative.
What about BASIC? (Score:1)
Re:What about BASIC? (Score:1)
nah... (Score:2)
And so what if it doesnt encourage good code writing? Thats not the point of it. If you want 'good code writing' you start learning more serious languages and formally learning computer science.
but they're boring. (Score:2)
Scheme, I hate to tell you, is not exciting. And until the world switches to lisp, when students move on to more imperative programming they're going to be confused (or vice versa if coming to scheme from imperative). If you dont make it interesting for the beginners, they're not going to want to continue onto the more 'serious' side of it.
Re:What about BASIC? (Score:2)
Re:What about BASIC? (Score:2)
10 BORDER 5 : PAUSE 10 : BORDER 3 : PAUSE 10 : BORDER 2 : PAUSE 10 : BORDER 7 : PAUSE 10
etc. , but to a non-programmer's mind, it is easy to see the connection between this sequence of commands, and the result (bright flashy screen output). This is the first step of learning to program.
Try writing something in C or Java or Haskell to the same effect..?
exactly. (Score:2)
With C and its arcane conventions its hardly easy for the complete beginner. Heck, there were times with basic id get really frustrated.
BOB UI and WinXP (Score:5, Informative)
In the picture of the Bob UI, it shows a little dog who has a caption bubble coming from his mouth. Well, in WinXP if you do a file search (hit F3), you'll see an almost identical dog.
Maybe Microsoft thought that Bob was ahead of its time?
Anyway, it's strange.
Re:BOB UI and WinXP (Score:1)
Clippy (Score:2)
Clippy et all are the only refugees left from that experiment.
*shudder*
But, I suppose, you have to give them some credit for trying a different OS interface. Even if it did suck in all ways...
Re:Clippy (Score:3, Interesting)
Some examples: People watching a news program on a TV *set* labeled "news television" will rate that program as more informative and authoritative than those watching the same program on a TV labeled "general-purpose television". People using a computer program that praises another computer program will rate it smarter than a program that criticizes another program. Larger pictures will be better remembered and better liked than smaller pictures. People will rate a speaking tutorial program more honestly if the rating program uses a different voice!
Fascinating stuff.
Re:Clippy NO! NO! NO! NO! NO! NO! NO! NO! (Score:5, Interesting)
I had to read this book for a graduate Mass Comm class two years ago, and it is without a doubt the most awful excuse for experimental science that I have ever seen. Unfortunately for the authors, I had taken a class in research methods before I encountered their book (they should consider doing the same).
It's shoddy science, through and through. They ignore intervening variables, operate every experiment without controls, provide no accounting for intercoder reliability, the samples are always too small to be statistically significant (only one had more than 30 participants, many had less than ten), and comprised of forced participants (Reeves' and Nass's freshmen psych students at Stanford, to be precise. Even without a grade on the line, that's a bad sample). Usually, they rely on reported rather than observed behavior, and the only operating hypothesis ever examined is their goofy "equation" (you want me to spoil the beginning of the book for you? Here is The Media Equation: Media Equals Real Life. That's it. Word for word).
As if that wasn't enough, they make constant generalizations of their results (which with forced, nonrandom sampling is the first thing thrown out the window). They grandstand on every turn -- everything supports the Media Equation, and there is nothing it doesn't affect. You should always be suspicious when "scientists" do thirty experiments and always find their hypothesis supported exactly how they predicted. It's usually bunk, and in this case, it's a pantload. In one instance, they even admit to writing the hypothesis AFTER the experiment was performed. They make repeated references to other "research" in this area, but if you read through the bibliogrpahy, they are merely citing *themselves* from previous experiments. Many of these experiments, if you were interested, have still not been accepted for publication, many years after they were done. Most are not even available at the authors own Web sites. If it weren't for the fact that The Media Equation was published by Reeves & Nass's employer, I doubt whether they could've goten it published at all.
It's bad science, and it's only an afterthought: they plainly thought up their "equation" first, and then set out to prove it. That's the Scientific Method in reverse, people.
Let me give you one quick example in reference to the poster above: in the larger/smaller pictures experiment mentioned above, they show participants photographs of people's faces, some in close up and others standing 10 yards away. And the "test" is showing the subject another photo of the same people, and seeing which person the recognize most often. Guess what: it's the person who's face they saw in close-up. Surprised? You shouldn't be. No thinking adult would be; you see the face close up so you see more detail, and see it better. Plain and simple. But that's not the conclusion Reeves and Nass come up with; they decide instead that this turn of events means that you are having a psychological reaction to the face, and the biger face makes you happier because it seems more like a person. So if you think it's a person, you will remember it better.
That's the media equation, you see? The more person-like an electronic communication is, the better it works. The only time this has been tried under real-world circumstances, of course, is their grand experiment: Microsoft Bob. Funny how well that went over, huh.
Man, I hate that book.
Nate
PS - you might try the following link to Amazon, I submitted the review under "n8willis": here [amazon.com].
PPS - if anybody cares, I'll follow up what I say by emailing them the paper I had to write about this ridiculously bad book; it goes into more detail. Or perhaps I'll submit it as a /. book review. I meant to at the time, but it was just way too long....
Re:Clippy NO! NO! NO! NO! NO! NO! NO! NO! (Score:2)
1. We think A probably implies B.
2. Let's devise an experiment to try and show this.
Admittedly, that sounds more vague than I thought it would.... The difference is that they try to devise the experiment in order to get the results they want to see, rather than devising the experiment to be neutral and then testing the hypothesis. Or, to engage in some wishful thinking, more than one hypothesis.
Besides, they use the language of statistics, but they clearly have no understanding of it (disclaimer: I am still bitter at having had six hours of statistics in college from a old Analysis professor who made us do proofs on everything). Example: they give a survey, asking respondants to check "always/sometimes/rarely/never" as their response. Then they do means and standard deviations to report their results. That's absolutely meaningless. First of all, the numbers you get are completely dependant on how you map those qualitative, non-numerical values to numbers, and secondly, nominal, categorical data like that has no correct mapping because it's nominal, and not numerical.
It doesn't matter how you encode it, saying you've taken the average of NBC, CBS, and FOX or the standard deviation of Red, Green, and Orange is meaningless. You can assign values and weights to them and then do your calculations on those, but it's totally arbitrary.
Nate
PS - also, before someone else mentions it, yes always/sometimes/rarely/never has order to it; that is called ordinal data. But it's not continuous, and order in a set does not imply oh, what's the word... interval? You can say that the person who checked sometimes ranks ahead of the person who checked rarely, but not by how much. Not even for a single response, much less for the data set as a whole.
Pascal? (Score:1)
Re:Pascal? (Score:1)
Re:Pascal? (Score:1, Troll)
I remember at the time (over 10 years ago) thinking this was the most ill-advised thing the department could have done. I thought it was absolutely pointless to teach kids a language that was not also in wide use outside the classroom - such as C. To this day, I have never seen any jobs for Turing programmers, and I laugh out loud every time I see a resume where the only language the applicant knows is Turing, which still happens.
Byte (Score:1)
I know this is off-topic, but I have too many karma points and I just need to air my view
Byte Print Edition (Score:2)
There have been rumors that the print edition may be returning, based on passing comments by certain coumnists in their web journals. But nothing tangible yet
Modula-2 (Score:3, Interesting)
Remembering Bob (Score:1)
Re:Remembering Bob (Score:2, Funny)
I am waiting for MS-bob.net for linux (Score:3, Funny)
Bob as a joke (Score:1)
HTML (Score:1, Funny)
;)
Bob is a smart driver.... ;) (Score:1)
In Belgium it is the guy who is a volunteer in a party to drive all others home, while drinking soft drinks....
DIL16 (Score:3, Interesting)
DIL16 was DataSaab Interpretive Language for their 16-bit minis. Looked like assembler but had no registers, and yes, it was interpreted. Completely bizarre. I used it to work on a teller system at Citibank in the mid 70s.
Any other DIL16 programmers out there?
Forgotten languages still used in the real world (Score:1)
Now for my real question, how do I write this into my resume and make it look good?
Re:Forgotten languages still used in the real worl (Score:2)
A senior engineer once told me "the only reason for a new hire to learn any plm language is so that they can understand what the current code does when they reimplement it in C++ or Modula-2."
SNOBOL4 (Score:5, Informative)
SNOBOL4 was completely flexible on type, (e.g. you could do "5" + 3); had dynamic memory allocation and garbage collection; had the ability to evaluate dynamically generated SNOBOL4
It's probably still in use, and it was bizaare and wonderful. I also have fond memories of two compiler courses taught by RBK Dewar, one of the implementers of the Spitbol implementation of SNOBOL4.
Re:SNOBOL4 (Score:2)
However, I prefer to use the results of Robert's more recent programming language activities.
Re:SNOBOL4 (Score:2)
I'm not sure that not liking it because it was on an Amiga is sensible, given how advanced and powerful the Amiga was compared to its peers.
Other Language? (Score:4, Offtopic)
Re:Other Language? (Score:3, Interesting)
Re:Other Language? (Score:2)
Yeah, like you'd notice!
Re:Other Language? (Score:2)
Dr. Fun Cartoon that sums it up so well... (Score:3, Funny)
UNIX Gurus in Hell [ibiblio.org]
Re:Dr. Fun Cartoon that sums it up so well... (Score:2, Funny)
cmclean
I remember XLISP (Score:2)
Instead I learnt FORTH (using the great F-PC system for PCs), and returned to Lisp later when I encountered Emacs 19.
Re:I remember XLISP (Score:2)
/Brian
Re:I remember XLISP (Score:2)
FORTH (Score:1)
I used to have a blast messing around with FORTH on my old Kaypro II, Vic 20, Apple II, CoCo, Amiga, etc. A really fun language to hack around in.
Go FORTH (Score:2)
I still have warm feelings for forth. I remember the first time I got acquainted with forth. It was some 3d framework called graforth.
I was quite impressed with its counter-intuitive reverse-polish-notation syntax:
c a b + = if then
Isn't it much more stylish than writing:
if (a+b==c) {} ?
PL/M, ICON (Score:1)
We used ICON to whip up a driver program for the CNC milling machine that the on-site machine shop had in order to aid conversion from their paper tape (!) library.
Subgenius (Score:5, Funny)
May Bob be with you.
Forgotten languages. (Score:2, Insightful)
FORTRAN wil live "forever" (Score:2)
If you look into Octave source code you will find those FORTRAN libraries there. Since they are public domain, Matlab and other commercial number-crunching software probably use them as well.
They are still coming up with new FORTRAN versions, I believe FORTRAN 2000 is the latest. Someone once said that we don't know which language people will use for numerical analysis in the year 2050, but we know what its name will be: FORTRAN.
COBOL has its place.. (Score:2)
It's not a given that COBOL is a scalable language. However, the best implementations of COBOL have always been the ones that reside on highly scalable architectures, i.e. mainframes (usually IBM's). It's also been true that the best language on those scalable architectures for business problems has almost always been COBOL. Therefore, businesses with very high volume needs have used COBOL. It's not difficult to understand.
Now, I'm sure IBM looked at public perception and said something like "Hey, everyone seems to despise COBOL. Maybe we should do something to get a new language in the works so we don't lose our shirts." And then was born their support for Java. I think you should expect to see Java take over tasks that COBOL would previously have been used.
Also, I think you're absolutely right. Mostly non-IT related companies probably are using COBOL. But who really pays the bills anyway? You think IT companies pay the bills? Really?! Despite our over-inflated self-opinion, IT will always be a derivative industry. Just like accountants, HR folks, etc. we will only ever have jobs where other industries already exist. IT has no value in a vacuum.
My last point is this: Every programming language out there that actually gets used, gets used by a community. Every community has a culture. And every programming language serves that culture. Don't think this is true? Try using VB in a Unix shop. The culture clash will be immediately apparent.
So, people still do program in COBOL; because it works and because that's how they and their peers think. End of story. It would be stupid to deny the reality of that. But those people aren't necessarily stupid for being the product of their local culture. Just like we're not stupid for being (at least in part) the product of the
Re:Forgotten languages. (Score:2)
Two words:
Cobol Script.
http://www.deskware.com [deskware.com]
dave
assembley! (Score:2, Insightful)
This is if you regard assembley as a programming language
Bob the language (Score:5, Informative)
It's hard to find any documentation for the Bob language. Having a quick look at some Bob source code, it is a simple OO language without classes, where subclassing is the same as instantiation, much like Self or Cecil. It seems to support only single inheritance, though I gather it's dynamically typed, so there's no need for "interface inheritance".
It's not "purely" object-oriented, since you can define procedures that are not methods of any class. At first glance, there doesn't seem to be any access control: all features of an object are public.
Re:Bob the language (Score:3, Informative)
(Disclaimer: I'm not an expert in prototype-based languages, but I'm fairly confident that the description below is quite accurate. Corrections are, therefore, very welcome.)
Languages such as this are called prototype-based languages, and are generally seen as the successor to object-oriented programming. However, no prototype-based language has, to my knowledge, actually gone anywhere (the one that got closest was NewtonScript, which would have made it if it weren't for the fact that the only place to use it was on the Apple Newton), so it's nice to see that at least one actually made some progress towards general acceptance.
For those unfamiliar with the concept of prototype-based programming languages, Bob (and all prototype-based languages, for that matter) are by their very nature single-inheritence. The general idea is to eliminate the whole idea of classes, and instead treat everything as an already existing object. You them modify those objects as necessary, and, if it's an object which is handy, you just make lots of copies of it. I find it much clearer to use the word "copy" instead of "initiate," as you did for this reason. For example, define anObject = new Foo() makes a new copy of the already existing Foo object, that will have all of the same values, etc. as Foo. You can then modify that copy by adding new values as necessary.
The reason I make this distinction is that one of the powerful things you can do in prototype-based languages is give an instance of a class a new function. For an example of when you'd want to do this, let's say you have a Canvas object in a GUI. Now, you, at some point, are going to need a screen, and the screen is going to need some variables and methods you don't need for a standard Canvas (for example, Screen.refreshRate or Screen.setColorDepth()). The normal way to do this in a regular OO language would probably be to declare a subclass of Canvas that had the extra functionality, and then to make a single instance of it, probably called theScreen. This is awkward, however, because, 99.9% of the time, you really only want one screen object, so making a subclass just for a single instance seems odd. In a prototype-based language, however, you'd simply "copy" a new instance of a Canvas (called theScreen) and add your extra methods and functionality specifically to that object. Ultimately decide that you really do need multiple screens? No problem! Probably you'll want to add a monitorNumber attribute directly to the already existing screen object, and then make a copy of that. Similar functionality is also present in Dylan [gwydiondylan.org], and, by extension, probably LISP's CLOS, although I'm honestly just not familiar enough to know.
Re:Bob the language (Score:2)
Re:Bob the language (Score:5, Interesting)
Prototype-based languages have an interesting minimalistic feel that allows language designers to get right to the heart of some issues, and I think they are very valuable for that reason. However, I think ultimately classes will win out.
One of the common mistakes in OO programming is to confuse kinds-of-things with instances-of-things. A program for a clothing store might have "shirt" objects with properties such as the manufacturer, size, colour, and quantity in stock. Seems ok so far. Then, when looking for a place to store information on a specific shirt, such as the customer who bought it, the programmer notices that a "shirt" has two meanings: a kind of shirt versus a specific individual shirt. The quantity-in-stock belongs to a shirt type, while the purchasing-customer belongs to a specific shirt instance. The design has to be juggled a bit to accomodate this new insight.
Combining classes and objects the way prototype-based languages do seems to encourage this kind of confusion on a massive scale. Programmers would undoubtedly add comments to indicate which objects are actually "meta-objects" (ie. classes). The Bob language examples do exactly this, as a matter of fact. I suspect that the degree to which they don't differentiate between classes and objects is exactly the degree to which they will have confusion between instances and kinds of things.
Incidentally, I think there is a great deal of promise in the idea that everything (including classes) is an object, but I don't see prototype-based languages as being a successor to class-based ones.
Re:Bob the language (Score:2)
Re:Bob the language (Score:2)
The guy I knew who wrote one called it an exemplar-based language, which is probably clearer. He didn't necessarily think they were going to take over from class-based OOL, just that it suited his purposes well - the proprietary language within one particular company's product (where I gather it worked very well, but "general acceptance" was never a goal).
> probably LISP's CLOS
See "The Art of the Metaobject Protocol". Worth a read even if you never expect to program in LISP, just for exposure to the ideas.
http://www.amazon.com/exec/obidos/ASIN/02626107
Re:Bob the language (Score:2)
It does look like a rather interesting language, though; in some ways it seems almost like a predecessor to Java (if that means anything). It's a fairly cleverly done macro language, basically similar to C++ with all the garbage stripped out.
/Brian
Re:Bob the language: it's aliiive!!! (Score:2)
AmigaOS. (Score:2)
x86.. (Score:2, Funny)
You could barely do anything with a single line of code. Whereas in Perl, you can make the coffee and clean your bedroom in one line, with the obsolete 'x86' you had to pretty much write a bible-worth of code.
I reckon they should consign x86 to the scrap-heap and make Intel processors run directly on BASIC instead.
Re:x86.. (Score:2)
There's no such thing - there are many Basic compilers which produce native code - Moonrock [sensation.net.au], ASIC [basicguru.com], and I believe the newer versions of Visual Basic (?)
See this old post [slashdot.org] also.
From all this you might think that I ever used Basic, but the fact is I never learned it
people make fun of Bob, but... (Score:2, Interesting)
The first version was widely reviled, but the team starting working on a second one. Now it is often true that the third version of a product is the one that catches on -- the first one is rushed out, the second has all the stuff that was supposed to go in the first, then the third can actually respond to user feedback and become useful. But for some reason, Microsoft untypically cancelled Bob 2.0 in mid-development.
Now if you imagine Bob continuing to evolve and eventually adding Internet access (still categorized, simplified, friendlier, etc), then it could have become...AOL. People make fun of AOL also (for similar reasons), but it's a pretty successful company and viewed in many ways as the only tech competitor to Microsoft. Now imagine if Microsoft had short-circuited that with Bob 5.0.
- adam
Visual Basic for MS-DOS (Score:3, Interesting)
As a tip, if you are ever called out to do a consulting gig and the customer mentions "Visual DOS", run like hell.
Re:Visual Basic for MS-DOS (Score:2)
Seriously tho, I think all operating systems should come with a version of basic, for learners and people old to the game but new to the OS to play with.
Re:Visual Basic for MS-DOS (Score:2)
It didn't have the 'Please hog all my system resources' that VB has.
dave
Re:Visual Basic for MS-DOS (Score:2)
Don't blame Lisp! (Score:2)
This would be like blaming general relativity for atomic weapons or Thomas Edison for phone sex and the psychic friends network.
Re:Don't blame Lisp! (Score:2, Informative)
It was Alexander Graham Bell who invented the telephone, by the way...
Re:Don't blame Lisp! (Score:2)
The disaster that was Bob... (Score:3, Funny)
So does anyone want to guess what happened to the program manager for Bob?
That's right. Bill Gates married her. Go figure.
The idea of predictive interfaces was interesting, but Bob had the fatal flaw of being way too complicated for the hardware of the day. Some of the technology lives on in Office's Clippy, but Bob itself was a disaster to the point that even the people who pirated it returned it.
Re:The disaster that was Bob... (Score:2)
Actually, if memory serves, it was the other way around. He married her, then let her spec out what she wanted and see it developed.
Anyone else and it probably wouldn't have seen the light of day.
Re:The disaster that was Bob... (Score:5, Informative)
Karen Fries was the driving force behind Bob. Melinda was just part of the PM team associated with it.
I was on that team as a contract tester. It was my first job after dropping out of college. I'm terribly surprised I still work in the industry after being associated with that disaster. I did come away with some entertaining memories, however.
The original project codename was "Utopia" (actually, it might have been Utopia Home). I've still got a T-Shirt with the Petie the Parrot character on it and Utopia scrawled across in a kind of abstract architect font.
When the name Bob was revelaed, there was a meeting of all the team members and a bunch of muckety mucks. This incredibly cliche marketing consulting team was the group which came up with the name. They were all up in front of the room in their black turtle necks and black plastic framed glasses.
When they got through their powerpoint presentation to the name and the glasses wearing smiley face icon the room was deathly still except for Karen Fries excited squeal and clapping. She looked out over the crowd assembled and started to look cross - we got the message and started clapping.
Now I've been involved with more than a couple doomed projects since then (perhaps I'm some sort of CS pariah) but I've never seen a group of people so unhappy and depressed about their work.
A little while after that, I believe, Melinda got her engagement ring. There was another big party. The joke I always tell about that event is: "It was a pitty about her arm..." I'm sure, being a geek that Bill felt he had to make up for something there and prove to the world that this relatively attractive woman was indeed taken. The rock on her ring was as large as one of my knuckles. There's no way when wearing that ring she could put her hand in a tight pocket. It was one of the most ridiculous and sad things I've ever seen, yet there I was saying, "Wow, that's... great!"
We knew when we were working on Bob itself that it would be a disaster. At the time, Pentium computers were just coming out in the consumer space. A P90 was reqiured to run Bob with any sort of usability. Most of it was written in VB, back when VB had no chance to rival C in any task and just using the product was painful. These computers were 3-5 thousand dollars and we expected new computer users to buy them just to use a piece of even less functional than MS Works software?
The whole thing would have been unbelievalbe to me if I hadn't lived it myself.
Re:The disaster that was Bob... (Score:2)
Apparently, there are some aspects that Microsoft is determined to keep alive; I guess you gotta have something for people to consider 'cute'.
Eric Raymond's retrocumputing museum (Score:3, Informative)
"The Retrocomputing Museum is dedicated to programs that induce sensations that hover somewhere between nostalgia and nausea -- the freaks, jokes, and fossils of computing history. Our exhibits include many languages, some machine emulators, and a few games.
Most are living history -- environments that were once important, but are now merely antiques. A few never previously existed except as thought experiments or pranks. Most, we hope, convey the hacker spirit -- and if not that, then at least a hint of what life was like back when programmers were real men and sheep were nervous."
http://www.tuxedo.org/~esr/retro/ [tuxedo.org]
How about REXX? (Score:3, Interesting)
And then there's PostScript. PostScript isn't forgotten, but there aren't a whole lot of programmers who know how to use it. It's a rather unwieldy language with a lot of primatives, but it looks a lot like forth. I preferred it over forth though, as it struck me as being a lot cleaner. If I were going to use a reverse polish notation language, it'd be a stripped down version of PostScript. If anyone wants to learn PostScript, Adobe sells a language reference manual and some tutorials that cover the language very nicely. Ghostscript is all the language interpreter you need. Then you could do cool stuff like make the printer compute and print calendars for you.
PostScript (Score:2)
Yep! (Score:2)
Unfortunately, I never did figure out how to open a network socket in PostScript. It would have been a really cool hack...
Solder (Score:2, Funny)
Bob? (Score:2)
Do I remember waterworld!
GUI's still aren't good enough (Score:5, Informative)
Bob, was one of the very, very few truly creative product attempts for the general market Microsoft has ever made. The first version was deeply flawed, but it also had some very good ideas. Microsoft is not very comfortable with the messiness of creativity and so like a foreign microbe Bob got expelled before these problems could be fixed. Version 2 got cancelled just a week before going into general beta.
The product started out as skunk works, and if it had stayed like that, we might have done a better job. However, I think the biggest curse was that mid-project our Product Unit Manager (PUM) became Melinda French, soon to become Melissa Gates. Melinda never had much direct say in the product, but she was obviously very well connected. We then got showered with money and developers and it went to our heads. It has become a very good object lesson to me on the dangers of over-engineering.
What I find distressing, though is that the good ideas that were in Bob are ignored, and no other product seems to be picking them up.
Here are some of the key ideas:
* Menus are not necessarily the best UI. Think about it; they are passive, they quite often show lots of options that are in appropriate, and the commands are stuffed in all sorts of weird places. Even experienced users have trouble finding some of the options.
* A shockingly high percentage of people are still scared of computers. If you are truly going to create consumer software you have to address this somehow.
* UI is a conversation. GUI's are built on the realization that we are very visual creatures. But what about tapping into our sociability? We are very social creatures. There is a body of evidence that shows that people interact in a social way with their computer (really!). That is where the characters come in -- in extensive usability tests we found a real benefit to them. They helped allay the fear factor and they served as a useful UI metaphor -- UI as a conversation. By the way, the characters were always completely optional -- there was a very easy way to turn them off completely.
*Task basked UI. Most programs are general purpose programs that do quite a number of things. The only problem is that the vast majority of people only use a small fraction of the features. One solution is to take the code for word processing and present it as a family of specialized tasks. So you would end up with a letter writer, a report writer, an e-mail writer, a list maker, etc.
I wrote Bob's Letter Writer. This may sound like a weird specialization, but since we knew that people using this particular program were just writing letters, we could do a great job of making mail merge easy, and also doing neat graphic effects (ala Publisher) that would appeal to someone writing a letter to a friend.
* Files are a low level concept. I mean really -- why should the common user have to care about such a geeky thing as a file? They just want to get their document. They could care less about whatever low level construct the developers have come up with to store this information, and really they shouldn't have to. It is weird that we still do not have an object oriented OS. My biggest disappointment with Linux is that it has done very little to push forward truly new ideas (I'm still rooting for it though).
On a technical side, the reason why Bob performed so poorly was because we tried to create the very first OLE component system that worked just as well for C++ as for Visual Basic. VB was not yet up to the challenge, and yet most of the apps were done in VB. We also used every Microsoft technology (the Jet database engine, the Quill word processing engine, VBA, etc.) and yet machines of that time only had 4 megabytes of memory! We required way too much memory for the time -- probably around 12 MB. The graphics looked bad because we had such a tight memory budget that we did not use any bitmaps at all. Everything was done with meta files (vector objects). On top of that we had to write to Windows 3.1 -- 16 bit programming.
One of threee (Score:2)
> the general market Microsoft has ever made.
yes, many people forget that. It shouldn't be that hard to remember all three innovations from microsoft:
1) 8 bit BASIC. Yes, the language existed, but actually implementing it for those silly little hobbiest toys as a commercial product was innovative.
2) The usable word processor footnote in 1984 (Word 1.0, Mac). Yes, we *could* make footnotes in wordstar, but it was a PITA. I'm told that Word Perfect came out with a footnote the same year, but it would be anothe rcouple ofyears before WP was in wide use (WS still reigned. Right up until that WS 2000 fiasco . .
3) Bob. Oddly, I've actually met two students who have seen in--both times in response to asking if anyone had ever heard of it. One not only remembered its existence, but actually thought it was cool, and had spent a lot of time at it.
And why doesn't it surprise me that most of the people from MS's last round of innovation are gone??? I still occsasionally use what I think are the final two decent products to leave MS: Word 5.1a, and Excel 4.0 (both mac).
Hawk, who really isn't anti-ms, but a) just hasn't seen anything worth owning from them in close to 10 years now, and b)has the usual free-market economists' distaste for monopolies which mess with his precious markets.
Re:GUI's still aren't good enough (Score:3, Interesting)
As the commenter above notes, the now standard WIMP (windows, icons, menus, pointer) interface isn't necessarily the optimal way to interact with a computer; it's just what we've all learned to work with. And it's worth noting that even that interface took several iterations to get right, just as it does for a lot of MS software (IE, Office, Windows, etc all seemed to come of age with the 3.x versions, and start surpassing the competition that they copied with the 4.x & 5.x versions. They of course start bloating by the time they get to 5.x & 6.x, but that's a separate problem... :)
Computer hardware is now drastically more capable than it was when Bob came out, to the point that software developers are always looking for ways to fill up all those extra clock cycles -- anything from running Seti in the background to having hooks in the Windows interface that pause for a few hundred milliseconds before opening a menus so that "it feels like the computer is working harder" -- surely my least favorite part of the Windows interrface and the first thing I try to disable with TweakUI on any computer I'll be using regularly. The really "revolutionary" releases of the recent past -- Mac OSX and Win XP -- aren't really revolutionary at all, but glossier and more refined versions of what we've been using for well over a decade now -- and in the case of OSX at least, you could argue that the interface is a step backwards in terms of flexibility and usability, emphasizing style over substance at the UI level, even if the underpinnings are surely much more advanced than before. XP might also be guilty of this, but I haven't used it yet so I can't say; I do know that the dissolving menus that Win2k had were guilty of the same sort of cute wastefulness that OSX/Aqua's pervasive translucence & drop shadows represent...
Maybe it's time to consider abandoning the WIMP interface. Maybe the world is ready for Bob or something like Bob to give it another shot. Or is it? Bob tried to represent the computer 'space' as the interior of a home, and for a desktop computer of sufficient power (i.e. what most of us have now, but didn't have when Bob came out), this isn't so bad. But in a networked world? Can you achieve some sort of network transparency & represent it in that sort of metaphor? I dunno, maybe. I am sure that it's an interesting challenge, much more than ever more glossy iterations of the same old Mac & Win interfaces could ever be, as they both try to refine their implementations of the Macintosh Interface Guidelines ever further.
Maybe it's time to give the Anti-Mac Interface [acm.org] a try -- a system that inverts all the assumptions that we've been working with for years now.
My kids loved Bob! (Score:2, Insightful)
Bob XP is out (Score:2, Funny)
Theme is on Wincustomize:
http://www.wincustomize.com/preview2.asp?source
It's Just for fun. Nobody in their right mind would run this as their UI. Just like no one in their right mind would use Bob before.
Purl who?? Computer? (Score:2)
Oh, and how could one forget the greatest "bot" of them all? The computer from Star Trek. "Computer, where is Worf?"
You want these bots, You need these bots. If you don't like the manner in which I provide these bots, then why don't you sit at a keyboard and write one yourself?
Bob, Bubba and Technet. (Score:2)
For a product that failed to make an inpact on the market, Bob has a supprisingly large number of entries in Microsoft's Technet. Despite Bob being gone, its annoyances and bugs soldier on through Windows 95, Windows 98, Windows 2000, Office and ultimately, Windows XP.
Firstly the sound themes are already present in BOB. So is the annoybot that ultimately becomes clippit. Then there are the sound schemes. And cab files. But there are prehaps a lot of technical features that ultimately appeared in Win95, the P!us pack, and later.
When you want to annoy the hell out of some MSCE or Microserf, you tell them that Windows NT is Microsoft Bob on top of a bloated WinOS2 shell running on top of 16-bit OS/2 1.3
This explains the extensive entries for both MS OS/2 [in both Technet and WinNT/2K help], and Bob. It's a handy place to hide surplus bugs. :)
Re:Linux cost analysis (Score:1)
Megaflops (Score:2)
Wait a minute. Wasn't apple advertizing the G4 as a Gigaflop system? I thought the more megaflops the better. Maybe you mean Microflop. (for those that don't know, a flop is a measure of processing power, the more, the better)
Re:Megaflops (Score:2)
Re:Megaflops (Score:2)
Re:A warrior's programming language (Score:2)
-Restil
Re:A warrior's programming language (Score:2)
Har!
dave
Re:How about Fortran (Score:2, Informative)
It's still used in scientific applications. I know three years ago, when I contracted at Raytheon STX (formerly Hughes STX...reasonably big name in aerospace), the guys who did the number crunching side of things worked with FORTRAN a lot.
Re:How about Fortran (Score:2)
I briefly worked in that world but thank God I now work on the client side and write C/C++.
Re:How about Fortran (Score:2)
dave
Re:Syntax for Bob? (Score:2)
I find this bit about Bob being used in DVD players to be rather interesting. As the creator of the language, can you tell us a bit more?
/Brian
RPG was especially important (Score:2)
Re:APL (Score:2)
It has a different problem domain. APL is heavily used in statistical and financial analysis and Perl is used for text file processing. You really can't beat APL when you need to do math work, especially math work involving huge sets of data.