Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×
IBM Announcements Programming IT Technology

Computer Pioneer Bob Bemer Dies 348

tpconcannon writes "Bob Bemer, the man who helped introduce the backslash as well as the escape key to computing, has passed away at his home at the age of 78. He also helped develop ASCII during the 60's at IBM. More interesting is that he predicted the Y2K bug all the way back in 1971!"
This discussion has been archived. No new comments can be posted.

Computer Pioneer Bob Bemer Dies

Comments Filter:
  • He was 84, not 78 (Score:5, Informative)

    by Anonymous Coward on Thursday June 24, 2004 @11:33PM (#9525186)
    And I posted this yesterday [slashdot.org].
  • by Anonymous Coward on Thursday June 24, 2004 @11:35PM (#9525194)
    that he's been ALT-F4'ed?
  • 82 73 80 (Score:5, Funny)

    by Daikiki ( 227620 ) <daikiki@wan a d oo.nl> on Thursday June 24, 2004 @11:36PM (#9525200) Homepage Journal
    82 73 80
  • His website (Score:5, Informative)

    by NearlyHeadless ( 110901 ) on Thursday June 24, 2004 @11:37PM (#9525204)
    His website is here [bobbemer.com]. There are a lot of interesting tidbits on his history [bobbemer.com] page.
    • Re:His website (Score:5, Informative)

      by orkysoft ( 93727 ) <orkysoft.myrealbox@com> on Friday June 25, 2004 @12:46AM (#9525506) Journal
      And if you read that, you'd know he invented the escape sequence, rather than just a key on your keyboard. The website hardly mentions the key, it mentions the concept of the escape sequence. That the ESC key is used to activate terminal escape sequences, or the backslash (which he also introduced into ASCII) is used to activate C-like escape sequences, isn't as relevant as the concept of the escape sequence itself.
  • That Y2K thingy... (Score:5, Insightful)

    by sljgh ( 742290 ) on Thursday June 24, 2004 @11:37PM (#9525207)
    Predicted it back in '71? That seems like something a smart person would do, shame the rest of us didn't follow up on it before 30 years later.
    • by f1ipf10p ( 676890 ) on Friday June 25, 2004 @12:05AM (#9525373)
      And still most people don't realize that the counter from epoch date (Jan 1. 1970) has a roll over flaw too. Seems to me 2038 is the magic year... but I have poor memory recall... I'm sure my recall will be even worse by then...
    • Note that both
      Yahoo and Microsoft hit their all-time highs during the week of Y2K, and never recovered since.


      Anyone who says the Y2K problem wasn't real, hasn't been following tech stocks.

    • by grrrl ( 110084 ) on Friday June 25, 2004 @12:41AM (#9525499)
      My year 10 computing teacher told us a story about how in the 60s they used a single number to store the year, and when they got to 1970 they were like "wait a sec!" - it all came down to space - what was the point of storing two or four numbers if you only needed one? it doesnt take much to extend the idea to 2000 for programs written in 1970, except they needed the space and why would their code last so long?

      we are now rather spoilt with storage space/bandwidth/etc
      • by Frobnicator ( 565869 ) on Friday June 25, 2004 @02:53AM (#9525831) Journal
        My year 10 computing teacher told us a story about how in the 60s they used a single number to store the year, and when they got to 1970 they were like "wait a sec!"
        Sounds like your year 10 computing teacher needs to take some more history lessons. Many of the groups who first dealt with computers (banks) were hit with the Y2K bug right at 1970, since they couldn't do 30-year loans, and many had already considered the problem. And since these companies counted out memory by the byte (or rented memory by the byte) they certainly wouldn't have been storing years as simple text. If they were, the programmer at the day would certainly have been considered wasteful, and possibly even fired for such practices.
        • by Detritus ( 11846 ) on Friday June 25, 2004 @05:16AM (#9526172) Homepage
          Years were commonly stored as text, BCD and packed BCD. What they weren't commonly stored as were 16-bit or 32-bit integers. The first two digits, sometimes three digits, were implied. DEC used three bits for the year in some of their early operating systems.

          Text and BCD formats were popular because they were efficient. Binary (integer) formats for date and time required complex conversions for I/O. There was no such thing as the microprocessor. Multiplication and division were usually very slow operations. Many computers implemented them in software, not hardware. The hardware for them was often an expensive option, not a standard part of the CPU. BCD could be converted to/from your favorite character code with simple hardwired logic.

    • by Ungrounded Lightning ( 62228 ) on Friday June 25, 2004 @03:02AM (#9525854) Journal
      Predicted it back in '71? That seems like something a smart person would do, shame the rest of us didn't follow up on it before 30 years later.

      I was already predicting it no later than '70. Didn't have the cute three-symbol acronym - I was calling it "The Great Bimillenial Computer Date Disaster."

      (I was resuscitating a batch processing system in '70 that wouldn't start - turned out to be a 'sanity check' on the date entry. But if I recall correctly I'd been predicting it even before then.)

      Nobody listened to ME, either.

      (In fact, in the early '80s, while I was consulting, I tried to convince the customer to let me specify date entry in a way that wouldn't blow up in 2000, and was directly ordered not to spend time doing so - because the design life of the system was only 15 years. B-(

      I guess I can feel a bit better if Bemer couldn't get the message across either. (Sigh.)
  • by Anonymous Coward on Thursday June 24, 2004 @11:39PM (#9525217)

    As recently as a month ago, "He was on the computer every day," Teeler said Wednesday. "He is a man who literally worked just about every day until he died. He felt at home sitting in front of a (computer) screen."


    In Memory Of A True Geek :)
  • Let's give credit where credit is due. Al gore clearly took the initiative in creating the backslash, the ESC key, and ASCII.
  • by ErichTheWebGuy ( 745925 ) on Thursday June 24, 2004 @11:39PM (#9525223) Homepage
    www.bobbemer.com [bobbemer.com] (official website)

    And the google cache [64.233.167.104] for the impending slashdotting

    Among the more interesting tidbits is that he coined the word COBOL
  • Sounds Like... (Score:5, Insightful)

    by Snagle ( 644973 ) on Thursday June 24, 2004 @11:40PM (#9525234)
    The guy must have been lucky or just had a lot of foresight. We could all pretend to act like we knew who he was and say he'll be missed but that would be a lie so let's just give him credit for his contributions. He gets an "A" in my book for thinking up "Esc" and "\", unlike the bastard who invented "CAPS LOCK" !!!
  • by orthogonal ( 588627 ) on Thursday June 24, 2004 @11:41PM (#9525236) Journal
    "Bob Bemer... passed away at his home at the age of 78.

    The AP reported he was 84, and Wikipedia [wikipedia.org] confirms that he was born in 1920.

    In any case, I'd like to commemorate Mr. Bemer with the traditional Slashdot version of a Viking funeral:

    I just heard some sad news on talk radio - COBOL standardizer/Father of ASCII Bob Bemer was found dead in his Texas home this morning. There weren't any more details. I'm sure everyone in the Slashdot community will miss him - even if you didn't enjoy his character set, there's no denying his contributions to popular culture. Truly an American icon.
  • ahhh "esc" (Score:4, Funny)

    by atarione ( 601740 ) on Thursday June 24, 2004 @11:41PM (#9525239)
    I keep pressing it and yet I'm still stuck at my crappy job....sigh

  • ASCII (Score:5, Interesting)

    by Teri in Hell ( 727154 ) on Thursday June 24, 2004 @11:43PM (#9525245) Homepage
    ASCII really is something of beauty. It is universal (debatable) and useful. Everyone knows how to read or write it. It is simple to use for config for a program because almost any language can read it and interpret it. It is the driving force of the web. We owe a lot to Bob for giving it to us. Plus, even though /. uses a forward slash, it could have been the other way.
  • RIP (Score:3, Insightful)

    by burtonator ( 70115 ) on Thursday June 24, 2004 @11:46PM (#9525267)

    Rest\ In\ Peace
  • ASCII Art Tributes? (Score:3, Interesting)

    by geofforius ( 791412 ) on Thursday June 24, 2004 @11:56PM (#9525322) Homepage
    There must be people out there with a bit of talent willing to have a crack at this!
  • by Acaila ( 259043 ) on Thursday June 24, 2004 @11:59PM (#9525340) Homepage Journal
    "Computer pioneer Bob Bemer, who published Y2K warnings in '70s, dies at 78" ....
    "has died after a battle with cancer. He was 84."

    2nd paragraph contradicts the first...
  • Y10k bug (Score:5, Funny)

    by sbergman2 ( 523735 ) <steve@rueb.com> on Friday June 25, 2004 @12:04AM (#9525367)
    Just for the record, I would like to predict that on Jan 1, 10000 much of the software currently in existence will malfunction unless it is modified to handle 5 digit years. Bemer made his prediction 29 years in advance. I'm making mine 7996 years in advance. So there! :-)
    • Re:Y10k bug (Score:3, Interesting)

      by 0racle ( 667029 )
      Only if you keep using outdated and poorly engineered OS's and hardware instead of Power Macs [businessweek.com] which are designed to handle dates through A.D. 29,940

      Disclaimer: Y2K was nothing but overblown crap reported on by the uninformed media, and I would not want to be in any way associated with it. I just found it funny that PPC Mac's handle such huge dates.
  • by Anonymous Writer ( 746272 ) on Friday June 25, 2004 @12:09AM (#9525385)

    As recently as a month ago, "He was on the computer every day," Teeler said Wednesday. "He is a man who literally worked just about every day until he died. He felt at home sitting in front of a (computer) screen."

    Do you people think he knew about Slashdot? Maybe he actually had an account and got involved with the story discussions. For all you know, he may have been a regular comment and story submitter on this site and nobody will notice his disappearance. Just a thought.

  • Goodbye Bob (Score:5, Insightful)

    by f1ipf10p ( 676890 ) on Friday June 25, 2004 @12:15AM (#9525407)
    EBCDIC to ASCII was as big a step as ASCII to Unicode. I hope that Bob's next step is even bigger. May he join that big computer in the sky and have restful NOOP's;

    from my (limited) COBOL days-

    CLOSE mName-# BobBemer

    Thanks Bob.
  • by keefey ( 571438 ) on Friday June 25, 2004 @12:16AM (#9525414)
    He surely can't have been the only one to predict the Y2K issue, however he was probably one of the only people, back then, that actually cared. I constantly hear the argument "ah well, they'll not be using it in x years time, so we can forget about that; it's not an issue".

    Well, it was! Now, what happens when the number of seconds since 1970 rolls over the maximum digit for an int?
  • It figures (Score:5, Funny)

    by Skapare ( 16644 ) on Friday June 25, 2004 @12:17AM (#9525419) Homepage

    It figures that his age across the year 2000 would end up being miscalculated by someone ... or something.

  • by phreakv6 ( 760152 ) <phreakv6@nOSPAM.gmail.com> on Friday June 25, 2004 @12:36AM (#9525480) Homepage
    ''Don't drop the first two digits. The program may well fail from ambiguity in the Year 2000.''

    He wrote this in his article "Time and the Computer" way back in the 70's.
  • by glMatrixMode ( 631669 ) on Friday June 25, 2004 @12:46AM (#9525507)
    > More interesting is that he predicted the Y2K bug all the way back in 1971!"

    which has not happened.
  • What a shame that something with such a powerful influence for good (as an escape for special characters) could be perverted and made so powerfully evil (as a directory separator under DOS/Windows).
  • by SdnSeraphim ( 679039 ) on Friday June 25, 2004 @01:08AM (#9525581)
    I recently, from about 1 1/2 years ago, until a couple of months ago, had the pleasure to exchange e-mails. He was very easy going, and responded to every one of my e-mails, even when they weren't that important. Even though I didn't know him past the history on his website, the way he treated me, a complete stranger, tells me that there was something special about him, past his "father of ASCII" title.
    • by orcmid ( 122810 ) on Friday June 25, 2004 @02:50AM (#9525821) Homepage
      That's nice to hear. Thanks. I worked for Bob while he was Director of Software at Sperry Univac in the 60's. He was a lot of fun: kept calling me "Bub." I found him on the web prior to Y2K as the result of an article reporting that he was suggesting a repair that would not require people to remap existing records. (He wanted to pack the numbers tighter and buy some time.)

      I exchanged e-mail with him a few times in the last few years, and I had a chance to acknowledge the inspiration [miser-theory.info] he was for me while he was still around. I don't know that he was around here. When I last exchanged e-mail with him he was frustrated about what it took to maintain his web site. Your contact was more recent. What do you think?

      I guess he was a geek at heart. I had produced a fast decimal-to-binary algorithm for a machine that didn't have a built-in converter but addressed in binary and calculated in decimal (makes subscripting hard). He was the only one of his organization that worked it over and took more cycles out of it, and then I took out more using his ideas. He thanked me for giving him a chance to play. He also worried about improving programming languages, establishing software forensics, and making software engineering an activity that exploited reusable piece parts, anticipating components by a good 30 years. He funded Peter Landin and Bill Burge's work on Functional Programming in the US. He also understood about small details, like character sets and escape techniques. With regard to his people, he didn't believe in burning out developers and he thought there was a lot of life to be had outside of the office. I'm pleased to learn that he was active to the end. I'll never forget him. -- Dennis E. Hamilton
  • Asshats.. (Score:3, Insightful)

    by cepheusfilms ( 776952 ) on Friday June 25, 2004 @02:21AM (#9525754)
    Here is a guy who imagined amazing things and contributed to the start of the computer revolution, and yet.. What does slashdot users do? ATTEMPT to think up witty and STUPID remarks to get themselves a nice "5 FUNNY" remark in their posts. Get a grip. Here is a great icon that has passed on. Why don't you take a moment to admire what he has done instead of being a total fuck? C
  • by karnat10 ( 607738 ) on Friday June 25, 2004 @02:52AM (#9525824)

    he defined the concept of using a special character to "escape" from one character set to another, and proposed to use the backslash for this (which hadn't existed in character sets until then).

    the escape key has nothing to do with this!

    thanks, slashdot editors, for misinforming people
  • by CactusCritter ( 182409 ) on Friday June 25, 2004 @02:57AM (#9525842)
    Back in about 1964, when I was an engineer and a member of the Cincinnati-Dayton Chapter of the ACM, I was surprised to learn from busines-programming members that 2-digit year representation was being used. We agreed that it had better not be too long before the 2-digit year was replaced in databases.

    When Bob's article on the Y2K problem appeared in 1971, I was surprised that nothing had been done. Of course, disk storage space was still quite pricey. I thought that Bob's article would stir things up.

    When Y2K finally publicly surfaced in 1998 or 1999, I was stunned that not a damned thing had been done since Bob's definitive 1971 article on the topic.

    Last year when I was proofing a local guru friend's in-process book ("The Healthy PC" by Carey Holzman, Osborne-McGraw Hill), we fell into a dispute (which I lost, of course) about his belief that Y2K should be described as a bug (because that's the way it was presented to the public) rather than a temporary disk space-saving convenience which had lived much too long.

    I got in touch with Bob Bemer, with whom I had worked in the 1970s and 1980s, about what had actually gone down. He was very gracious and sent me a URL for a definitive newspaper article on Y2K:
    http://www.bobbemer.com/weingart.htm

    Bob was a very gracious person, as someone else observed, and both pleasant and impressive to work with; I knew somewhat of what he had accomplished.
  • by Thaidog ( 235587 ) <slashdot753@nym.hush. c o m> on Friday June 25, 2004 @03:16AM (#9525896)
    I mean, if he can't escape death who can?
  • ASCII art tribute (Score:5, Interesting)

    by ChronoWiz ( 709439 ) on Friday June 25, 2004 @04:34AM (#9526094) Journal
    Bob Bemer ASCII Art Tribute [optusnet.com.au]

    Hats off to a truly great man.
  • by Pedrito ( 94783 ) on Friday June 25, 2004 @06:53AM (#9526390)
    One of the most reviled men in computers, the creator of the EBCDIC characterset continues living.
  • by bdsesq ( 515351 ) on Friday June 25, 2004 @10:10AM (#9528167)
    Back in the early 80's we both worked for Honeywell. Bob was working on a full screen editor that ran on Honewyell mainframes using TTY based terminals. It was a neat hack.

    He was a true geek. He was very focused on whatever he was working on. So non-geeks thought he was difficult.

    He was living near Phoenix then and his license plate was ESCAPE. I wondered what the police thougt about that. Perhaps thats why he changed it to ASCII.

    R. I. P.

    (this all happened over 20 years ago so I may have some details wrong)

Do you suffer painful hallucination? -- Don Juan, cited by Carlos Casteneda

Working...