Follow Slashdot stories on Twitter


Forgot your password?
Communications Programming Technology

60 Years of Hamming Codes 66

swandives writes "In 1950 Bell Labs researcher Richard W. Hamming made a discovery that would lay an important foundation for the modern computing and communications industries — coming up with a method for performing computing operations on a large scale without errors. Hamming wrote about how self-checking circuits help eliminate errors in telephone central offices. He speculated the 'special codes' he proposed — which became known as Hamming codes — would only need to be applied to systems requiring unattended operation for long periods or 'extremely large and tightly integrated' systems where a single failure would incapacitate the entire installation. Hamming code was the first discovery in an immense field called coding theory. This article looks back on the history of Hamming codes, their applications, and includes interviews with Todd Moon, Professor of electrical and computer engineering at Utah State University and David MacKay, Professor of natural philosophy in the department of Physics at the University of Cambridge and chief scientific adviser to the UK Department of Energy and Climate Change. An interesting read, about a little-known but fundamental element of information theory."
This discussion has been archived. No new comments can be posted.

60 Years of Hamming Codes

Comments Filter:
  • News For Nerds (Score:5, Insightful)

    by DarkKnightRadick ( 268025 ) <> on Thursday November 25, 2010 @01:02PM (#34343552) Homepage Journal

    News for nerds, stuff that matters.

    This submission qualifies.

  • by radio4fan ( 304271 ) on Thursday November 25, 2010 @01:19PM (#34343686)

    Feynman's excellent book 'Lectures on Computation' has a fantastic explanation of Hamming codes and distance, error correction etc.

    If you're even remotely interested in information theory you *must* read this book! No prior knowledge required.

    If you're a cheap bastard I'm sure you can find a pdf, but it's well worth the asking price.

    • by puto ( 533470 ) on Thursday November 25, 2010 @01:42PM (#34343812) Homepage
      Anything by Feynman is worth it. I am 41 years old and i remember seeing super 8 videos of him in grammar school from my science teacher. Brilliant and engaging.
      • by Rockoon ( 1252108 ) on Thursday November 25, 2010 @02:13PM (#34343986)
        For your enjoyment

        Feynman - The Douglas Robb Memorial Lectures []
      • by refactored ( 260886 ) <> on Thursday November 25, 2010 @05:22PM (#34345266) Homepage Journal
        ... is excessively vague and handwaving requiring literally hours and pages of close work to fill in the gaps between the equation N he shows and the next equation N+1 that he says follows from N.

        Yup, a brilliant guy I'm sure, but not the guy I want teaching me. At the end of a course, (call me greedy), _I_ want to know how to do everything in the course, not merely have a warm fuzzy WEE-WOW feeling that something exciting just went by that I can't quite reproduce.

        Give me Richard Hamming's books instead of Feynman's any day. Ok, they won't make you go "WEEE WOW!!!", but on the other hand you will have an excellent understanding of the material AND be able to reproduce and USE any result in them yourself.

        • by godunc ( 1836034 )

          At the end of a course, (call me greedy), _I_ want to know how to do everything in the course, not merely have a warm fuzzy WEE-WOW feeling that something exciting just went by that I can't quite reproduce.

          Holy flamebait; you are talking about America's most famous physics teacher. You were expecting a "how to" book from a nobel laureate?

          • Flame bait nothing, wake up call to see past the hype... yes.

            Maybe if I was a Nobel Laureate material... I'd agree with you.

            However, Feynman is not a suitable teacher neither for I, nor for that 99.9% segment of Humanity we call "mere ordinary mortals".

            My wife, bless her, is quite capable of digesting and following Feynman's books, but then her skills are "world class" (to me "goddess-like").

            By far most maths / physics university graduates, let's be honest now, get a "Whee! Wow! That was exciting!"

    • I'll have to look this up. I took an undergrad engineering elective in Error Coding and found it to be one of the most fascinating subjects I have been exposed to. The mathematics behind it really are amazing.
    • by martin-boundary ( 547041 ) on Thursday November 25, 2010 @05:15PM (#34345224)

      If you're a cheap bastard I'm sure you can find a pdf, but it's well worth the asking price.

      Exactly. Remember kids, the money you spend goes directly to Richard Feynman, so he can continue to write excellent books from beyond the grave.

  • TFA mentions Mackay's book. It is an awesome book, and is free online. I have a dead-tree copy, too. Well worth the price.

  • by hoytak ( 1148181 )

    Hamcode, hamcode, where you been? Around the world and I'm going again...

  • Hamming code was the first discovery in an immense field called coding theory

    First discovery? I would say Shannon's historic paper on coding theory "A Mathematical Theory of Communication" from 1948 was earlier.

    • To summarize the article that you seemed not to have read, Shannon is cited as writing the seminal paper to which you refer, and in it created an existence proof for error correction codes. He did not, in his paper, actually go so far as to create an ECC. According to TFA, Shannon is credited with creating the entire field of information theory. Not a bad accomplishment. Hamming was noted as actually creating ECCs and laying the foundation stone for coding theory. It's probably why they named the codes af
    • by rrohbeck ( 944847 ) on Thursday November 25, 2010 @03:30PM (#34344574)

      Shannon only proved that those codes exist. Hamming gave the first examples. So it's fair to say that Shannon was about information theory, not coding theory.

      • Re: (Score:3, Informative)

        by epine ( 68316 )

        Shannon's work was general over the error model. Coding theory assumes a specific error model (such as bit error rates, insertion, deletion, magnitude distortion).

        It doesn't take much wits to translate Shannon's work into a rudimentary code. Constructing codes near the bounds of optimality is extremely difficult, especially if the decoder corrects errors with better efficiency than combinatorial trial and error.

        I wouldn't say Shannon's work was light on coding theory, much of which was implied pretty dire

        • By the time you're correcting 5 bit errors in a 256 bit packet, you don't want to have to search for the nearest valid codeword combinatorially.

          The best codes, LDPCC, can only be decoded optimally using combinatorial search, the question "is this the optimal decode?" being NP-complete. So, they always use heuristic decoders, generally loopy BP. Fortunately, that is extraordinarily suitable for efficient hardware implementation. But it gives no bounds on error rate, nor even a guarantee of convergence.

          At thi

  • For the record, Dr. Moon is the Department Head of ECE. He gives killer lectures, but his tests will make you wish for death ;-).

    I'm currently working on an MS there. Good to see him get some publicity.
  • Easy to understand (Score:2, Interesting)

    by elbow_spur ( 749878 )
    Hamming codes are practical things, while Shannon's analyses of codes were more abstract (though still hugely useful and important)
    Consider the checksum bit. It helps to catch errors but there are 2 problems. First, if there is a double error (more likely if the checksum is on a longer string), then the error isn't caught Second, even if we know there is an error, we can't recover, but have to resend.
    The easiest error-correcting code is to replace every bit with a triple copy of itself. So
    101 become
    • Not to mention that it's a very small footprint: for N bits in a transmitted message, you only need log(N,2) parity bits to retain the same error correction/detection capability. You can pretty easily balance how robustly you want to protect your data with how much excess information you want to transmit.

  • by knutert ( 1142047 ) on Thursday November 25, 2010 @04:10PM (#34344832)

    Want to be like Hamming? Here's how:
    In summary, I claim that some of the reasons why so many people who have greatness within their grasp don't succeed are:
            * they don't work on important problems,
            * they don't become emotionally involved,
            * they don't try and change what is difficult to some other situation which is easily done but is still important,
            * and they keep giving themselves alibis why they don't.
            * They keep saying that it is a matter of luck.
    I've told you how easy it is; furthermore I've told you how to reform. Therefore, go forth and become great scientists!

    Source: []

    • by sahai ( 102 ) []

      That is an online transcription of Claude Shannon's thoughts on the matter. Google pointed to this link from Sharif, but you can see a printed version in the second volume of his collected works, I believe. I ran across this during my blissful graduate student days.

      I love how this talk ends with him asking his audience to come and look at this machine that he built.

  • In my Naval career, I was lucky enough to come across both of these titans of computing’s early age. RADM Hopper gave a lecture to every plebe class at the Naval Academy , including mine in 1984, where she would give each Midshipman a short length of wire of the length that light traveled in a nanosecond. She used these to illustrate stories she told of the early days of computers that were programmed by connecting wires differently. Her speech was the first place I heard “it was easier to b

Genius is ten percent inspiration and fifty percent capital gains.