60 Years of Hamming Codes 66
swandives writes "In 1950 Bell Labs researcher Richard W. Hamming made a discovery that would lay an important foundation for the modern computing and communications industries — coming up with a method for performing computing operations on a large scale without errors. Hamming wrote about how self-checking circuits help eliminate errors in telephone central offices. He speculated the 'special codes' he proposed — which became known as Hamming codes — would only need to be applied to systems requiring unattended operation for long periods or 'extremely large and tightly integrated' systems where a single failure would incapacitate the entire installation. Hamming code was the first discovery in an immense field called coding theory. This article looks back on the history of Hamming codes, their applications, and includes interviews with Todd Moon, Professor of electrical and computer engineering at Utah State University and David MacKay, Professor of natural philosophy in the department of Physics at the University of Cambridge and chief scientific adviser to the UK Department of Energy and Climate Change. An interesting read, about a little-known but fundamental element of information theory."
See Feynman's Lectures on Computation (Score:5, Informative)
Feynman's excellent book 'Lectures on Computation' has a fantastic explanation of Hamming codes and distance, error correction etc.
If you're even remotely interested in information theory you *must* read this book! No prior knowledge required.
If you're a cheap bastard I'm sure you can find a pdf, but it's well worth the asking price.
Re:Anonymous Coward (Score:2, Informative)
If you know anything about information theory, you know about hamming codes.
If you know anything about information theory, you know about hamming codes.
If you know anything about information theory, you know about hamming codes.
If you know anything about information theory, you know about hamming codes.
Re:See Feynman's Lectures on Computation (Score:5, Informative)
Feynman - The Douglas Robb Memorial Lectures [vega.org.uk]
Re:Anonymous Coward (Score:0, Informative)
There's a good reason Grace Hooper isn't well known for her contributions to computing. Of course, the same can't be said of Grace Hopper, who among other things discovered the first bug in a computer system.
Re:Anonymous Coward (Score:4, Informative)
Re:TFA mentions Mackay's book. (Score:3, Informative)
I just downloaded it.
http://www.inference.phy.cam.ac.uk/mackay/itila/book.html [cam.ac.uk]
Another Mackay book. (Score:2, Informative)
"Sustainable energy - without the hot air", available as a free PDF download.
I haven't read it yet, but I will given his credibility with the article and other book.
http://www.withouthotair.com/ [withouthotair.com] or http://www.inference.phy.cam.ac.uk/withouthotair/ [cam.ac.uk]
Here is a podcast of a lecture he gave at Cambridge on the topic of sustainable energy.
http://mediaplayer.group.cam.ac.uk/component/option,com_mediadb/task,play/idstr,CU-CSF-Lectures_2008-12_David_MacKay/vv,-1/Itemid,42 [cam.ac.uk]
How to do research like Hamming (Score:5, Informative)
Want to be like Hamming? Here's how:
In summary, I claim that some of the reasons why so many people who have greatness within their grasp don't succeed are:
* they don't work on important problems,
* they don't become emotionally involved,
* they don't try and change what is difficult to some other situation which is easily done but is still important,
* and they keep giving themselves alibis why they don't.
* They keep saying that it is a matter of luck.
I've told you how easy it is; furthermore I've told you how to reform. Therefore, go forth and become great scientists!
Source: http://paulgraham.com/hamming.html [paulgraham.com]
Re:News For Nerds - Read his book! (Score:5, Informative)
Re:Not the First Discovery in Coding Theory (Score:3, Informative)
Shannon's work was general over the error model. Coding theory assumes a specific error model (such as bit error rates, insertion, deletion, magnitude distortion).
It doesn't take much wits to translate Shannon's work into a rudimentary code. Constructing codes near the bounds of optimality is extremely difficult, especially if the decoder corrects errors with better efficiency than combinatorial trial and error.
I wouldn't say Shannon's work was light on coding theory, much of which was implied pretty directly. I would say instead that his work was light on algorithmic efficiency of sophisticated codes.
By the time you're correcting 5 bit errors in a 256 bit packet, you don't want to have to search for the nearest valid codeword combinatorially.