Turbo Codes Promise Better Wireless Transmission 212
captain igor writes "IEEE is running a story about two French professors that have created a new class of encoding, called 'Turbo Codes,' that will allow engineers to pass almost twice as much data through a given communications channel, or equivalently, the same amount of data at half the power. The new codes allow the Shannon Limit (the theoretical maximum capacity of a channel) to be approached to, currently, within .5 dB. Scientists hope that this breakthrough will revolutionize wireless communications, especially with the coming reclamation of large swaths of the EM spectrum." As the article points out, such codes are in use now, but seem poised for much wider implementation.
Finally!!! (Score:4, Funny)
Oh wait...
Odd wording (Score:5, Informative)
It also seems to be making a natural progression into new areas, beginning at satellite transmissions 11 years ago and making it's way into other digital wireless applications along the way.
Thanks to timothy for almost clearing this up
Re:Odd wording (Score:5, Funny)
Wonder how long until the story about a revolutionary new compression scheme called LZW hits the frontpage?
Re:Odd wording (Score:4, Funny)
Yeah, I read about that new LZW thing the other day. I just hope it doesn't have patent problems.
Just wait three months (Score:2)
It gets better (Score:5, Interesting)
The main problem is that optimal decoding of any non-trivial code is NP-hard, which has been known for about 30 years now (i.e., the only known algorithm has exponential complexity in the code length). The Turbo breakthrough was to show that a suboptimal decoder with O(n) complexity for code length n could nonetheless achieve excellent results. This is the so-called "Turbo principle".
There is an even "newer" class of codes called Low-Density Parity-Check Codes that can beat turbo codes. Turbo codes have a small gap to the Shannon limit, and these new codes can potentially eliminate the gap. Small gains are a big deal; the rule of thumb is that 1 dB of gain is equal to a million dollars of annual revenue for a wireless provider.
The twist is that these LDPC codes were actually proposed in a 1963 PhD thesis, but were disregarded as beyond the computational abilities of the time. They were only "rediscovered" in 1996, after the Turbo code furore.
Re:It gets better (Score:3, Insightful)
So, turbo codes have brought the gain differential (sorry, don't know the proper term) to
That doesn't sound like terribly much, considering how much money those companies are pulling in already.
Re:It gets better (Score:4, Insightful)
Re:It gets better (Score:3, Informative)
Re:It gets better (Score:5, Interesting)
The twist is that these LDPC codes were actually proposed in a 1963 PhD thesis, but were disregarded as beyond the computational abilities of the time. They were only "rediscovered" in 1996, after the Turbo code furore.
The article also mentions that the latency associated with turbo codes is too high for most voice applications and that LDPC codes, while more computationally intensive, have a low latency. (At least, that's what I remember from the article).
I thought it was funny that their sponsor, Alcatel or whoever, never patented it in Asia so NTT has been using turbo codes in Japan for years, free.
Re:It gets better (Score:5, Informative)
These large block sizes are not a problem at high data rates or on very long deep space links where the propagation delay still exceeds the block transmission time, but they are a problem with heavily compressed voice where low latencies are required.
One way to decrease the average latency associated with a forward error correcting code is to attempt a decode before the entire block has been received. If the signal-to-noise ratio is high, the attempt may succeed; if not, you wait, collect more of the frame and try again, and you've only lost some CPU cycles. This is called "early termination", and it's one of the tricks done in Qualcomm's 1xEV-DO system, now deployed by Verizon as BroadbandAccess. 1xEV-DO is probably the first widespread commercial application of turbo codes. A 128-byte code block is used to get good coding gain. This relatively large block size is practical because 1xEV-DO is primarily designed for Internet access rather than voice.
Re:It gets better (Score:2)
Isn't EV-DO designed _only_ for data access? I thought the DO stood for Data Only. (I currently work on IS-2000 1xrtt applications right now, and I have very little knowledge of EV-DO and EV-DV interworkings)
Re:It gets better (Score:3, Insightful)
Also, (please correct me if I'm wrong), wasn't EV-DV designed to treat all (non-legacy) calls as data? (What I am asking is that all EV-DV calls are VoIP calls, and voice & data calls share the same service option number?) (I'm also of the assumpti
LDPC (Score:3, Insightful)
I'm sure there are some efficient implementations, but for certain applications having packets that long can be prohibitive.
Re:LDPC: It gets even betterer (Score:5, Informative)
Also they don't require the more time consuming decoding algorithm needed for Turbo Codes (although they can be used). Check 'em out, you can search for papers on IEEE Xplore. Also someone is working on research to show Turbo Codes are LDPC codes! Crazy.
I almost crapped my pants when I saw that statement that Turbo codes were new.
-Take that Lisa's beliefs!
Re:LDPC: It gets even betterer (Score:2)
Re:LDPC: It gets even betterer (Score:5, Insightful)
LDPC codes will be behind DVB-S2, the new transmission system for digital satellite video distribution. Since they approach the Shannon limit so closely, there will be no DVB-S3.
I should say that the IEEE article is a little over-hyped, in that these codes really only buy about 2-4 dB additional gain, concatenated RS and convolutional coding were pretty close to the Shannon limit in AWGN, but those last couple of dBs were nice, but the remaining 0.5-1 dB beyond LDPC & Turbo Codes isn't worth much.
Much more important now are ways to handle "fast fading" channels found in mobile environments, this is what is driving OFDM.
Also, both Turbo Codes and LDPC codes are really computationally intensive to decode. They are currently only decoded at speeds below 20 Mbps, generally implemented as (expensive) FPGAs. We won't see real cheap ASICs for another year or two.
Re:LDPC: It gets even betterer (Score:3, Informative)
What amazes me is how far behind these companies are in actually implementing this technology. 10-years old is actually new.
A friend of mine was saying that even when new research developments are implemented in new hardware the IT
Re:LDPC: It gets even betterer (Score:3, Informative)
Ability to accurately distinguish bits in a communication system decreases with data rate (crudely speaking, this is because you get less time to "look" at each bit and distinguish it from noise.) For a given communication system, we have an acceptable probability of bit error, say 10^-6. Given
Re:Then maybe you can explain something to me (Score:2)
Haha! (Score:4, Funny)
Double your hard drive space, your bandwidth, data transfer, penis size...
Lucky guys! (Score:4, Funny)
Re:Lucky guys! (Score:4, Funny)
1.) Suffering from severe dehydration.
2.) Skeletor's distant relative
News? (Score:3, Informative)
The problem is that turbo codes are so computationally intensive that using them in consumer electronics is only new becoming feasible.
Re:News? (Score:3, Insightful)
The problem is that turbo codes are so computationally intensive that using them in consumer electronics is only new becoming feasible.
Did you know about it? Did most people know about it? If something is making it more feasible to use, then its news. Dr. Evil wanted to put laser beams on sharks *years* ago, but no one's really done it. If something made it possible to put frikking laser beams on sh
Re:News? (Score:2)
Re:whu-huh? (Score:2)
Re:News? (Score:5, Insightful)
Turbo coding is ultimately not much of a theoretical breakthrough, but a compromise and algorithmic hack that happens to work fairly well for real-world problems and is expressed in a language that people who work on communications systems understand. But that's nothing to be sneezed at, since it will ultimately mean that we will get higher data rates and other benefits in real-world products.
Re:News? (Score:3, Funny)
Re:News? (Score:2, Insightful)
Re:News? (Score:4, Funny)
Ahh, that would explain the "Turbo" thing: It comes from the era of Borland compilers and 486 clone boxes. If it were invented today, it would surely be called something like "iCodes".
Actually not true (Score:2)
Part of the turbo coding system involves a feedback loop between two seperate decoders at the receiver. The feedback from each decoder helps the other decoder make a better decision about
Theory vs. Practice (Score:2, Insightful)
1904 Einstein predicts the energy released from nuclear fission (E=MC^2). ~1938 first atom split, the equation was correct.
FM radio was therorized for many years, but until Amstrong came up with his Phase Lock Loop none could make an FM radio capable of broadcasting more than about a hundred feet.
CDMA has been around for a while too, Qualcomm doesn't own the pattent on it, just some techniques for practically implementing it which wasn't
TURBO! (Score:4, Funny)
Dang it, this software isn't made out of platinum, either.
Re:TURBO! (Score:4, Interesting)
Re:TURBO! - The name is correct! (Score:2, Interesting)
The key idea is that this feedback gives you an infinite impulse response, i.e. in theory all bits ever transmitted through such a channel will continue to affect it for ever after.
Even if you do limit the feedback time to more reasonable levels, you can still get a very useful increase in channel capacity.
It is also important to notic
Re:TURBO! (Score:2)
Re:TURBO! (Score:2)
As explained earlier, turbo codes were named because they use a positive feedback loop to improve their performance, just like a turbocharger, even though in the strictest sense of using the word "turbo", it is inappropriate since "turbo" implies the use of a turbine somewhere.
Question... (Score:2)
Re:Question... (Score:3, Informative)
Re:Question... (Score:2)
Re:Question... (Score:2)
Re:Question... (Score:2)
Maybe there is a problem with the latency or the processing power required. It could not be used in UMTS for voice communication because latency would have been too big (see the article).
Surely it will be used as soon as it becomes cheap to do so for the operators
Reception Quality. (Score:3, Interesting)
Turbo Codes for modems? (Score:4, Interesting)
Re:Turbo Codes for modems? (Score:5, Informative)
well they are not exactly a compression technique... but basically an error correcting technique.... like hamming. And thus help you transfer more reliably at the same power.
see this small writeup [ed.ac.uk]
besides changes like the one you talk about would require hardware changes... as i said its not compression
Re:Turbo Codes for modems? (Score:3, Informative)
Re:Turbo Codes for modems? (Score:2)
Re:Turbo Codes for modems? (Score:3, Informative)
Just for those reading, DSL isn't just louder shouting, it's shouting at a much higher frequency so there's a lot more waves to float your data on. =)
Re:Turbo Codes for modems? (Score:2)
Or we could reach close to 1,000K by calling it DSL.
Minor problem (Score:2)
Note the log(1 + SNR) in Shannon's theorem...
Sorry, 56K is about it. (Score:5, Informative)
Nope. (They'd actually reduce the data rate if used.)
Turbo codes are members of the class "Forward Error Correction" codes, which are used to correct errors that creap in during signal propagation from a sender to a receiver over a noisy channel. They work by sending EXTRA bits (typically 3 times as many with turbo), then processing what you get to correct the errors.
Turbo codes are typically used on noisy analog channels - such as radio links. Because they make the data bits less susceptable to noise you can reduce the amount of power you use to transmit them. It's like saying the same sentence three times in a noisy room, rather than scraeming over the noise level just once, so the guy at the next table can hear exactly what you said.
Turbo code (and other FEC codes) send more bits. But because the underlying data is so much less likely to be received incorrectly the sender can can reduce the amount of power used for each bit by MORE than enough to make up for the extra bits. You can trade this "coding gain" either for more BPS at a given power level or a lower power consumption at a given BPS.
But adding bits also increases the amount of information you're sending - either by broadening the bandwidth or more finely dividing the signaling symbol (for instance: more tightly specifying and measuring the voltage on a signal). So if your bitrate is limited by the channel capacity rather than the noise level you're stuck. The coding scheme will "hit the wall" and after that the extra bits come right out of your data rate. Not a problem with Ultra Wideband, or with encoding more bits-per-baud to get closer to a noise floor. But a big problem with telephone connections.
If you had a pure analog telephone connection they would be useful for increasing your bandwidth utilization. And in the old days you did. And analog modems struggled to push first 110 BPS, then 300, then 1200 through it despite the distortion and noise. Then they got fancy and cranked it up to 9.6k and beyond by using DSPs.
But these days you don't have an analog connection all the way. Your analog call is digitized into 8,000 8-bit samples per second and transmitted at 64,000 BPS to the far end. No matter WHAT you do to your signal you can't get more than that number of bits through it.
(This is actually very good, by the way, if the connection is more than a couple miles long. The digital signal is propagated pretty much without error, while an analog signal would accumulate noise, crosstalk, and distortion. So for calls outside your neighborhood you usually get a better signal-to-noise ratio with the digital system for the long hops than with an analog system. That's why cross-continent calls these days sound better than cross-town calls in the '50s.)
In practice it's worse than 64,000 BPS - because the system sometimes steals one of the bits from one sample in six for signaling about dialing, off-hook, ringing, etc. And you don't know WHICH sample. So you can only trust 7 of the bits. 56,000 BPS max. (It's a tad worse yet, because some combinations of signals are forbidden due to regulatory restrictions on how much energy you can put on a phone line - a particular signal could slightly exceed the limit. This makes the actual bit rate a little lower. And you have to sacrifice a little bandwidth to keep the receiver synchronized, too.)
And you only get the approximately 56k in the downlink direction, because to use it you have to have a digital connection at the head end to make full use of the digital transmission. At your uplink you can't be sure enough of the sampling moment to create a waveform that would force exactly the right bit pattern out of the A-to-D converter. So you have to fall back to a modulation scheme that allows some slop when you're transmitting at the POTS end of the link. (If you had a digital connection at both ends - i.e. ISDN - yo
Re:Turbo Codes for modems? (Score:2)
Not really news... (Score:5, Informative)
Turbo codes aren't really new. As the article states, they were invented in 1993. I have a big stack of papers on them at home.
Turbo codes have a few problems, though. One, they are a pain to implement and consume a lot of resources. Two, turbo codes are SNR dependent, which makes them harder to use in varried channels.
The article also makes it seem like there were no coding advances since Shannon published his orginal paper on channel capacity. Ungerbock's papers on trellis coded modulation (TCM) from 1987 or so radically altered digital communications, and he should have been mentioned. Some of the current turbo code research is trying to unify TCM and Turbo Coding (TTCM), which has great promise once it is practical.
theres plenty of room at the bottom !! (Score:2, Informative)
Error correction again seems like one of the bottom less pits... like trying to achieve zero kelvin... perfect vacuum and of course this [zyvex.com] landmark talk by feynman.
Another thing that worries me is why all prepostrous claims are met with so much resistance.... relativity, quantum mechanics, secure-crashfree-windows(oops)...
strange world we live in.
Re:theres plenty of room at the bottom !! (Score:2)
If it's true, it will be accepted in the end. Scientists don't want to live in a fantasy world. It seems logical that preposterous claims are met with resistance, otherwise every nutcase who makes an outrageous statement would cause a massive waste of time as people try to validate it.
And anyway, nothing about turbo codes is preposterous. It's been around for years, and I really don't understand why this arti
Moores Law? (Score:3, Interesting)
Just interested...
Re:Moores Law? (Score:3, Informative)
You can calculate the best possible compression scheme for handling arbitrary data. That is, the compression algorithm which, if you fed it every possible combination of input data, would compress the data the best. The algorithm is called Huffman coding.
The problem is, in actual use Huffman coding is crap. Why? Because real data isn't random, it doesn't cover the entire space of possible data.
So useful compression algorithms take advantage of the non-randomness of actual data, to do
Re:Moores Law? (Score:5, Informative)
No. Huffman coding is only optimal if the block size grows to infinity. In that case, it can approach the Shannon limit. For finite data sets (i.e., all data sets), arithmetic coding performs slightly better on a per-symbol basis, because it is able to use fractional numbers of bits to represent symbols.
Huffman is never used on its own, except as a demonstration of basic compression algorithms. It's commonly used as the final step in entropy-coding the output symbols of another compressor, commonly a lossy compressor, like JPEG, MP3, or MPEG.
The problem is, in actual use Huffman coding is crap. Why? Because real data isn't random, it doesn't cover the entire space of possible data.
In actual coding, it is crap, but not for the reasons you listed. It's crap because it's infeasible to allow the block size to grow without bound, since the number of possible codes increases exponentially.
As someone else has pointed out in the threads, if you allow your compression scheme to be useful only for particular files, and allow the compression and decompression software to be arbitrarily complex, there's essentially no limit to how tight you can compress data.
There's a very clear limit, given by the Shannon entropy (which can vary widely, depending on the transformations you apply to the data). It is also limited by the quite obvious pigeonhole principle. You cannot pick an algorithm which will compress a given data set arbitrarily well.
So in general, there's no way to tell what the technical limit is on compression.
No. The limit is definitely the Shannon entropy. You cannot magically represent data using fewer bits than necessary (and Shannon's entropy tells you how many that is). What changes between compression algorithms is the transformations which are applied to the data, or in other words, the different ways of looking at it which reveal a lower entropy than other ways of looking at it. However, the limit is always the Shannon entropy. The transformations can toss out data which is irrelevant. Your playing card example is a good one -- the information which is relevant is which card it is and possibly its angle. You've applied a transformation to the data -- transformed it from a bitmapped image to a series of card values and positions. You can then compress those pieces of information with Huffman codes (if the entropy allows it).
Re:Moores Law? (Score:2)
Your distinction between "transformation" and "compression" is unclear and appears arbitrary. For the purposes of the original question, "put data in, get less data out that represents original data losslessly" is compression; whether the method used is technically known as compression or coding or transformation or notational re-engineering is irrelevant.
Re:Moores Law? (Score:2)
In the case of a Turing machine, it's not really relevant to mention how long the tape is (for example). It's definitely relevant to mention block sizes when discussing Huffman compression, because it has a major impact on its effectiveness.
Your distinction between "transformation" and "compression" is unclear and appears arbitrary.
I wasn't drawing a d
Re:Moores Law? (Score:2)
Re:Shannon limit applies only to linear systems... (Score:2)
I don't even have the slightest idea what "linear" or "nonlinear" means in the context of information entropy. Thus I can only assume you're talking about the channel theorem.
Re:Moores Law? (Score:2)
That algorithm is a simple "copy" operation, compressing the input to an output that is the same size.
It is not possible to construct a compression algorithm that will compress every possible combination of input data with a better result.
Re:Moores Law? (Score:5, Informative)
The total spectum goes from 30hz right up past light at some crazy big number. Frequecies below 30Mhz are called HF, these frequceys are pretty full but the total band is only 30 Mhz, thats only 6 TV channels, or alot of small 30khz voice channels. This band is so saturated because its not limited to line of sight and its possible to communicate right around the world on it. Its also the easyest to transmit and recive on because filters for these low frequcencies have very high Q(very narrow bandpass, basicly tune better). At these freq. things are already pretty saturated but are strictly regulated along with most of the spectum up to 300Ghz.
Frequcencys from 30Mhz-1Ghz have been the primary frequecys for local communications for the past 50 years or so. Ever since super-hetrodyning has been around, that allows you to "move" a freq down to a lower intermediated frequecy where you have high Q filters. This range is pretty full too, not much room for new communications, aircraft, broadcast tv and radio, police, cell phones, everything is in this range, old UHF tv channels were resently given up to make room for cellphones. These bands are strickly los, well rf los which is a little better than optical line of sight, lower frequencys refract and difract more than higher frequncy light..
Above 1Ghz has been very expensive to use up untill recently. With the advent of new semiconductors and processes its now viable to use these frequecys. There is lots of bandwidth up here to use, alot of it is tied up in radar and with the miltary which is just legal BS because 99.999% isn't being used. Its more political than techinical. There is also quite a bit of it being wasted by older technology that use like 20Mhz for a 6Mhz signal.
Also alot of companies own rights to these microwave frequenies(above 1Ghz) and have swiched to fiber optics and they are just sitting their idle.
Frequencys above 1Ghz can be used again and a again in diffrent areas provided they are seperated by mountains or the curvature of the earth. The problem with 802.11b is that there are only 3 channels that don't overlap.
Also with frequencys above 1 Ghz you can use a dish to focus the signal in one area rather than using more power in all directions. This way you will only interfear with people using the same frequcys in the direction that your dish is pointing. This flashlight vs a room lamp, the flashlight can light up a small area far away and so can the room lamp, but the room lamp uses more power because it also lights up the full room.
In short there is tones of spectum left, we just need to get some of it back from the military and wasteful users.
Not long (Score:2)
i.e. a communications system is always optimized to maximize performance with a bitstream that is assumed to be non-redundant.
Turbo codes are a method for error correction, not compression. In fact, they do the exact opposite of compression - they ADD redundant data.
As to a
Delay (Score:3, Informative)
As other posters have commented, turbo codes have been around along time. Any Digital Modulation Course will cover it.
Turbo Codes do have one major drawback
Re:Delay (Score:2, Informative)
What are they using for the decoding? Monkeys turning hand cranks?
Modern turbo coding cores (and I mean modern as in several years ago) introduce a latency of only a couple frames.
Whoa, way false (Score:5, Informative)
Gee- thats funny. I work a a satellite internet company, and we use Turbo-codes in our FPGA's. The delay from TPC is in the low milisecond range- trust me.
Anything close to one second would have been unacceptable.
I dont know where you got your numbers from.
Re:Whoa, way false (Score:2)
Re:Whoa, way false (Score:3, Informative)
High-powered? (Score:4, Informative)
Re:High-powered? (Score:3, Informative)
It's a really clever hack -- when the protocol asks for a retry, the modem compares the analog levels of the second copy of the packet to the analog levels of the first copy. At first I thought they were signal-averaging, but a telecom engineer told me it's doubtless more sophisticated.
So, even before applying ECC to corre
Turbo Codes? (Score:3, Insightful)
However, the latest word on source and channel coding though, is Space-Time Coding. Especially convolutionsal S-T codes are very very promising and quite naturally perform even better than block S-T codes...
What I don't understand is why this now? It's like running a feature on GSM, instead of writing about TDD or FDD in 3G, or even the discussion going on about how 4G will shape out to be...
Value of bands (Score:2)
Are there any of you who could comment on whether this will reduce the value of such a chunk of the spectrum?
LDPC codes (Score:5, Interesting)
Re:LDPC codes (Score:3, Interesting)
The patent issue might by one of the reasons why it is probably not as widespread in use today. In Space communication various methods of FE
Re:LDPC codes (useful links to coding information) (Score:2, Interesting)
This allows the system to do a good job of guessing what the original message is quickly. If you're interested in Turbo Codes, one of my former professors has done a lot of work with them, and has links to other turbo code sites on h
Re:LDPC codes (Score:3, Interesting)
Next on the list... (Score:2, Funny)
Up, Up, Down, Down, Left, Right, Left, Right... etc.
But what is it good for? (Score:5, Funny)
I was wondering when they would get to the practical applications.
TURBO! (Score:2, Funny)
Shannon's Limit.... (Score:5, Informative)
max data rate = 2H log V bits/sec
Here, H is the bandwith available, usually through a low-pass filter, and V is the number of discrete signal levels (V = 2 in a straight binary system).
This equation however is for a noiseless channel, which doesn't exist. So Shannon updated the formula for a channel which contains a signal-to-noise ratio. This turns out to be:
max bits/sec = H log (1 + S/N)
Here the log is again base 2, H is still the bandwith (in Hertz) and S/N is the signal-to-noise ratio in dB. Notice that the parameter V is missing. This is because no matter how many discrete symbols you have, or how often you sample them, this is still the maximum number of bits/sec that you will be able to attain.
Most coding schemes, or modulation techniques as they are also called, rely on shifting signal amplitude, frequency, and phase to transmit more than one bit per symbol. The problem is that the more symbols there are, the harder it is to detect them correctly with the addition of noise. Basically, when you fire up your modem, and you hear all that weird buzzing and beeping, a lot of that is time being spent for your modem trying to determine just how noisy the channel is, and what the best modulation scheme it can use is while still being able to detect symbols correctly.
Most frightening information .... (Score:3, Informative)
New phone buttons? (Score:3, Funny)
Not new (Score:2, Informative)
The article simply says that turbo coding is about to go mainstream.
Why not use VMSK? (Score:3, Funny)
Checkout http://www.vmsk.org/ where all the claims are made. Supposedly you could compress all communication to 1Hz bandwidth, or so.
[of course I do not believe this]
Re:Why not use VMSK? (Score:2)
Re:Why not use VMSK? (Score:2)
Interesting article (Score:2, Informative)
original paper / there are actually 3 authors (Score:2, Informative)
Claude Berrou, Alain Glavieux, and Punya Thitimajshima, "Near Shannon Limit Error-Correcting Coding and Decoding:Turbo-Codes", in Proceedings of IEEE International Conference on Communications, (Geneva, Switzerland), pp. 1064--1070, May 1993.
http://gladstone.systems.caltech.edu/EE/Courses/EE 127/EE127B/handout/berrou.pdf [caltech.edu]
(with 400 citations, as counted by CiteSeer ResearchIndex
http://citeseer.ist.psu.edu/context/31968/0 [psu.edu]
-- yeah, it is since 1993
Re:Pfff there's a lot faster than this (Score:2)
Re:Pfff there's a lot faster than this (Score:2)
And there's a lot better than that too... (Score:2)
Hint: transmitting the names of things is not the same as transmitting the things themselves.
Re:Has it something to do with signal sampling? (Score:5, Informative)
Shannon capacity is the theoretical bit-rate you can stuff through a channel of a given physical bandwidth with a given signal to noise ratio:
C = W * Log2(1+S/N)
where C is capacity (bits per second), W is bandwidth (Hz), S is signal power, and N is noise power.
As far as Turbo Codes go, without getting too technical, its an extension of the principle that you can increase the efficiency of a channel while transmitting at a given power by attaching some redundant bits (redundant signal dimensions is probably a better way to look at it, though) to the signal. I'm not too familiar with the particulars of Turbo Coding, but it is a lot like Viterbi coding where these redundant bits are dependent on the data bits and when you detect an error (the redundant bit doesn't match the proper sequence), you back-trace through your data and find the most likely non-errored sequence and adjust your data bits accordingly.
Tim
Re:Has it something to do with signal sampling? (Score:2, Informative)
Theory finally caught up with them more recently, from the framework of probabilistic networks. It turned out that if you have a Bayesian network with cycles, inference is difficult. But there is a method of belief propagation throu
Re:Has it something to do with signal sampling? (Score:2)
Soft Decision Decoding (Score:2)
Well, no, not obvious (Score:3, Informative)
Re:STOP USING THE WORD "TURBO" (Score:2, Informative)
It was France Telecom that asked Berrou to come up with a commercial name for the invention. He found the name when one day, watching a car race on TV, he noticed that the newly invented code used the output of the decoders to improve the decoding process, much as a turbocharger uses its exhaust to force air into the engine and