3. Communication System Structural modular approach Various components Of defined functions Channel Coding Source Coding Modulation Formatting Digitization Multiplexing Access techniques send receive
4. Channel Coding To encode the information sent over a communication channel in such a way that in the presence of channel noise, errors can be detected and/or corrected. Can be categorized into Backward error correction (BEC) Forward error correction (FEC ) Objective: provide coded signals with better distance properties
5. Types of coding Block coding Convolutional coding: codes differ from block codes in the sense that they do not break the message stream into fixed-size blocks. Instead redundancy is added continuously to the whole stream. The encoder keeps M previous input bits in memory. Each output bit of the encoder then depends on the current input bit as well as the M stored bits.
7. A Need for Better Codes Energy efficiency vs Bandwidth efficiency Codes with lower rate (i.e. bigger redundancy) correctmore errors.then communication system can operate with a lower transmit power, transmit over longer distances, tolerate more interference, use smaller antennas and transmit at a higher data rate. These properties make the code energy efficient. low-rate codes have a large overhead and are hence more heavy on bandwidth consumption. Also, decoding complexity grows exponentially with code length.
8. Shannon Theory For every combination of bandwidth (W), channel type, signal power (S) and received noise power (N), there is a theoretical upper limit on the data transmission rate R, for which error-free data transmission is possible. This limit is called channel capacity or also Shannon capacity. sets a limit to the energy efficiency of a code.
9. A decibel is a relative measure. If E is the actual energy and Eref is the theoretical lower bound, then the relative energy increase in decibels is . Since, A twofold relative energy increase equals 3dB.
10. Turbo codes Turbo codes are a class of error correcting codes codes introduced in 1993 that come closer to approaching Shannon’s limit than any other class of error correcting codes. Turbo codes achieve their remarkable performance with relatively low complexity encoding and decoding algorithms.
11. Turbo Encoder RSC Input Interleaver random Systematic codeword RSC X Y1 Y2
12. Recursive Systematic Coders Copy of the data in natural order Systematic Recursive S1 S2 S3 Data stream Calculated parity bits
13. Interleaver The interleaver’s function is to permute low weight code words in one encoder into high weight code words for the other encoder. A “row-column” interleaver: data is written row-wise and read columnwise.While very simple, it also provides little randomness. A “helical” interleaver: data is written row-wise and read diagonally. An “odd-even” interleaver: first, the bits are left uninterleaved and encoded,but only the odd-positioned coded bits are stored. Then, the bits arescrambled and encoded, but now only the even-positioned coded bits arestored. Odd-even encoders can be used, when the second encoder producesone output bit per one input bit.
14.
15.
16.
17. Turbo Decoding Criterion For n probabilistic processors working together to estimate common symbols, all of them should agree on the symbols with the probabilities as a single decoder could do
27. Conclusion : End of Search Turbo codes achieved the theorical limits with small gap Give rise to new codes : Low Density Parity Check (LDPC) Need Improvements in decoding delay
28. Reference http://www.google.com [2] University of South Australia, Institute for Telecommunications Research,Turbo coding research group. http://www.itr.unisa.edu.au/~steven/turbo/. [3] S.A. Barbulescu and S.S. Pietrobon. Turbo codes: A tutorial on a new class of powerful error correction coding schemes. Part I: Code structures and interleaverdesign. J. Elec. and Electron.Eng., Australia, 19:129–142, September 1999.