1.coding theory

Post on 21-Oct-2015

42 Views

Category:

Documents

2 Downloads

Preview:

Click to see full reader

DESCRIPTION

Coding

TRANSCRIPT

CODING THEORY

A Bird’s Eye View

2

Text Books

Shu Lin and Daniel. J. Costello Jr., “Error Control Coding:

Fundamentals and applications”, Prentice Hall Inc.

R.E. Blahut, “Theory and Practice of Error Control Coding”, MGH

References:

Rolf Johannesson, Kamil Sh. Zigangirov, “Fundamentals of

Convolutional Coding”, Universities Press (India) Ltd. 2001.

Proakis, “Digital Communications”, MGH.

3

Introduction

Role of Channel Coding in Digital

Communication System

Block Codes and Convolution Codes

Channel Models

Decoding Rules

Error Correction Schemes

References

4

Slide 41

5

6

7

8

9

10

11

12

13

14

Shannon’s Theorem (1948)

Noisy Coding Theorem due to Shannon:

Roughly: Consider channel with capacity C. If we are

willing to settle for a rate of transmission that is strictly

below C, then there is an encoding scheme for the

source data, that will reduce the probability of a

decision error to any desired level.

Problem: Proof is not constructive! To this day, no one

has found a way to construct the coding schemes

promised by Shannon’s theorem.

15

Shannon’s Theorem (1948)-contd

Additional concerns:

Is the coding scheme easy to implement, both in

encoding and decoding?

May require extremely long codes.

16

The Shannon-Hartley Theorem

Gives us a theoretical maximum bit-rate that can

be transmitted with an arbitrarily small bit-error

rate (BER), with a given average signal power,

over a channel with bandwidth B Hz, which is

affected by AWGN.

For any given BER, however small, we can find

a coding technique that achieves this BER;

smaller the given BER, the more complicated

will be the coding technique.

17

Shannon-Hartley Theorem-contd.

Let the channel bandwidth be B Hz and signal

to noise ratio be S/N (not in dB).

sec/)/1(log2

bitsNSBC

18

Shannon-Hartley Theorem-contd.

For a given bandwidth B and a given S/N,

we can find a way of transmitting data at a

bit-rate R bits/second, with a bit-error rate

(BER) as low as we like, as long as R C.

Now assume we wish to transmit at an

average energy/bit of Eb and the AWGN

noise has two sided power spectral density

N0 /2 Watts per Hz. It follows that the signal

power S = EbR and the noise power N = N0B

Watts.

19

Shannon-Hartley Theorem-contd.

R/B ratio is called bandwidth efficiency in

bits/sec/Hz. How many bits per sec do I get for

each Hz of bandwidth.We want this to be as

high as possible. Eb /N0 is the normalised

average energy/bit, where the normalisation is

with respect to the one sided PSD of the noise.

The law gives the following bounds:

20

Shannon-Hartley Theorem-contd.

BRN

E

BN

RE

B

R

BR

b

b

/

12)(

)1(log

/

min

0

0

2

21

Shannon Limit

The bound gives the minimum possiblenormalised energy per bit satisfying the Shannon-Hartley law.

If we draw a graph of (Eb/N0 )min against (R/B)we observe the that (Eb/N0 )min never goes less than about 0.69 which is about -1.6dB.

Therefore if our normalised energy per bit is less than -1.6dB, we can never satisfy the Shannon-Hartley law, however inefficient (in terms of bit/sec/Hz) we are prepared to be.

22

Shannon Limit-contd.

There exists a limiting value of (Eb/N0 ) below

which there cannot be error free communication

at any transmission rate.

The curve R = C will divide the achievable and

non-achievable regions.

23

24

Modulation-Coding trade-off

For Pb=10-5, BPSK modulation requires Eb/N0 = 9.6dB(optimum un-coded binary modulation)

For this case, Shannon’s work promised a performance improvement of 11.2dB over the performance of un-coded binary modulation, through the use of coding techniques.

Today, “Turbo Codes”, are capable of achieving an improvement close to this.

“Turbo Codes” are Near Shannon limit error correcting codes

25

Coding Theory-Introduction

Main problem:

A stream of source data, in the form of 0’s and 1’s, is being transmitted over a communication channel, such as a telephone line. Occasionally, disruptions can occur in the channel, causing 0’s to turn into 1’s and vice versa.

Question: How can we tell when the original data has been changed, and when it has, how can we recover the original data?

26

Coding Theory-Introduction

Easy things to try:

Do nothing. If a channel error occurs with probability p, then the probability of making a decision error is p.

Send each bit 3 times in succession. The bit that occurs the majority of the time, gets picked. (E.g. 010 => 0)

Repetition codes!!

27

Coding Theory-Introduction

Generalize above: Send each bit n times, choose majority

bit. In this way, we can make the probability of making

a decision error arbitrarily small, but inefficient in terms

of transmission rate.

As n increases the achievable BER reduces, at the

expense of increased codeword length (reduced code

rate)

Repetition coding is inefficient…

28

Coding Theory Introduction (cont’d)

Encode source information, by adding additional

information (redundancy), that can be used to detect,

and perhaps correct, errors in transmission. The more

redundancy we add, the more reliably we can detect and

correct errors, but the less efficient we become at

transmitting the source data.

29

Error control applications

Data communication networks (Ethernet, FDDI, WAN, Bluetooth)

Satellite and Deep space communications

Cellular mobile communications

Modems

Computer buses

Magnetic disks and tapes

CDs, DVDs. Digital sound needs ECC!

30

Error control categories

The error control problem can be classified in several ways:

Types of error control coding: detection vs. correction

Types of errors: how much clustering- random, burst etc.

Types of codes: block vs. convolutional

31

Error Control Strategies

Error detection.

Goal: avoid accepting faulty data.

Lost data may be unfortunate; wrong data may be disastrous.

(Forward) error correction (FEC or ECC).

Use redundancy in encoded message to estimate from the received data what message was actually sent.

The best estimate is usually the “closest" message. The optimal estimate is the message that is most probable given what is received.

32

33

34

Types of Channel Codes

Block Codes ( Codes with strong algebraic flavor)

~1950--- Hamming Code (Single error correction)

All codes in 50’s were too weak compared to the codes promised by Shannon

Major Breakthrough……..1960

BCH Codes…

Reed-Solomon Codes…

Capable of correcting Multiple Errors

35

36

37

38

Convolutional Codes

Codes with Probabilistic flavor.

Late 1950’s but gained popularity after the

introduction of Viterbi algorithm in 1967.

Developed from the idea of sequential decoding

Non-block codes

Codes are generated by a convolution operation on

the information sequence

39

40

41

Coding Schemes: Trend…

Since 1970’s the two avenues of research started

working together

This resulted in the development towards the

codes promised by Shannon

Today “Turbo Codes”, are capable of achieving

an improvement close to Shannon Limit

42

Coding Schemes

Applications demand for wide range of data rates, block sizes, error rates.

No single error protection scheme works for all applications. Some requires the use of multiple coding techniques.

A common combination uses an inner convolutional code and an outer Reed-Solomon code.

43

44

45

46Slide 3

47

48

49

50

51

top related