coding theory

21
Coding Theory

Upload: mari-jefferson

Post on 01-Jan-2016

149 views

Category:

Documents


3 download

DESCRIPTION

Coding Theory. Communication System. Source encoder. CRC encoder. Channel encoder. Interleaver. Modulator. Voice Image Data. Impairments Noise Fading. Channel. Error control. Source decoder. CRC encoder. Channel encoder. Deinterleaver. Demodulator. Error control coding. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Coding Theory

Coding Theory

Page 2: Coding Theory

2

Communication System

Channelencoder

Sourceencoder

Modulator

Demodulator

Channel

VoiceImageData

CRCencoder

Interleaver

DeinterleaverChannelencoder

CRCencoder

Sourcedecoder

Error controlImpairments

NoiseFading

Page 3: Coding Theory

3

Error control coding

• Limits in communication systems– Bandwidth limit

– Power limit

– Channel impairments• Attenuation, distortion, interference, noise and fading

• Error control techniques are used in the digital communication systems for reliable transmission under these limits.

Page 4: Coding Theory

4

Power limit vs. Bandwidth limit

Page 5: Coding Theory

5

Error control coding

• Advantage of error control coding– In principle:

• Every channel has a capacity C.• If you transmit information at a rate R < C, then the error-free

transmission is possible.

– In practice:• Reduce the error rates• Reduce the transmitted power requirements• Increase the operational range of a communication system

• Classification of error control techniques– Forward error correction (FEC)

– Error detection: cyclic redundancy check (CRC)

– Automatic repeat request (ARQ)

Page 6: Coding Theory

6

History

• Shannon (1948)

– R: Transmission rate for data

– C: Channel capacity

– If R < C, it is possible to transfer information at error rates that can be reduced to any desired level.

Page 7: Coding Theory

7

History

• Hamming codes (1950)– Single error correcting

• Convolutional codes (Elias, 1956)• BCH codes (1960), RS codes (1960)

– multiple error correcting

• Goppa codes (1970)– Generalization of BCH codes

• Algebraic geometric codes (1982)– Generalization of RS codes

– Constructed over algebraic curves

• Turbo codes (1993)• LDPC codes

Page 8: Coding Theory

8

Channel

• Memoryless channel– The probability of error is independent from one symbol to t

he next.

• Symmetric channel– P( i | j )=P( j | i ) for all symbol values i and j

Ex) binary symmetric channel (BSC)

• Additive white Gaussian noise (AWGN) channel• Burst error channel• Compound (or diffuse) channel

– The errors consist of a mixture of bursts and random errors.

• Many codes work best if errors are random.– Interleaver and deinterleaver are added.

Page 9: Coding Theory

9

Channel

• Random error channels

– Deep-space channels

– Satellite channels

Use random error correcting codes

• Burst error channels: channels with memory

– Radio channels• Signal fading due to multipath transmission

– Wire and cable transmission• Impulse switching noise, crosstalk

– Magnetic recording• Tape dropouts due to surface defects and dust particles

Use burst error correcting codes

Page 10: Coding Theory

10

Encoding

• Block codes

• Encoding of an [n , k] block code

k bits k bits k bits n bits n bits n bits

message or information codeword

Redundancy: n – k

Code rate: k / n

Message m(m1, m2, … , mk)

codeword c(m1, m2, … , mk, p1, p2, … , pn - k )

Add n – k redundant parity check symbols

(p1, p2, … , pn - k)

Page 11: Coding Theory

11

Decoding

• Decoding [n , k] block code

– Decide what the transmitted information was

– The minimum distance decoding is optimum in a memoryless channel.

Received data r(r1, r2, … , rn)

Decoded message

Correct errors and removen – k redundant symbols

m

)ˆ,.....,ˆ,ˆ( 21 kmmm

Error vector e = (e1, e2, … , en) = (r1, r2, … , rn) – (c1, c2, … , cn)

Page 12: Coding Theory

12

Decoding

• Decoding plane

c1

r

c4

c3

c2

c6

c5

Page 13: Coding Theory

13

Decoding

Ex) Encoding and decoding procedure of [6, 3] code

1. Generate the information (100) in the source.

2. Transmit the codeword (100101) corresponding to (100).

3. The vector (101101) is received.

4. Choose the nearest codeword (100101) to (101101).

5. Extract the information (100) from the codeword (100101).

Information000100010110001101011111

codeword000000100101010011110110001111101010011100111001

Distance from (101101)41542332

Page 14: Coding Theory

14

Parameters of block codes

• Hamming distance dH(u, v)

– # positions at which symbols are different in two vectors

Ex) u=(1 0 1 0 0 0)

v=(1 1 1 0 1 0) dH(u, v) = 2

• Hamming weight wH(u)

– # nonzero elements in a vector

Ex) wH(u) = 2, wH(v) = 4

• Relation between hamming distance and hamming weight

– Binary code: dH(u, v) = wH(u + v),

where ‘+’ means exclusive OR (bit by bit)

– Nonbinary code: dH(u, v) = wH(u – v)

Page 15: Coding Theory

15

Parameters of block codes

• Minimum distance d– d = min dH(ci, cj) for all ci cj C

• Any two codewords differ in at least d places.

• [n, k] code with d [n, k, d] code• Error detection and correction capability

– Let s = # errors to be detected

t = # errors to be corrected (s t)

– Then, we have d s + t + 1

• Error correction capability– Any block code correcting t or less errors satisfies

d 2t + 1

– Thus, we have t = (d – 1) / 2

Page 16: Coding Theory

16

Parameters of block codes

Ex) d = 3, 4 t = 1 : single error correcting (SEC) codes

d = 5, 6 t = 2 : double error correcting (DEC) codes

d = 7, 8 t = 3 : triple error correcting (TEC) codes

• Coding sphere

t

s

tdci cj

Page 17: Coding Theory

17

Code performance and coding gain

• Criteria for performance in the coded system

– BER: bit error rate in the information after decoding, Pb

– SNR: signal to noise ratio, Eb / N0

Eb = signal energy per bit

N0 = one-sided noise power spectral density in the channel

– Coding gain (for a given BER)

G = (Eb / N0)without FEC – (Eb / N0)with FEC [dB]

• At a given BER, Pb, we can save the transmission power by

G [dB] over the uncoded system.

Page 18: Coding Theory

18

Minimum distance decoding

• Maximum-likelihood decoding (MLD)– : estimated message after decoding

– : estimated codeword in the decoder

• Assume that c was transmitted.– A decoding error occurs if .

• Conditional error probability of the decoder, given r :

• Error probability of the decoder:

mc

cc ˆ

)|ˆ()|( rccr PEP

ccmm ˆˆ

r

rr )()|()( PEPEP , where P(r) is independent of decoding rule

Page 19: Coding Theory

19

Minimum distance decoding

• Optimum decoding rule: minimize error probability, P(E)

– This can be obtained by minr P(E | r), which is equivalent to

• Optimum decoding rule is

– argmaxc P(c | r) : Maximum a posteriori probability (MAP)

– argmaxc P(r | c) : Maximum likelihood (ML)

• Bayes’ rule

– If equiprobable c, MAP = ML

)|ˆ(max rccr P

)(

)()|()|(

r

ccrrc

P

PPP

Page 20: Coding Theory

20

Problems

• Basic problems in coding

– Find good codes

– Find their decoding algorithm

– Implement the decoding algorithms

• Cost for forward error correction schemes

– If we use [n, k] code, the transmission rate increase from k to n.

• Increase of channel bandwidth by n / k or decrease of message transmission rate by k / n.

• Cost for FEC

Page 21: Coding Theory

21

Classification

• Classification of FEC

– Block codes• Hamming, BCH, RS, Golay, Goppa, Algebraic geometric code

s (AGC)

Tree codes• Convolutional codes

– Linear codes• Hamming, BCH, RS, Golay, Goppa, AGC, etc.

Nonlinear codes• Nordstrom-Robinson, Kerdock, Preparata, etc.

– Systematic codes vs. Nonsystematic codes