information theory & coding jan 2014

3
USN following noise .:' 06EC6s :', Max. Marks:100 (10 Marks) (06 Marks) (04 Marks) Sixth Semester B.E. Degree Examination, Dec.2013lJtn.20l4 lnformation Theory and Goding L 3 d () o L g o d o ! (.)X d9 7v) -*ll troa o:Y o= =o *,a 6= oO 6A ooc cg6 9= 3o OE o-a oi' 6= A,i, LO JE >.: bo- coo aJ= o-B tr> :o v! I lr< :^ o z Time: 3 hrs. . la. b. Note: Answer FIVE full questions, selecting at least TWO questions from each part. PART _ A Define the following : i) Self inf.ormation ii) Entropy of the source iii) Extremal property iv) Additive property (04 Marks) A black and w-hite TV picture consists of 525lines of picture information. Assume that each line consists of 525 pixels, with each pixel having 256 equiprobable brightness levels. Pictures are repe4ted at the rate of r, : 30 frames per second. Calculate the rate of inlormation conveyed by a TV set to the viewer. (06 Marks) The state diagram of a Markov source is shown in Fig. Q1 (c). i) Find the state probabilitr. ii) Find the enlropy of each state Hr. iii) Find the entropy of the first order source H(S). iv) Find Gl. G2 and verif,y that. G, I G, 2 H(S) Fig. Q1 (c): Markov source"'for Q1 (c) (10 Marks) matrix and given 2 a. For the binary symmetric channel characterized by the symbol input probability matrix find, i) All effiiopies H(X), H(Y), H(X/Y), H(Y/X), H(XY). ii) Data transmission rate. iii) Channel capacity given r,: 1000 symbols / sec. iv) Channel efficiency and redundancy. [: 1l lzl P(Yi x,=lT ll. o,*,:l?l lz 4) L;l b. A source emits a independent sequence of symbols from an alphabet consisting of five symbols from an alphabet consisting of five symbols A, B, C, D, E with probabilities 1113 5 =4, -g, g , G, - respectively. Find the Shannon code using Shannon encoding algorithm and compute the efhciency. Prove that H(XY) = H(Y) + U(X I V) c. For More Question Papers Visit - www.pediawikiblog.com For More Question Papers Visit - www.pediawikiblog.com www.pediawikiblog.com

Upload: prasad-c-m

Post on 13-May-2017

226 views

Category:

Documents


5 download

TRANSCRIPT

Page 1: Information Theory & Coding Jan 2014

USN

following noise

.:'

06EC6s

:',

Max. Marks:100

(10 Marks)

(06 Marks)

(04 Marks)

Sixth Semester B.E. Degree Examination, Dec.2013lJtn.20l4lnformation Theory and Goding

L3

d()oLg

odo!

(.)X

d97v)-*lltroa

o:Yo==o

*,a

6=

oO

6A

ooccg69=

3oOE

o-aoi'6=A,i,LOJE

>.:bo-cooaJ=o-Btr>:ov!I

lr<:o

z

Time: 3 hrs.

.

la.

b.

Note: Answer FIVE full questions, selectingat least TWO questions from each part.

PART _ ADefine the following :

i) Self inf.ormation ii) Entropy of the source iii) Extremal propertyiv) Additive property (04 Marks)A black and w-hite TV picture consists of 525lines of picture information. Assume that eachline consists of 525 pixels, with each pixel having 256 equiprobable brightness levels.Pictures are repe4ted at the rate of r, : 30 frames per second. Calculate the rate ofinlormation conveyed by a TV set to the viewer. (06 Marks)The state diagram of a Markov source is shown in Fig. Q1 (c).i) Find the state probabilitr.ii) Find the enlropy of each state Hr.

iii) Find the entropy of the first order source H(S).iv) Find Gl. G2 and verif,y that. G, I G, 2 H(S)

Fig. Q1 (c): Markov source"'for Q1 (c)(10 Marks)

matrix and given2 a. For the binary symmetric channel characterized by thesymbol input probability matrix find,i) All effiiopies H(X), H(Y), H(X/Y), H(Y/X), H(XY).ii) Data transmission rate.iii) Channel capacity given r,: 1000 symbols / sec.

iv) Channel efficiency and redundancy.

[: 1l lzlP(Yi x,=lT ll. o,*,:l?l

lz 4) L;lb. A source emits a independent sequence of symbols from an alphabet consisting of five

symbols from an alphabet consisting of five symbols A, B, C, D, E with probabilities1113 5=4, -g, g , G, - respectively. Find the Shannon code using Shannon encoding algorithm and

compute the efhciency.Prove that H(XY) = H(Y) + U(X I V)c.

For More Question Papers Visit - www.pediawikiblog.com

For More Question Papers Visit - www.pediawikiblog.com

www.pediawikiblog.com

Page 2: Information Theory & Coding Jan 2014

3a.

06EC6s

Design a ternary Huffinan code for a source S = {s,,S2,Sj,s4,s5,s6} with

a = {1,1,1,1,:,1}, X = {0,1,2}, where X is code alphabet; by moving the combined1348812t2)

symbols i) as high as possible ii) as low as possible. For each of the codes, findi) Minimum average length ii) Variance iii) Code efficiency. (12 Marks)

For a binary erasure channel shown in Fig. Q3 (b) find the following:i) Average mutual informationii) Channel capacity.iii) Values of P(x1) and P(x2) for maximum mutual information. (08 Marks)

lco,l-vla,

Pt,'-^) t-$

C : 1.44 S/q bps. (08 Marks)

PART _ B

b.

TzFig. Q3 (b)

4 a. Derive an equation for the capacity of a channel of bandwidth Bnz effected by additive

white Gaussian noise of power spectral density r'1 and hence prove Shannons limit

c.

An analog signal has a 4 kHz bandwidth. The signal is sampled at 2.5 times the Nyquist rate

and each sample quantized into 256 equally likely levels. Assume that the successive levelsare statistically independent. :"

i) Find the information rate of the source.

ii) Canthe output of this source be transmitted without errors over a Gaussian channel of/q\

bandwidth 50 kHz and I j I ratio oi20 dB?

\N/iii) If the output'"of this source is to be transmitted without elrors over an analog channel

/o\having I i I "f 10 dB, compute the bandwidth requirement of the channel. (08 Marks)"\.N)

Define differential entropy and briefly explain with relevant equations. (04 Marks)

5a.b.

Explain the matrix representation of linear block codes.

For a systematic linear block code, the parity matrix P is given by,

[r r 1ltttpl=|1 1 olLr ll 0 1ltt

L0 r r.l

i) Find all possible code vectors.ii) Draw the corresponding encoding circuit.iii) A single error has occurred in each of the following received vectors.

correct those errors:Ra: [0 1 1 1 1 1 0],Re:[1 0 1 1 1 00]

iv) Draw the syndrome calculation circuit.2 of3

(06 Marks)

Detect and

(14 Marks)

For More Question Papers Visit - www.pediawikiblog.com

For More Question Papers Visit - www.pediawikiblog.com

www.pediawikiblog.com

Page 3: Information Theory & Coding Jan 2014

6a.

b.

06EC6s

What is a binary cyclic code? Discuss the encoding and decoding circuit used to generate

binary cyclic codes. (06 Marks)

A (15, 5) linear cyclic code has a generator polynomial,

g(x) =1+x+x' +xo +x' +xt +x'oi) Draw the encoder and syndrome calculator circuit for this code.

ii) Find the code polynomial for the message polynomial D(x) : 1 + *' + *o in sy-stemJic

form. -'

iii)IsV(x):1+xa+r.u+r8+*'oacodepolynomial?Ifnot,findthesyndrome.ofu(i).,, ''' (l4Marks)

.,::

..:::..::.

(20 Marks)

Explain the following error control codes:

a. Golay codes.b. Shorten€d cyclic codes.

c. RS codes.

d. Burst and Random error correcting codes.

Consider the (3, 1 ",i).oruolution

code with g(r): (1 1 ol,gt:t: (1 0 1) and g(3):11 1 l;.i) Draw the encodei,bfock diagram.ii) Find the generator matfix,iii) Find the codeword corresponding to the information sequence I 1 1 0 1, using the time

domain and transform domain approach. ', '-

,, ,,,, * ,r * * *,,,,,,,,

(20 Marks)

? nf ?

For More Question Papers Visit - www.pediawikiblog.com

For More Question Papers Visit - www.pediawikiblog.com

www.pediawikiblog.com