tutorial sheet digital comm

4
TUTORIAL SHEET – 1 DIGITAL COMMUNICATION (EC-1505) 1. A source emits one of four possible symbols during each signaling interval. The symbols occurs with the probabilities:- Po=0.4, P1=0.3, P2=0.2, P3=0.1 Find the amount of information gained by observing the source emitting each of these symbols. 10.2 2. A source emits one of four symbols S0, S1, S2, and S3 with probabilities 1/3, 1/6, ¼ and ¼, respectively. The successive symbols emitted by the source are statistically independent. Calculate entropy of the source. 2.1.3 3. A high resolution black and white TV picture consists of about 2x10 6 picture elements and 16 different brightness levels. Pictures are repeated at the rate of 32 per second. All picture elements are assumed to be independent, and all levels have equal likelihood of occurrence. Calculate the average rate of information conveyed by this TV picture source. 10.5 4. Consider a telegraph source having two symbols, ‘dot’ and ‘dash’. The ‘dot’ duration is 0.2s. The ‘dash’ duration is 3 times the ‘dot’ duration. The probability of the ‘dots’ occurring is twice that of the ‘dash’, and the time between symbols is 0.2 s. Calculate the information rate of the telegraph source. 10.6 5. Consider a binary channel shown in Fig:- 10.7 a) Find the channel matrix of the channel. b) Find P(Y1) and P(Y2) when P(X1) = P(X2) = 0.5. c) Find the joint probabilities P(x1, Y2,) and P(X2,Y1) when P(X1)= P(X2) = 0.5 6. A discrete source transmits messages X1, X2 and X3 with the probabilities 0.3, 0.4 and 0.3. The source is connected to the channel given in fig. calculate all the entropies i.e. H(X), H(Y), H(X, Y), H(X/Y), H(Y/X) 7. Consider a discrete memoryless source where alphabet consists of K equiprobable symbols. 2.2.1

Upload: prem-prakash

Post on 26-Dec-2015

175 views

Category:

Documents


7 download

DESCRIPTION

gtert

TRANSCRIPT

Page 1: Tutorial Sheet Digital Comm

TUTORIAL SHEET – 1

DIGITAL COMMUNICATION (EC-1505)

1. A source emits one of four possible symbols during each signaling interval. The symbols occurs with the probabilities:- Po=0.4, P1=0.3, P2=0.2, P3=0.1 Find the amount of information gained by observing the source emitting each of these symbols. 10.2

2. A source emits one of four symbols S0, S1, S2, and S3 with probabilities 1/3, 1/6, ¼ and ¼, respectively. The successive symbols emitted by the source are statistically independent. Calculate entropy of the source. 2.1.3

3. A high resolution black and white TV picture consists of about 2x106 picture elements and 16 different brightness levels. Pictures are repeated at the rate of 32 per second. All picture elements are assumed to be independent, and all levels have equal likelihood of occurrence. Calculate the average rate of information conveyed by this TV picture source. 10.5

4. Consider a telegraph source having two symbols, ‘dot’ and ‘dash’. The ‘dot’ duration is 0.2s. The ‘dash’ duration is 3 times the ‘dot’ duration. The probability of the ‘dots’ occurring is twice that of the ‘dash’, and the time between symbols is 0.2 s. Calculate the information rate of the telegraph source. 10.6

5. Consider a binary channel shown in Fig:- 10.7a) Find the channel matrix of the channel.b) Find P(Y1) and P(Y2) when

P(X1) = P(X2) = 0.5.c) Find the joint probabilities

P(x1, Y2,) and P(X2,Y1) when P(X1)= P(X2) = 0.56. A discrete source transmits messages X1, X2 and X3 with the probabilities 0.3, 0.4 and 0.3. The

source is connected to the channel given in fig. calculate all the entropies i.e.H(X), H(Y), H(X, Y), H(X/Y), H(Y/X)

7. Consider a discrete memoryless source where alphabet consists of K equiprobable symbols. 2.2.1(a) Explain why the use of a fixed length code for the representation of such a source is about as

efficient as any code can be.(b) What conditions have to be satisfied by K and code word length for the coding efficiency to be

100 %.8. Consider the four codes listed below:- 2.2.3

Symbol Code I Code II Code III Code IVS0 0 0 0 0S1 10 01 01 01S2 110 001 011 10S3 1110 0010 110 110S4 1111 0011 111 111

a) Two of these four codes are prefix codes. Identify them, and construct their individual decision trees.b) Apply the Kraft- McMillan inequality to codes I, II, III, and IV. Discuss your results in light of these

obtained in part(a)(9) A discrete memory-less source (DMS) has five equally likely symbols. 10.33

a) Construct a Shannon-Fano code for X, and calculate the efficiency of the code.

Page 2: Tutorial Sheet Digital Comm

b) Construct a Huffman code and compare the results in terms of code efficiency.

10) A Discrete memory less source X has five symbols X1, X2, X3, X4, and X5, with P(X1)= 0.4, P( X2)= 0.19, P(X3)= 0.16 P( X4)=0.15 and P(X5)=0.1 10.34

a) Construct a Shannon-Famo code for X, and calculate the efficiency and redundancy of the code.

b) Repeat for the Huffman code and compare the results.

11) Find the mutual information and channel capacity of the channel shown in fig. Given P(X1) = 0.6, P(X2) =0.4.

12. Consider a discrete memory less source with alphabet S0, S1, S2 and statistics 0.7, 0.15, 0.15 for its output. 2.3.3

a) Apply the Huffman algorithm to this source. Hence, show that the average code word length of the Huffman code equals 1.3 bits/symbol.

b) Let the source be extended to order two. Apply the Huffman algorithm to the resulting extended source, and show that the average code-word length of the new code equals 1.1975 bits/symbol.

c) Compare the average code-word length calculated in part (b) with the entropy of the original source.

13. A binary symmetric channel is shown in fig. Find the rate of information transmission over this channel when P=0.9, 0.8 and 0.6; assume that the symbol (orbit) rate is 1000/Sec.

14. Calculate the capacity of the discrete channel shown in figure. Assume Rs=1 symbol/Sec.

P(X=0) = P

P(X=1) = Q

P(X=2) = Q

P(X=3) = P

15. A voice grade channel of the telephone network has a bandwidth of 3.4 KHz. 2.9.1

Page 3: Tutorial Sheet Digital Comm

a) Calculate the information capacity of the telephone channel for a signal- to- noise ratio of 30 dB.

b) Calculate the minimum signal-to- noise ratio through the telephone channel at the rate of 9600 b/s.

16. Consider an AWGN channel with 4KHz bandwidth and the noise power spectral density No/2=10 -12 W/Hz. The signal power required at the receiver is 0.1 mW. Calculate the capacity of this channel. 10.23

17. Find the channel capacity of an ideal AWGN channel with infinite bandwidth. 10.22

18. An analog signal having 4 KHz bandwidth is sampled at 1.25 times the Nyquist rate, and each sample is quantized into one of 256 equally likely levels. Assume that the successive samples are statistically independent. 10.24

a) What is the information rate of this source?

b) Can the output of this source be transmitted without error over an AWGN channel with a bandwidth of 10 KHz and an S/N ratio of 20 dB?

c) Find the S/N ratio required for error free transmission for part (a)

d) Find the bandwidth required for an AWGN channel for error free transmission of the output of this source if the S/N ratio is 20 dB.

19. Calculate the bandwidth of the picture (video) signal in a television. The following are the available data:-

(i) Number of distinguishable brightness levels=10; (ii) The number of elements per picture frame = 300,000; Picture frames transmitted per second = 30; and S/N required = 30 dB. 2.9.3 (Similar to 10.5)

20. State and derive the Shannon- Hartley Theorem.