dc-tutorial sheet 5

Upload: allanki-sanyasi-rao

Post on 15-Oct-2015

14 views

Category:

Documents


0 download

DESCRIPTION

Dig. Commn.

TRANSCRIPT

Tutorial Sheet 5Subject: Digital Communication Branch: III ECE II sem

1. The probabilities of 4 messages are , , 1/8 and 1/16. Calculate the Entropy. If the source produces 100 Symbols/sec, determine the maximum information rate.2. Using Shanon-Fano algorithm find the code words for six messages occurring with probabilities 1/3, 1/3, 1/6, 1/12, 1/24 and 1/24.3. Calculate the capacity of a standard 4 KHz telephone channel working in the range of 300 to 3400 Hz with a SNR of 32 dB.4. Let 1, 2, 3 and 4 have probabilities , , 1/8 and 1/8.i) Calculate Hii) Find R if r=1 messages per second.iii) What is the rate at which binary digits are transmitted if the signal is sent after encoding 1 4 as 00, 01, 10 and 11.5. A transmitter has an alphabet consisting of four letters [ F, G, H and I ]. The joint probabilities for the communication are given below:

FGHI

A0.250.000.000.00

B0.100.300.000.00

C0.000.050.100.00

D0.000.000.050.10

E0.000.000.050.00

Determine the Marginal, Conditional and Joint entropies for this channel.6. Construct the Optimum Source Code for the following symbols.

X X1X2X3X4X5X6

P(x) 0.30.250.20.10.10.05

Find the coding efficiency.7. An independent, discrete source transmits letters selected from an alphabet consisting of 3 letters A, B and C with respective probabilities 0.7, 0.2 and 0.1. if consecutive letters are statistically independent and two symbol words are transmitted, find the entropy of the system of such words.8. The noise characteristic of a channel is given as: Y

X and P(x1)=1/8, P(x2)=1/8 and P(x3)=6/8.Find the mutual information.9. A source is transmitting messages Q1, Q2 and Q3 with probabilities P1, P2 and P3. Find the condition for the entropy of the source to be maximum.10. Consider an AWGN channel with 4 KHz bandwidth and the noise PSD is the signal power required at the receiver is 0.1 mw. Find the capacity of the channel.11. For the communication system whose channel characteristic is:

1/211/41/401201XY

Given P(x=0), P(x=1) = 1/2. Verify that I[X, Y] = I[Y, X]12. For a signal x(t) which is having a uniform density function in the range 0 x 4, find H(X). If the same signal is amplified by a factor of 8, determine H(X).13. Calculate the average information content in English language, assuming that each of the 26 characters in the alphabet occurs with equal probability.14. A discrete memory less source X has four symbols X1, X2, X3, and X4 with probabilities 0.4, 0.3, 0.2 and 0.1 respectively. Find the amount of information contained in the messages X1 X2 X1 X3 and X4 X3 X3 X2.15. Two binary channels are connected in cascade as shown.

a) Find the overall matrix of the resultant channel.b) Find P(z1) and P(z2) when P(x1)=P(x2)=0.516. An analog signal having a 4 KHz bandwidth is sampled at 1.25 times the Nyquist rate and each sample is quantized into one of 256 equally likely levels. Assume that the successive samples are statistically independent.i) What is the information rate of this source?ii) Find the bandwidth required for the channel for the transmission of the output of the source, if the S/N ratio is 20 dB if error free transmission is required.17. A discrete memory less source has five equally likely symbols. Construct a Shanon-Fano code for the above source.18. Two dice are thrown and we are told that the sum of faces is 7. Find the information content of above message.

A.S.Rao Dept. of Electronics & Communication Engg.

Balaji Institute of Engineering & Sciences