Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 2

Digital Communications

1. Derive the probability of error expression for the following keying techniques:
a. Amplitude Shift Keying
b. Frequency Shift Keying
c. Phase Shift Keying
d. Quadrature Phase Shift Keying

2. Explain the following terms


a. Amount of information and its properties
b. Entropy and its properties
c. Mutual Information and its properties

3. A DMS transmits four messages 𝑀1,𝑀2, 𝑀3, 𝑀4 with the probabilities ½, ¼, 1/8, 1/8 respectively
Calculate the Entropy H. If r = 1 message/sec then calculate the R.

4. i) Show that the entropy for a discrete memory less source is maximum at log2M when the output
symbols are equally probable. ii) Show that H(X, Y) = H(X) + H(Y |X) = H(Y) + H (X|Y). iii)
Show that I(X, Y) = H(X) + H(Y) - H(X, Y) = H(X) - H (X|Y) = H(Y) - H(Y |X).

5. A DMS has symbols a, b, c with probabilities 0.65, 0.2, 0.15 respectively.


i) Calculate the entropy of the source.
ii) Calculate the entropy of second order extension of the source.

6. Consider a telegraph source having two symbols, dot and dash. The dot duration is 0.2 s. The dash
duration is 3 times that of the dot duration. The probability of the dot’s occurring is twice that of
the dash, and the time between symbols is 0.2 s. Calculate the information rate of the telegraph
source.

7. An analog signal is band limited to B Hz and sampled at Nyquist rate. The samples are quantized
in to 4 levels. Each level represents one message. Thus there are 4 messages the probabilities of
occurrence of these 4 levels (message) are p1 = p4 = 1/8 and p2 = p3 = 3/8. Find out information
rate of the source.

8. Apply Shannon-Fano coding to the source with 8 emitting messages having probabilities
½,3/20,3/20,2/25,2/25,1/50,1/100 and 1/100 respectively, and find the coding efficiency.

9. For the following messages with their probabilities calculate the code words using both
“Shannon- Fano Coding” and “Huffman Encoding” technique and also calculate the coding
efficiency

M1 M2 M3 M4 M5 M6 M7 M8 M9
1/2 1/8 1/8 1/8 1/16 1/16 1/16 1/32 1/32

10. For a (7,4) Linear Block Code, the Generator Matrix


0
0 0 1
1 1
G = 1
0 0 1 0 1
0 1 0 0 1 1
0 0 0 1 1 0
a. Obtain all code words
b. Find the parity check matrix H
c. Determine the error correction and detection capabilities of this code
d. Compute the syndrome for the received code vector R = [1 1 0 1 1 0 1]

11. For a (7, 4) Binary Cyclic Codes, the Generator polynomial 𝑔(𝑥) = 1 + 𝑥 + 𝑥3 find out all code
vectors in “Systematic” Procedure.

12. Draw the state diagram, tree diagram and trellis diagram for k=3 and rate 1/3 code generated by
g1(x) = 1+x2, g2(x) =1+x, g3(x) =1+x+x2.

You might also like