You have to communicate a signal in a language that has 3 symbols A, B and C. The probability of observing A is 50% while that of observing B and C is 25% each. Design an appropriate encoding for this language. What is the entropy of this signal in bits?

Respuesta :

Answer:

Huffman code is use for encoding the language. The entropy when calculated is 1.5.

Explanation:

Using Huffman Coding scheme to encode:

The huffman coding scheme is described in the attachment.

To find entropy; we use the formula given below:

H = ∑[tex]p log_{2}  \frac{1}{p}[/tex]

where H = Entropy and p = probability

p(A) = 50% = 1/2

p(B) = 25% = 1/4

p(C) = 25% = 1/4

[tex]H = \frac{1}{2} log_{2} 2 + \frac{1}{4}  log_{2} 4+ \frac{1}{4} log_{2}4\\H = \frac{1}{2}(1) + \frac{1}{4}(2) + \frac{1}{4}(2)\\H = \frac{1}{2} + \frac{2}{4} + \frac{2}{4}\\H = \frac{1}{2} + \frac{1}{2} + \frac{1}{2}\\H = \frac{3}{2}\\H = 1.5[/tex]

Q&A Education