Answer:
Huffman code is use for encoding the language. The entropy when calculated is 1.5.
Explanation:
Using Huffman Coding scheme to encode:
The huffman coding scheme is described in the attachment.
To find entropy; we use the formula given below:
H = ∑[tex]p log_{2} \frac{1}{p}[/tex]
where H = Entropy and p = probability
p(A) = 50% = 1/2
p(B) = 25% = 1/4
p(C) = 25% = 1/4
[tex]H = \frac{1}{2} log_{2} 2 + \frac{1}{4} log_{2} 4+ \frac{1}{4} log_{2}4\\H = \frac{1}{2}(1) + \frac{1}{4}(2) + \frac{1}{4}(2)\\H = \frac{1}{2} + \frac{2}{4} + \frac{2}{4}\\H = \frac{1}{2} + \frac{1}{2} + \frac{1}{2}\\H = \frac{3}{2}\\H = 1.5[/tex]