Tải bản đầy đủ (.doc) (2 trang)

Problems 3

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (54.78 KB, 2 trang )

Problems _ Chapter 3
3.1
A discrete source generates three independent symbols A, B and C with probabilities
0.9, 0.08 and 0.02 respectively.
a) Determine the entropy of the source.
b) Determine the redundancy of the source.
3.2
a) Consider a source having an M = 3 symbol alphabet where p(x
1
) = ½, p(x
2
) = p(x
3
) =
¼ and symbols are statistically independent. Calculate the information conveyed by the
receipt of the symbol x
1
. Repeat for x
2
and x
3
.
b) Consider a source whose, statistically independent, symbols consist of all possible
binary sequences of length k. Assume all symbols are equiprobable. How much
information is conveyed on receipt of any symbol.
c) Dermine the information conveyed by the specific message x
1
x
3
x
2


x
4
when it
emanates from the following statistically independent symbol source: M = 4, p(x
1
) =
1/2, p(x
2
) = 1/4, p(x
3
) = p(x
4
) = 1/8.
3.3
Calculate the loss information due to noise, per transmitted digit, if a random binary
signal is transmitted through a channel, which adds zero mean Gaussian noise, with an
average signal-to-noise ratio of:
a) 0 dB
b) 5 dB
c) 10 dB
3.4
An information source contains 100 different, statistically independent, equiprobable
symbols. Find the maximum code efficiency, if, for transmission, all the symbols are
represented by binary code words of equal length.
3.5
a) Apply Huffman’s algorithm to deduce an optimal code for transmitting the source
defined in problem 3.2a over a binary channel. Is your code unique?
b) Determine the efficiency of the code devised in part (a).
c) Construct another code for the source of part (a) and assign equal length binary
words irrespective of the occurrence probability of the symbols. Calculate the

efficiency of this source.
3.6
An input alphabet (a keyboard on a word processor) consists of 100 characters.
a) If the keystrokes are encoded by a fixed-length code, determine the required number
of bits for the encoding
b) We make the assumption that 10 of the keystrokes are equally likely and that each
occurs with probability of 0.05. We also assume that the remaining 90 keystrokes are
equally likely. Determine the average number of bits required to encode this alphabet
using a variable-length Huffman code.
3.7
Encode the specific message AABACB using:
a) The 7-bit ASCII.
b) The adaptive Huffman code.
c) Compare the number of bits in encoded message in part (a) and part (b). Give your
comments.

Tài liệu bạn tìm kiếm đã sẵn sàng tải về

Tải bản đầy đủ ngay
×