INFORMATION THEORY
1. What is entropy?
Entropy
is also called average information per message. It is the ratio of total
information to number of messages. i.e.,
Entropy,
H = Total information / Number of messages
2. What is channel redundancy?
Redundancy
= 1 – code efficiency
Redundancy
should be as low as possible.
3. Name the two source coding techniques.
The source coding techniques are, a) prefix
coding b)
Shannon-fano coding c) Huffman coding
4. Write the expression for code efficiency in
terms of entropy.
Code
efficiency = Entropy(H) / Average code word length(N)
5. What is memory less source? Give an example.
The
alphabets emitted by memory less source do not depend upon previous alphabets.
Every alphabet is independent. For example a character generated by keyboard
represents memory less source.
6. Explain the significance of the entropy H(X/Y)
of a communication system where X is the transmitter and Y is the receiver.
a) H(X/Y)
is called conditional entropy. It represents uncertainty of X, on average, when
Y is known.
b) In
other words H(X/Y) is an average measure of uncertainty in X after Y is
received.
c) H(X/Y)
represents the information lost in the noisy channel.
7. What is prefix code?
In prefix
code, no codeword is the prefix of any other codeword. It is variable length
code. The binary digits (codewords) are assigned to the messages as per their
probabilities of occurrence.
8. Define information rate.
Information
rate(R) is represented in average number of bits of information per second. It
is calculated as,
R = r H
Information bits / sec
9. Calculate the entropy of source with a symbol
set containing 64 symbols each with a probability pi = 1/ 64 .
Here,
there are M = 64 equally likely symbols. Hence entropy of such source is given
as,H = log 2 M = log 2 64 = 6 bits / symbol
10. State the channel coding theorem for a discrete
memory less channel.
Statement of the theorem:
Given a
source of ‗M‘ equally likely messages, with M >>1, which is generating
information at a rate. Given channel with capacity C. Then if,R ≤ C
There
exits a coding technique such that the output of the source may be transmitted
over the channel with a probability of error in the received message which may
be made arbitrarily small.
Explanation:
This theorem says that if R ≤ C ; it is possible to transmit information
without any error even if noise is present. Coding techniques are used to
detect and correct the errors.
11. What is information theory?
Information
theory deals with the mathematical modeling and analysis of a communication
system rather than with physical sources and physical channels
12. Explain Shannon-Fano coding.
An
efficient code can be obtained by the following simple procedure, known as
Shannon
– Fano
algorithm.
Step 1:
List the source symbols in order of decreasing probability.
Step 2:
Partition the set into two sets that are as close to equiprobable as possible,
and sign 0 to the upper set and 1 to the lower set.
Step:
Continue this process, each time partitioning the sets with as nearly equal
probabilities as possible until further partitioning is not possible.
13.
Define bandwidth efficiency.
The ratio
of channel capacity to bandwidth is called bandwidth efficiency. i.e,
Bandwidth
efficiency = Channel Capacity / Bandwidth (B)
14. Define channel capacity of the discrete memory
less channel.
The
channel capacity of the discrete memory less channel is given as maximum
average mutual information. The maximization is taken with respect to input
probabilities.
GLOSSARY TERMS:
1. Entropy,
the average amount of information per source symbol.
2. Information,
is a continuous function of probability,
3. Channel,
is the medium through which the information is transmitted from the source to
destination.
4. Channel
capacity, maximum of mutual information that may be transmitted through the
channel.
5. Binary
symmetric channel, the channel is symmetric because the probability of
receiving a '1‘ if a '0‘ is transmitted is same as the probability of receiving
a '0‘ if '1‘ is transmitted.
6. SNR, the
ratio of input noise is divided by output noise.
Related Topics
Privacy Policy, Terms and Conditions, DMCA Policy and Compliant
Copyright © 2018-2023 BrainKart.com; All Rights Reserved. Developed by Therithal info, Chennai.