Telecommunication
Networks Information Theory
By: Vinh Dang
12/12/13
1
Outline
Introduction of Telecommunication
networks
Basic concepts
Information
Entropy
Source Encoding
12/12/13
2
Introduction
What is telecommunications?
This word is derived from the
Greek word, "tele", which
means, "far off"; and the Latin
word, "communicate", which
means, "to share". Hence
Telecommunication is distance
communication.
The true nature of
telecommunications is the
passing of information to one
or more others in any form that
may be used.
12/12/13
3
Telecommunications
People tend to think of telecommunications in
terms of telephones, computer networks,
Internet, and maybe even cable television.
This includes the often-considered electrical,
electromagnetic, and optical means already
mentioned, but it also includes simple wire,
radio, or even other visual forms.
12/12/13
4
Early Telecommunications
Drum and horn
Smoke/Fire Signal
Light
Pigeons
12/12/13
Smoke
Drum
Tower using mirrors
5
Pigeons
Advancing telecommunications
Telegraph (Morse)
Telephone (Bell)
Wireless communication
Satellite Communication
12/12/13
A BTS (Mobile
Communication)
6
VINASAT-1 satellite
Mobile communication systems
Wireless communication systems
Data communication systems
Telecommunication Networks
Satellite communication
systems
Switching systems
12/12/13
Optical communiation systems
7
Voice&Television systems
A telecommunicaton networks is a network
of telecommunication links and nodes
arranged so that messages may be
passed from one part of the network to
another over multiple links and through
various modes.
12/12/13
8
Telecommunication Networks
Telecommunication networks are subdivided into the following
areas:
Transport
Transmission facilities
Switching
Switch or Exchange or Central Office (CO)
Access
Equipment for the access of subscribers
(Access networks AN)
Customer
Premises
Equipment (CPE)
Subscriber terminal equipment
Switching equipment and transmission facilities together form the core
network
The customer premises equipment is connected to the core network via
the access network
Transport and switching are the two basic functions for the transfer
of user information.
12/12/13
9
AN
N
T
S IEM EN S
N IX DO RF
Transport
AN
AN
N
T
N
T
S IE M E N S
N IX DO RF
Exchange
S IEM EN S
N IX DO RF
Access Network (AN)
with subscriber
terminals (CPE)
AN
N
T
S IE M E N S
N IXD O R F
12/12/13
Transport channels (inter exchange connections):
• Trunks for transmitting user information
• Signaling links for transmitting the control information
10
Basic concepts
Block diagram of digital communication system
Source
Destination
12/12/13
Source
encoder
Channel
encoder
Modulator
Source
decoder
Channel
decoder
Demodulator
Noisy
Channel
11
What is Information Theory?
Information theory provides a quantitative measure of
source information, the information capacity of a channel
Dealing with coding as a means of utilizing channel
capacity for information transfer
Shannon’s coding theorem:
“If the rate of information from a source does not exceed
the capacity of a communication channel, then there
exists a coding technique such that the information can be
transmitted over the channel with an arbitrarily small
probability of error, despite the presence of error”
12/12/13
12
Information measure
Information Theory: how
much information
… is contained in a signal?
… can a system generate?
… can a channel transmit?
Information is the
commodity produced by
the source for transfer to
some user at the
destination
Examples: Barcelona vs
GĐT-LA
12/12/13
13
Information measure
Consider the three results: win, draw, loss
Cases
Barca wins
No information
≈1, quite sure
2
Barca draws with
GĐT-LA
More information
Relatively low
3
Information
1
Events
Probability
Barca loses
A vast amount of
information
Very low probability
of occurrence in a
typical situation
The less likely the message, the more information it
conveys
How is information mathematically defined?
12/12/13
14
Information
Let xj be an event with p(xj) is the probability of the event
that xj is selected for transmission
If xj occurred, we have
I ( x j ) = log a
1
= − log a p ( x j )
p( x j )
(1)
units of information
I(xj) is called self-information
for
0≤ p(xj)≤1
I(xj)→0
for
p(xj)→1
12/12/13
I(xj)≥0
I(xj)>I(xi) for
p(xj)
15
Information
The base of the logarithm
e →the measure of information is nat
10 →the measure of information is hartley
2 →the measure of information is bit
Examples: A random experiment with 16 equally likely
outcomes:
The information associated with each outcomes is:
I(xj)=-log2(1/16)=log216=4 bits
The information is greater than one bit, since the probability of
each outcome is much less than ½.
12/12/13
16
Entropy and Information rate
Consider an information source emitting a sequence of
symbols from the set X={x1,x2..,xM}
Each symbol xi is treated as a message with probability p(xi)
and self-information I(xi)
This source has an average rate of r symbols/sec
Discrete memoryless source
The amount of information produced by the source during an
arbitrary symbol interval is a disrete random variable X.
The average information per symbol is then given by:
M
H ( X ) = E{I ( x j )} = −∑ p ( x j ) log 2 p ( x j )
bit/symbol
(2)
j =1
Entropy = information = uncertainty
If a signal is completely predictable, it has zero entropy and no
information
Entropy = average number of bits required to transmit the
17
signal
12/12/13
Example
Random variable with uniform
distribution over 32 outcomes
= - ∑ (1/32).log(1/32) = log 32 = 5
# bits required = log 32 = 5 bits!
Therefore H(X) = number of bits required to
represent a random event
H(X)
How many bits are needed for:
Outcome
of a coin toss
“tomorrow is a Thursday”
12/12/13
18
Entropy
The value of H(X) for a given source depends upon the
symbol probabilities p(xi) and M
However,
0 ≤ H ( X ) ≤ log 2 M
(3)
The lower bound corresponds to no uncertainty
The upper bound corresponds to maximum uncertainty,
occuring when each symbol are equally likely
The proof of this inequality is shown in [2] Chapter 15
12/12/13
19
Prove 0 ≤ H ( X ) ≤ log 2 M
The lower bound with arbitrary M is easily done with
noting that a.log(a)→0 as a→0
The proof of the upper bound is more complex
H ( X ) = −∑ p ( x ). log p ( x) ≤ log 2 M
X
M
= −∑ p ( xi ). log p ( xi ) ≤ log 2 M
i =1
We invoke the inequality
ln a ≤ (a-1)
12/12/13
20
Prove 0 ≤ H ( X ) ≤ log 2 M
Consider:
H ( X ) − log 2 M = −∑ p ( x). log p ( x) − log 2 M
X
M
M
M
i =1
i =1
i =1
= −∑ p ( xi ). log 2 p ( xi ) − log 2 M .∑ p ( xi ) = ∑ p ( xi ). log 2
1
M . p ( xi )
M
1
1
= log 2 e.∑ p ( xi ). ln
≤ log 2 e.∑ p ( xi ).
− 1
M . p( x )
M . p ( xi )
i =1
i =1
i
M
1
⇒ H ( X ) − log 2 M ≤ log 2 e.∑ − p ( xi )
i =1 M
M 1 M
⇔ H ( X ) − log 2 M ≤ log 2 e ∑ − ∑ p ( xi )
i =1 M i =1
M
⇔ H ( X ) − log 2 M ≤ 0 ⇒ H ( X ) ≤ log 2 M
The equal sign occurs when:
12/12/13
1
1
= 1 ⇒ p ( xi ) =
M . p ( xi )
M
21
Example
For a binary source (M=2),
p(1)=α and p(0)=1-α = β.
From (2), we have the binary
entropy:
H(X)= -α.logα -(1-α).log(1-α)
12/12/13
22
Source coding theorem
Information from a source producing different
symbols could be described by the entropy H(X)
Source information rate (bit/s):
Rs = rH(X) (bit/s)
H(X):
source entropy (bits/symbol)
r: symbol rate (symbols/s)
Assume this source is the input to a channel:
C:
capacity (bits/symbol)
S: available symbol rate (symbols/s)
S.C: bits/s
12/12/13
23
Source coding theorem (cont’d)
Shannon’s first theorem (noiseless coding theorem):
“Given a channel and a source that generates
information at a rate less than the channel capacity, it
is possible to encode the souce output in such a
manner that it can be transmitted through the channel”
Demonstration of source encoding by an example:
Discrete
binary
source
Source symbol rate= r
= 3.5 symbols/s
12/12/13
Source
encoder
Binary
channel
C = 1 bit/symbol
S = 2 symbols/s
SC
24 = 2 bits/s
Example of Source encoding
Discrete binary source: A(p=0.9), B(p=0.1)
Source symbol rate (3.5) >channel capacity (2)
source symbols cannot be transmitted directly
Check Shannon’s theorem:
H(X)=-0.1
Rs
log20.1-0.9log20.9=0.469bits/symbol
= rH(X) = 3.5(0.469)=1.642 bits/s < S.C = 2 bits/s
Transmission is possible by source encoding to
decrease the average symbol rate
12/12/13
25