1
Principles of Communications
By: Dang Quang Vinh
Faculty of Electronics and Telecommunications
Ho Chi Minh University of Natural Sciences
Convolutional codes
09/2008
2
Introduction
In block coding, the encoder accepts k-bit message block
and generates n-bit codeword⇒Block-by-block basis
Encoder must buffer an entire message block before
generating the codeword
When the message bits come in serially rather than in large
blocks, using buffer is undesirable
Convolutional coding
3
Definitions
An convolutional encoder: a finite-state machine that
consists of an M-stage shift register, n modulo-2 adders
L-bit message sequence produces an output sequence
with n(L+M) bits
Code rate:
L>>M, so
ol)(bits/symb
)( MLn
L
r
+
=
ol)(bits/symb
1
n
r =
4
Definitions
Constraint length (K): the number of shifts over which a
single message bit influence the output
M-stage shift register: needs M+1 shifts for a message to
enter the shift register and come out
K=M+1
5
Example
Convolutional code (2,1,2)
n=2: 2 modulo-2 adders or 2 outputs
k=1: 1 input
M=2: 2 stages of shift register (K=M+1=2+1=3)
Path 1
Path 2
Output
Input
6
Example
Output
Input
Convolutional code (3,2,1)
n=3: 3 modulo-2 adders or 3 outputs
k=2: 2 input
M=1: 1 stages of each shift register (K=2 each)
7
Generations
Convolutional code is nonsystematic code
Each path connecting the output to the input can be
characterized by impulse response or generator
polynomial
denoting the impulse response of
the i
th
path
Generator polynomial of the i
th
path:
D denotes the unit-delay variable⇒different from X of
cyclic codes
A complete convilutional code described by a set of
polynomials { }
),,, ,(
)(
0
)(
1
)(
2
)( iiii
M
gggg
)(
0
)(
1
2)(
2
)()(
)(
iiiMi
M
i
gDgDgDgDg ++++=
)(), ,(),(
)()2()1(
DgDgDg
n
8
Example(1/8)
Consider the case of (2,1,2)
Impulse response of path 1 is (1,1,1)
The corresponding generator polynomial is
Impulse response of path 2 is (1,0,1)
The corresponding generator polynomial is
Message sequence (11001)
Polynomial representation:
1)(
2)1(
++= DDDg
1)(
2)2(
+= DDg
1)(
34
++= DDDm
9
Example(2/8)
Output polynomial of path 1:
Output sequence of path 1 (1001111)
Output polynomial of path 2:
Output sequence of path 2 (1111101)
1
1
)1)(1(
)()()(
236
2345456
234
)1()1(
++++=
++++++++=
++++=
=
DDDD
DDDDDDDD
DDDD
DgDmDc
1
)1)(1(
)()()(
23546
234
)2()2(
+++++=
+++=
=
DDDDD
DDD
DgDmDc
10
Example(3/8)
m= (11001)
c
(1)
=(1001111)
c
(2)
=(1111101)
Encoded sequence c=(11,01,01,11,11,10,11)
Message length L=5bits
Output length n(L+K-1)=14bits
A terminating sequence of K-1=2 zeros is
appended to the last input bit for the shift
register to be restored to its zero initial state
11
Example(4/8)
Another way to calculate the output:
Path 1:
11110000
10110100
10011001
11001001
00110010
01100100
11001001
output111m
c
(1)
=(1001111)
12
Example(5/8)
Path 2
m 101 output
001001 1 1
00100 11 1
0010 011 1
001 001 1 1
00 100 11 1
0 010 011 0
001 0011 1
c
(2)
=(1111101)
13
Example(6/8)
Consider the case of (3,2,1)
denoting the impulse
response of the j
th
path corresponding to i
th
input
Output
Input
),, ,,(
)(
0,
)(
1,
)(
1,
)(
,
)( j
i
j
i
j
Mi
j
Mi
j
i
ggggg
−
=
14
Example(7/8)
Output
Input
DDgg
DDgg
DDgg
Dgg
Dgg
DDgg
=⇒=
+=⇒=
=⇒=
=⇒=
=⇒=
+=⇒=
)()10(
1)()11(
)()10(
1)()01(
1)()01(
1)()11(
)1(
1
)3(
2
)1(
1
)3(
1
)2(
2
)2(
2
)2(
1
)2(
1
)1(
1
)1(
2
)1(
1
)1(
1
15
Example(8/8)
Assume that:
m
(1)
=(101)⇒m
(1)
(D)=D
2
+1
m
(2)
=(011)⇒m
(1)
(D)=D+1
Outputs are:
c
(1)
=m
(1)
*g
1
(1)
+m
(2)
*g
2
(1)
= (D
2
+1)(D+1)+(D+1)(1)
=D
3
+D
2
+D+1+D+1=D
3
+D
2
⇒c
(1)
=(1100)
c
(2)
=m
(1)
*g
1
(2)
+m
(2)
*g
2
(2)
= (D
2
+1)(1)+(D+1)(D)
=D
2
+1+D
2
+D=D+1 ⇒c
(2)
=(0011)
c
(3)
=m
(1)
*g
1
(3)
+m
(2)
*g
2
(3)
= (D
2
+1)(D+1)+(D+1)(D)
=D
3
+D
2
+D+1+D
2
+D=1=D
3
+1
⇒c
(3)
=(1001)
Output c=(101,100,010,011)
16
State diagram
4 possible states
Each node has 2 incoming
branches, 2 outgoing branches
A transition from on state to
another in case of input 0 is
represented by a solid line and
of input 1 is represented by
dashed line
Output is labeled over the
transition line
state Binary description
a 00
b 10
c 01
d 11
00
10
01
11
a
b
d
c
0/00
1/11
0/10
1/01
1/10
0/01
1/00
0/11
Consider convolutional code (2,1,2)
17
Example
Message 11001
Start at state a
Walk through the
state diagram in
accordance with
message sequence
00
10
01
11
a
b
d
c
0/00
1/11
0/10
1/01
1/10
0/01
1/00
0/11
00
a
10 01 00
b
c
a
State
Input 1 0 0
Output
10
b
1
11
01
c
0
01
00
a
0
11 11 10 11
11
d
01
1
18
Trellis(1/2)
a=00
b=10
c=01
d=11
0/00
1
/
1
1
1
/
1
1
0/00
0
/
1
0
1
/
0
1
1
/
1
1
0/00
0
/
1
0
1
/
0
1
0
/
1
1
1
/
0
0
1/10
0
/
0
1
1
/
1
1
0/00
0
/
1
0
1
/
0
1
0
/
1
1
1
/
0
0
1/10
0
/
0
1
1
/
1
1
0/00
0
/
1
0
1
/
0
1
0
/
1
1
1
/
0
0
1/10
0
/
0
1
1
/
1
1
0/00
0
/
1
0
1
/
0
1
0
/
1
1
1
/
0
0
1/10
0
/
0
1
0/00
0
/
1
0
0
/
1
1
0
/
0
1
0/00
0
/
1
1
Level j=0 1 5432 L-1 L L+1 L+2
19
Trellis(1/2)
The trellis contains (L+K) levels
Labeled as j=0,1,…,L,…,L+K-1
The first (K-1) levels correspond to the
encoder’s departure from the initial state a
The last (K-1) levels correspond to the
encoder’s return to state a
For the level j lies in the range K-1≤j≤L, all
the states are reachable
20
Example
Message 11001
a=00
b=10
c=01
d=11
0/00
1
/
1
1
0/00
0
/
1
0
1
/
1
1
0/00
0
/
1
0
1
/
0
1
0
/
1
1
1
/
0
0
1/10
1
/
1
1
0/00
0
/
1
0
1
/
0
1
1
/
0
0
1/10
0
/
0
1
0/00
0
/
1
0
1
/
0
1
0
/
1
1
1
/
0
0
1/10
0
/
0
1
0/00
0
/
1
1
0
/
0
1
0/00
Level j=0 1 5432
Input
Output
1 1 0 0 1 0 0
6 7
1
/
1
1
1
/
0
1
0
/
0
1
0
/
1
1
1
/
1
1
0
/
1
0
0
/
1
1
11 01 01 11 11 10 11
21
Maximum Likelihood Decoding
of Convolutional codes
m denotes a message vector
c denotes the corresponding code vector
r denotes the received vector
With a given r , decoder is required to make estimate
of message vector, equivalently produce an estimate
of the code vector
otherwise, a decoding error
happens
Decoding rule is said to be optimum when the
propability of decoding error is minimized
The maximum likelihood decoder or decision rule is
described as follows:
Choose the estimate for which the log-likelihood
function log
p(r/c)
is maximum
m
ˆ
c
ˆ
ccmm ==
ˆ
ifonly
ˆ
c
ˆ
22
Maximum Likelihood Decoding
of Convolutional codes
Binary symmetric channel: both c and r are binary
sequences of length N
r differs from c in d positions, or d is the Hamming
distance between r and c
∏
=
=
N
i
ii
crpcrp
1
)|()|(
∑
=
=⇒
N
i
ii
crpcrp
1
)|(log)|(log
=
≠
−
=
ii
ii
ii
cr
cr
p
p
crp
if
if
1
)|(with
)1log(
1
log
)1log()(log)|(log
pN
p
p
d
pdNpdcrp
−+
−
=
−−+=⇒
23
Maximum Likelihood Decoding
of Convolutional codes
Decoding rule is restated as follows:
Choose the estimate that minimizes the
Hamming distance between the received vector
r
and the transmitted vector
c
The received vector r is compared with each
possible code vector c, and the one closest
to r is chosen as the correct transmitted
code vector
c
ˆ
24
The Viterbi algorithm
Choose a path in the trellis whose coded sequence differs
from the received sequence in the fewest number of
positions
25
The Viterbi algorithm
The algorithm operates by computing a
metric for every possible path in the trellis
Metric is Hamming distance between coded
sequence represented by that path and
received sequence
For each node, two paths enter the node, the
lower metric is survived. The other is
discarded
Computation is repeated every level j in the
range K-1≤ j≤L
Number of survivors at each level ≤2
K-1
=4