Professional Documents
Culture Documents
Paul H. Siegel
Electrical and Computer Engineering
University of California, San Diego
5/ 31/ 07
1
Copyright 2007 by Paul H. Siegel
Outline
INFORMATION
SOURCE TRANSMITTER RECEIVER DESTINATION
CHANNEL
SIGNAL RECEIVED
SIGNAL
MESSAGE
MESSAGE
NOISE
SOURCE
f ( y | 1) f ( y | 1)
P P
def
# information bits
R= <C
channel use
0 0 0.9
0.8
0.7
0.6
? 0.5
0.4
0.3
1 1 0.2
0.1
1- 0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0 0 0.8
0.7
p 0.6
0.5
0.4
p 0.3
0.2
1 1 0.1
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
1-p
H 2 ( p ) = p log 2 p (1 p) log 2 (1 p)
5/ 31/ 07 LDPC Codes 8
More Channels and Capacities
P
C = log 2 1 + 2
1
f ( y | 1) f ( y | 0) 2
1.8
1.6
P
1.4
P 1.2
0.8
0.6
0.4
0.2
0
-10 -8 -6 -4 -2 0 2 4 6 8 10
x = x1 , x2 , K, xk c = c1 , c2 , K, cn
Source Encoder
Channel
Sink Decoder
x = x1 , x 2 , K, x k y = y1 , y 2 , K , y n
Code rate: R= k
n
5/ 31/ 07 LDPC Codes 10
Shannons Coding Theorems
A regrettable review:
Doob, J.L., Mathematical Reviews, MR0026286 (10,133e)
The discussion is suggestive throughout, rather than
mathematical, and it is not always clear that the authors
mathematical intentions are honorable.
Problem
Randomness + large block length + optimal decoding =
COMPLEXITY!
Solution
Long, structured, pseudorandom codes
Practical, near-optimal decoding algorithms
Examples
Turbo codes (1993)
Low-density parity-check (LDPC) codes (1960, 1999)
State-of-the-art
Turbo codes and LDPC codes have brought Shannon limits
to within reach on a wide range of channels.
LDPC
codes from Trellis and Turbo Coding,
Schlegel and Perez, IEEE Press, 2004
5/ 31/ 07 LDPC Codes 15
Linear Block Codes - Basics
Encoding rule: 2 3
1. Insert data bits in 1, 2, 3, 4. 1
7 6
2. Insert parity bits in 5, 6, 7 4
to ensure an even number
of 1s in each circle
2k=16 codewords
Systematic encoder places input bits in positions 1, 2, 3, 4
Parity bits are in positions 5, 6, 7
0000 000 1000 111
0001 011 1001 100
0010 110 1010 001
0011 101 1011 010
0100 101 1100 010
0101 110 1101 001
0110 011 1110 100
0111 000 1111 111
5/ 31/ 07 LDPC Codes 18
Hamming Code Parity Checks
1 2 3 4 5 6 7
5
1 1 1 0 1 0 0
2 3
1 1 0 1 1 0 1 0
7 6 1 1 0 1 0 0 1
4
1 1 1 0 1 0 0 0
H = 1 0 1 1 0 1 0 H c = 0
T
1 1 0 1 0 0 1 0
Generator matrix G
1 0 0 0 1 1 1 u = [u1 , u 2 , u3 .u 4 ]
c = [c1 , c2 , c3 , c4 , c5 , c6 , c7 ]
0 1 0 0 1 0 1
G=
0 0 1 0 1 1 0
0 0 0 1 0 1 1
u G = c
5/ 31/ 07 LDPC Codes 20
Parity-Check Equations
1 1 1 0 1 0 0 c1 + c2 + c3 + c5 = 0
H = 1 0 1 1 0 1 0 c1 + c3 + c4 + c6 = 0
1 1 0 1 0 0 1 c1 + c2 + c4 + c7 = 0
c1
c2
c3
c1 +c2 +c3 +c5 =0
c4
c1 +c3 +c4 +c6 =0
c5
c6
c1 +c2 +c4 +c7 =0
c7
variable nodes
(bit, left) M check nodes
(constraint, right)
Gallager, 1963
Gallager, 1963
Gallager, 1963
Richardson,
Irregular LDPC Shokrollahi,
(3,6) and Urbanke,
2001
n=106
R=1/2
( xx)) = i xi ( xx)) =
i=2
i x i
i =1
i = number of variable i = number of check
nodes of degree i nodes of degree i
( x) =
i =1
i x i 1 ( x) =
i=2
i x i 1
Conversion rule
(x ) L(x ) P (x ) R (x )
( x) = = ( x) = =
(1) L(1) P (1) R (1)
Design rate
1
i
R ( , ) = 1
i i
=1
( x ) dx
0
i i
1
( x ) dx
i
0
Under certain conditions related to codewords of weight n/2,
the design rate is achieved with high probability as n increases.
(x ) = 14 + 12 x + 14 x 2 3 4
R ( , ) = 1 =
(x ) = x 3 7 7
1
1
1
H = 1 n-k
1
0 1
1
1
n-k k
(complexity ~O(n2) )
Another general encoding technique based upon approximate
lower triangulation has complexity no more than O(n2), with
the constant coefficient small enough to allow practical
encoding for block lengths on the order of n=105.
= arg max
b{0 ,1}
P
xC
X |Y ( x | y)
xi = b
Assume that codewords are transmitted equiprobably.
If every codeword xX(y) satisfies xi=b, then set
x MAP ( y ) = b
Otherwise, declare a bit erasure in position i.
Estimated codeword:
[0 0 0 0 0 0 0]
0
Decoding successful!!
This procedure would have 0 0
worked no matter which 0
codeword was transmitted. 0 0
0
Initialization:
Forward known variable node
values along outgoing edges
Accumulate forwarded values at
check nodes and record the
parity
Delete known variable nodes and
all outgoing edges
0 0
? ?
0 0
? ?
0 0
? ?
1 1
5/ 31/ 07 LDPC Codes 52
Peeling Decoder - Initialization
Delete known variable
x Accumulate parity x nodes and edges
0 0
? ?
0 0
? ?
0 0
? ?
1 1
5/ 31/ 07 LDPC Codes 53
Decoding with the Tanner Graph:
an a-Peeling Decoder
Decoding step:
Select, if possible, a check node with one edge remaining;
forward its parity, thereby determining the connected
variable node
Delete the check node and its outgoing edge
Follow procedure in the initialization process at the known
variable node
Termination
If remaining graph is empty, the codeword is determined
If decoding step gets stuck, declare decoding failure
0 0
0 0
0 0
? ?
0 0
? ?
1 1
5/ 31/ 07 LDPC Codes 56
Peeling Decoder Step 2
Find degree-1 check node;
Delete check node and edge;
x forward accumulated parity; x forward new variable node value
determine variable node value
0 0
0 0
0 0
1 1
0 0
? ?
1 1
5/ 31/ 07 LDPC Codes 57
Peeling Decoder Step 2
Delete known variable
x Accumulate parity x nodes and edges
0 0
0 0
0 0
1 1
0 0
? ?
1 1
5/ 31/ 07 LDPC Codes 58
Peeling Decoder Step 3
Find degree-1 check node;
Delete check node and edge;
x forward accumulated parity; x decoding complete
determine variable node value
0 0
0 0
0 0
1 1
0 0
1 1
1 1
5/ 31/ 07 LDPC Codes 59
Message-Passing Decoding
? u
from channel
? v=? u ? v=u
edge e edge e
? ?
?
? v1
u=? u = v1 + v2 + v3
edge e edge e
v1 v2
v2 v3
0 0 0
? ? ?
0 0 0
? ? ?
0 0 0
? ? ?
1 1 1
5/ 31/ 07 LDPC Codes 63
Message-Passing Example Round 1
x y Check-to-Variable
x y Variable-to-Check
0 0 0 0
? ? 0 ?
0 0 0 0
? ? ? ?
0 0 0 0
? ? ? ?
1 1 1 1
5/ 31/ 07 LDPC Codes 64
Message-Passing Example Round 2
x y Check-to-Variable
x y Variable-to-Check
0 0 0 0
0 ? 0 ?
0 0 0 0
? ? 1 ?
0 0 0 0
? ? ? ?
1 1 1 1
5/ 31/ 07 LDPC Codes 65
Message-Passing Example Round 3
x y Check-to-Variable
x y Variable-to-Check
Decoding complete
0 0 0 0
0 ? 0 ?
0 0 0 0
1 ? 1 ?
0 0 0 0
? ? 1 ?
1 1 1 1
5/ 31/ 07 LDPC Codes 66
Sub-optimality of Message-Passing Decoder
Hamming code: decoding of 3 erasures
There are 7 patterns of 3 erasures that
correspond to the support of a weight-3
codeword. These can not be decoded by
any decoder! 0
The other 28 patterns of 3 erasures can be
uniquely filled in by the optimal decoder. ? ?
We just saw a pattern of 3 erasures that ?
was corrected by the local decoder. Are 1
there any that it cannot?
0
0
Test: ???0 010
0
There is a unique way to fill the
erasures and get a codeword: 1 0
1
1100 010
0 1
The optimal decoder would find it. 0
But every parity-check has at least 2
erasures, so local decoding will not
work!
Codeword
support set
S={4,6,7}
1 2 3 4 5 6 7
0 0 0 1 0 1 1
1 2 3 4 5 6 7
1 2 3 4 5 6 7
Message-passing is suboptimal!!
Bad news:
Message-passing decoding on a Tanner graph is
not always optimal...
Good news:
For any code C, there is a parity-check matrix on
whose Tanner graph message-passing is optimal,
e.g., the matrix of codewords of the dual code C .
Bad news:
That Tanner graph may be very dense, so even
message-passing decoding is too complex.
1 1 0 1 0 0 0
H = 0 0 1 1 0 1 0
0 0 0 1 1 0 1
R=4/7 dmin=2
Good news:
If a Tanner graph is cycle-free, the message-
passing decoder is optimal!
Bad news:
Binary linear codes with cycle-free Tanner graphs
are necessarily weak...
Good news:
The Tanner graph of a long LDPC code behaves
almost like a cycle-free graph!
Concentration
With high probability, the performance of rounds
of MP decoding on a randomly selected (n, , )
code converges to the ensemble average
performance as the length n.
Convergence to cycle-free performance
The average performance of rounds of MP
decoding on the (n, , ) ensemble converges to the
performance on a graph with no cycles of length
2 as the length n.
Threshold calculation
There is a threshold probability p*(,) such that,
for channel erasure probability < p*(,), the
cycle-free error probability approaches 0 as the
number of iterations .
Pr (u = ?) = vd 1
d =1
dc
= 1
d =1
d (1 pl 1 ) d 1
= 1 (1 pl 1 )
5/ 31/ 07 LDPC Codes 82
Density Evolution-2
Consider a variable node of degree
d with independent incoming
messages.
edge e v Pr (v = ?) = Pr (u 0 = ?) Pr (ui = ?, for all i = 1, K, d 1)
u0 = p0 [1 (1 pl 1 )]
d 1
u1 M
The probability that edge e connects to
from channel ud 2 a variable node of degree d is d , so
u d 1 dv
Pr (v = ?) = d p0 [1 (1 pl 1 )]d 1
d =1
= p0 (1 (1 pl 1 ))
p = p0 (1(1p-1))
5/ 31/ 07 LDPC Codes 83
Threshold Property
p = p0 (1(1p-1))
then
lim pl 0.
l
< p * ( , ) ,
for sufficiently large block length n .
Example: (j,k)=(3,4)
(
f ( x, p ) x = p 1 (1 x) )
3 2
x p* 0.6474
p = 0.7
p = 0.6474
p=0.6
p=0.5
=
i2
i (1 (1 x)) i 1 x
1
p * ( , ) .
(0) (1)
1
c( x) < v p ( x), for all x (0,1)
Graphically, this says that the curve for c(x) must lie
below the curve for v p 1 ( x) for all p < p*.
( x) = x 2 ( x) = x 3
v p ( x ) = p ( x ) c( x) = 1 (1 x)
= px 2 = 1 (1 x) 3
1
1 x 2
v ( x) =
p
p
0.9 1
p=0.5
x 2
0.8
p=0.6 v p1 ( x) =
0.7 p=0.7 p
0.6 p=0.8 for various
0.5 values of
0.4
initial erasure
p*0.6474 probability p
0.3
0.2
c( x) = 1 (1 x) 3
0.1
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.6
c( x) = 1 (1 x) 3
0.5
0.4
0.3
0.2
0.1
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.9 p0=0.6
q 0.8
fraction of 0.7
erasures 0.6
from check
nodes 0.5
0.4
0.3
0.2
p
0.1
fraction of erasures from variable nodes
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.9 p0=0.6
q 0.8
q1=0.936
fraction of 0.7
erasures 0.6
from check
nodes 0.5
0.4
0.3
0.2
p
0.1
fraction of erasures from variable nodes
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.9 p10.5257
q 0.8
q10.936
fraction of 0.7
erasures 0.6
from check
nodes 0.5
0.4
0.3
0.2
p
0.1
fraction of erasures from variable nodes
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.9 p10.5257
q 0.8
0.4
0.3
0.2
p
0.1
fraction of erasures from variable nodes
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.9
q 0.8
p20.4788
fraction of 0.7 q20.8933
erasures 0.6
from check
nodes 0.5
0.4
0.3
0.2
p
0.1
fraction of erasures from variable nodes
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.9
q 0.8 p20.4788
fraction of 0.7
q30.8584
erasures 0.6
from check
nodes 0.5
0.4
0.3
0.2
p
0.1
fraction of erasures from variable nodes
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.9
q 0.8
p30.4421
fraction of 0.7
q30.8584
erasures 0.6
from check
nodes 0.5
0.4
0.3
0.2
p
0.1
fraction of erasures from variable nodes
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.9
q 0.8
p30.4421
fraction of 0.7
0.3
0.2
p
0.1
fraction of erasures from variable nodes
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
0.9
q 0.8
fraction of 0.7
0.3
0.2
p
0.1
fraction of erasures from variable nodes
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
= argmax
xi {1}
P
~ xi
X |Y ( x | y)
= argmax
xi {1}
P
~ xi
Y|X ( y | x)PX ( x)
n
= argmax
xi {1}
PY j | X j ( y j | x j ) f C ( x)
~ xi j =1
where f C (x) is the indicator function for C.
5/ 31/ 07 LDPC Codes 111
Belief Propagation
For codes with cycle-free Tanner graphs, there is a message-
passing approach to bit-wise MAP decoding.
The messages are essentially conditional bit distributions,
denoted u = [u(1), u(-1)].
The initial messages presented by the channel to the variable
nodes are of the form
Variable-to-check
v u0
d 1
u ch
u1 M v(b) = u ch u (b) , for b {1}
k =1
k
from channel ud 2
u d 1
u
Check-to-variable v0
d 1 v1
u (b) =
{ x1 , x 2 ,K, x d 1 }
v ( x ),
f (b, x1 , x2 ,K, xd 1 )
k =1
k k M
vd-1
vd
where f is the parity-check indicator function.
Variable-to-check d 1
v u0
v(b) = u ch u (b) , for b {1}
k =1
k
Then, a reasonable estimate to send to check node 0 based upon the other
estimates would be the product of those estimates (suitably normalized).
d 1
P ( x = b)
P ( x = b) = Pch ( x = b)
k =1
k
Note that
e L (u ) 1
u (1) = and u (1) =
1 + e L (u ) 1 + e L (u )
L(v)
L (u 0 ) d 1
M L (u1 )
from channel
L (v ) = L(u )
k =0
k
L (u d 2 )
L ( u d 1 )
L(u)
edge e
L(v1)
d 1
L (v k ) M
L(u ) = 2 tanh 1
k =1
tanh
2
L(vd-2)
L(vd-1)
r u
e L (u )
1 e 1 e 1
L ( v1 ) L ( v2 )
= L(v ) L(v ) v0
e L (u )
+1 e 1
+1 e 2 +1 s
v1
Noting that
L(a) L(a)
t v2
eL(a)
1 e e 2 2
= L(a) = tanh( 2 )
L(a)
+1
L(a) L(a)
e
e 2 +e 2
we conclude
) = tanh(
L(u ) L ( v1 ) L ( v2 )
tanh( 2 2
) tanh( 2
)
Threshold calculation
There is a threshold channel parameter p*(,)
such that, for any better channel parameter p,
the cycle-free error probability approaches 0 as
the number of iterations .
Pl = P0 ( 1 ( (( Pl 1 ))))
( P) =
i2
i ( P ) (i 1) and ( P) =
i2
i ( P ) (i 1)
Pl ( z )dz
5/ 31/ 07 LDPC Codes 124
Threshold
Pl = P0 ( 1 ( (( Pl 1 ))))
lim
l P ( z)dz = 0.
l
BiAWGN() (j,k) R * Sh
(3,4) 0.25 1.26 1.549
(4,6) 0.333 1.01 1.295
(3,5) 0.4 1.0 1.148
(3,6) 0.5 0.88 0.979
(4,8) 0.5 0.83 0.979
BiAWGN Rate R=
Sh = 0.979
max *
15 0.9622
20 0.9646
30 0.9690
40 0.9718
AWGN
R=1/2
n =103, 104,
105, 106
BER
Richardson,
Shokrollahi,
and Urbanke,
2001
Chung, et al.,
2001.
0.0045dB from
Shannon limit!
Chung, et al.,
2001.
Hou, et al.,
2001
R=1/2,
(3,6)
Kurkoski, et
al., 2002
Rate 7/8
Regular j=3
n=495
Rate 7/8
Regular j=3
n=495
Varnica and
Kavcic, 2003
R=0.7
n=106