You are on page 1of 95

Cooperative Techniques in Networks

Abdellatif Zaidi
Universite Paris-Est Marne La Vallee, France
abdellatif.zaidi@univ-mlv.fr
Spring School on 5G Communications
Hammamet, March 2014
Abdellatif Zaidi Cooperative Techniques in Networks 1 / 90
Tutorial Goals
Review and discuss some important aspects of cooperative communications in
networks
Provide intuition for basic concepts
Connect with recent results
Encourage research activity
Contribute to networking practice
Abdellatif Zaidi Cooperative Techniques in Networks 2 / 90
Recurring Themes and Take-Aways
1 Many approaches, benets, and challenges to utilizing relaying and cooperative
communications in networks.
2 Relaying also includes, or should also include, multihop (store-and-forward routing)
and network coding.
3 The capacity of relay systems is dicult to analyze, but the theory is surprisingly
flexible and diverse.
4 Although generalizations to networks with many nodes are in general not easy,
there are schemes which scale appropriately.
5 In networks with interference, relaying can help, not only by adding power/energy
spatially, but also by allowing distributed interference cancellation; and in general
this boots network capacity.
6 To cope with network interference eciently, classic relaying schemes as such in
general do not suce, and need be combined carefully with other appropriate
techniques.
Abdellatif Zaidi Cooperative Techniques in Networks 3 / 90
Caveats
1 I assume familiarity with: entropy, mutual information, capacity of discrete
memoryless and additive white Gaussian noise channels, rate-distortion theory
2 References are not provided in the main slides; only selected references are given at
the end. These, along with the references therein, point to many other recent
papers on relaying and cooperative communications in networks.
3 Time constraints limit the scope to less than originally planned.
4 I may go fast. Please feel free to stop me, whenever you want to.
Abdellatif Zaidi Cooperative Techniques in Networks 4 / 90
Tutorial Outline
Part 1: Basics on Cooperation
Part 2: Cooperation in Presence of Interference
Part 3: Interaction and Computation
Abdellatif Zaidi Cooperative Techniques in Networks 5 / 90
Basics on Cooperation

Part I : Basics on Cooperation


Abdellatif Zaidi Cooperative Techniques in Networks 6 / 90
Basics on Cooperation
Outline: Part 1
1 Introduction and Models
2 Protocols
Amplify-and-Forward (AF)
Decode-and-Forward (DF)
Compress-and-Forward (CF)
3 Information Rates
Abdellatif Zaidi Cooperative Techniques in Networks 7 / 90
Basics on Cooperation Introduction and Models
Wireless Network
A communication network has devices and
channels
Network purpose: enable message exchange
between nodes
Main features of wireless networks:
- Fading: electromagnetic scattering,
absorption, node mobility, humidity
- Broadcasting: nodes overhear wireless
transmissions (creates interference)
W
1
(X
1
, Y
1
)
W
N
(X
N
, Y
N
)
p(y
1
, . . . , y
N
|x
1
, . . . , x
N
)
W
k
(X
k
, Y
k
)
Abdellatif Zaidi Cooperative Techniques in Networks 8 / 90
Basics on Cooperation Introduction and Models
Fast and Slow Fading
Space-time models for h
uv
:
- Deterministic: electromagnetic wave propagation equations
- Random: {h
uv,i
}
n
i=1
is a realization of an integer-time stochastic
process {H
uv,i
}
n
i=1
(The random model admits uncertainty and is simpler).
Marginal distributions of the H
uv,i
:
- Assume the H
uv,i
, i = 1, . . . , n, have the same marginal distribution
H
uv
during a given communication session
- No fading: H
uv
is a known constant
- Rayleigh fading: H
uv
is complex, Gaussian, 0-mean, unit var.
Temporal correlation: two extremes
- Fast fading: h
uv,i
are independent realizations of H
uv
- Slow fading: H
uv,i
= H
uv
for all i
Abdellatif Zaidi Cooperative Techniques in Networks 9 / 90
Basics on Cooperation Introduction and Models
Discrete Memoryless Network Model
There are M source messages W
m
,
m = 1, 2, . . . , M.
Message W
m
is estimated as message

W
m
(u)
at certain nodes u
W
1
(X
1
, Y
1
)
W
N
(X
N
, Y
N
)
p(y
1
, . . . , y
N
|x
1
, . . . , x
N
)
W
k
(X
k
, Y
k
)
Source model:
- Messages are statistically independent
- Sources are not bursty
Device model:
- Node u has one input variable X
u
and one output variable Y
u
- Causality: X
u,i
= f
u,i
(local messages, Y
i1
u
), i = 1, . . . , n
- Cost constraint example: E[f(X
u,1
. . . , X
u,n
)] P
u
Abdellatif Zaidi Cooperative Techniques in Networks 10 / 90
Basics on Cooperation Introduction and Models
Discrete Memoryless Network Model (Cont.)
W
1
(X
1
, Y
1
)
W
N
(X
N
, Y
N
)
p(y
1
, . . . , y
N
|x
1
, . . . , x
N
)
W
k
(X
k
, Y
k
)
Channel model:
- A network clock governs operations: node u transmits X
u,i
between
clock tick (i 1) and tick i, and receives Y
u,i
at tick i
- Memoryless: Y
u,i
generated by X
u,i
, all u, via the channel
P
Y
1
Y
2
Y
3
...|X
1
X
2
X
3
...
()
Capacity region: closure of set of all (R
1
, R
2
, . . . , R
M
) for which
Pr
_

m,u
_

W
m
= W
m
__
can be made close to zero for large n
Abdellatif Zaidi Cooperative Techniques in Networks 11 / 90
Basics on Cooperation Introduction and Models
Node Coding/Broadcasting
Traditional approach:
- Channels treated as point-to-point links
- Data packets traverse paths (sequences of nodes)
Other possibilities:
Broadcasting: nodes overhear wireless transmissions
Node coding: nodes process
- reliable message or packet bits (network coding)
- reliable or unreliable symbols (relaying/cooperation)
These concepts already appear in 3-node networks
Abdellatif Zaidi Cooperative Techniques in Networks 12 / 90
Basics on Cooperation Protocols
Building Block
1
2
3

W W
Direct transmission from Node 1 to Node 3
Multihop transmission through Node 2
Amplify-and-Forward (AF)
Decode-and-Forward (DF)
Compress-and-Forward (CF)
Capacity still unknown
Multiaccess problem: from Nodes 1 and 2 to Node 3
Broadcast problem: from Node 1 to Nodes 2 and 3
Feedback problem: output at Node 2 is a form of feedback
Superposition coding, binning, feedback techniques, etc.
Abdellatif Zaidi Cooperative Techniques in Networks 13 / 90
Basics on Cooperation Protocols
Direct Transmission
Encoder Decoder

W
X
n
1
Y
n
3
X
n
2
= b
n
P
Y
3
|X
1
,X
2
(y
3
|x
1
, x
2
) W
Relay does not participate, i.e., X
2,i
= b (often 0) for all i
A standard random coding argument shows that rates
R < R
dir
:= max
P
X
1
|X
2
(|b),b
I(X
1
; Y
3
|X
2
= b)
are achievable
R
dir
is in fact capacity if the relay channel is reversely degraded, i.e.,
P
Y
2
Y
3
|X
1
X
2
() = P
Y
3
|X
1
X
2
()P
Y
2
|Y
3
()
(Think: Y
2
is a noisy version of Y
3
)
Abdellatif Zaidi Cooperative Techniques in Networks 14 / 90
Basics on Cooperation Protocols
Multihop Transmission
Node 1 transmits to Node 2, and Node 2 transmits to Node 3
Motivated by a cascade of two channels, i.e.,
P
Y
n
2
Y
n
3
|X
n
1
X
n
2
() = P
Y
n
2
|X
n
1
()P
Y
n
3
|X
n
2
()
Question: What should Node 2 transmit ?
Non-Regenerative / Amplify-and-Forward (AF)
Node 2 sets X
2,i
= Y
2,ik
for some k 1
Compress-and-Forward (CF) / Estimate-Forward (EF)
Node 2 conveys a quantized version of Y
n
2
to Node 3
Regenerative / Decode-and-Forward (DF)
Node 2 decodes W and re-encodes into a codeword X
n
2
Transmissions usually occur in a pipeline of two (or more) blocks (potentially of
varying sizes)
Abdellatif Zaidi Cooperative Techniques in Networks 15 / 90
Basics on Cooperation Protocols
Block Pipeline
Relay
Destination
Source



Block B Block B-1 Block 2

R R
y
2
[B 1]
Block 1
x
1
(w
1
)
y
2
[1] y
2
[2] y
2
[B]
y
3
[1] y
3
[2] y
3
[B] y
3
[B 1]
R 0
a
b
x
1
(w
2
) x
1
(w
B1
)
f
(B1)
2
({y2[i]}
B2
i=1
) f
(2)
2
(y
2
[1]) f
(B)
2
({y
2
[i]}
B1
i=1
)
B blocks, each of length n channel uses. Message W = (w
1
, w
2
, . . . , w
B
)
Block 1 lls the pipeline; and block B empties the pipeline
Blocks 2 through B can blend broadcast and multiaccess
Extremes
n = 1, B large: Like an intersymbol interference (ISI) channel
n large, B = 2: No interblock interference, half-duplex
n, B large: Eective rate
(B1)nR
nB
=
(B1)
B
R R
Memory within and among input blocks allowed through f
(i)
2
__
y
2
[j]
_
i1
j=1
_
Abdellatif Zaidi Cooperative Techniques in Networks 16 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Amplify-and-Forward (AF)
Relay
Destination
Source



Block B Block B-1 Block 2

R R
y
2
[B 1]
Block 1
x
1
(w
1
)
y
2
[1] y
2
[2] y
2
[B]
y
3
[1] y
3
[2] y
3
[B] y
3
[B 1]
R 0
a
b
x
1
(w
2
) x
1
(w
B1
)
y
2
[1] y
2
[B 1] y
2
[B 2]
Choose f
(i)
2
() to be a linear function:
Discrete Channels
Often x
2
[i] f
(i)
2

y
2
[j]

i1
j=1

:= y
2
[i 1]
Requires Y
2
X
2
Continuous Channels
Often x
2
[i] f
(i)
2

y
2
[j]

i1
j=1

:= y
2
[i 1]
chosen subject to a power constraint
Abdellatif Zaidi Cooperative Techniques in Networks 17 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Multihop AF, B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y
2
[1] y
2
[2]
y
2
[1]
y
3
[1] y
3
[2]
a
b
x
1
(w)
If Node 3 decodes using only Block 2, R achievable if
R < R
maf
:= max
P
X
1,1
|X
2,1
(|b),a,b
1
2
I(X
1,1
; Y
3,2
|X
2,1
= b, X
1,2
= a)
I(X
1,1
; Y
3,2
|X
2,1
= b, X
1,2
= a) computed for the eective channel
P
Y
3,2
|X
1,1
X
1,2
X
2,1
(|, a, b) in which X
2,2
= Y
2,1
Also known as non-regenerative repeating
Abdellatif Zaidi Cooperative Techniques in Networks 18 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Diversity AF, B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y
2
[1] y
2
[2]
y
2
[1]
y
3
[1] y
3
[2]
a
b
x
1
(w)
If Node 3 decodes using both Blocks 1 and 2, then R achievable for
R < R
daf
:= max
P
X
1,1
|X
2,1
(|b),a,b
1
2
I(X
1,1
; Y
3,1
Y
3,2
|X
2,1
= b, X
1,2
= a)
Similar to repetition coding, except X
2,2
= Y
2,1
is a corrupted version of X
1,1
Abdellatif Zaidi Cooperative Techniques in Networks 19 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Non-Orthogonal AF (NAF), B = 2
Relay
Destination
Source
Block 2
R R
Block 1
y
2
[1] y
2
[2]
y
2
[1]
y
3
[1] y
3
[2]
b
x
1
(w
1
) x
1
(w
2
)
If Node 1 sends new information in Block 2, and Node 3 decodes using both
Blocks 1 and 2, then R achievable for
R < R
ndaf
:= max
1
2
I(X
1,1
X
1,2
; Y
3,1
Y
3,2
|X
2,1
= b)
with max over P
X
1,1
|X
2,1
(|b)P
X
1,2
|X
2,1
(|b) and b
A combination of DAF and Direct from Node 1 to Node 3
Abdellatif Zaidi Cooperative Techniques in Networks 20 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
Intersymbol Interference AF (IAF)
Pipeline with n = 1, B creates an eective intersymbol interference (ISI)
channel
Input memory important in this case (waterlling)
Additional improvements with bursty AF, mainly at low SNR
Abdellatif Zaidi Cooperative Techniques in Networks 21 / 90
Basics on Cooperation Protocols : Amplify-and-Forward (AF)
AF Summary
Schemes discussed so far are all special cases of Block Pipeline AF.
Generally
R
maf
< R
daf
< R
ndaf
< R
iaf
but coding and decoding grows increasingly complex
MAF, DAF, and NDAF with B = 2 are useful for half-duplex systems
IAF with B is useful for full-duplex systems
Abdellatif Zaidi Cooperative Techniques in Networks 22 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Decode-and-Forward (DF)
Relay
Destination
Source



Block B Block B-1 Block 2
x
1
(w
1
, w
2
)
R R
y
2
[B 1]
Block 1
x
1
(1, w
1
) x
1
(w
B1
, 1)
y
2
[1] y
2
[2] y
2
[B]
x
2
(1) x
2
( w
1
) x
2
( w
B2
) x
2
( w
B1
)
y
3
[1] y
3
[2] y
3
[B] y
3
[B 1]
x
1
(w
B2
, w
B1
)
R 0
f
(i)
2
(): in block i, Relay uses y
2
[i 1] to decode message w
i1
, and re-encodes it
into x
2
( w
i1
)
Joint typicality decoding: look at w
i1
s.t. x
1
(w
i2
, w
i1
), y
2
[i 1] is jointly
typical given x
2
( w
i2
)
Multi-user codebooks designed jointly
x
1
[i] := x
1
(w
i1
, w
i
) and x
2
[i] := x
2
( w
i1
) both depend upon w
i1
x
1
[i] := x
1
(w
i1
, w
i
) also depends upon w
i
Joint distributions P
X
2
()P
X
1
|X
2
()
Abdellatif Zaidi Cooperative Techniques in Networks 23 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Multihop DF, B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y
2
[1] y
2
[2]
x
2
( w)
y
3
[1] y
3
[2]
a
b
x
1
(w)
Let q be the fractional length of Block 1 (Block 2 of fractional length q := 1 q)
At the end of Block 1, Node 2 decodes message w reliably if
R < qI(X
1
; Y
2
|X
2
= b)
and w = w with high probability
If Node 3 decodes using only Block 2, R achievable if
R < qI(X
2
; Y
3
|X
1
= a)
Abdellatif Zaidi Cooperative Techniques in Networks 24 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Multihop DF (MDF), B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y
2
[1] y
2
[2]
x
2
( w)
y
3
[1] y
3
[2]
a
b
x
1
(w)
Thus, a rate R is achievable if
R < R
mdf
:= max
0q1
min{q max
P
X
1
|X
2
(|b),b
I(X
1
; Y
2
|X
2
= b),
= q max
P
X
2
|X
1
(|a),a
I(X
2
; Y
3
|X
1
= a)}
Routing: Time-sharing between Direct from Node 1 to Node 2 and Direct from
Node 2 to Node 3
Also known as regenerative repeating
Abdellatif Zaidi Cooperative Techniques in Networks 25 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Diversity DF (DDF), B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y
2
[1] y
2
[2]
x
2
( w)
y
3
[1] y
3
[2]
a
b
x
1
(w)
If Node 3 decodes using both Blocks 1 and 2, then R achievable for
R < R
ddf
:= max min{qI(X
1,1
; Y
2,1
|X
2,1
= b),
qI(X
1,1
; Y
3,1
|X
2,1
= b)
+ qI(X
2,2
; Y
3,2
|X
1,2
= a)}
with max over P
X
1,1
|X
2,1
(|b)P
X
2,2
|X
1,2
(|a), a, b, and 0 q 1
Abdellatif Zaidi Cooperative Techniques in Networks 26 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Non-Orthogonal DDF (NDDF), B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y
2
[1] y
2
[2]
b x
2
( w
1
)
y
3
[1] y
3
[2]
x
1
(w
1
) x
1
(w
2
)
If Node 1 sends new information in Block 2, and Node 3 decodes using both
Blocks 1 and 2, then R achievable for
R < R
nddf,2
:= max min{qI(X
1,1
; Y
2,1
|X
2,1
= b),
qI(X
1,1
; Y
3,1
|X
2,1
= b)
+ qI(X
1,2
X
2,2
; Y
3,2
)}
with max over P
X
1,1
|X
2,1
(|b)P
X
1,2
X
2,2
(), a, b, and 0 q 1
Abdellatif Zaidi Cooperative Techniques in Networks 27 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Non-Orthogonal DDF (NDDF), B
Relay
Destination
Source



Block B Block B-1 Block 2
x
1
(w
1
, w
2
)
R R
y
2
[B 1]
Block 1
x
1
(1, w
1
) x
1
(w
B1
, 1)
y
2
[1] y
2
[2] y
2
[B]
x
2
(1) x
2
( w
1
) x
2
( w
B2
) x
2
( w
B1
)
y
3
[1] y
3
[2] y
3
[B] y
3
[B 1]
x
1
(w
B2
, w
B1
)
R 0
All blocks of length n
Three encoding and decoding algorithms: dier in complexity and delay
requirements.
Regular enc., sliding-window dec.: R < I(X
2
; Y
3
) +I(X
1
; Y
3
|X
2
)
Regular enc., backward dec.: R < I(X
1
, X
2
; Y
3
)
Irregular enc., successive dec.: binning at encoding (Cover, El Gamal)
Abdellatif Zaidi Cooperative Techniques in Networks 28 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
Non-Orthogonal DDF (NDDF), B
Relay
Destination
Source



Block B Block B-1 Block 2
x
1
(w
1
, w
2
)
R R
y
2
[B 1]
Block 1
x
1
(1, w
1
) x
1
(w
B1
, 1)
y
2
[1] y
2
[2] y
2
[B]
x
2
(1) x
2
( w
1
) x
2
( w
B2
) x
2
( w
B1
)
y
3
[1] y
3
[2] y
3
[B] y
3
[B 1]
x
1
(w
B2
, w
B1
)
R 0
All blocks of length n
Rate R is achievable if
R < R
nddf,
:= max min{I(X
1
; Y
2
|X
2
), I(X
1
X
2
; Y
3
)}
with max over P
X
1
X
2
()
R
nddf,
is in fact capacity if relay channel is physically degraded, i.e,
P
Y
2
Y
3
|X
1
X
2
() = P
Y
2
|X
1
X
2
()P
Y
3
|Y
2
X
2
()
Abdellatif Zaidi Cooperative Techniques in Networks 29 / 90
Basics on Cooperation Protocols : Decode-and-Forward (DF)
DF Summary
Schemes discussed so far are all special cases of Block Pipeline DF.
Generally
R
mdf
< R
ddf
< R
nddf,2
< R
nddf,
but coding and decoding grows increasingly complex
MDF, DDF, and NDDF with B = 2 are useful for half-duplex systems
NDDF with B is useful for full-duplex systems
Abdellatif Zaidi Cooperative Techniques in Networks 30 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
Multihop CF, B = 2
Relay
Destination
Source
Block 2
R 0
Block 1
y
2
[1] y
2
[2]
x
2
(s
1
)
y
3
[1] y
3
[2]
a
b
x
1
(w)
s
1
is s.t. y
2
[s
1
] :=

y
2
[1]
Basic Idea: in block i, relay quantizes (scalar or vector) y
2
[i 1] and
communicates it to the destination.
Details
Fix distributions P
X
1
|X
2
(|b) and P

Y
2
|Y
2
X
2
, to be optimized later
Generate 2
n

R
quantization codewords y
2
[s] independently and i.i.d.
according to the marginal P

Y
2
|X
2
( y
2
|b), s = 1, 2, . . . , 2
n

R
.
Abdellatif Zaidi Cooperative Techniques in Networks 31 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
Multihop CF
Details (cont.)
Upon receiving y
2
[1], relay quantizes it by nding a joint typical y
2
[1] in the
quantization codebook. This is likely for n large if

R > I(Y
2
;

Y
2
|X
2
= b)
The destination rst utilizes y
3
[2] to decode the compression index s
1
. This can
be done with no error if

R < max
P
X
2
|X
1
(|a),a
I(X
2
; Y
3
|X
1
= a)
Then, the destination utilizes y
2
[s
1
] :=

y
2
[1] to decode message w. This can be
done with no error if
R < R
mcf
:= max I(X
1
;

Y
2
|X
2
= b)
with max over P
X
1
|X
2
(|b), P

Y
2
|Y
2
X
2
(|, b), a and b such that
I(Y
2
;

Y
2
|X
2
= b) < max
P
X
2
|X
1
(|a)
I(X
2
; Y
3
|X
1
= a)
Abdellatif Zaidi Cooperative Techniques in Networks 32 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
CF Wyner-Ziv Compression
Y
n
3
X
n
1

W W
Y
n
2
: X
n
2

Y
n
2
Send B 1 independent messages over B blocks (each of length n)
At the end of block i, relay chooses a description y
n
2
[i] of y
n
2
[i]
Since the receiver has side information y
n
3
[i] about y
n
2
[i], we use Wyner-Ziv coding
to reduce rate necessary to send y
n
2
[i]

R > I(Y
2
;

Y
2
|X
2
, Y
3
)
= I(Y
2
;

Y
2
|X
2
) I(Y
3
;

Y
2
|X
2
)
The bin index is sent to the receiver in block i + 1 via x
n
2
[i + 1]
Abdellatif Zaidi Cooperative Techniques in Networks 33 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
CF Wyner-Ziv Compression (Cont.)
Y
n
3
X
n
1

W W
Y
n
2
: X
n
2

Y
n
2
At the end of block i + 1, the receiver rst decodes x
n
2
[i + 1] from which it nds
y
n
2
[i]

R < I(X
2
; Y
3
)
It then nds unique w
i
s.t. x
n
1
( w
i
), x
n
2
[i], y
n
2
[i], y
n
3
[i] are jointly typical
R < I(X
1
;

Y
2
, Y
3
|X
2
)
Compress-Forward rate
R
CF
= max
P
X
1
P
X
2
P

Y
2
|Y
2
,X
2
I(X
1
;

Y
2
, Y
3
|X
2
)
subject to I(Y
2
;

Y
2
|X
2
, Y
3
) I(X
2
; Y
3
)
Abdellatif Zaidi Cooperative Techniques in Networks 34 / 90
Basics on Cooperation Protocols : Compress-and-Forward (CF)
Summary
What we covered in this section:
Summarized basic elements of relay channels, including direct, multihop,
broadcast, and multiaccess transmission
Introduced mechanics of various kinds of relay processing, including
amplify-and-forward, decode-and-forward, and compress-and-forward
Abdellatif Zaidi Cooperative Techniques in Networks 35 / 90
Basics on Cooperation Information Rates
Information Rates
The purpose of this section is to rene the above analysis, study numerical
examples, and develop insight based on rate.
S
c
S
The capacity region C is usually dicult to compute. A useful outer bound on C is
the cut-set bound.
Let S N and let S
c
be the complement of S in N. A cut separating W
m
from
one of its estimates

W
m
(u) is a pair (S, S
c
) where W
m
is connected (immediately)
to a node in S but not in S
c
, and where u S
c
.
Abdellatif Zaidi Cooperative Techniques in Networks 36 / 90
Basics on Cooperation Information Rates : Cut-set Bound
Cut-Set Bound
Let X
S
= {X
u
: u S}
Consider any choice of encoders, and compute
P
X
N
Y
N
(a, b) =
_
1
n
n

i=1
P
X
N,i
(a)
_
P
Y
N
|X
N
(b|a)
where P
X
N,i
() is the marginal input distribution at time i.
Let M(S) be the set of messages separated from one of their sinks by the cut
(S, S
c
).
Cut bound: any (R
1
, R
2
, . . . , R
M
) C satises

mM(S)
R
m
I(X
S
; Y
S
c
|X
S
c
)
Cut-set bound for xed P
X
N
(): intersection over all S of (R
1
, . . . , R
M
) satisfying
the above bounds
Cut-set bound: union over P
X
N
() of all such regions
Abdellatif Zaidi Cooperative Techniques in Networks 37 / 90
Basics on Cooperation Information Rates : Cut-set Bound
Cut-Set Bound Examples
Point-to-point channel:
C
P
X
()
I(X; Y )
Relay channel:
C
P
X
1
X
2
()
min{I(X
1
; Y
2
Y
3
|X
2
), I(X
1
X
2
; Y
3
)}
The cut-set bound is usually loose, e.g., for two-way channels, broadcast channels,
relay channels, etc.
Multiple access
Broadcast
X
1
Y
3
X
2
X
1
Y
2
: X
2
Y
3
Abdellatif Zaidi Cooperative Techniques in Networks 38 / 90
Basics on Cooperation Information Rates : Wireless Geometry
Wireless Geometry
Relay is a full-duplex device
Powers and Noise
E[X
2
u
] P
u
, u = 1, 2
Z
i
, i = 2, 3, ind. Gaussian
E[Z
2
i
] = N, i = 2, 3
Source and Dest. kept xed. Relay
moves on the circle. The model is:
Y
2
=
H
12
|d|
/2
X
1
+Z
2
Y
3
= H
13
X
1
+
H
23

1 d
2
/2
X
2
+Z
3
1
d

1 d
2
2
1 3
To compare rates, we will consider two settings:
No Fading: H
uv
is a known constant, CSIR + CSIT
Fast Uniform-phase fading: H
uv
= e
j2
uv
where
uv
is uniform in
[0, 2), with CSIR, No CSIT
Abdellatif Zaidi Cooperative Techniques in Networks 39 / 90
Basics on Cooperation Information Rates : CF
CF Rates for AWGN Channels
Recall the Compress-Forward Lower Bound in the DM Case
C max
P
X
1
P
X
2
P

Y
2
|Y
2
,X
2
I(X
1
;

Y
2
, Y
3
|X
2
)
subject to I(Y
2
;

Y
2
|X
2
, Y
3
) I(X
2
; Y
3
)
For AWGN channels, a natural choice is

Y
2
= Y
2
+

Z
2
where

Z
2
is Gaussian with variance

N
2
.
X
1
and X
2
are chosen as independent, Gaussian, and with variances P
1
and P
2
,
respectively.
The smallest possible

N
2
is when
I(Y
2
;

Y
2
|X
2
Y
3
) = I(X
2
; Y
3
)
which gives

N
2
= N
P
1
_
|H
12
|
2
/d

12
+|H
13
|
2
/d

13
_
+N
P
2
|H
23
|
2
/d

23
Abdellatif Zaidi Cooperative Techniques in Networks 40 / 90
Basics on Cooperation Information Rates : CF
CF Rates for AWGN Channels
For full-duplex relays
R = I(X
1
;

Y
2
, Y
3
|X
2
)
= log
2
_
1 +
P
1
|H
12
|
2
d

12
(N +

N
2
)
+
P
1
|H
13
|
2
d

13
N
_
bits/use
Comments:
As SNR
23
:= |H
23
|
2
P
2
/d

23
N we have

N
2
0 and

Y
2
Y
2
and
R becomes
R = max
P
X
1
X
2
()
I(X
1
; Y
2
Y
3
|X
2
)
This is a cut rate so CF is optimal as SNR
23
.
Important insight: use CF when the relay is near the destination, and
not AF or DF (see the rate gure).
Abdellatif Zaidi Cooperative Techniques in Networks 41 / 90
Basics on Cooperation Information Rates : DF
DF Rates for AWGN Channels
Recall the DF block structure where x
2
(w
b1
) is generated by P
X
2
, and
x
1
(w
b1
, w
b
) by P
X
1
|X
2
(|x
2,i
(w
b1
)) for all i.
After block b, the DF relay decodes w
b
at rate
R < I(X
1
; Y
2
|X
2
)
Backward decoder rate: decode w
b
after block b + 1 by using y
3
[B],
b = B, B 1, . . . , 1, at rate
R < I(X
1
X
2
; Y
3
)
Sliding-window decoder rate: decode w
b
after block b + 1 by using y
3
[b] and
y
3
[b + 1], b = 1, 2, . . . , B 1, at rate
R < I(X
1
; Y
3
|X
2
) +I(X
2
; Y
3
) = I(X
1
X
2
; Y
3
)
where the rst information term is due to y
3
[b], and I(X
2
; Y
3
) is due to y
3
[b + 1]
(treat x
1
(w
b
, w
b+1
) as interference).
DF rate:
R = max
P
X
1
X
2
min {I(X
1
; Y
3
|X
2
), I(X
1
X
2
; Y
3
)}
Abdellatif Zaidi Cooperative Techniques in Networks 42 / 90
Basics on Cooperation Information Rates : DF
DF Rates for AWGN Channels
For AWGN channels, choose Gaussian P
X
1
X
2
with E[|X
1
|
2
] = P
1
, E[|X
2
|
2
] = P
2
and = E[X
1
X

2
]/

P
1
P
2
.
The DF rate is
R =max

min
_
log
2
_
1 +
P
1
|H
12
|
2
(1 ||
2
)
d

12
N
_
,
log
2
_
1 +
P
1
|H
13
|
2
d

13
N
+
P
2
|H
23
|
2
d

23
N
+

P
1
P
2
Re{H
13
H

23
}
d

13
d

23
N
__
Comments:
As SNR
12
:= |H
12
|
2
P
1
/d

12
N , optimal 1 and R becomes
R = max
P
X
1
X
2
I(X
1
X
2
; Y
3
)
This is a cut rate so DF is optimal as SNR
12
.
Important insight: use DF when the relay is near the source (see the
rate gure).
Abdellatif Zaidi Cooperative Techniques in Networks 43 / 90
Basics on Cooperation Information Rates : AF
AF Rates for AWGN Channels
AF processing: set
X
2,i
= aY
2,i1
= a
_
H
12,i1
d

12
X
1,i1
+Z
2,i1
_
where a is chosen to satisfy the relay power constraint.
Destination output:
Y
3,i
=
H
13,i
d

13
X
1,i
+a
H
23,i
d

23
_
H
12,i1
d

12
X
1,i1
+Z
2,i1
_
+Z
3,i
AF eectively converts the channel into a unit-memory inter-symbol interference
channel. The transmitter should thus perform a water-lling optimization of the
spectrum of X
n
1
.
It turns out the relay should not always transmit with maximum power.
More generally:
X
2,i
=a
_
Y
2,i1
, Y
2,i2
, , Y
2,iD
_
T
Amounts to ltering-and-forwarding.
Abdellatif Zaidi Cooperative Techniques in Networks 44 / 90
Basics on Cooperation Information Rates : AF
Rates For AWGN Channels, No Fading
P
1
/N = 10 dB, P
2
/N = 20 dB, = 1, H
uv
= 1
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
1.5
2
2.5
3
3.5
4
distance d
R
a
t
e


cutset bound
Rate DF
Rate CF
Relay off
Abdellatif Zaidi Cooperative Techniques in Networks 45 / 90
Basics on Cooperation Information Rates : AF
Rates For Fast Uniform Phase Fading, CSIR and No CSIT
For CF, choose

Y
2
= Y
2
e
j
12
+

Z
2
where

Z
2
is Gaussian with 0-mean and var.

N
2
.
Straightforward algebra leads to same CF rate as for AWGN relay channels.
For DF, Straightforward algebra leads to
R =max

min
_
log
2
_
1 +
P
1
(1 ||
2
)
d

12
N
_
,
E
_
log
2
_
1 +
P
1
d

13
N
+
P
2
d

23
N
+
e
j(
12

13
)

P
1
P
2
d

13
d

23
N
___
By Jensens inequality and E[e
j(
12

13
)
] = 0, the best is zero!
Intuition: Without phase knowledge, source and relay transmissions cannot
combine coherently.
Important insight: Coherent combining requires CSIT at either the source or relay
node and seems unrealistic in mobile environments
Abdellatif Zaidi Cooperative Techniques in Networks 46 / 90
Basics on Cooperation Information Rates : AF
Rates For Fast Uniform Phase Fading Channels, CSIR and
No CSIT
P
1
/N = 10 dB, P
2
/N = 20 dB, = 2, H
uv
= e
j2
uv
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
1.5
2
2.5
3
3.5
4
distance d
R
a
t
e


cutset bound
Rate DF
Rate CF
Relay off
Abdellatif Zaidi Cooperative Techniques in Networks 47 / 90
Basics on Cooperation Information Rates : AF
Summary
What we covered in Part I:
1 Summarized basic elements of relay channels, including direct, multihop,
broadcast, and multiaccess transmission
2 Introduced mechanics of various kinds of relay processing, including
amplify-and-forward, decode-and-forward, and compress-and-forward
3 Reviewed a cut-set bound
4 Reviewed information theory for cooperative protocols, including AF, CF, DF.
5 Examples of rate gains for a wireless relay channel
6 Gave insight on protocol choice based on geometry and constraints
Abdellatif Zaidi Cooperative Techniques in Networks 48 / 90
Cooperation in Presence of Interference

Part II: Cooperation in Presence of Interference


Abdellatif Zaidi Cooperative Techniques in Networks 49 / 90
Cooperation in Presence of Interference
Goal
The purpose of this section is to give a high level overview of some issues that
arise in interference relay networks.
Show, through examples, that classic relaying techniques in general do not suce
as such in such networks, and need be combined appropriately with more advanced
techniques.
Discuss what roles relays can play in such networks, in addition to adding power,
reducing path-loss, combating fading and increasing coverage.
Abdellatif Zaidi Cooperative Techniques in Networks 50 / 90
Cooperation in Presence of Interference
Outline: Part 2
1 CF Generalization to Networks / Noisy Network Coding
2 Relaying in Presence of Additive Outside Interference
- Interference Not Known
- Interference Known Only to Relay
- Interference Known Only to Source
3 Generalisation: Channels with States
Abdellatif Zaidi Cooperative Techniques in Networks 51 / 90
Cooperation in Presence of Interference Noisy Network Coding
Wyner-Ziv Compression for Two Receivers or More
?
Y
n
3
X
n
1

W W
Y
n
2
: X
n
2
Y
n
4

W

R = I(

Y
2
; Y
2
|X
2
Y
4
)

R = I(

Y
2
; Y
2
|X
2
Y
3
)
CF is a good candidate for relaying in networks with no CSIT, such as mobile
environments.
In networks, which side information to take into account ?
- Multiple-description coding not optimal and complex for many users!
Binning rate (and so, the overall rate) depends on the considered side information
More generally, Wyner-Ziv type compressions require the quantized version y
n
2
[i] to
be tailored for a specic receiver
The problem is even more complex in networks with more than two receivers!
Abdellatif Zaidi Cooperative Techniques in Networks 52 / 90
Cooperation in Presence of Interference Noisy Network Coding
Noisy Network Coding
W
1
(X
1
, Y
1
)
W
N
(X
N
, Y
N
)
p(y
1
, . . . , y
N
|x
1
, . . . , x
N
)
W
k
(X
k
, Y
k
)
Alternate compression: noisy network coding (Kim, El Gamal 2011)
Standard compression, i.e., no binning!
Every message is transmitted b times
Receiver decodes using all blocks without explicitly decoding the compression
indices
Abdellatif Zaidi Cooperative Techniques in Networks 53 / 90
Cooperation in Presence of Interference Noisy Network Coding
Noisy Network Coding: Outline of Proof
Source node sends same message b times; relays use compress-forward; decoders
use simultaneous decoding
No binning; dont require decoding compression indices correctly!
For simplicity, consider proof for relay channel
The relay uses independently generated compression codebooks:
B
j
= {y
n
2
(l
j
|l
j1
) l
j
, l
j1
[1 : 2
nR
2
]}, j [1 : b]
l
j1
is compression index of

Y
n
2
(j 1) sent by the relay in block j
The senders use independently generated transmission codebooks:
C
j
= {(x
n
1
(j, m), x
n
2
(l
j1
)) m [1 : 2
nbR
], l
j
, l
j1
[1 : 2
nR
2
]}, j [1 : b]
Encoding: Sender transmits X
n
1
(j, m) in block j [1 : b] Upon receiving Y
n
2
(j)
and knowing X
n
2
(l
j1
), the relay nds jointly typical

Y
n
2
(l
j
|l
j1
), and sends
X
n
2
(l
j
) in block j + 1.
Abdellatif Zaidi Cooperative Techniques in Networks 54 / 90
Cooperation in Presence of Interference Noisy Network Coding
Example: Interference Channel with Intermittent Feedback
Encoder 1
Encoder 2
Decoder 1
Decoder 2
D
M

I
C

G
F
X
n
1
X
n
2
Y
n
4
W
2
Y
n
3
Y
n
2
Y
n
1
W
Y
1
,
Y
2
,
Y
3
,
Y
4
|
X
1
,
X
2
,
S W
1
W
2
S
1

W
1
S
2
Feedback provided only intermittently: at time i:
- Feedback-event with proba. p
1
on Dec. 1 Enc.1, and p
2
on Dec. 2 Enc.2
Pr{Y
1
[i] = Y
3
[i]} = p
1
and Pr{Y
2
[i] = Y
4
[i]} = p
2
- Erasure-event with proba. p
1
on Dec. 1 Enc.1, and p
2
on Dec. 2 Enc.2
Pr{Y
1
[i] = } = 1 p
1
and Pr{Y
2
[i] = } = 1 p
2
Can model this type of FB using a memoryless state-pair (S
1
, S
2
), with
S
1
Bern-p
1
and S
2
Bern-p
2
, and (S
1
, S
2
) p
S
1
,S
2
(s
1
, s
2
)
Capacity region is unknown in general, even without feedback!
Classic partial-DF schemes inecient here, because of the intermittence
Abdellatif Zaidi Cooperative Techniques in Networks 55 / 90
Cooperation in Presence of Interference Noisy Network Coding
Example: IC with Intermittent Feedback (Cont.)
Encoder 1
Encoder 2
Decoder 1
Decoder 2
D
M

I
C

G
F
X
n
1
X
n
2
Y
n
4
W
2
Y
n
3
Y
n
2
Y
n
1
W
Y
1
,
Y
2
,
Y
3
,
Y
4
|
X
1
,
X
2
,
S W
1
W
2
S
1

W
1
S
2
Key idea: each transmitter compresses, `a-la noisy network coding, its output FB
and sends it to both receivers.
Optimal for linear deterministic IC model
Y
3
[i] = H
11
X
1
[i] +H
12
X
2
[i]
Y
4
[i] = H
22
X
2
[i] +H
21
X
1
[i]
Recovers known results on linear deterministic IC (Tse et al.) if p
1
= p
2
= 1
More generally, optimal for Costa-El Gamal injective deterministic IC model
- Details in [Zaidi Interference channels with generalized and intermittent
feedback, IEEE Trans. IT, 2014]
Abdellatif Zaidi Cooperative Techniques in Networks 56 / 90
Cooperation in Presence of Interference Interference Amplication
RC with Unknown Outside Interferer
Node 4 is an unknown interferer
E[X
2
4
] = Q
E[Z
2
i
] = 1, i = 2, 3
E[X
2
i
] = 1, i = 1, 2
1
2
3

W W
4
We focus on the shown geometry. Node 4 interferes on both links, to the relay and
destination. The model is
Y
2
= H
12
X
1
+H
42
X
4
+Z
2
Y
3
= H
13
X
1
+H
23
X
2
+H
43
X
4
+Z
3
Previous schemes may all perform poor if transmit power at Node 4 is too large
(strong interferer).
Abdellatif Zaidi Cooperative Techniques in Networks 57 / 90
Cooperation in Presence of Interference Interference Amplication
Amplifying the Interference
Important insight:
Interference X
4
is dierent from noise (has a structure !)
Treat the interference as desired information, and amplify it instead of
combating it!
The destination first decodes the interference, cancels its eect, and then
decodes message W interference-free
Rationale:
Consider the following IC, powers and noise variances set to unity for simplicity.
SNR
1
|g
11
|
2
, SNR
2
|g
22
|
2
INR
1
|g
21
|
2
, INR
2
|g
12
|
2

W
1
W
1

W
2
W
2
g
11
g
22
g
21
g
12
Strong interference regime: INR
1
SNR
1
and INR
2
SNR
2
Decoding interference is optimal in the strong interference regime.
The relay steers the network to the strong interference regime in which the
interference can be decoded and so its eect canceled out!
Abdellatif Zaidi Cooperative Techniques in Networks 58 / 90
Cooperation in Presence of Interference Interference Amplication
Amplifying the Interference
Important insight:
Interference X
4
is dierent from noise (has a structure !)
Treat the interference as desired information, and amplify it instead of
combating it!
The destination first decodes the interference, cancels its eect, and then
decodes message W interference-free
Rationale:
Consider the following IC, powers and noise variances set to unity for simplicity.
SNR
1
|g
11
|
2
, SNR
2
|g
22
|
2
INR
1
|g
21
|
2
, INR
2
|g
12
|
2

W
1
W
1

W
2
W
2
g
11
g
22
g
21
g
12
Strong interference regime: INR
1
SNR
1
and INR
2
SNR
2
Decoding interference is optimal in the strong interference regime.
The relay steers the network to the strong interference regime in which the
interference can be decoded and so its eect canceled out!
Abdellatif Zaidi Cooperative Techniques in Networks 58 / 90
Cooperation in Presence of Interference Binning
Partially Known Interferer
To gain intuition, consider the Gaussian model
Y
2
= X
1
+S +Z
2
Y
3
= X
1
+X
2
+S +Z
3
E[S
2
] = Q, E[X
2
i
] = P
i
, i = 1, 2
E[Z
2
i
] = N
i
, i = 2, 3
1
2
3

W W
4
The interference S can be known to all or only a subset of the nodes
Node k, k = 1, 2, 3, knows the interference from Node 4
- strictly-causally: if, at time i, it knows S
i1
(S
1
, . . . , S
i1
)
- causally: if, at time i, it knows S
i
(S
1
, . . . , S
i1
, S
i
)
- non-causally: if, at time i, it knows S
n
(S
1
, . . . , S
i1
, S
i
, . . . , S
n
)
In all cases, interference can be learned, e.g., through relaying or by
means of cognition.
In general, asymmetric models, are more dicult to solve!
Abdellatif Zaidi Cooperative Techniques in Networks 59 / 90
Cooperation in Presence of Interference Binning
Collaborative Binning Against Interference
Recall Costas Dirty Paper Coding for a point-to-point AWGN channel
- Input-output relation: Y = X +S +Z
E[X
2
] P, E[S
2
] = Q, E[Z
2
] = N
S known non-causally to Tx, but not to Rx
-Optimal precoding: Tx sends X = U S, with = P/(P +N)
Capacity C = I(U; Y ) I(U; S) = log
2
(1 +P/N)
Intuition: Y = U + (1 )S +N, 1 0
Symmetric case: S is known to both source and relay (non-causally)
X
2
= (1 )(U
1

1
S), =
_

P
1
/

P
(1)
,
X
1
= (U
1

1
S) + (U
2

2
S) +X

1
, X

1
N(0, P
1
)
with P
(1)
= (
_

P
1
+

P
2
)
2
, P
(2)
= P
1
and

k
=
P
(k)
P
(1)
+P
(2)
+ (P
1
+N
2
)
, k = 1, 2.
eliminates completely the eect of the interference S! (optimal if channel is
physically degraded).
Abdellatif Zaidi Cooperative Techniques in Networks 60 / 90
Cooperation in Presence of Interference Binning
Collaborative Binning Against Interference
Recall Costas Dirty Paper Coding for a point-to-point AWGN channel
- Input-output relation: Y = X +S +Z
E[X
2
] P, E[S
2
] = Q, E[Z
2
] = N
S known non-causally to Tx, but not to Rx
-Optimal precoding: Tx sends X = U S, with = P/(P +N)
Capacity C = I(U; Y ) I(U; S) = log
2
(1 +P/N)
Intuition: Y = U + (1 )S +N, 1 0
Symmetric case: S is known to both source and relay (non-causally)
X
2
= (1 )(U
1

1
S), =
_

P
1
/

P
(1)
,
X
1
= (U
1

1
S) + (U
2

2
S) +X

1
, X

1
N(0, P
1
)
with P
(1)
= (
_

P
1
+

P
2
)
2
, P
(2)
= P
1
and

k
=
P
(k)
P
(1)
+P
(2)
+ (P
1
+N
2
)
, k = 1, 2.
eliminates completely the eect of the interference S! (optimal if channel is
physically degraded).
Abdellatif Zaidi Cooperative Techniques in Networks 60 / 90
Cooperation in Presence of Interference Cognitive Relay
Interference Known Only at Relay
Case of no interference, with DF
The source knows the relay input X
n
2
The source can therefore jointly design X
n
1
through P
X
1
,X
2
(x
1
, x
2
)
This ensures some coherence gain as in multi-antenna transmission
R < I(X
1
, X
2
; Y
3
)
Case of interference known only at relay, with DF
Issue: coherent transmission is dicult to obtain
Relay should exploit the known S
n
X
2,i
=
2,i
(Y
i1
2
, S
n
)
Source does not know S
n
, and therefore X
2,i
Abdellatif Zaidi Cooperative Techniques in Networks 61 / 90
Cooperation in Presence of Interference Cognitive Relay
Coding
Complete interference mitigation is impossible, due to the asymmetry
Main idea:
decompose relay input as
X
2
= U
1
+

X
2

X
2
: zero mean Gaussian with variance P
2
, [0, 1]
U
1
: zero mean Gaussian with variance

P
2
,

= 1
U
1
is independent of S and correlated with X
1

X
2
is correlated with S and independent of X
1
E[U
1
X
1
] =

12

P
1
P
2
, E[

X
2
S] =

2s

P
2
Q

X
2
is generated using a Generalized DPC (

2s
0)
U
2
=

X
2
+S
Abdellatif Zaidi Cooperative Techniques in Networks 62 / 90
Cooperation in Presence of Interference Cognitive Relay
Coding
Complete interference mitigation is impossible, due to the asymmetry
Main idea:
decompose relay input as
X
2
= U
1
+

X
2

X
2
: zero mean Gaussian with variance P
2
, [0, 1]
U
1
: zero mean Gaussian with variance

P
2
,

= 1
U
1
is independent of S and correlated with X
1

X
2
is correlated with S and independent of X
1
E[U
1
X
1
] =

12

P
1
P
2
, E[

X
2
S] =

2s

P
2
Q

X
2
is generated using a Generalized DPC (

2s
0)
U
2
=

X
2
+S
Abdellatif Zaidi Cooperative Techniques in Networks 62 / 90
Cooperation in Presence of Interference Cognitive Relay
Information Rate
Theorem
The capacity of the general Gaussian RC with interference known non-causally only at
the relay is lower-bounded by
R
in
G
= max min
_
1
2
log
_
1 +
P
1
+

P
2
+ 2

12
_

P
1
P
2
P
2
+Q+N
3
+ 2

2s

P
2
Q
_
+
1
2
log
_
1 +
P
2
(1
2
2s
)
N
3
_
,
1
2
log
_
1 +
P
1
(1
2
12
)
N
2
_
_
where the maximization is over parameters [0, 1],

12
[0, 1],

2s
[1, 0], and

= 1 .
Abdellatif Zaidi Cooperative Techniques in Networks 63 / 90
Cooperation in Presence of Interference Cognitive Relay
Upper Bounding Technique
Source Relay
R < I(X
1
; Y
2
, Y
3
, S|X
2
)
= I(X
1
; Y
2
, Y
3
|S, X
2
)
(Source,Relay) Destination
R < I(X
1
, X
2
; Y
3
|S)
The term I(X
1
; S|Y
3
) is the rate loss due to not knowing the interference at the
source as well
Has connection with MAC with asymmetric CSI
But, with a dierent proof technique
Abdellatif Zaidi Cooperative Techniques in Networks 64 / 90
Cooperation in Presence of Interference Cognitive Relay
Upper Bounding Technique
Source Relay
R < I(X
1
; Y
2
, Y
3
, S|X
2
)
= I(X
1
; Y
2
, Y
3
|S, X
2
)
(Source,Relay) Destination
R < I(X
1
, X
2
; Y
3
|S) I(X
1
; S|Y
3
)
The term I(X
1
; S|Y
3
) is the rate loss due to not knowing the interference at the
source as well
Has connection with MAC with asymmetric CSI
But, with a dierent proof technique
Abdellatif Zaidi Cooperative Techniques in Networks 64 / 90
Cooperation in Presence of Interference Cognitive Relay
How Tight is the Lower Bound ?
Proposition
For the physically degraded Gaussian RC, we have:
1) If P
1
, P
2
, Q, N
2
, N
3
satisfy
N
2
max
[1,0]
P
1
N
3
(P
2
+Q+N
3
+ 2

P
2
Q)
P
1
N
3
+P
2
(1
2
)(P
1
+P
2
+Q+N
3
+ 2

P
2
Q)
,
then channel capacity is given by
C
DG
=
1
2
log(1 +
P
1
N
2
).
2) If the maximum for the upper bound is attained at the boundary
2
12
+
2
2s
= 1, then
the lower bound is tight.
Abdellatif Zaidi Cooperative Techniques in Networks 65 / 90
Cooperation in Presence of Interference Cognitive Relay
How Tight is the Lower Bound ? (cont.)
Bounds for GRC also meet in some extreme cases:
Arbitrarily strong interference, i.e., Q
C =
_
min{
1
2
log(1 +
P
1
N
2
),
1
2
log(1 +
P
2
N
3
)} (DG RC)
1
2
log(1 +
P
2
N
3
), if
P
2
N
3

P
1
N
2
(General GRC)
Zero-power at the relay, i.e., P
2
= 0
C =
_
1
2
log(1 +
P
1
Q+N
3
) (DG RC)
1
2
log(1 +
P
1
Q+N
3
), if
P
1
Q+N
3

P
1
N
2
(General GRC)
No interference, i.e., Q = 0 (DG RC)
Abdellatif Zaidi Cooperative Techniques in Networks 66 / 90
Cooperation in Presence of Interference Cognitive Relay
The Deaf Helper Problem
What if the relay cannot hear the source ?
Finite interference
subtract as much as possible from S
not optimum in general
Arbitrarily stong interference
Constructed a dammed codeword X
2
(independent of X
1
) by DPC,
X
2
= U
2
S
At the destination: decode U
2
rst and then X
1
Cleans-up the channel for X
1
if
I(U
2
; Y
3
) (U
2
; S) > 0.
Transmission at
I(X
1
; Y
3
|S) =
1
2
log
2
(1 +
P
1
N
3
)
Abdellatif Zaidi Cooperative Techniques in Networks 67 / 90
Cooperation in Presence of Interference Cognitive Relay
Example : Degraded Gaussian RC
P
1
= P
2
= Q = N
3
= 10 dB
0 5 10 15 20 25 30
0.2
0.4
0.6
0.8
1
1.2
P
1
/N
2
[dB]
R
a
t
e
Lower bound
Trivial upper bound
Upper bound
Trivial lower bound
0 5 10 15 20 25 30
0
0.5
1

21
2
+

22
s
Abdellatif Zaidi Cooperative Techniques in Networks 68 / 90
Cooperation in Presence of Interference Cognitive Source
RC with Interference Known Only at Source
Model
Y
2
= X
1
+S +Z
2
Y
3
= X
1
+X
2
+S +Z
3
E[S
2
] = Q, E[X
2
i
] = P
i
, i = 1, 2
E[Z
2
i
] = N
i
, i = 2, 3
1
2
3

W W
4
The interference S is known non-causally to only the source
Abdellatif Zaidi Cooperative Techniques in Networks 69 / 90
Cooperation in Presence of Interference Cognitive Source
Coding
Two dierent techniques:
1 Lower bound 1: reveal the interference to the relay
interference exploitation (binning) is performed also at relay
(share message and interference)
2 Lower bound 2: reveal to the relay just what the relay would send had the relay
known the interference
interference exploitation (binning) is performed only at source
(share X = (V, S), suitable for oblivious relaying)
Abdellatif Zaidi Cooperative Techniques in Networks 70 / 90
Cooperation in Presence of Interference Cognitive Source
Coding
Two dierent techniques:
1 Lower bound 1: reveal the interference to the relay
interference exploitation (binning) is performed also at relay
(share message and interference)
2 Lower bound 2: reveal to the relay just what the relay would send had the relay
known the interference
interference exploitation (binning) is performed only at source
(share X = (V, S), suitable for oblivious relaying)
Abdellatif Zaidi Cooperative Techniques in Networks 70 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound 2
If the interference were also known at relay
Beginning of block i:
Source sends x
1
[i] := x
1
(w
i1
, w
i
) | v(w
i1
, j

V
), u(w
i1
, w
i
, j

U
), s[i]
Relay sends x[i] | v(w
i1
, j

V
), s[i]
In our case (interference at only source)
The source knows w
i1
and s[i], and so x[i]
The source transmits x[i] to the relay, ahead of time, in block i 1
The relay estimates x[i] from y
2
[i 1], and sends x
2
[i] x[i] in block i
Abdellatif Zaidi Cooperative Techniques in Networks 71 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound 2
Outline:
Beginning of block i:
Source looks for u(w
i
, j

i
) such that (u(w
i
, j

i
), s[i]) T
n

Source looks for u(w


i+1
, j

i+1
) such that (u(w
i+1
, j

i+1
), s[i + 1]) T
n

; and then
computes x[i + 1] | u(w
i+1
, j

i+1
), s[i + 1]
Source quantizes x[i + 1] into x[m
i
]
u
R
(m
i
, j

Ri
)
u(w
i
, j

i
)
Martons coding Superposition coding
(w
i
, m
i
)
u(w
i
, j

Ui
)
u
R
(m
i
, j

Ri
)
v(w
i1
)
Abdellatif Zaidi Cooperative Techniques in Networks 72 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound 2: Martons Coding
Theorem
The capacity of the DM RC with interference known only at source is lower bounded by
R
lo
= max I(U; Y
3
) I(U; S)
subject to the constraint
I(X;

X) < I(U
R
; Y
2
) I(U
R
; S) I(U
R
; U|S)
where maximization is over all joint measures on
S U U
R
X
1
X
2
X

X Y
2
Y
3
of the form
P
S,U,U
R
,X
1
,X
2
,X,

X,Y
2
,Y
3
= Q
S
P
U,U
R
|S
P
X
1
|U,U
R
,S
P
X|U,S
P

X|X
1
X
2
=

X
W
Y
2
,Y
3
|X
1
,X
2
,S
.
Abdellatif Zaidi Cooperative Techniques in Networks 73 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound 2 (Gaussian Case)
Test channel:

X = aX +

X, a := 1 D/P
2
,

X N(0, D(1 D/P
2
)), 0 D P
2
X N(0, P
2
), with E[X

X] = E[XS] = E[

XS] = 0
X
1R
N(0, P
1
), with E[X
1R
S] = 0, 0 1
U =
_
_
P
1
P
2
+
_
P
2
D
P
2
_
X +S
U
R
= X
1R
+
R
_
S +

P
1

P
1
+

P
2
D
X
_
,
with
=
(

P
1
+

P
2
D)
2
(

P
1
+

P
2
D)
2
+ (N
3
+D +P
1
)

R
=
P
1
P
1
+N
2
.
Abdellatif Zaidi Cooperative Techniques in Networks 74 / 90
Cooperation in Presence of Interference Cognitive Source
Lower Bound
Theorem
The capacity of the Gaussian RC with interference known only at source is lower
bounded by
R
lo
G
= max
1
2
log

1 +
(

P
1
+

P
2
D)
2
N
3
+D +P
1

,
where
D := P
2
N
2
N
2
+P
1
and the maximization is over [0, 1], with := 1 .
Abdellatif Zaidi Cooperative Techniques in Networks 75 / 90
Cooperation in Presence of Interference Cognitive Source
How Tight are the Bounds ? (Contd)
Bounds for GRC meet in some extreme cases:
Arbitrarily small noise at relay, i.e., N
2
0,
C
G
=
1
2
log
_
1 +
(

P
1
+

P
2
)
2
N
3
_
o(1)
where o(1) 0 as N
2
0.
Arbitrarily strong noise at relay, i.e., N
2
,
R
up
G
=
1
2
log(1 +
P
1
N
3
)
R
lo
G
=
1
2
log(1 +
P
1
N
3
+P
2
).
If P
1
, bounds meet asymptotically in the power at the source if P
2
P
1
,
yielding
C
G-orth
=
1
2
log(1 +
P
1
N
3
) +o(1)
Abdellatif Zaidi Cooperative Techniques in Networks 76 / 90
Cooperation in Presence of Interference Cognitive Source
Example
P
1
= Q = N
3
= 10 dB, P
2
= 20 dB
20 10 0 10 20 30 40 50
0
0.2
0.4
0.6
0.8
1
1.2
1.4
1.6
1.8
2
P
1
/N
2
[dB]
R
a
t
e
Lower bound (Theorem 4)
Lower bound (Theorem 5)
Upper bound (Theorem 3)
Upper bound (Cutset bound)
Trivial lower bound
Abdellatif Zaidi Cooperative Techniques in Networks 77 / 90
Cooperation in Presence of Interference General Framework
RC with States
More generally, S may represent any information about the channel (fading, activity,..)
SOURCE DESTINATION
RELAY
X
2
Y
3
X
1

W W W
Y2,Y3|X1,X2,S W W
Y
2
S
n
RELAY
DESTINATION SOURCE
X
1
Y
2 X
2
Y
3

W W W W
S
n
W
Y2,Y3|X1,X2,S
Thourough results, as well as strictly-causal CSI case, in:
- Zaidi et al., Bounds on the Capacity of the Relay Channel with Noncausal State
at Source, EEE Trans. on Inf. Theory, Vol. 59, No. 5, May 2013, pp. 2639-2672.
- Zaidi et al. Cooperative Relaying with State Available Non-Causally at the
Relay, EEE Trans. Inf. Theory, vol. 5, no. 56, pp. 2272-2298, May 2010.
- Zaidi et al., Capacity Region of Cooperative Multiple Access Channel with
States, IEEE Trans. on Inf. Theory, Vol. 59, No. 10, 2013, pp. 6153-6174
Abdellatif Zaidi Cooperative Techniques in Networks 78 / 90
Cooperation in Presence of Interference General Framework
MAC with Delayed CSI
MAC
Decoder
Encoder 2
Encoder 1
S
i1
W
c
X
n
2
X
n
1
Y
n
W
Y |X1,X2,S
(

W
c
,

W
1
)
W
1
S
i1
Both encoders send W
c
. Encoder 1 also sends W
1
Both encoders know the states only strictly causally

1,i
: W
c
W
1
S
i1
X
1
, i = 1, . . . , n

2,i
: W
c
S
i1
X
2
, i = 1, . . . , n
Decoder:
: Y
n
W
c
W
1
Abdellatif Zaidi Cooperative Techniques in Networks 79 / 90
Cooperation in Presence of Interference General Framework
Main Results
1 CSI given with delay to only transmitters increases the capacity region
- Zaidi et al., Cooperative MAC with states, IT 2013
2 Gains obtained through a Block-Markov coding in which the encoders jointly
compress the CSI of the last block, using an appropriate compression scheme
3 Reminiscient of quantizing-and-transmitting noise in a non-degraded BC example
with common noise at receivers by Dueck (Cf: Partial feedback for two-way and
BC, Inf. Contr. 1980)
- MAT scheme (Maddah Ali, Tse): interference S here. In block i,
Transmitter sends a linear function of (S[i 1], X[i 1]). This can be seen
as a compressed version

S = V = f(X, S) P
V |X,S
- Lapidoth et al., MAC with causal and strictly causal CSI, IT 2013
- Li et al., MAC with states known strictly causally, IT 2013
Abdellatif Zaidi Cooperative Techniques in Networks 80 / 90
Cooperation in Presence of Interference General Framework
Capacity Region in Some Special Cases
Let D
sym
MAC
be the class of discrete memoryless two-user cooperative MACs, denoted by
D
sym
MAC
, in which the channel state S, assumed to be revealed strictly causally to both
encoders, can be obtained as a deterministic function of the channel inputs X
1
and X
2
and the channel output Y , as
S = f(X
1
, X
2
, Y ).
Theorem
For any MAC in the class D
sym
MAC
dened above, the capacity region C
s-c
is given by the
set of all rate pairs (R
c
, R
1
) satisfying
R
1
I(X
1
; Y |X
2
, S)
R
c
+R
1
I(X
1
, X
2
; Y )
for some measure
P
S,X
1
,X
2
,Y
= Q
S
P
X
1
,X
2
W
Y |S,X
1
,X
2
.
Example: model Y = X
1
+X
2
+S. Capacity R
c
+R
1
log(1 + (

P
1
+

P
2
)
2
/Q).
Abdellatif Zaidi Cooperative Techniques in Networks 81 / 90
Cooperation in Presence of Interference General Framework
Delayed CSI Not Always Helps !
Proposition
Delayed CSI at the encoders does not increase the sum capacity
max
(R
c
,R
1
) C
s-c
R
c
+R
1
= max
p(x
1
,x
2
)
I(X
1
, X
2
; Y ).
Proposition
Delayed CSI at only the encoder that sends both messages does not increase the
capacity region of the cooperative MAC.
Abdellatif Zaidi Cooperative Techniques in Networks 82 / 90
Interaction and Computation

Part III: Interaction and Computation


Abdellatif Zaidi Cooperative Techniques in Networks 83 / 90
Interaction and Computation
Outline: Part 3
1 Interaction for Sources Reproduction
2 Interaction for Function Computation
Abdellatif Zaidi Cooperative Techniques in Networks 84 / 90
Interaction and Computation
Setup
A B
M
1
M
2
.
.
.
M
t
f
A
(X, Y ) f
B
(X, Y )
Y X
Under what conditions is interaction useful ?
How useful is interaction ?
What is the best way to interact ?
Abdellatif Zaidi Cooperative Techniques in Networks 85 / 90
Interaction and Computation
Interaction for Sources Reproduction Losslessly
Discrete memoryless multi-source (X
1
, Y
1
), . . . , (X
n
, Y
n
) i.i.d. p
X,Y
(x, y)
Goal: Reproduce X = (X
1
, X
2
, . . . , X
n
) at B with probability 1 as n
A B
M
1
R
1
= H(X|Y )
(X
1
, . . . , X
n
) (Y
1
, . . . , Y
n
)
(X
1
, . . . , X
n
)
One round Slepian-Wolf Coding is optimal
Abdellatif Zaidi Cooperative Techniques in Networks 86 / 90
Interaction and Computation
Interaction for Lossy Sources Reproduction
Discrete memoryless multi-source (X
1
, Y
1
), . . . , (X
n
, Y
n
) i.i.d. p
X,Y
(x, y)
Goal: Reproduce

X = (

X
1
,

X
2
, . . . ,

X
n
), with E[d(X,

X)] D
X
as n
A B
M
1
M
2
.
.
.
M
t
(Y
1
, . . . , Y
n
) (X
1
, . . . , X
n
)

X = (

X
1
, . . . ,

X
n
)

Y = (

Y
1
, . . . ,

Y
n
)
R
2
R
t
R
1
Interaction is useful
R
sum
:= R
1
+R
2
+. . . +R
t
min
P
U
1
|X
I(U
1
; X|Y ) + min
P
U
2
|Y
I(U
1
; Y |X)
Abdellatif Zaidi Cooperative Techniques in Networks 87 / 90
Interaction and Computation
Interaction for Fonction Computation
Discrete memoryless multi-source (X
1
, Y
1
), . . . , (X
n
, Y
n
) i.i.d. p
X,Y
(x, y)
Goal: Reproduce

f
A
=

f
A
(X, Y), with E[d(

f
A
(X, Y), f
A
(X, Y))] D
A
as
n
A B
M
1
M
2
.
.
.
M
t
(Y
1
, . . . , Y
n
) (X
1
, . . . , X
n
)
R
2
R
t
R
1

f
B
(X, Y)
f
A
(X, Y)
Interaction is useful
Abdellatif Zaidi Cooperative Techniques in Networks 88 / 90
Wrap-Up
Wrap-Up
1 Many approaches, benets, and challenges to utilizing relaying and cooperative
communications in networks.
2 Relaying also includes, or should also include, multihop (store-and-forward routing)
and network coding.
3 The capacity of relay systems is dicult to analyze, but the theory is surprisingly
flexible and diverse.
4 Although generalizations to networks with many nodes are in general not easy,
there are schemes which scale appropriately.
5 In networks with interference, relaying can help, not only by adding power/energy
spatially, but also by allowing distributed interference cancellation; and in general
this boots network capacity.
6 To cope with network interference eciently, classic relaying schemes as such in
general do not suce, and need be combined carefully with other appropriate
techniques.
Abdellatif Zaidi Cooperative Techniques in Networks 89 / 90
Wrap-Up
Selected References
- T. Cover and A. El Gamal, Capacity theorems for the relay channel, IEEE Trans.
Inf. Theory, Vol. 25, Sep. 1979, pp. 572-584.
- G. Kramer, M. Gastpar and P. Gupta, Cooperative strategies and capacity
theorems for relay networks, IEEE Trans. Inf. Theory, Sep. 2005, pp. 3037-3037.
- N. Laneman, D. Tse and G. Wornell, Cooperative diversity in wireless networks:
Ecient protocols and outage behaviour, IEEE Trans. Inf. Theory, vol. 50, Dec.
2004, pp. 3062-3080.
- A. Zaidi, S. Kotagiri and N. Laneman, Cooperative relaying with State Available
Non-Causally at the Relay, EEE Trans. Inf. Theory, vol. 5, no. 56, pp.
2272-2298, May 2010.
- A. Zaidi et al., Bounds on the capacity of the relay channel with noncausal state
at source, EEE Trans. on Inf. Theory, Vol. 59, No. 5, May 2013, pp. 2639-2672.
- A. Zaidi et al., Capacity region of cooperative multiple access channel with
states, IEEE Trans. on Inf. Theory, vol. 59, No. 10, 2013, pp. 6153-6174
- S.-H. Lim, Y.-H. Kim and A. El Gamal, Noisy Network Coding, IEEE Trans. Inf.
Theory, vol. 57, May 2011, pp. 3132-3152.
- A. Avestimeher, S. Diggavi and D. Tse, Wireless network information ow: a
determenistic approach, IEEE Trans. Inf. Theory, May 2011, pp. 3132-3152.
- A. El Gamal and Y.H-. Kim, Network information theory, Cambridge University
Press, 2011.
Abdellatif Zaidi Cooperative Techniques in Networks 90 / 90

You might also like