Professional Documents
Culture Documents
c 20032006 Communication Theory Group, ETH Zurich SVN Revision: 1767, January 24, 2007
Discrete Random Variables of X given Y = y is then defined as
Basic Probability Discrete Random Variables [1, 3.13.2].
Z x
fX,Y (, y)
FX | Y (x | y) = d
Sample Space and Sigma-Field [1, 1.2.5]. P(A B) = P(A) + P(B) P(B A). The probability mass function (PMF) of a dis- fY (y)
The sample space is the set of all outcomes, Let A1 , A2 , . . . be an increasing sequence crete RV X is the function f : R [0, 1] for any y such that fY (y) > 0. The conditional
or elementary events, of a random experiment. of events, so that A1 A2 , and given by fX (x) = P(X = x). The joint PMF of density function is given by
The power set of contains all subsets of write for their limit a random vector X is defined analogously.
fX,Y (x, y)
and is written as {0, 1} . A collection F of [ The PMF of a discrete RV satisfies fX | Y (x | y) =
A = Ai = lim Ai . The set of x such that fX (x) , 0 is count- fY (y)
subsets of is called a -field if it satisfies
the following conditions i=1
i
able. for any y such that fY (y) > 0.
i fX (xi ) = 1.
P
F. The P(A) = limi P(Ai ). The same Expectation [1, 4.3]. The expectation of a
If A1 , . . . An F then ni=1 Ai F , where holds for a decreasing sequence of events Discrete RVs X1 , X2 , . . . .XN are independent continuous RV X is
S
n = 1, 2, . . . can be infinite. and their intersection. if the events {X1 = x1 }, {X2 = x2 }, . . . , {XN = Z
If A F then Ac F . xN } are independent for all x1 , x2 , . . . , xN . E[X] := x fX (x)dx
Conditional Probability [1, 1.4].
A is called an event; -fields are closed under If X and Y are independent and g, h :
If P(B) > 0 then the conditional probabil- R R, then g(X) and h(X) are indepen- whenever the integral exists.
countable intersections.
ity that A occurs given that B occurs is dent also. If X and g(X) are continuous random
Probability Space [1, 1.3.1]. A proba- defined to be X1 , X2 , . . . , XN are independent variables then Z
bility measure P on (, F ) is a function P(A B) iff fX2 ,X2 ,...,XN (x1 , x2 , . . . , xN ) =
P : F [0, 1] satisfying P(A | B) := .
g(X)
= g(x) fX (x)dx.
P(B) fX1 (x1 ) fX2 (x2 ) . . . fXN (xN ) for all E
P() = 0, P() = 1. For any events A and B such that 0 < x1 , x2 , . . . , xN R.
If A1 , A2 , . . . is a collection of disjoint If X has PDF fX with fX (x) = 0 when x <
P(B) < 1, 0, and distribution function FX , then
members of F , in that Ai A j = for all Expectation [1, 3.3]. The expectation of the
P(A) = P(A | B) P(B) + P(A | Bc ) P(Bc ) RV X with PMF fX is defined as
Z
pairs i, j with i , j, then
More generally, let B1 , B2 , . . . , Bn be a X E[X] = (1 FX (x))dx.
E[X] := x fX (x)
[ X partitioning of such that P(Bi ) > 0 for 0
P Ai =
P(Ai ). all i. Then x: fX (x)>0 If g : RN R is an F -measurable func-
i=1 i=1 X n
whenever the sum is absolutely convergent. tion then
The triple (, F , P) is called a probability P(A) = P(A | Bi ) P(Bi ). E g(X1 , X2 , . . . , XN ) =
The expectation of random vectors is de-
space. i=1 fined element wise. (
An event A is called null if P(A) = 0. Bayes Rule: let Bi and A be as before, If X has PMF fX and g : RN R, then g(x1 , x2 , . . . , xN )
An event B is said to occur almost surely P(A) > 0, then X
if P(B) = 1. E g(X) =
g(x) fX (x) fX1 ,X2 ,...,XN (x1 , x2 , . . . , xN )dx1 dx2 . . . dxN
P(A | Bi ) P(Bi )
A probability space (, F , P) is called P(Bi | A) = Pn . x Z
complete if all subsets of null sets, i.e., i=1 P(A | Bi ) P(Bi ) whenever the sum is element-wise abso- = g(x) fX (x)dx.
events of zero probability, are events Independence [1, 1.5]. A family {A : i lutely convergent.
i
themselves. I} is called independent if If X 0 then E[X] 0. The expectation is linear: E[aX + bY] =
If a, b R then E[aX + bY] = a E[X] + a E[X] + b E[Y], for all a, b C.
Properties of a Probability Space [1,
\ Y b E[Y] (linearity).
1.3]. P Ai = P(Ai ) The random variable 1, taking the value 1 Functions of Random Variables [1, 4.7,
P(Ac ) = 1 P(A). iJ iJ always, has expectation E[1] = 1. 4.8]. Let X1 and X2 be RVs with joint den-
If B A then P(B) = P(A) + P(B \ A) for all finite subsets J of I. If X and Y are independent, then sity function fX1 ,X2 , and let T : (x1 , x2 )
P(A). E[XY] = E[X] E[Y]. (y1 , y2 ) be a one-to-one mapping taking
some domain D R2 onto some range
Sums of discrete RVs [1, 3.8]. The proba- R R2 . If g : R2 R and T maps the set
Random Variables bility of the sum of two RVs X and Y having A D onto the set B R, then
Basics random vector [X Y] has the following joint PMF fX,Y is given by "
g(x1 , x2 )dx1 dx2
X
properties, which hold analogously for N- P(X + Y = z) = fX,Y (x, z x).
Random Variables and Distribution Func- dimensional random vectors:
tions [1, 2.1]. A random variable (RV) is x "
A
lim FX,Y (x, y) = 0, lim FX,Y (x, y) = 1.
a function X : R with the property If X and Y are independent, then = g x1 (y1 , y2 ), x2 (y1 , y2 )
x,y x,y
that { : X() x} F for each x R. If [x1 y1 ] [x2 y2 ] then FX,Y (x1 , y1 )
X B
fX+Y (z) = P(X + Y = z) = fX (x) fY (z x).
FX,Y (x2 , y2 ).
Such a function is said to be F -measurable.
x
J(y1 , y2 ) dy1 dy2 ,
The (cumulative) distribution function (CDF) FX,Y is continuous from above in that
of a RV X is the function FX : R [0, 1] where J denotes the Jacobian of the transform
FX,Y (x + u, y + v) FX,Y (x, y) x1 x2
given by FX (x) := P(X x). A distribution
J(y1 , y2 ) = det y
1 y1
function has the following properties
as u, v 0. Continuous Random Variables x1 x2
.
The marginal distribution functions of X y2 y2
lim FX (x) = 0, lim FX (x) = 1. and Y are Density Functions [1, 4.1,4.5]. If X is a
x x
If x < y then FX (x) FX (y). lim FX,Y (x, y) = FX (x), continuous RV, its CDF FX = P(X x) can Then the pair Y1 , Y2 , given by (Y1 , Y2 ) =
y be expressed as T(X1 , X2 ) has joint density function
The CDF FX is right-continuous, that is,
fY1 ,Y2 (y1 , y2 ) =
Z x
FX (x + h) FX (x) as h 0. lim FX,Y (x, y) = FY (y).
P(X > x) = 1 FX (x).
x FX (x) = fX ()d.
written as FX (x) = FX,Y (x, ) and FY (y) = fX1 ,X2 x1 (y1 , y2 ), x2 (y1 , y2 ) J(y1 , y2 )
P(x < X y) = FX (y) FX (x). The function fX is called the (probability) if (y , y ) R(T) and 0 otherwise.
P(X = x) = FX (x) lim FX (y). FX,Y (, y), respectively. 1 2
yx The definitions of discrete, continuous, and density function of the continuous RV X. If the transformation is not one-to-one
R
The RV X is called continuous [2, 4.2] if mixed RVs extend to random vectors. fX (x)dx = 1. but piecewise one-to-one and sufficiently
its CDF FX is continuous; in that case, Relationship Between Real-Valued and P(X = x) = 0 for all x R. smooth, the more general transformation
FX (x ) = FX (x) x, and P(X = x) = 0. It is dis- Complex-Valued Operations [3, 5]. A Rb rule is the following: Let I1 , . . . , IN be
crete if it takes values in some countable sub- Complex RV U = X + jY can be treated as P(a X b) = a fX (x)dx. intervals which partition R, and suppose
set {x1 , x2 , . . . } of R; in that case, FX (x) is con- a random vector [X Y]. Consider arbitrary The random variables X and Y are jointly Y = g(X) is strictly monotone and continu-
stant except for a finite number of jump dis-
complex vectors u, v CN and a complex continuous with joint PDF fX,Y : R2 [0, ) ously differentiable on every In . For each n,
continuities, and P(X = x) = FX (x) FX (x ). MN if the function g : In R is invertible on
It is of mixed type if FX (x) is piecewise con- M N matrix A C . Define uR := <{u} Z y Z x g(In ) and we write hn for the inverse func-
tinuous with a finite number of jump dis- and u I := ={u} and the real 2N-dimensional FX,Y (x, y) = fX,Y (, )dd tion. Then,
continuities. vector = = N
<{u}
u
The indicator function IA : R is defined
X
u = uR = ={u} .
0
I
for each x, y R. If FX,Y is sufficiently dif- fY (y) = fX hn (y) hn (y) ,
as the binary RV e ferentiable then n=1
Then the complex-valued linear operation
1 if A, 2
(
v = Au can be expressed in terms of the fX,Y (x, y) = FX,Y (x, y). with the convention that the nth summand
IA () = xy
0 if Ac . real quantities as v = Au, where a matrix A is 0 if hn is not defined at y.
satisfying v = Aueexists eeand is given by e The marginal densities are given as If X and Y have joint density function fX,Y ,
Independence [1, 4.2]. Random vari-
Z then the sum of RVs X + Y has density func-
A <{A} ={A}
A
e ee
ables X and Y (discrete or continuous) are A = AR A I = ={A} <{A} . fX (x) = fX,Y (x, y)dy tion Z
I R
called independent if {X x} and {Y y} e Z fX+Y (z) = fX,Y (x, z x)dx.
are independent events for all x, y R, i.e., Let B be another complex matrix, then
FX,Y (x, y) = FX (x)FY (y) for all x, y R. AB = AB. fY (y) = fX,Y (x, y)dx.
e k!
k
n o
Poisson k = 0, 1, 2, . . . 1 exp (eit 1)
eibt eiat
continuous Unifrom 1
ba [a, b] 1
2 (a + b) 1
12 (b a)2 0 it(ba)
Exponential e x
[0, ), > 0 1
1
2
2 it
(x)2
eit 2 t
1 2
Normal N(, 2 ) 1 exp 22 R 2 0
22
exp{ 12 (x)T K1 (x))} T
Multivariat Normal N(, K) Rn K exp itT t 2Kt
2 det(K)
Cauchy C(, ) 2 +(x)2
R eit e|t|
2
q q 22
x
2 2 (3) t
x 22
(2 2 )2 1 + i 2 t e 2 (?)
2
Rayleigh 2
e [0, ) 2 (4)3/2
"
x +a 2 2
x 22
Rice 2
e I0 ax
2
R 2
(1 + r) I0 2r
i n o
a2
r= 2 2 +rI1 2r exp 2r
(ln x)2
2
2
e+ 2 e +2 (e 1) e 1(2 + e )
2 2 2
Log-normal 1 exp (0, )
x 22 22
N
q
x 2 1
Central Chi-square 2N (N/2)2N/2
ex/2 [0, ) N 2N 2
N
1
(1i2t)N/2
x+ N1
e 2 x 2 2 2(3+N)
Non-Central Chi-Square 2(x)N/4
I N 1 ( x) [0, ] +N 2(2 + N) (2+N)3/2
2
h i 3
2 ( 1+ 1 )3(1+ 1 )(1+ 2 )(1+ 3 )
Weibull x1 e(x/) [0, ) 1 + 1 2 1 + 2 2 1 + 1 3/2
[(1+ 2 )2 (1+ 1 )]
2 !
1 (m+ 2 )
m q 1
x2m1 (m+ 2 )
1
Nakagami-m 2
(m)
m
m 2 (0, ) (m) m 1 m (m)
ex
References [3] I. E. Telatar, Mathematical preliminaries, Mar. 2002, lecture Notes for Wireless
Communication and Mobility, EPFL.
[1] G. R. Grimmett and D. R. Stirzaker, Probability and Random Processes, 3rd ed. Oxford,
U.K.: Oxford Univ. Press, 2001. [4] R. G. Gallager, Stochastic processesa conceptual approach, Aug. 2001, University
of California at Berkeley, EE226A Class Reader, Fall 2001.
[2] A. Papoulis and S. U. Pillai, Probability, Random Variables and Stochastic Processes, 4th ed.
Boston, MA, U.S.A.: McGraw-Hill, 2002. [5] S. G. Wilson, Digital Modulation and Coding. Upper Saddle River, NJ, U.S.A.: Prentice
Hall, 1996.