You are on page 1of 3

LECTURE 11 A general formula

Derived distributions; convolution;


Let Y = g(X)
covariance and correlation
g strictly monotonic.
d g
Readings:
slope (x)
y dx
Finish Section 4.1;
Section 4.2
g(x)
[y, y+?]
Example
y x
f (y,x)=1
X,Y
[x, x+d]
1
Event x X x + is the same as
g(x) Y g(x +)
or (approximately)
1
x
g(x) Y g(x) +|(dg/dx)(x)|
Find the PDF of Z = g(X, Y ) = Y/X
Hence,
dg
f
X
(x) = f
Y
(y)
(z z 1

(x)
F ) =
dx
Z

where y = g(x)

F
Z
(z) = z 1
The distribution of X +Y The continuous case
W = X +Y ; X, Y independent
W = X +Y ; X, Y independent
.
y

y
.

(0,3)
.
(1,2) w

.
(2,1)
.
(3,0)
.
x

w x

p
W
(w) = P(X +Y = w)

x + y = w
= P(X = x)P(Y = w x)
=

x
p (x)p (w x)
f
W X
(w | x) = f
Y
(w x)
X Y
x
|
f
W,X
(w, x) = f
X
(x)f
Mechanics:
W|X
(w | x)
= f
X
(x)f
Y
(w x)
Put the pmfs on top of each other

Flip the pmf of Y
f (w) =
_

W
f
X
(x)f
Y
(w x) dx
Shift the ipped pmf by w

(to the right if w > 0)
Cross-multiply and add
1
Two independent normal r.v.s The sum of independent normal r.v.s
X N(
2

x
,
x
),
2
Y N(
2

y
,
y
), X N(0,
x
),
2
Y N(0,
y
),
independent independent
f
X,Y
(x, y) = f
X
(x)f
Y
(y)
Let
1
_
W = X +Y
( )
2
( )
2

x
= exp
x
y
y
2

2
_
f
W
(w) =
_

f
X
(x)f
Y
(w x) dx
2
x

y
2
x
2
y
1

2
x /2
2
=

2 2
x)
x
(w /2
e e
y
dx
PDF is constant on the ellipse where 2
x

y
2
_

(x
2

x
) (
(algebra) = ce
y
+
2
2

y
)
2
w
x
2
2
y
Conclusion:
is constant
W is normal
mean=0, variance=
2

x
+
2

y
Ellipse is a circle when
x
=
y
same argument for nonzero mean case
Covariance Correlation coecient
cov(X, Y ) = E
_
(X E[X]) (Y E[Y ])
_
Dimensionless version of covariance:
_
(X E[X]) (Y E[Y ])
Zero-mean case: cov(X, Y ) = E[XY ]
= E


Y
_
x .
cov(
x
X, Y )
.
=
.
.
.
..

X

Y
..
.
.
.
. .
. .
.
. . .
.
. . .
..
. . . .
.
. . . .
. .
..
.
. . .
.
.
. .
. .
.
.
. .
1 1
. . .
. .
.
. . . .
.
.
.
. . .
.
. .
. . . .
. .
.
. .
.
.
. .
. . .
.
.

.
. ..

.
. .
. .
..
. .
.

.
.
.
. . . .
.
.
.
.
. .. .
. . .
y
. . .. . ..
y
.
.
.... .
..
| = 1 (X E[X]) = c(Y E[Y ])
.
.
.
...
.
.
. .
...
|
.
.
.. ..
.
(linearly related)
..
.
..
..
.
Independent = 0
cov(X, Y ) = E[XY ] E[X]E[Y ] (converse is not true)
_
n
var
_
n
_
X )
i

i
=1
_
= var(X
i
) +2 cov(X
i
, X
j
i

=1 (i,j

):i=j
independent cov(X, Y ) = 0
(converse is not true)

2
MIT OpenCourseWare
http://ocw.mit.edu
6.041 / 6.431 Probabilistic Systems Analysis and Applied Probability
Fall 2010
For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

You might also like