Professional Documents
Culture Documents
Distributions: Part II
Cyr Emile MLAN, Ph.D.
mlan@stat.uconn.edu
p. 1/2
Introduction
Text Reference: Introduction to Probability and Its
Applications, Chapter 6.
Reading Assignment: Sections 6.4-6.5, April
13-April 15
p. 2/2
E g(X, Y ) =
g(x, y) p(x, y)
x= y=
(The sum is in fact over all values of (x, y) for which p(x, y) > 0)
If X, Y are continuous random variables with joint density function
f (x, y), then the expected value of g(X, Y ) is
Z Z
E g(X, Y ) =
g(x, y) f (x, y) dx dy
(The double integral is in fact over all values of (x, y) for which
Multivariate Probability Distributions: Part II
p. 3/2
f (x, y) > 0
x=Zy=
Z
x y f (x, y) dx dy ,
if discrete
x y p(x, y) ,
if continuous
If g(X, Y ) = X k , we have
E X
k
xk p(x, y) ,
x=
Z Zy=
x f (x, y) dx dy ,
if discrete
if continuous
If g(X, Y ) = Y k , we have
E Y
k
y k p(x, y) ,
if discrete
x=
Z Zy=
Multivariate
Probability Distributions: Part II
k
y f (x, y) dx dy ,
if continuous
p. 4/2
2(1 x), if 0 x 1, 0 y 1,
f (x, y) =
0,
otherwise.
p. 5/2
x21 (1 x)21
dx
= B(2, 2)
B(2, 2)
0
(2) (2)
1!1!
1
=
=
=
(4)
3!
6
Z
p. 6/2
E c1 g1 (X, Y ) + c2 g2 (X, Y ) + + ck gk (X, Y ) =
c1 E g1 (X, Y )] + c2 E g2 (X, Y )] + + ck E gk (X, Y )
In particular,
E X + Y = E[X] + E[Y ]
E X Y = E[X] E[Y ]
E cg(X, Y ) = cE[g(X, Y )]
p. 7/2
24 x y, if 0 X + Y 1,
f (x, y) =
0,
otherwise.
p. 8/2
= 1 + E X + 2E Y
Z 1 Z 1x
Z
= 1 + 24
x(xy) dydx + 48
0
= 1 + 12
(1 x)2 x2 dx + 16
= 1 + 12B(3, 3) + 16B(2, 4)
11
2 4
= 1+ + =
5 5
5
1
0
1x
y(xy) dydx
0
x(1 x)3 dx
0
p. 9/2
Joke
p. 10/
Theorem 6.6:
Let X and Y be independent random variables and g(X) and
h(Y ) be functions of only X and Y , respectively. Then
E g(X) h(Y ) = E g(X)] E g2 (Y )]
p. 11/
Cov(X, Y ) = E (X X )(Y Y )
X
X
x X y Y p(x, y)
x= y=
if X and Y discrete
=
Z Z
x X y Y f (x, y) dx dx
x= y=
if X and Y continuous
p. 13/
p. 14/
Cov(X, Y ) = E X Y E[X] E[Y ]
Solution:
E XY
E X = 100(.50) + 250(.50) = 175
E Y
= 0(.25) + 100(.25) + 200(.50) = 125
p. 15/
Solution:
E X =
E XY
=
=
Cov(X, Y ) =
2
= E Y
5
Z 1 Z 1x
24
x2 y 2 dydx
0
1
2
8
x (1 x) dx = 8B(3, 4) =
15
0
2
2
2
2
=
15
5
5
75
Z
p. 16/
p. 17/
and we have
= Corr(X, Y ) = p
Cov(X, Y )
Var(X) Var(Y )
1 Corr(X, Y ) +1
Multivariate Probability Distributions: Part II
p. 18/
p. 19/
Solution:
We have
E X
Var X
2
E Y
Var Y
p. 20/
Solution:
2
E X
= 24
Var X =
Corr(X, Y ) =
1x
2
1
x y dydx = = E Y
5
3
2
1
2
1
=
= Var Y
5
5
25
2
2
s 75 =
3
1
1
25
25
p. 21/
Theorem 6.7 :
Let Y1 , Y2 , , Yn and X1 , X2 , , Xm be random variables. For any
constants a1 , a2 , !
. . . , an and b1 , b2 , . . . , bm . Then the following hold:
n
n
X
X
(a) E
ai Yi =
ai E(Yi )
i=1
(b)
Var
n
X
i=1
ai Yi
i=1
n
X
a2i Var(Yi )
+2
i=1
ai aj Cov(Yi , Yj )
1i<jn
n
m
n X
m
X
X
X
(c) Cov
ai Yi ,
aj Xj =
ai bj Cov(Yi , Xj )
i=1
j=1
i=1 j=1
n
X
ai Yi
n
X
a2i Var(Yi )
i=1
i=1
Multivariate
Probability
Distributions: Part II
p. 22/
Solution:
n
E X
Var X
1X
1X
n
E(Xi ) =
=
=
n i=1
n i=1
n
n
n
1 X
1 X 2
n 2
2
Var(Xi ) = 2
= 2 =
2
n i=1
n i=1
n
n
p. 23/
Joke
p. 24/