You are on page 1of 18

Introduction to Statistics

Mathematics

Chapter
Properties of RV

Jointly Distributed Random


Variables

The joint probability mass function of X


and Y is p(x,y)=P(X=x,Y=y)
The probability mass function of X

p ( x, y )

p ( x)

f x, y dy

The probability mass function of Y

p ( x, y )

p( y)

f x, y dx

Expectation
E( X Y )

x y f ( x, y)dxdy

f
(
x
,
y
)
dy
dx

xf ( x)dx yf ( y )dy
E ( X ) E (Y )
E ( X 1 ... X n ) E ( X 1 ) ... E ( X n )

dy
f
(
x
,
y
)
dx

Independent R.V

X and Y are
independent if

E ( XY )

xyf ( x, y)dxdy
xyf ( x) f ( y )dxdy
xf ( x)dx yf ( y )dy

p ( x, y ) p ( x ) p ( y )

f ( x, y ) f ( x ) f ( y )

E ( X ) E (Y )

It is possible to
combine the
preceding theorems
to show that if ai are
constant and Xi are
jointly distributed RV,
then

ai X i ai E X i
i 1
i 1
k

If X and Y are
Independent, then

E g x h y

g x h y f1 x f 2 y dxdy

g x f1 x dx h y f 2 y dy
Eg x Eh y

Example

As another example of the usefulness of


equation above, let us use it to obtain
the expectation of a binomial r.v.

1, succes
Xi
; E ( X i ) p ; V ( X i ) pq
0, failed
X
X 1 ... X n
E ( X ) E ( X 1 ) ... E ( X n ) np
V ( X ) V ( X 1 ) ... V ( X n ) npq

Example

At a party N men throw their hats into


the center of a room. The hats are
mixed up and each man randomly
selects one. Find the expected number
of men that select their own hats

Solution

Letting X denote the number of men that


select their own hats, we can best
compute E(X) by noting that X = X 1+
+XN ; where Xi is indicator function if the
ith man select his own hat. So P(X i = 1) =
1/N. And so E(Xi) = 1/N. Hence We obtain
that E(X) = 1. So, no matter how many
people are at the party, on the average
exactly one of the men will select his
own hat.

Covariance and Variance of


Sums of Random Variables

The covariance of any two random


variables X and Y, denoted by Cov(X,Y),
is defined by
Cov(X,Y)
= E[(X-E[X])(Y-E[Y])]
= E(XY)-E(X)E(Y)
= XY

If X and Y are independent Cov(X,Y) = 0

Properties of Covariance

Cov(X,X) = Var(X)
Cov(cX,dY)

= E[c(X-E(X))d((Y-E(Y))]
cdCov(X,Y)

Cov(X,Y+Z)

= E[X(Y+Z)]-E[X]E[Y+Z]
= = E[XY]-E[X]E[Y] + E[XZ]-E[X]E[Z]
=(Cov(X,Y) + Cov(X,Z)

Cov(X+Y,U+W)
= (Cov(X+Y,U) +
Cov(X+Y,W)
= Cov(X,U)+Cov(Y,U)
+
Cov(X,W)
+Cov(Y,W)
= Cov(Xi,Yi)

The last property easily


generalizes to give

Cov X i , Y j
Cov( X i ,Y j )

If X1 and X2 are random variables with


joint pdf f(x1,x2), then
V X 1 X 2 E X 1 X 2 1 2
E X 1 1 X 2 2

E X 1 1 2 E X 1 1 X 2 2
2

E X 2 2

V ( X 1 ) V ( X 2 ) 2Cov X 1 , X 2

For more general case we can prove


that
V X1 X 2 X 3

V ( X 1 X 2 ) V ( X 3 ) 2Cov X 1 X 2 , X 3
V X i 2Cov( X 1 , X 2 ) 2Cov X 1 , X 3
2Cov X 2 , X 3

V X i 2 Cov( X i , X j )
i j

Variance of Sum Variabel

Var

X
i 1

Cov

X , X
i 1

j 1

Cov ( X i , X j )
i 1 j 1
n

Cov ( X i , X i ) Cov( X i , X j )
i 1
n

i 1 j i

V ( X i ) 2 Cov ( X i , X j )
i 1

i 1 j i

Proposition

Suppose that X1,,Xn are independent


and identically distributed with expected
value and variance 2. Then

Cov( X , X i X ) Cov( X , X i ) Cov ( X , X )


1n Cov ( X i X i , X j ) Var ( X )
j i

1n Cov ( X i , X i ) 1n Cov( X i X j ) Var ( X )


j i

2
n

2
n

Example

Sums of independent Poisson Random


Variables : Let A and Y be independent
Poisson random variables wirh
respective means 1 and 2 . Calculate
the distribution of X + Y.
Solution : Since the event {X+Y = n}
may be written as the union of the
disjoint events {X=k,Y=n-k}, 0kn,
we have

P ( X Y n) P{ X k , Y n k }
k 0
n

P{ X k }P{Y n k}
k 0
n

e
k 0

1 1k
k!

1 2

n!

e 12
n!

n2k

( n k )!

k 0

n!
k !( n k )!

1 2

k
1

nk
2

If X and Y are random


variables with
variances X2 , y2
and covariance xy ,
then the correlation
coefficient between X
and Y is

XY
XY

You might also like