Professional Documents
Culture Documents
Lesson 5
Chapter 4: Jointly Distributed Random
Variables
Michael Akritas
Department of Statistics
The Pennsylvania State University
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Quantifying Dependence
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Motivation
When we record more than one characteristic from each
population unit, the outcome variable is multivariate, e.g.,
Y = age of a tree, X = the trees diameter at breast height.
Y = propagation of an ultrasonic wave, X = tensile
strength of substance.
Of interest here is not only each variable separately, but also to
a) quantify the relationship between them, and b) be able to
predict one from another. The joint distribution (i.e., joint pmf, or
joint pdf) forms the basis for doing so.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
p(xi , yi ) = 1.
i
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
0.5
0.4
0.3
0.2
0.1
0
1
2
2
3
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
The joint pmf of X = amount of drug administered to a
randomly selected rat, and Y = the number of tumors the rat
develops, is:
p(x, y)
0.0 mg/kg
1.0 mg/kg
2.0 mg/kg
0
.388
.485
.090
y
1
.009
.010
.008
2
.003
.005
.002
Thus 48.5% of the rats will receive the 1.0 mg dose and will
develop 0 tumors, while 40% of the rats will receive the 0.0 mg
dose.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Duration
>3
0.25
0.10
0.07
<3
0.30
0.15
0.13
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Quantifying Dependence
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
p(x, y )
0 mg/kg
1 mg/kg
2 mg/kg
0
.388
.485
.090
.963
Michael Akritas
y
1
.009
.010
.008
.027
2
.003
.005
.002
.010
.400
.500
.100
1.000
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
pY (y ) =
p(x, y ).
x in SX
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
p(x, y)
P(X = x, Y = y )
=
,
P(X = x)
pX (x)
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
0
.388/.4=.97
.090/.1=.9
Michael Akritas
1
.009/.4=.0225
.008/.1=.08
2
.003/.4=.0075
.002/.1=.02
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Find the conditional expected value and variance of the number
of tumors when X = 2.
Solution. Using the conditional pmf that we found before, we
have,
E(Y |X = 2) = 0 (.9) + 1 (.08) + 2 (.02) = .12.
Compare this with E(Y ) = .047. Next,
E(Y 2 |X = 2) = 0 (.9) + 1 (.08) + 22 (.02) = .16.
so that Y2 |X (2) = .16 .122 = .1456
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Proposition
1
pY |X (y|x)pX (x)
x in SX
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Quantifying Dependence
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
In the example where X = amount of drug administered and
Y = number of tumors developed we have:
y
pY |X =0 (y)
pY |X =1 (y)
pY |X =2 (y)
0
.97
.97
.9
1
.0225
.02
.08
2
.0075
.01
.02
0
0.0375
Michael Akritas
1
0.04
2
0.12
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Use the regression function obtained in the previous example, and the
x
0 1 2
pmf of X , which is
, to find Y .
pX (x) .4 .5 .1
Solution: 0.0375 .4 + 0.04 .5 + 0.12 .1 = .047
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Quantifying Dependence
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Let X = 1 or 0, according to whether component A works or
not, and Y = 1 or 0, according to whether component B works
or not. From their repair history it is known that the joint pmf of
(X , Y ) is
Y
0
0
0.0098
1
0.9702
0.98
0.0002
0.01
0.0198
0.99
0.02
1.0
X
1
Are X , Y independent?
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Proposition
1
Example
Consider the two-component system of the previous example
and suppose that the failure of component A incurs a cost of
$500.00, while the failure of component B incurs a cost of
$750.00. Let CA and CB be the costs incurred by the failures of
components A and B respectively. Are CA and CB
independent?
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Proposition
Each of the following statements implies, and is implied by, the
independence of X and Y .
1
pY |X (y|x) = pY (y ).
pX |Y (x|y ) = pX (x).
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
X takes the value 0, 1, or 2 if a child under 5 uses no seat belt,
uses adult seat belt, or uses child seat. Y takes the value 0, or
1 if a child survives car accident, or not. Suppose that
y
pY |X =0 (y)
pY |X =1 (y)
pY |X =2 (y)
and
x
pX (x)
0
0.54
1
0.17
0
0.69
0.85
0.84
1
0.31
0.15
0.16
2
. Are X and Y independent?
0.29
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example (Continued)
Here pY |X =x (y) depends on x and so X , Y are not
independent.
Alternatively, the joint pmf of X , Y is
y
p(x, y)
0
1
2
pY (y)
0
0.3726
0.1445
0.2436
0.7607
1
0.1674
0.0255
0.0464
0.2393
Here
pX (0)pY (0) = 0.54 0.7607 = 0.4108 6= p(0, 0) = 0.3726.
Thus X , Y are not independent.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Show that if X and Y are independent then
E(Y |X = x) = E(Y ).
Thus, if X , Y are independent then the regression function of Y
on X is a constant function.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
XX
x
2
h(X
,Y )
E[h2 (X , Y )]
[E(h(X , Y ))]2
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Find E(X + Y ) and X2 +Y if the joint pmf of (X , Y ) is
y
p(x, y )
0
1
2
0
0.10 0.04 0.02
1
0.08 0.20 0.06
2
0.06 0.14 0.30
(3.1)
Next
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example (Continued)
E[(X + Y )2 ] = (0)0.1 + (1)0.04 + (22 )0.02 + (1)0.08 + (22 )0.2
+
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Proposition
If X and Y are independent, then
E(g(X )h(Y )) = E(g(X ))E(h(Y ))
holds for any functions g(x) and h(y ).
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Quantifying Dependence
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Proposition
Let X1 , . . . , Xn be any r.v.s (i.e discrete or continuous,
independent or dependent), with E(Xi ) = i . Then
E (a1 X1 + + an Xn ) = a1 1 + + an n .
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Corollary
1. Let X1 , X2 be any two r.v.s. Then
E (X1 X2 ) = 1 2 , and E (X1 + X2 ) = 1 + 2 .
2. Let X1 , . . . , Xn be such that E(Xi ) = for all i. Then
E(T ) = n, E X = .
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Tower 1 is constructed by stacking 30 segments of concrete
vertically. Tower 2 is constructed similarly. The height, in
inches, of a randomly selected segment is uniformly distributed
in (35.5,36.5). Find
a) the expected value of the height, T1 , of tower 1,
b) the expected value of the height, T2 of tower 2, and
c) the expected value of the difference T1 T2 .
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Let N be the number of accidents per month in an industrial
complex, and Xi the number of injuries caused by the ith
accident. Suppose the Xi are independent from N and
E(Xi ) = 1.5 for
Pall i. If E(N) = 7, find the expected number of
injuries, Y = N
i=1 Xi , in a month.
Solution: First note that,
E(Y |N = n) = E
N
X
!
Xi |N = n
=E
i=1
n
X
Michael Akritas
!
Xi |N = n
i=1
E(Xi |N = n) =
i=1
n
X
n
X
E(Xi ) = 1.5n
i=1
Lesson 5 Chapter 4: Jointly Distributed Random Variables
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
1.5nP(N = n)
n in SN
= 1.5
n in SN
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
N
X
!
Xi
= E(N)
i=1
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Quantifying Dependence
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Var(X + Y ) = E
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Let X = deductible in car insurance, and Y = deductible in
home insurance, of a randomly chosen home and car owner.
Suppose that the joint pmf of X , Y is
100
0
.20
y
100
.10
200
.20
.5
.05
.25
.15
.25
.30
.50
.5
1.0
x
250
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Solution:
First,
E(XY ) =
XX
x
and
E(X ) =
Thus,
XY = 23, 750 175 125 = 1875.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Proposition
1
If X1 , X2 are independent,
Var(X1 + X2 ) = 12 + 22 , and Var(X1 X2 ) = 12 + 22 .
Without independence,
Var(X1 + X2 ) = 12 + 22 + 2Cov(X1 , X2 )
Var(X1 X2 ) = 12 + 22 2Cov(X1 , X2 )
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Proposition
1
If X1 , . . . , Xn are independent,
Var(a1 X1 + + an Xn ) = a12 12 + + an2 n2 ,
where i2 = Var(Xi ).
If X1 , . . . , Xn , are dependent,
Var(a1 X1 + + an Xn ) = a12 12 + + an2 n2 +
n X
X
ai aj ij ,
i=1 j6=i
where ij = Cov(Xi , Xj ).
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Corollary
1
2
and Var (T ) = n 2 ,
n
Pn
i=1 Xi .
Michael Akritas
p(1 p)
n
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Let X denote the number of short fiction books sold at the State
College, PA, location of the Barnes and Noble bookstore in a
given week, and let Y denote the corresponding number sold
on line from State College residents. We are given that
E(X ) = 14.80, E(Y ) = 14.51, E(X 2 ) = 248.0, E(Y 2 ) = 240.7,
and E(XY ) = 209. Find Var (X + Y ).
Solution: Since Var (X ) = 248.0 14.82 = 28.96,
Var (Y ) = 240.7 14.512 = 30.16, and
Cov (X , Y ) = 209 (14.8)(14.51) = 5.75, we have
Var (X + Y ) = Var (X ) + Var (Y ) + 2Cov (X , Y )
= 28.96 + 30.16 2 5.75 = 47.62.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Proposition
1. Cov(X , Y ) = Cov(Y , X ).
2. Cov(X , X ) = Var(X ).
3. If X , Y are independent, then Cov(X , Y ) = 0.
4. Cov(aX + b, cY + d) = acCov(X , Y ), for any real numbers
a, b, c and d.
P
P P
Pn
m
m
n
5. Cov
X
,
Y
i=1 i
j=1 j =
i=1
j=1 Cov Xi , Yj .
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
1
0
1
0
1/3
0
1/3
2/3
1
0
1/3
0
1/3
1/3
1/3
1/3
1.0
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Consider the example with the car and home insurance
deductibles, but suppose that the deductible amounts are given
in cents. Find the covariance of the two deductibles.
Solution: Let X 0 , Y 0 denote the deductibles of a randomly
chosen home and car owner in cents. If X , Y denote the
deductibles in dollars, then X 0 = 100X , Y 0 = 100Y . According
to part 4 of the proposition,
X 0 Y 0 = 100 100 XY = 18, 750, 000.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
On the first day of a wine tasting event three randomly selected
judges are to taste and rate a particular wine before tasting any
other wine. On the second day the same three judges are to
taste and rate the wine after tasting other wines. Let X1 , X2 , X3
be the ratings, on a 100 point scale, in the fist day, and
Y1 , Y2 , Y3 be the ratings on the second day. We are given that
the variance of each Xi is X2 = 9, the variance of each Yi is
Y2 = 4, the covariance Cov(Xi , Yi ) = 5, for all i = 1, 2, 3, and
Cov(Xi , Yj ) = 0 for all i 6= j. Find the variance of combined
rating X + Y .
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Solution.
First note that
Var(X + Y ) = Var(X ) + Var(Y ) + 2Cov(X , Y ).
Next, with the information given, we have Var(X ) = 9/3 and
Var(Y ) = 4/3. Finally, using the proposition we have
Cov(X , Y ) =
3 X
3
X
11
i=1 j=1
33
Cov(Xi , Yj ) =
3
X
11
i=1
33
Cov(Xi , Yi ) = 3
11
5.
33
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
b)
Example (Variance of the Binomial RV and of p
b).
Let X Bin(n, p). Find Var (X ), and Var (p
Solution: Use the expression
X =
n
X
Xi ,
i=1
where the Xi are iid Bernoulli(p) RVs. Since Var (Xi ) = p(1 p),
X
p(1 p)
b
Var (X ) = np(1 p), Var (p) = Var
=
.
n
n
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
r
X
Xi ,
i=1
n
X
X2 i = r (1 p)/p2
i=1
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
N
1X
(xi X )(yi Y ),
N
i=1
P
1 PN
where X = N1 N
i=1 xi and Y = N
i=1 yi are the
marginal expected values of X and Y .
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
XY
.
X Y
1 (X , Y ) 1.
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Find the correlation coefficient of the deductibles in car and
home insurance of a randomly chosen car and home owner,
when the deductibles are expressed a) in dollars, and b) in
cents.
Solution: If X , Y denote the deductibles in dollars, we saw that
XY = 1875. Omitting the details, it can be found that X = 75
and Y = 82.92. Thus,
XY =
1875
= 0.301.
75 82.92
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Let X U(0, 1), and Y = X 2 . Find XY .
Solution: We have
XY
1 11
1
=
.
4 23
12
q
q
1
4
2
Omitting calculations, X = 12
, Y = 45
= 3
. Thus,
5
= E(X 3 ) E(X )E(X 2 ) =
XY
3 5
= = 0.968.
2 12
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Let X U(1, 1), and let Y = X 2 .
Check that here XY = 0.
Note that the dependence of X and Y is not monotone.
Definition
Two variables having zero correlation are called uncorrelated.
Independent variables are uncorrelated, but uncorrelated
variables are not necessarily independent.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
SX ,Y =
1 X
(Xi X )(Yi Y )
n1
i=1
rX ,Y =
SX ,Y
SX SY
Michael Akritas
Sample versions of
covariance and
correlation coefficient
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Computational formula:
" n
X
1
1
Xi Yi
SX ,Y =
n1
n
i=1
n
X
i=1
!
Xi
n
X
!#
Yi
i=1
R commands:
R commands for covariance and correlation
cov(x,y) gives SX ,Y
cor(x,y) gives rX ,Y
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
To calibrate a method for measuring lead concentration in
water, the method was applied to twelve water samples with
known lead content. The concentration measurements, y, and
the known concentration levels, x, are (in R notation)
x=c(5.95, 2.06, 1.02, 4.05, 3.07, 8.45, 2.93, 9.33, 7.24, 6.91,
9.92, 2.86) and y=c(6.33, 2.83, 1.65, 4.37, 3.64, 8.99, 3.16,
9.54, 7.11, 7.10, 8.84, 3.56)
Compute the sample covariance and correlation coefficient.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Hierarchical Models
Hierarchical models use the Multiplication Rules, i.e.
p(x, y ) = pY |X =x (y )pX (x), for the joint pmf
f (x, y ) = fY |X =x (y)fX (x), for the joint pdf
in order to specify the joint distribution of X , Y by first
specifying the conditional distribution of Y given X = x,
and then specifying the marginal distribution of X .
This hierarchical method of modeling yields a very rich and
flexible class of joint distributions.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Let X be the number of eggs an insect lays and Y the number
of eggs that survive. Model the joint pmf of X and Y .
Solution: With the principle of hierarchical modeling, we need
to specify the conditional pmf of Y given X = x, and the
marginal pmf of X . One possibile specification is
Y |X = x Bin(x, p), X Poisson(),
which leads to the following joint pmf of X and Y .
x y
e x
p(x, y) = pY |X =x (y)pX (x) =
p (1 p)xy
x!
y
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Commentaries (Continued):
3. 2 is the conditional variance of Y given X and is called
error variance.
4. See relation (4.6.3), for the joint pdf of X and Y . A more
common form of the bivariate pdf involves X and Y , X2 ,
Y2 and XY ; see (4.6.12).
5. Any linear combination, a1 X + a2 Y , of X and Y has a
normal distribution.
A plot a bivariate normal pdf is given in the next slide:
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
0.15
0.10
0.05
3
2
1
-3
x2
-2
-1
0
x1
-1
1
-2
2
Michael Akritas
-3
Lesson 5 3
Chapter 4: Jointly Distributed Random Variables
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
0.15
0.10
z
3
2
0.05
1
0
-2
x2
0.00
-3
-1
-1
0
x1
-2
1
2
Michael Akritas
-3
3 4: Jointly Distributed Random Variables
Lesson 5 Chapter
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Regression Models
Regression models focus primarily on the regression
function of a variable Y on another variable X .
For example, X = speed of an automobile and Y = the
stopping distance.
(5.1)
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
(5.2)
(5.3)
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Proposition
In the linear regression model
1
X ,Y
= X ,Y Y .
2
X
X
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Suppose Y = 5 2x + , and let = 4, X = 7 and X = 3.
a) Find Y2 , and X ,Y .
b) Find Y .
Solution: For a) use the formulas in the above proposition:
Y2
X ,Y
= 42 + (2)2 32 = 52
= 1 X /Y = 2 3/ 52 = 0.832.
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
(5.4)
(5.5)
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Example
Suppose Y = 5 2x + , and let N(0, 2 ) where = 4. Let
Y1 , Y2 be observations taken at X = 1 and X = 2, respectively.
a) Find the 95th percentile of Y1 and Y2 .
b) Find P(Y1 > Y2 ).
Solution:
a) Y1 N(3, 42 ), so its 95th percentile is
3 + 4 1.645 = 9.58. Y2 N(1, 42 ), so its 95th percentile
is 1 + 4 1.645 = 7.58.
b) Y1 Y2 N(2, 32) (why?). Thus,
P(Y1 > Y2 ) = P(Y1 Y2 > 0) = 1
Michael Akritas
32
= 0.6382.
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas
Outline
The Joint Probability Mass Function
Mean Value of Functions of Random Variables
Quantifying Dependence
Models for Joint Distributions
Michael Akritas