Professional Documents
Culture Documents
Independence
1/53
Random vectors
Independence
Random vectors
The theory of linear algebra provides us with a good grounding to
analyse our linear models. However we must still do some more
groundwork. Once we have done this, the theoretical results come
out quite easily!
Previously, we were thinking of matrices and vectors simply as a
bunch of numbers. However, there is no reason why we cant think
of them as a bunch of random variables!
We can then extend the traditional concepts of expectation,
variance, etc. to random vectors.
2/53
Random vectors
Independence
Expectation
Although traditionally random variables are denoted with capital
letters, in keeping with our linear algebra notation, we will denote
them by lowercase.
We define the expectation of a random vector y to be the vector
of expectations of its components:
y1
E [y1 ]
y2
E [y2 ]
If y = . then E [y] =
.. .
..
.
yk
E [yk ]
3/53
Random vectors
Independence
Expectation properties
I
Example. Let
A=
2 3
1 4
,y =
y1
y2
4/53
Random vectors
Independence
2y1 + 3y2
y1 + 4y2
E [2y1 + 3y2 ]
E [y1 + 4y2 ]
2E [y1 ] + 3E [y2 ]
E [y1 ] + 4E [y2 ]
80
= AE [y].
90
5/53
Random vectors
Independence
Variance
Defining the variance of a random vector is slightly trickier. We
want to not just include the variance of the variables themselves,
but also how the variables affect each other.
Recall that the variance of a random variable Y with mean is
defined to be E [(Y )2 ]. Now let y be as before, a k 1 vector
of random variables. We define the variance of y (sometimes
known as the covariance matrix) to be
var y = E [(y )(y )T ]
where = E [y].
6/53
Random vectors
Independence
7/53
Random vectors
Independence
Variance properties
8/53
Random vectors
Independence
Example. Let
y1
y = y2
y3
be a random vector, such that var yi = 2 for all i , and that the
elements of y are independent. This means that cov(yi , yj ) = 0 for
i 6= j , so the covariance matrix of y is
2
0 0
var y = V = 0 2 0 = 2 I .
0 0 2
9/53
Random vectors
Independence
10/53
Random vectors
Independence
11/53
Random vectors
Independence
Multivariate normal
Definition
Let z be a k 1 vector of i.i.d. standard normal r.v.s, A an n k
matrix, and b an n 1 vector, then we say that
x = Az + b
is (an n dimensional) multivariate normal, with mean = E x = b
and covariance matrix = var x = AAT .
We write x MVN (, ) or just x N (, ).
For any and any symmetric positive semidefinite matrix , let z
be a vector of i.i.d. standard normals, then
+ 1/2 z MVN (, ).
12/53
Random vectors
Independence
13/53
Random vectors
Independence
14/53
Random vectors
Independence
15/53
Random vectors
Independence
library(MASS)
a <- matrix(c(3, 1), 2, 1)
V <- matrix(c(1, .8, .8, 1), 2, 2)
y <- mvrnorm(100, mu = a, Sigma = V)
plot(y[,1], y[,2])
16/53
Independence
1
1
y[, 2]
Random vectors
y[, 1]
Linear Statistical Models: Random vectors
17/53
Random vectors
Independence
P <- eigen(V)$vectors
sqrtV <- P %*% diag(sqrt(eigen(V)$values)) %*% t(P)
z <- matrix(rnorm(200), 2, 100)
y_new <- sqrtV %*% z + rep(a, 100)
points(y_new[1,], y_new[2,], col = "red")
18/53
Independence
1
1
y[, 2]
Random vectors
y[, 1]
Linear Statistical Models: Random vectors
19/53
Random vectors
Independence
20/53
Random vectors
Independence
Theorem
Let y be a random vector with E [y] = and var y = V , and let
A be a matrix of constants. Then
E [yT Ay] = tr (AV ) + T A.
21/53
Random vectors
Independence
22/53
Random vectors
Independence
4 1
1 2
.
23/53
Random vectors
Independence
24/53
Random vectors
Independence
25/53
Random vectors
Independence
26/53
Random vectors
Independence
Noncentral 2 distribution
Definition
Let y = (yi ) be a k 1 normally distributed
vector with
Prandom
k
T
2
mean and variance I . Then x = y y = i=1 yi follows a
noncentral 2 distribution with k degrees of freedom and
noncentrality parameter = 12 T . We write x 2k , .
27/53
Random vectors
Independence
28/53
Independence
0.10
0.05
0.00
chisq 4 df lambda = 0, 1, 2
0.15
Random vectors
10
15
x
Linear Statistical Models: Random vectors
29/53
Random vectors
Independence
Theorem
Let Xk21 ,1 , Xk22 ,2 , . . . , Xk2n ,n be a collection of n independent
noncentral 2 random variables, with k1 , k2 , . . . , kn degrees of
freedom respectively and noncentrality parameters 1 , 2 , . . . , n
respectively. Then
n
X
Xk2i ,i
i=1
Pn
has a noncentral
distribution
Pn with i=1 ki degrees of freedom
and noncentrality parameter i=1 i .
2
30/53
Random vectors
Independence
Theorem
Let y be a n 1 normally distributed random vector with mean
and variance I and let A be a n n symmetric matrix. Then
yT Ay has a noncentral 2 distribution with k degrees of freedom
and noncentrality parameter = 12 T A if and only if A is
idempotent and has rank k .
31/53
Random vectors
Independence
32/53
Random vectors
Independence
33/53
Random vectors
Independence
Corollary
Let y be a n 1 normally distributed random vector with mean 0
and variance I and let A be a n n symmetric matrix. Then
yT Ay has a (ordinary) 2 distribution with k degrees of freedom if
and only if A is idempotent and has rank k .
Corollary
Let y be a n 1 normally distributed random vector with mean
and variance 2 I and let A be a n n symmetric matrix. Then
1 T
y Ay has a noncentral 2 distribution with k degrees of
2
freedom and noncentrality parameter = 21 2 T A if and only if
A is idempotent and has rank k .
34/53
Random vectors
Independence
yT Ay =
1
y1 y2
2
1 1
1 1
y1
y2
1
1
= y12 + y1 y2 + y22
2
2
35/53
Random vectors
Independence
Theorem
Let y be a n 1 normal random vector with mean and variance
V , and let A be a n n symmetric matrix. Then yT Ay has a
noncentral 2 distribution with k degrees of freedom and
noncentrality parameter = 21 T A if and only if AV is
idempotent and has rank k .
36/53
Random vectors
Independence
Corollary
Let y be a n 1 normal random vector with mean 0 and variance
V and let A be a n n symmetric matrix. Then yT Ay has a
(ordinary) 2 distribution with k degrees of freedom if and only if
AV is idempotent and has rank k .
Corollary
Let y be a n 1 normal random vector with mean and variance
V of full rank. Then yT V 1 y has a noncentral 2 distribution
with n degrees of freedom and noncentrality parameter
= 21 T V 1 .
37/53
Random vectors
Independence
38/53
Random vectors
Independence
39/53
Random vectors
Independence
0.08
0.06
0.00
0.02
0.04
Density
0.10
0.12
0.14
Histogram of x
10
15
20
x
Linear Statistical Models: Random vectors
40/53
Random vectors
Independence
41/53
Random vectors
Independence
Theorem
Let y be a n 1 normal random vector with mean and variance
V of full rank, and let A and B be symmetric n n matrices.
Then yT Ay and yT B y are independent if and only if
AVB = 0.
42/53
Random vectors
Independence
43/53
Random vectors
Independence
44/53
Random vectors
Independence
45/53
Random vectors
Independence
46/53
Random vectors
Independence
AVB
1 0
0 0
1 c
c 1
1 0
0 0
0 c
0 1
0 c
0 0
=
=
=
0 0
0 1
47/53
Random vectors
Independence
Corollary
Let y be a random normal vector with mean and variance 2 I ,
and let A and B be symmetric matrices. Then yT Ay and yT B y
are independent if and only if AB = 0.
48/53
Random vectors
Independence
Theorem
Let y be a n 1 normal random vector with mean and variance
V , and let A be a n n symmetric matrix and B a m n matrix.
Then yT Ay and B y are independent if and only if BVA = 0.
Lastly, we can combine several of the theorems we have seen
before to tell when a group of quadratic forms (more than two) are
independent.
49/53
Random vectors
Independence
Theorem
Let y be a normal random vector with mean and variance I , and
let A1 , A2 , . . . , Am be a collection of m symmetric matrices. If any
two of the following statements are true:
I
Ai Aj = 0 for all i 6= j ;
50/53
Random vectors
Independence
P
When i Ai = I , the previous result can be seen as a special case
of the following result (which we will not prove):
X 1
1 T
y
y
=
yT Ai y.
2
2
i=1
r (Ai ) = n.
i=1
51/53
Random vectors
Independence
Example
> A <- matrix(1, 2, 2)
> B <- matrix(c(1,-1,-1,1), 2, 2)
> A %*% B
[1,]
[2,]
>
>
>
>
[,1] [,2]
0
0
0
0
[1] 0.0662571
52/53
Random vectors
Independence
15
0
10
x2
20
25
30
10
20
x1
30
40
53/53