You are on page 1of 4

Eigenvalues and eigenvectors

Eigenspace corresponding to an eigenvalue


The eigenspace corresponding to an eigenvalue is a basis formed from a set of
linearly independent eigenvectors.
Elementary row operations do not preserve eigenvalues
Linear independence of eigenvectors associated with distinct eigenvalues
If an n x n matrix A has n distinct eigenvalues, then it is diagonalizable and contains
n linearly independent eigenvectors.
Characteristic polynomial and the characteristic equation
The characteristic equation of finding the characteristic polynomial is det(A-lambda
I)x=0.
Algebraic multiplicity and geometric multiplicity of an eigenvalue
A matrix A is diagonalizable if and only if alg(lamba i)=geom(lambdai) where lambda
is an eigenvalue of A and i is equal to the number of eigenvalues of A.
Relationship between the eigenvalues of a matrix and the determinant of a matrix
An n x n matrix A is invertible if and only if the determinant is not equal to zero, and
zero is not an eigenvalue of A.
Relationship between the eigenvalues of a matrix and the trace of a matrix
The trace of a matrix is the sum of the diagonal entries of that matrix.
Similar matrices
If a matrix is similar, then they have the same characteristic polynomial and hence
the same eigenvalues (with the same multiplicities).
A=PBP-1
Diagonalization and the diagonalization theorem
The Diagonalization Theorem
An n x n matrix A is diagonalizable if and only if A has n linearly independent
eigenvectors. In fact, A =PDP-1 with D a diagonal matrix if and only if the columns of
P are n linearly independent eigenvectors of A. The diagonal entries of D are
eigenvalues of A corresponding to the eigenvectors in P.
Discrete linear dynamical systems
Formula: xk=c1(eigenvalue1)k[eigenvector1]++cn(eigenvaluen)k[eigenvectorn]
The use of eigenvalues and eigenvectors to determine the behavior of solutions to
discrete linear dynamical systems

Eigenvalues of a matrix determine whether or not the linear system tends toward
zero, away from zero, or both away and toward zero as k, time, approaches infinity.
Classifying the origin as an attractor, repeller or saddle point for a discrete linear
dynamical system
The origin is an attractor if the magnitude of eigenvalues of A are less than 1. The
origin is a repeller if the magnitude of eigenvalues of A are greater than 1. It is a
saddle point if the eigenvalues both tend towards and away from zero as k, time,
approaches infinity. i.e. If the magnitudes of the eigenvalues are both greater than 1
and also less than 1.
Linear systems of differential equations
Initial value problems for linear systems of differential equations
The use of eigenvalues and eigenvectors to give a formula for solutions of initial
value
problems
Formula: xn(t)=c1v1elambda1t++cnvnelambdant
Classifying the origin as an attractor (sink), repeller (source) or saddle point for a
linear
system of differential equations
If the eigenvalues are negative, then the origin is an attractor. If the eigenvalues are
positive, then the origin is a repeller. If they are both negative and positive, then the
origin is a saddle point.
Euclidean dot product, norm and orthogonality
Unit vectors and normalizing vectors
Normalizing u: (1/||u||)* [vector u]
||u||=sqrt(u dot product u)
Euclidean distance
Distance from u and v: ||u-v||
Orthogonal vectors, orthogonal sets, and orthonormal sets
Two vectors u and v are orthogonal to each other if u dot product v=0. (also see
Pythagorean Theorem)
Such vectors u and v are orthonormal if ||u||=1 and ||v||=1. (i.e. Length=1)
Orthogonal basis
A set of vectors is an orthogonal basis if all vectors are orthogonal to each other.
Orthonormal basis

A set of vectors is an orthonormal basis if all vectors are orthogonal to each other
and if their lengths are equal to 1.
Orthogonal complement
Let W be a subspace of Rn. If a vector x is orthogonal to every single vector in W,
then it is said that the set of all vectors x are orthogonal to W, and x is said to be
the orthogonal complement of W denoted by W perpendicular symbol.
The orthogonal complement of a row space of matrix A is the null space of A, and
the orthogonal complement of the column space of A is the null space of A T.
Pythagorean theorem
Two vectors u and v are orthogonal if and only if ||u+v||^2=||u||^2+||v||^2.
Angles between vectors
Expressing vectors as a linear combination of vectors from an orthogonal basis
Orthogonal projection (and its geometric interpretation)
Orthogonal matrices
The Best Approximation Theorem
Let W be a subspace of Rn, let y be any vector in Rn, and let y^ be the orthogonal
projection of y onto W. Then y^ is the closest point in W to y, in the sense that ||yy^||<||y-v|| for all v in W distinct from y^.
Using an orthogonal basis for a subspace to find an orthogonal projection onto
that subspace
An orthogonal basis is a set of vectors that are orthogonal to each other (i.e. dot
products of each vector to each other=0). X={x1, x2, x3}; Proj xv=((v dot product
x1)/(x1 dot product x1))[vector x1]+((v dot product x2)/(x2 dot product x2))[vector x2]+
+((v dot product xn)/(xn dot product xn))[vector xn].
Gram Schmidt Process
Given {x1,x2} find an orthogonal basis {v1,v2}
Let v1=x1
v2=x2-projv1x2
v3=x3-projv1x3-projv2x3
The general least squares problem
ATAx=ATb
Least squares solutions to linear systems
ATAx=ATb, Solve for x.
The normal equations for Ax=b

ATAx=ATb (coincides with least squares solution)

The relationship between orthogonal projection and least squares solutions to


linear
systems
If the columns of A are orthogonal, then you can find the projection of b onto A.
Then use the projection of b onto A and form the augmented matrix [A proj Ab] and
use row operations to find the RREF, which gives you the least squares solution in
terms of x.
Line of best least squares fit, and other least squares fits of data
Form [1 x] and y matrices from the data given, solve X TX(beta)=XTy.
Observation vector, design matrix and predicted values from least squares fits
Eigenvalue and eigenvector properties of real symmetric matrices
A symmetric matrix is a matrix A such that AT=A.
If a matrix A is symmetric, then any two eigenvectors from different eigenspaces of
A are orthogonal.
If a matrix A is symmetric, then the eigenspaces of A are mutually orthogonal.
Orthogonally diagonlizable
An n x n matrix is orthogonally diagonalizable if and only if A T=A (aka symmetric). In
order to orthogonally diagonalize, use equation A=PDP -1. Find eigenvalues,
eigenvectors of A. P is, in this case, formed from the normalized eigenvectors of A,
and D is still the eigenvalues of A corresponding to eigenvectors in P.
The spectral theorem for real symmetric matrices
An n x n symmetric matrix A has the following properties:
a. A has n real eigenvalues, counting multiplicities.
b. The dimension of the eigenspace for each eigenvalue lambda equals the
multiplicity of lambda as a root of the characteristic equation.
c. The eigenspaces are mutually orthogonal, in the sense that eigenvectors
corresponding to different eigenvalues are orthogonal.
d. A is orthogonally diagonalizable.

You might also like