Professional Documents
Culture Documents
Eigenvalues of a matrix determine whether or not the linear system tends toward
zero, away from zero, or both away and toward zero as k, time, approaches infinity.
Classifying the origin as an attractor, repeller or saddle point for a discrete linear
dynamical system
The origin is an attractor if the magnitude of eigenvalues of A are less than 1. The
origin is a repeller if the magnitude of eigenvalues of A are greater than 1. It is a
saddle point if the eigenvalues both tend towards and away from zero as k, time,
approaches infinity. i.e. If the magnitudes of the eigenvalues are both greater than 1
and also less than 1.
Linear systems of differential equations
Initial value problems for linear systems of differential equations
The use of eigenvalues and eigenvectors to give a formula for solutions of initial
value
problems
Formula: xn(t)=c1v1elambda1t++cnvnelambdant
Classifying the origin as an attractor (sink), repeller (source) or saddle point for a
linear
system of differential equations
If the eigenvalues are negative, then the origin is an attractor. If the eigenvalues are
positive, then the origin is a repeller. If they are both negative and positive, then the
origin is a saddle point.
Euclidean dot product, norm and orthogonality
Unit vectors and normalizing vectors
Normalizing u: (1/||u||)* [vector u]
||u||=sqrt(u dot product u)
Euclidean distance
Distance from u and v: ||u-v||
Orthogonal vectors, orthogonal sets, and orthonormal sets
Two vectors u and v are orthogonal to each other if u dot product v=0. (also see
Pythagorean Theorem)
Such vectors u and v are orthonormal if ||u||=1 and ||v||=1. (i.e. Length=1)
Orthogonal basis
A set of vectors is an orthogonal basis if all vectors are orthogonal to each other.
Orthonormal basis
A set of vectors is an orthonormal basis if all vectors are orthogonal to each other
and if their lengths are equal to 1.
Orthogonal complement
Let W be a subspace of Rn. If a vector x is orthogonal to every single vector in W,
then it is said that the set of all vectors x are orthogonal to W, and x is said to be
the orthogonal complement of W denoted by W perpendicular symbol.
The orthogonal complement of a row space of matrix A is the null space of A, and
the orthogonal complement of the column space of A is the null space of A T.
Pythagorean theorem
Two vectors u and v are orthogonal if and only if ||u+v||^2=||u||^2+||v||^2.
Angles between vectors
Expressing vectors as a linear combination of vectors from an orthogonal basis
Orthogonal projection (and its geometric interpretation)
Orthogonal matrices
The Best Approximation Theorem
Let W be a subspace of Rn, let y be any vector in Rn, and let y^ be the orthogonal
projection of y onto W. Then y^ is the closest point in W to y, in the sense that ||yy^||<||y-v|| for all v in W distinct from y^.
Using an orthogonal basis for a subspace to find an orthogonal projection onto
that subspace
An orthogonal basis is a set of vectors that are orthogonal to each other (i.e. dot
products of each vector to each other=0). X={x1, x2, x3}; Proj xv=((v dot product
x1)/(x1 dot product x1))[vector x1]+((v dot product x2)/(x2 dot product x2))[vector x2]+
+((v dot product xn)/(xn dot product xn))[vector xn].
Gram Schmidt Process
Given {x1,x2} find an orthogonal basis {v1,v2}
Let v1=x1
v2=x2-projv1x2
v3=x3-projv1x3-projv2x3
The general least squares problem
ATAx=ATb
Least squares solutions to linear systems
ATAx=ATb, Solve for x.
The normal equations for Ax=b