You are on page 1of 2

Math 415 Final Exam: Denitions

Triangular factorization: L is lower triangular with 1s on the diagonal and multipliers lij from elimination below the
diagonal. U is the upper triangular matrix which appears after forward elimination, and diagonal entries are pivot
elements.
Inverse of a matrix: a matrix B such that BA=I. There is at most one inverse B
Vector Space: A subset of vectors is a subspace if
it contains at least the zero vector such that x+0=x
it is closed under addition (if x and y are in the space, x+y is also)
it is closed under scalar multiplication (if x is in V, cx is in V for c in R)
commutativity, associativity, distributivity
for every x in V there is some -x such that -x+x=0
subspace: basically the above
column space: the subspace of R
m
spanned by the columns of A. The dimension is at most n. aka the subspace formed
by all b in R
m
such that Ax=b has a solution
null space: the space of all x in R
n
such that Ax=0
echelon form: rst nonzero entry in each row is a pivot, below each pivot are zeros, every pivot lies to the right of the
pivot in the row above, any all-zero rows are at the bottom
reduced row echelon form: an echelon matrix with leading 1s and all entries above or below pivots are zeros
linear dependence: if v1vn are vectors in R
m
, they are linearly independent if, whenever c1v1++cnvn=0, all ci are
zero. If any c is nonzero, the set is dependent
basis: a basis is a set of vectors which is linearly independent and which spans the space
linear transformation: any transformation which satises the rule of linearity: for any numbers c and d and vectors x and
y, if a matrix A models the transformation, A(cx+dy) = cAx+dAy
orthogonal subspaces: two subspaces V and W in R
n
are orthogonal if every vector v in V is orthogonal to every vector
w in W
Orthogonal complement: for a vector space V, the set of all vectors orthogonal to V
Determinants:
the determinant of I is 1
the determinant of a matrix changes sign when any two rows are interchanged
taking the determinant is linear in the rst row
| a+a b+b | | a b | + | a b |
| c d | = | c d | | c d | and k times rst row is k|A|
orthonormal basis: it vectors q1qn are a basis for a space, the basis is orthonormal if qi
T
qi = 1 and qi
T
qj = 0 for i not
equal to j
orthogonal matrix: a matrix with orthonormal columns
Fourier series: a sum of sines and cosines which models a function f(x) on some interval. discrete means a non-innite
sum which insists on agreement on some nite number of evenly-spaced points on the interval
eigenvalue: a number " associated to a matrix A such that |A-"I| = 0 (and A-"I is singular)
characteristic equation: |A-"I|
diagonalization: if an nun matrix A has n linearly independent eigenvectors (span R
n
), and these columns form a square
matrix S, the diagonalization of a is S#S
-1
where # is the matrix with eigenvalues along the diagonal
difference equation: an equation which can be modeled in the form uk+1 = Auk or un = A
n
u0
stability of difference equation: a difference equation is stable if all eigenvalues have absolute value less than one. It is
neutrally stable if ONLY one eigenvalue has absolute value 1 and unstable if at least one absolute value of an eigenvalue
is more than 1
stability of differential equations: du/dt = Au is stable and e
At
approaches 0 whenever Re{"i} is less than or equal to zero
for all i, neutrally stable when only one equals 0, and unstable if any Re{"i} is greater than zero
hermitian matrix: a matrix such that A
H
= A, or the conjugate transpose of A is itself
unitary matrix: a complex matrix U such that U
H
U = UU
H
= I and U
H
= U
-1

similarity transformation: if M is an invertible matrix, B = M
-1
AM, and B and A are similar matrices, this is called a
similarity transform

You might also like