You are on page 1of 3

Linear Algebra Topics

1. Introduction to Vectors
1.1. Vectors and Linear Combinations
1.1.1. Column/Row vectors
1.1.2. Scalar multiplication
1.1.3. 2D/3D vectors, R^3 space
1.2. Lengths and Dot Products
1.2.1. Norm (L2, L1, min/max, etc)
1.2.2. Unit vector
1.2.3. Angle between vectors
1.2.4. Cosine formula
1.2.5. Schwartz inequality
1.2.6. Triangle inequality
1.3. Matrices
1.3.1. Multiplying matrices
1.3.2. Linear equations
1.3.3. Inverse matrix
1.3.4. Independence/dependence
2. Solving Linear Equations
2.1. Vectors and Linear Equations
2.1.1. Linear combinations of vectors
2.1.2. Matrix form of equations
2.2. The Idea of Elimination
2.2.1. Upper triangular matrices
2.2.2. Using elimination to solve systems of equations
2.3. Elimination Using Matrices
2.3.1. One step of elimination
2.3.2. Row exchange
2.3.3. Augmented matrix
2.4. Rules for Matrix Operations
2.4.1. Index form of matrix elements (A_ij)
2.4.2. Matrix multiplication of rows and columns
2.4.3. Commutative/associative/distributive laws
2.4.4. Block matrices and block multiplication
2.5. Inverse Matrices
2.5.1. Invertible/singular matrices
2.5.2. Identity matrix
2.5.3. 2x2 inverse formula
2.5.4. Inverse of matrix product
2.5.5. Inverse with Gauss-Jordan elimination
2.6. Elimination = Factorization: A = LU
2.6.1. LU and LDU factorization
2.6.2. Lower and upper triangular matrix properties
2.6.3. Computational cost of elimination
2.7. Transposes and Permutations
2.7.1. Sum, product, inverse of transposed matrices
2.7.2. Inner product
2.7.3. Symmetrix matrices
2.7.4. Symmetrix factorization (LDL^T)
3. Vector Spaces and Subspaces
3.1. Spaces of Vectors
3.1.1. Definition of vector space
3.1.2. Subspaces
3.1.3. Column space
3.1.4. Span
3.2. The Nullspace of A: Solving Ax = 0
3.2.1. Finding nullspace through elimination
3.2.2. Reduced row echelon form
3.3. The Rank and the Row Reduced Form
3.3.1. Pivot/free variables
3.3.2. Rank one matrices
3.3.3. Pivot column
3.3.4. Special solutions
3.3.5. Nullspace matrix
3.4. The Complete Solution to Ax = b
3.4.1. Particular solution
3.4.2. Full row rank/full column rank
3.5. Independence, Basis, and Dimension
3.5.1. Independent vectors
3.5.2. Spanning a space
3.5.3. Bases of a space
3.5.4. Dimension of a space
3.5.5. Function spaces
3.6. Dimensions of the Four Subspaces
3.6.1. Row space, column space, nullspace, left nullspace
3.6.2. Big picture
4. Orthogonality
4.1. Orthogonality of the Four Subspaces
4.1.1. Orthogonal vectors
4.1.2. Orthogonal bases
4.1.3. Orthogonal spaces
4.1.4. Orthogonal matrices
4.1.5. Orthogonal complement
4.1.6. Combining bases
4.2. Projections
4.2.1. Projection formula
4.2.2. Projection onto a line
4.2.3. Projection onto a subspace
4.2.4. A^T A
4.3. Least Square Approximations
4.3.1. Least squares formula
4.3.2. Minimizing error
4.3.3. Fitting a line
4.3.4. Fitting a parabola
4.4. Orthogonal Bases and Gram-Schmidt
4.4.1. Orthonormal basis
4.4.2. Projection using orthogonal bases
4.4.3. Gram-Schmdit process
4.4.4. Factorization A=QR
5. Determinants
5.1. The Properties of Determinants
5.1.1. 2D determinant
5.1.2. N-dimensional determinant
5.2. Permuations and Cofactors
5.2.1. Pivot formula
5.2.2. Big formula for determinant
5.2.3. Cofactor determinant
5.3. Cramers Rule, Inverses, and Volumes (skipped)
6. Eigenvalues and Eigenvectors
6.1. Introduction to Eigenvalues
6.1.1. Finding eigenvalues/eigenvectors
6.1.2. Equation for eigenvalue
6.1.3. Trace
6.1.4. Imaginary eigenvalues
6.2. Diagonalizing a Matrix
6.2.1. Eigenvalue matrix
6.2.2. Matrix powers
6.2.3. Nondiagonalizable matrices
6.3. Applications to Differential Equations
6.3.1. du/dt = Au
6.3.2. Second order equations
6.3.3. Exponential of a matrix
6.4. Symmetric Matrices
6.4.1. Symmetrix diagonalization
6.4.2. Complex eigenvalues
6.5. Positive Definite Matrices (skipped)
6.6. Similar Matrices (skipped)
6.7. Singular Value Decomposition (SVD)
6.7.1. SVD formula

You might also like