You are on page 1of 3

1. Why Ax = lamda x is a nonlinear equation? (at R.H.

S there are two unknowns)


2. For A^k whether it will require that the all eigen values are distinct?
3. Whether Column space (particular solution) is not a subspace? Then the row space (range) is not a subspace of
column space (domain) at any instance. Because the subspace of a space always contains the origin, but the particular
4.
5.
6.
7.

solution will not consist of it.


How the particular solution became parallel to the null space?
To find the particular solution whether we need to consider Ux=c?
Whether for a fat matrix (Ax=b) there will be always a possibility that b is in the column space?
How can we say that i) Two vectors are dependent if they lie on the same line? ii) Three vectors are dependent if they

8.
9.
10.
11.
12.

lie in the same plane.


Null space and Spanning set.
Dimensional vs Dimension.
Whether the number of independent rows and independent columns is equals to the rank of a rectangular matrix?
Whether a vector space can be defined as the span of some vectors?
For an m by n matrix, if m > n then is it true that the column space is a subspace of row space and if m < n then

column space is a superset (mapped from higher to lower space) of row space.
13. Whether the concept of column space and null space is literally valid for complex matrix?
14. The column space is sometimes called the range. Why not always? (Because due to null
15.
16.
17.
18.
19.

space the output is transformed to zero vector, which is not expressed by column space.)
Column space and Null space are perpendicular to each other?
What is the difference between Echelon matrix and the matrix obtained by elimination?
How can we say that row space and column space dimension is same?
Sylvesteris inequality.
Whether a set of independent vectors can be considered as basis of a vector space? (Probably
not, because independency is not the only condition, it is also essential that span of the

20.
21.
22.
23.
24.

vector set can defined the full space) Strang: PP. 129.
How can we say that matrix transformation is a linear transformation technique?
Physical interpretation of space defined by functions.
Strang: PP. 493.
Big formula for n by n determinants Strang: PP. 482.
Block multiplication

10.17.32.9;10.17.32.5;paygw.iitkgp.ernet.in

Null space is a subspace of Rn, just as the column space was a subspace of Rm.
If there is a zero vector then the complete vector set is linearly dependent.
The columns of A are independent exactly when N(A) = {zero vector}.
A set of n vectors in Rm must be linearly dependent if n > m. Every system Ac = 0 with more
unknowns than equations has solutions c not equals to zero, i.e., there must exist some null

space.
No zeros on the diagonal implies full rank.
Spanning involves the column space, and independence involves the nullspace.
The nonzero rows of any echelon matrix U must be independent.
A basis for V is a sequence of vectors having two properties at once: 1. The vectors are

linearly independent (not too many vectors). 2. They span the space V (not too few vectors).
Every vector in the space is a combination of the basis vectors, because they span.
It also means that the combination is unique: If v = a1v1 + + akvk and also v = b1v1 + + bkvk, then
subtraction gives 0 = sum of (ai bi)vi. Now independence plays its part; every coefficient ai bi must
be zero. Therefore ai = bi. There is one and only one way to write v as a combination of the
basis vectors.

If the columns to be a basis for the whole space Rn, then the matrix must be square and
invertible.
By convention, the empty set is a basis for that space, and its dimension is zero.
Any linearly independent set in V can be extended to a basis, by adding more vectors if
necessary.
Any spanning set in V can be reduced to a basis, by discarding vectors if necessary.
A vector space has infinitely many different bases. Whenever a square matrix is
invertible, its columns are independent and they are a basis for Rn.
A vector space does not have a unique basis.

If v1,...,vm and w1,...,wn are both bases for the same vector space, then m = n. The number of vectors is
the same.

A basis is a maximal independent set. A basis is also a minimal spanning set.


The dimension of the column space that equals the rank of the matrix.
When r = m there is a right-inverse, and Ax = b always has a solution. When r = n there is a
left-inverse, and the solution (if it exists) is unique, therefore only a square matrix can achieve
both existence and uniqueness. Only a square matrix has a two-sided inverse.
EXISTENCE: Full row rank r = m. Ax = b has at least one solution x for every b if and only if the
columns span Rm. Then A has a right-inverse C such that AC = Im (m by m). This is possible only if m
n.

UNIQUENESS: Full column rank r = n. Ax = b has at most one solution x for every b if and
only if the columns are linearly independent. Then A has an n by m left-inverse B such that
BA = In. This is possible only if m n.
Every matrix of rank 1 has the simple form A = uvT = column times row.
A rectangular matrix cannot have both existence and uniqueness. If m is different from
n, we cannot have r = m and r = n.
A square matrix is the opposite. If m = n, we cannot have one property without the other. A
square matrix has a left-inverse if and only if it has a right-inverse. Existence implies
uniqueness and uniqueness implies existence, when the matrix is square.
One-sided inverses B = (ATA)1AT and C = AT(AAT)1.

Explicit
Physical significance (consistent vs inconsistent)
7. Proof the SVD decomposition.

10. A Brief idea on Norm.


11. A Notion of rank deficient.
12. Maxima, Minima and Saddle,
13. Row space, column space, basis, vector
14. The space of m by n matrix is mn dimension, then for a symmetric matrix how it can be
realised?
15. Distinction between subset and subspace.
16. Any rank deficient column rank matrix mapped a vector into the lower dimensional space?

You might also like