Rank
In linear algebra, the rank of a matrix is the dimension of its row space or column space. It is an important fact that the row space and column space of a matrix have equal dimensions.
Let be a matrix. If denotes the row space of and denotes the column space of , then . This quantity is called the rank of , and denoted .
Intuitively, the rank measures how far the linear transformation represented by a matrix is from being injective or surjective. Suppose is an -by- matrix representing a linear transformation . Since , the rank of is an integer between and , and equals precisely when is surjective. A similar interpretation exists for in terms of injectivity: the rank of is an integer between and , and equals precisely when is injective.
Proof of Column Rank = Row Rank
Let be an -by- matrix, representing a linear transformation . We define the row rank of to be , and similarly the column rank . A priori, these numbers are not necessarily equal, but we will prove they actually are.
Suppose the row rank of is . Our approach will be to produce linearly independent vectors in . This will prove . By applying the same argument to the transpose of , we obtain the reverse inequality , hence the result; this works because taking the transpose simply swaps row and column spaces.
How could one produce linearly independent vectors in A possible approach is to choose linearly independent vectors in , and then apply to each of them, hoping the resultant vectors in will themselves be linearly independent. To ensure this works, the set of linearly independent vectors in should have some additional structure we can work with; they cannot just be random vectors. But there is a very natural set of linearly independent vectors in : a basis for the row space
Accordingly, let be a basis for . We must prove are linearly independent. Suppose they aren't; then there are (not all zero) such that
By construction, we know is in the row space of . Furthermore, since , a computation done in the article on row and column spaces implies is orthogonal to every row of . But is a linear combination of these rows, so is orthogonal to itself; thus, . This implies the relation , which is absurd on account of linear independence of the collection .
Examples and Problems
What is the rank of the matrix
Note that the first two columns of are linearly independent, since they are not multiples of each other. Thus, .
Let be an -by- square matrix. Prove has rank if and only if is invertible.
has rank if and only if the row space equals . But the kernel of is precisely the orthogonal complement of the row space, and hence . We know is invertible if and only if its kernel is zero, so we are done.
Consider the -by- matrix for which the entry in the row and column is .
What is the rank of
Let be a collection of vectors. The Gram matrix of this collection is defined to be the -by- matrix whose entry in the row and column is , where denotes the dot product.
Consider the Gram matrix of the collection:
What is the rank of