Let be a matrix. If denotes the row space of and denotes the column space of , then . This quantity is called the rank of , and denoted .
Intuitively, the rank measures how far the linear transformation represented by a matrix is from being injective or surjective. Suppose is an -by- matrix representing a linear transformation . Since , the rank of is an integer between and , and equals precisely when is surjective. A similar interpretation exists for in terms of injectivity: the rank of is an integer between and , and equals precisely when is injective.
Let be an -by- matrix, representing a linear transformation . We define the row rank of to be , and similarly the column rank . A priori, these numbers are not necessarily equal, but we will prove they actually are.
Suppose the row rank of is . Our approach will be to produce linearly independent vectors in . This will prove . By applying the same argument to the transpose of , we obtain the reverse inequality , hence the result; this works because taking the transpose simply swaps row and column spaces.
How could one produce linearly independent vectors in A possible approach is to choose linearly independent vectors in , and then apply to each of them, hoping the resultant vectors in will themselves be linearly independent. To ensure this works, the set of linearly independent vectors in should have some additional structure we can work with; they cannot just be random vectors. But there is a very natural set of linearly independent vectors in : a basis for the row space
Accordingly, let be a basis for . We must prove are linearly independent. Suppose they aren't; then there are (not all zero) such that
By construction, we know is in the row space of . Furthermore, since , a computation done in the article on row and column spaces implies is orthogonal to every row of . But is a linear combination of these rows, so is orthogonal to itself; thus, . This implies the relation , which is absurd on account of linear independence of the collection .
What is the rank of the matrix
Note that the first two columns of are linearly independent, since they are not multiples of each other. Thus, .
Let be an -by- square matrix. Prove has rank if and only if is invertible.
has rank if and only if the row space equals . But the kernel of is precisely the orthogonal complement of the row space, and hence . We know is invertible if and only if its kernel is zero, so we are done.