Row And Column Spaces
In linear algebra, when studying a particular matrix, one is often interested in determining vector spaces associated with the matrix, so as to better understand how the corresponding linear transformation operates. Two important examples of associated subspaces are the row space and column space of a matrix.
Suppose \(A\) is an \(m\)-by-\(n\) matrix, with rows \(r_1, \cdots, r_m \in \mathbb{R}^n\) and columns \(c_1, \cdots, c_n \in \mathbb{R}^m\). The row space \(R(A)\) is the subspace of \(\mathbb{R}^n\) spanned by the vectors \(\{r_i\}_{1\le i \le m}\); similarly, the column space \(C(A)\) is the subspace of \(\mathbb{R}^m\) spanned by the vectors \(\{c_i\}_{1\le i \le n}\).
Example Computation
Consider the matrix
\[A = \begin{pmatrix} 2 & 1 & 0 \\ 3& -1 & 2 \end{pmatrix}.\]
Compute the row space \(R(A)\) and the column space \(C(A)\).
The rows of this matrix are \((2,1,0)\) and \((3,-1,2)\), so the row space \(R(A)\) is the span of these two vectors in \(\mathbb{R}^3\). In particular, since these rows are linearly independent, \(R(A)\) is a two-dimensional subspace of \(\mathbb{R}^3\). To determine this subspace explicitly, note that both rows are orthogonal to \((-2,4,5)\), so they span the plane \(-2x + 4y + 5z = 0\).
To determine the column space of \(A\), first note the columns of the matrix are \((2,3)\), \((1,-1)\), and \((0,2)\). Since the first two of these vectors are linearly independent, it follows that their span \(C(A)\) is a two-dimensional subspace of \(\mathbb{R}^2\), and hence \(\mathbb{R}^2\) itself. \(_\square\)
Note that, in the above computation, we have \(\dim\big(R(A)\big) = \dim\big(C(A)\big)\). This is an instance of the linear algebra fact that "row rank equals column rank," and is discussed in the article on rank.
Consider the matrix
\[B= \begin{pmatrix} 8 & -5 & 2 \\0 & 2 & 1 \end{pmatrix}.\]
Note that the column space \(C(B)\) is just \(\mathbb{R}^2\), since the first two columns are linearly independent. The row space \(R(B)\) has equation \(ax+by+cz = 0\), where \(a,b,c\in \mathbb{Z}, a>0,\) and \(\gcd(a,b,c) =1\). What is \(a+b+c?\)
Interpretations of Row and Column Spaces
Let \(A\) be an \(m\)-by-\(n\) matrix, which corresponds to a linear transformation \(T: \mathbb{R}^n \to \mathbb{R}^m\). One can interpret the row and column spaces of \(A\) in terms of this transformation.
Suppose the columns of \(A\) are \(c_1, \ldots, c_n \in \mathbb{R}^m\). For any vector \((a_1, a_2, \ldots, a_n)\in \mathbb{R}^n\), one may compute \[T(a_1, \ldots, a_n) = a_1 c_1 + a_2 c_2 + \cdots + a_n c_n \in \mathbb{R}^m.\] This computation implies the image of \(T\) is precisely the column space \(C(A)\). One should think of this as a "coordinate-free" interpretation of the column space; no matter which matrix one chooses to represent the linear transformation \(T\), the column space of that matrix will always equal the image of \(T\).
If \(A\) is a matrix representing a linear transformation \(T\), then the column space of \(A\) is the image of \(T\). In symbols, \[C(A) = \text{Im}(T).\]
Similarly, one can interpret the row space of \(A\) as follows: If the rows of \(A\) are \(r_1, \ldots, r_m \in \mathbb{R}^n\), then for any \(a\in \mathbb{R}^n\), one computes \[T(a) = (r_1 \cdot a, r_2 \cdot a, \ldots, r_m \cdot a),\] where \(\cdot\) denotes the dot product. Recall that the kernel of \(T\) is the subspace of \(\mathbb{R}^n\) consisting of all the vectors \(v\) such that \(Tv = 0 \in \mathbb{R}^m\). Then, the computation above shows that \(a\) is in the kernel of \(T\) if and only if it is orthogonal to each row \(r_i\). In other words, the kernel of \(T\) is precisely the space of vectors \(v\) such that \(v\cdot w = 0\) for all \(w\in R(A)\).
Let \(V\) be a vector space and \(W \subset V\) a subspace; assume that \(V\) has a dot (inner) product \(\cdot\) defined on it. The orthogonal complement of \(W\) is the subspace \(W^{\perp}\) consisting of the vectors \(a\in V\) such that \(a\cdot b = 0\) for all \(b \in W\).
If \(A\) is a matrix representing a linear transformation \(T\), then the kernel of \(T\) is the orthogonal complement of \(R(A)\). In symbols, \[R(A)^{\perp} = \text{Ker}(T).\]