# Subspace

A **subspace** is a vector space that is entirely contained within another vector space. As a subspace is defined *relative* to its containing space, both are necessary to fully define one; for example, \(\mathbb{R}^2\) is a subspace of \(\mathbb{R}^3\), but also of \(\mathbb{R}^4\), \(\mathbb{C}^2\), etc.

The concept of a subspace is prevalent throughout abstract algebra; for instance, many of the common examples of a vector space are constructed as subspaces of \(\mathbb{R}^n\). Subspaces are also useful in analyzing properties of linear transformations, as in the study of fundamental subspaces and the fundamental theorem of linear algebra.

## Formal definition

Let \(V\) be a vector space. \(W\) is said to be a **subspace** of \(V\) if \(W\) is a subset of \(V\) and the following hold:

- If \(w_1, w_2 \in W\), then \(w_1 + w_2 \in W\)
- For any scalar \(c\) (e.g. a real number ), if \(w \in W\) then \(cw \in W\).

It can be shown that these two conditions are sufficient to ensure \(W\) is itself a vector space, as it inherits much of the structure present in \(V\) and thus satisfies the remaining conditions on a vector space.

All vector spaces have at least two subspaces: the subspace consisting entirely of the 0 vector, and the "subspace" \(V\) itself. These are called the **trivial subspaces** and rarely have independent significance.

## Examples

The simplest way to generate a subspace is to restrict a given vector space by some rule. For instance, consider the set \(W\) of complex vectors \(\mathbf{v}\) such that

\[\begin{pmatrix}2\\-3\\4\end{pmatrix} \cdot \mathbf{v} = 0\]

Of course, the set of \(\mathbf{v}\) satisfying this property is a subset of \(\mathbb{C}^3\), by the definition of "complex vector". Thus, if it can be shown that this is in fact a vector space, a subspace of \(\mathbb{C}^3\) would be established.

To show this, both conditions from the previous section have to be verified. But fortunately this is straightforward:

If \(\mathbf{w_1}, \mathbf{w_2} \in W\), then \[\begin{pmatrix}2\\-3\\4\end{pmatrix} \cdot \mathbf{w_1} = 0, \begin{pmatrix}2\\-3\\4\end{pmatrix} \cdot \mathbf{w_2} = 0\] by definition. Hence \[\begin{pmatrix}2\\-3\\4\end{pmatrix} \cdot (\mathbf{w_1 + w_2}) = 0\] and so \(w_1 + w_2 \in W\).

Similarly, if \(\mathbf{w} \in W\) and \(c \in \mathbb{C}\), then \[\begin{pmatrix}2\\-3\\4\end{pmatrix} \cdot \mathbf{w} = 0 \implies \begin{pmatrix}2\\-3\\4\end{pmatrix} \cdot (\mathbf{cw}) = 0\] so \(cw \in W\) as desired.

Thus \(W\) is a subspace of \(V\).

Not all subsets form subspaces however. For example, consider the set \(W\) of complex vectors \(\mathbf{v} \in \mathbb{C}^3\) satisfying

\[\begin{pmatrix}2\\-3\\4\end{pmatrix} \mathbf{v} = 12\]

This is not a subspace as any subspace must inherit the zero vector of its parent, but \(\mathbf{v} = \begin{pmatrix}0\\0\\0\end{pmatrix} \not\in W\). This can also be seen from the above conditions:

\[\begin{pmatrix}2\\-3\\4\end{pmatrix} \cdot \begin{pmatrix}6\\0\\0\end{pmatrix} = 12 \implies \begin{pmatrix}6\\0\\0\end{pmatrix} \in W\] \[\begin{pmatrix}2\\-3\\4\end{pmatrix} \cdot \begin{pmatrix}0\\0\\3\end{pmatrix} = 12 \implies \begin{pmatrix}0\\0\\3\end{pmatrix} \in W\] but \(\begin{pmatrix}6\\0\\0\end{pmatrix} + \begin{pmatrix}0\\0\\3\end{pmatrix} = \begin{pmatrix}6 \\ 0 \\ 3\end{pmatrix}\) is not in \(W\).

Let \(V\) be the vector space \(\mathbb{C}^2\), or the set of vectors with two complex number components. Which of the following, if any, are subspaces of \(V\)?

- The set of vectors \(\begin{pmatrix}x\\y\end{pmatrix}\) for which \(xy = 0\).
- The set of vectors \(\begin{pmatrix}x\\y\end{pmatrix}\) for which \(x\) and \(y\) are both integers.

## Fundamental subspaces

*Main article: Fundamental subspaces*

There are also several fundamental subspaces of a given matrix \(A\): the **column space** (and, analogously, the **row space**) and the **nullspace**.

The **column space** is defined as the vector space whose basis consists of the columns of \(A\). This is clearly a vector space since its basis is explicitly defined, but it is possible that this basis is redundant (contains linearly dependent vectors). In this case a smaller basis can be assigned, and the dimension of the smallest possible basis is called the **rank** of \(A\). A famous theorem -- which is part of the fundamental theorem of linear algebra -- states that the dimension of the column space is the same as the dimension of the row space (which is defined analogously as the vector space whose basis consists of the rows of \(A\)).

The **nullspace** (or **kernel**) is defined as the set of vectors \(\mathbf{v}\) for which \(A\mathbf{v}=0\). Unlike the column space, it is not immediately evident that this forms a vector space, but it is straightforward to show:

If \(\mathbf{v_1}, \mathbf{v_2} \in N\) (where \(N\) is the nullspace of \(A\)), then \(A\mathbf{v_1} = A\mathbf{v_2} = 0\) by definition. Then \[A(\mathbf{v_1} + \mathbf{v_2}) = A\mathbf{v_1} + A\mathbf{v_2} = 0\] so \(\mathbf{v_1} + \mathbf{v_2} \in N\).

Similarly, \(\mathbf{v} \in N \implies A\mathbf{v} = 0 \implies A(c \mathbf{v}) = c \cdot 0 = 0\), so \(c\mathbf{v} \in N\) for any scalar \(c\).

This establishes that the nullspace is a vector space as well.

In fact, the column space and nullspace are intricately connected by the rank-nullity theorem, which in turn is part of the fundamental theorem of linear algebra.