Waste less time on Facebook — follow Brilliant.

Basis of \(\mathbb{R}^3\)

In this note, we are going to generalize a technique we used in another note to the entirety of \(\mathbb{R}^3\). In the aforementioned note, we constructed a linear transformation that aided us in proving a certain property of the cross product. In particular, we showed that we may "shift" our basis vectors so that the coordinates of other vectors are easier to deal with. In this proof we will show that any three linearly independent vectors in \(\mathbb{R}^3\) are a basis of \(\mathbb{R}^3\). More precisely, we will prove:

Given any three linearly independent vectors \(\vec{p}\), \(\vec{q}\) and \(\vec{v}\) in \(\mathbb{R}^3\), \(\exists\) \(T(\textbf{x}) : \mathbb{R}^3 \rightarrow \mathbb{R}^3\), \(T(\textbf{x}) =\textbf{Ax}\) such that:

If \(\textbf{x} = \left[\begin{array} \vec{p}& \vec{q}& \vec{v}\end{array} \right]\), then \(T(\textbf{x}) = \textbf{I}\)


Before we prove the above statement, we must pick vectors that we are certain are linearly independent. To do this, we will need to prove two things:

(1) If \(\vec{a}\) and \(\vec{b} \in \mathbb{R}^n\), \(\vec{a} \neq \vec{b} \neq \vec{0}\) and \(\exists\) \(k \in \mathbb{R}\) such that \(\vec{a} = k \vec{b}\) , then \(\vec{a} \cdot \vec{b} \neq 0\)

(2) If \(\vec{a}\) and \(\vec{b} \in \mathbb{R}^3\), \(\vec{a} \neq k \vec{b}\) for any \(k \in \mathbb{R}\), then there does not exist \(c_1, c_2 \in \mathbb{R}\) such that \(\vec{a} \times \vec{b} = c_1\vec{a} +c_2\vec{b}\)

(1) says that two linearly dependent, nonzero vectors can never have a null inner product. (2) says that the cross product of any two linearly independent vectors is linearly independent of the two vectors that are crossed.

Proof for (1):

Let \(\vec{b} = <b_1,...,b_n>\). If \(\vec{a} = k \vec{b}\), then:

\(\vec{a} \cdot \vec{b} = k\vec{b} \cdot \vec{b} = k|\vec{b}|^2 = k \displaystyle \sum_{i=1}^n b_i^2\)

Suppose that \(\vec{a} \cdot \vec{b} = 0\)

Then \(k \displaystyle \sum_{i=1}^n b_i^2 = 0\)

\(\Rightarrow\) \(\displaystyle \sum_{i=1}^n b_i^2 = 0\)

But \(b_i^2 \geq 0 \) \(\forall\) \(1 \leq i \leq n\). Then it follows that \(\vec{b} = \vec{0}\) contrary to our assumption. This is an obvious contradiction and we are forced to conclude that \(\vec{a} \cdot \vec{b} \neq 0\)

This proves (1)

Proof for (2):

Define \(\vec{a} = <a_1,a_2,a_3>\)

\(\vec{b} = <b_1,b_2,b_3>\)

We have: \(\vec{a} \times \vec{b} = <a_2b_3 - a_3b_2,-(a_1b_3-a_3b_1),a_1b_2 - a_2b_1>\)

Supposing that \(\exists\) \(c_1, c_2 \in \mathbb{R}\) such that \(\vec{a} \times \vec{b} = c_1\vec{a} + c_2 \vec{b}\), we have:

\(a_2b_3 - a_3b_2 = c_1a_1 +c_2b_1\)

\(-(a_1b_3-a_3b_1) = c_1a_2 + c_2b_2\)

\(a_1b_2 - a_2b_1=c_1a_3 + c_2b_3\)

\(\Rightarrow\) \(\left[ \begin{array}{c} a_2b_3 - a_3b_2 \\ -(a_1b_3-a_3b_1) \\ a_1b_2 - a_2b_1\end{array} \right] = \left[ \begin{array}{cc} a_1 & b_1 \\ a_2 & b_2 \\ a_3 & b_3 \end{array} \right] \left[ \begin{array}{c} c_1\\ c_2 \end{array} \right]\)

\(c_1\) and \(c_2\) exist if and only if \(\left[ \begin{array}{cc} a_1 & b_1 \\ a_2 & b_2 \\ a_3 & b_3 \end{array} \right] \)

is invertible. This is a non-square matrix, hence it is not invertible. Then such \(c_1,c_2\) do not exist.

This proves (2).

Now we proceed to prove our original statement. Define any vector \(\vec{v}\) in \(\mathbb{R}^3\)

Define \(\vec{p} \in \mathbb{R}^3\) such that: \(\vec{v} \cdot \vec{p} = 0\)

Define \(\vec{q} \in \mathbb{R}^3\) such that \(\vec{q} = \vec{v} \times \vec{p}\)

Assume none of these vectors are the null vector.

It is clear, by (1) and (2), that \(\vec{p}, \vec{q}, \vec{v}\) are all linearly independent. Define:

\(\textbf{x} = \left[\begin{array} \vec{p}& \vec{q}& \vec{v}\end{array} \right]\)

Now, we need to find a transformation matrix \(A\) such that:

\(A\textbf{x} = \textbf{I}\)

If this statement is true, then clearly \(A=\textbf{x}^{-1}\). Then \(A\) exists if and only if \(\det( \textbf{x}) \neq 0\). Now, we could start doing some matrix algebra here, that is, actually define our vectors in terms of components and find the determinant, but this is actually unnecessary. All we need to observe is that if \(\det( \textbf{x}) = 0\), then the columns of \(\textbf{x}\) must be linearly dependent. We've already proven that this cannot be true through our choice of \(\vec{p}, \vec{q}, \vec{v}\) and (1) and (2). Therefore, \(\det( \textbf{x}) \neq 0\) and \(A\) exists.

If we define \(T(\textbf{x}) = A\textbf{x}\), we've shown that \(T(\textbf{x}) =\textbf{I}\) and hence proved our statement.


We've actually proven three very important concepts here. The primary statement that we successfully proved is fundamental to the field of Linear Algebra (in its more general form). Essentially, this proof justifies the "tilted axis" technique many students of elementary physics use to solve problems. Also, this proof implicitly establishes that three dimensional space can be represented by an orthonormal basis of your choosing. This is incredibly useful, as we saw in the aforementioned note. As elementary and simple as this proof is, I find the implications of it to be far reaching in a practical sense.

Note by Ethan Robinett
2 years, 2 months ago

No vote yet
1 vote


Sort by:

Top Newest

Nice note,@Ethan Robinett Anuj Shikarkhane · 2 years, 2 months ago

Log in to reply

@Anuj Shikarkhane Thanks! Ethan Robinett · 2 years, 2 months ago

Log in to reply


Problem Loading...

Note Loading...

Set Loading...