Linear Algebra

Dot Products and Inner Products

For many applications, it is important to know how “close” two vectors are, in terms of the angle between them. To this end, we define the dot product of two real-valued vectors to be \[a \cdot b = \left \lVert a \right \rVert \left \lVert b \right \rVert \cos\theta,\] where \( \left\lVert a \right\rVert \) is the norm (i.e. length) of \(a\) and \(\theta\) is the angle between the two vectors. In this way, the dot product is a function that takes in two vectors (of the same size) and returns a single number.

Dot Products and Inner Products

The definition of dot product we’ve seen is not always so easy to work with. Sometimes more convenient is an alternate form, which we’ll see the equivalence of later. If \(a = (a_1, a_2, \ldots, a_n)\) and \(b = (b_1, b_2, \ldots, b_n)\), then we have \[a \cdot b = a_1b_1 + a_2b_2 + \cdots + a_nb_n.\] Consider the vectors \(a = (1, 2, 2)\) and \(b = (-3, 4, 0)\). If the angle between them is \(\theta\), what is \(\cos\theta?\)

Dot Products and Inner Products

The equivalence of these definitions also gives us an important inequality. Since \(\cos \theta \leq 1\), we have \[a_1b_1 + \cdots + a_nb_n = a \cdot b =\left \lVert a \right \rVert \left \lVert b \right \rVert \cos\theta \leq \left \lVert a \right \rVert \left \lVert b \right \rVert. \] Squaring both sides of the inequality implies \[\big(a_1^2 + a_2^2 + \cdots + a_n^2)(b_1^2 + b_2^2 + \cdots + b_n^2\big) \geq (a_1b_1 + a_2b_2 + \cdots + a_nb_n)^2\] This is called the Cauchy-Schwarz inequality.

Suppose \(x, y, z\) are real numbers satisfying \(x^2 + 2y^2 + 3z^2 = 6\). What is the maximum possible value of \(x + y + z?\) (Before attempting this problem, you may want to consult our wiki on Cauchy-Schwarz.)

Dot Products and Inner Products

The dot product tells us how close real number vectors are, but as we’ve seen throughout this chapter, there are lots of vector spaces where our current definition of dot product doesn’t make sense (e.g. what is the dot product of matrices?). To resolve this, we extend to inner products of real vector spaces, which are essentially generalized dot products.

Inner products are denoted by \(\langle v, w \rangle\), where \(v, w\) are vectors and the output is a scalar. The important properties an inner product needs to satisfy are motivated by the dot product:

1. Distributivity: \(\langle u + v, w \rangle = \langle u, w \rangle + \langle v, w \rangle\)

2. Linearity: \(c\langle v, w \rangle = \langle cv, w \rangle\)

3. Commutativity: \(\langle v, w \rangle = \langle w, v \rangle\)

4. Positive-definiteness: \(\langle v, v \rangle \geq 0\) for all \(v\), and \(\langle v, v \rangle = 0\) if and only if \(v = 0\).

It’s a good exercise to make sure “our” dot product satisfies all of these!

Dot Products and Inner Products

Which of the following is an inner product on the vector space of real numbers?

                     

Dot Products and Inner Products

It’s even possible to extend inner products to non-numeric vector spaces. For instance, we’ve seen previously that continuous functions on \([a,b]\) form a vector space. We can define an inner product for them as \[\langle f, g \rangle = \int_a^b f(x)g(x)dx,\] which isn’t too hard to verify. Even random variables, which again form a vector space, have an inner product \[\langle X, Y \rangle = \mathbb{E}[XY].\] Finally, we can extend inner products to complex vector spaces as well, substituting the commutativity condition with \(\langle v, w \rangle = \overline{\langle w, v \rangle}\) (for real vector spaces this reduces to the original).

Dot Products and Inner Products

As we’ll see momentarily, orthogonal vectors are extremely important in constructing bases. Recall that, for real vectors, orthogonal means the same thing as perpendicular. With this in mind, what is the dot product of two orthogonal vectors?

Dot Products and Inner Products

An orthogonal set is a set of vectors, any pair of which have dot product 0. For instance, the standard basis of \(\mathbb{R}^3\) \[\left\{ \begin{pmatrix}1\\0\\0\end{pmatrix}, \begin{pmatrix}0\\1\\0\end{pmatrix}, \begin{pmatrix}0\\0\\1\end{pmatrix} \right\}\] is orthogonal. Which of the following is true of any orthogonal set if there are no zero vectors?

                     

Dot Products and Inner Products

Orthogonal sets are often much easier to work with than some of the “weird” bases we’ve seen before: compare e.g. the standard basis to \(\left\{ \begin{pmatrix}4\\2\end{pmatrix}, \begin{pmatrix}2\\4\end{pmatrix} \right\}.\) Thus, given a basis for a vector space, it is useful to transform it into an orthogonal basis.

This can be achieved using the Gram-Schmidt algorithm, which we’ll look at over the next few problems. We define \[\text{proj}_u(v) = \frac{u \cdot v}{u \cdot u}u,\] which projects \(v\) onto \(u\).

For example, what point represents \(\text{proj}_{ \begin{pmatrix} 3\\4\end{pmatrix}}\left( \begin{pmatrix} 2\\1\end{pmatrix} \right)\)?

                     

Dot Products and Inner Products

Now we can use the projection operator to describe the Gram-Schmidt algorithm. Given a basis \[(v_1, v_2, \ldots, v_n),\] we make a new set \[(u_1, u_2, \ldots, u_n),\] where \[ \begin{align*} u_1 &= v_1 \\ u_2 &= v_2 - \text{proj}_{u_1}(v_2) \\ u_3 &= v_3 - \text{proj}_{u_1}(v_3) - \text{proj}_{u_2}(v_3) \\ u_4 &= v_4 - \text{proj}_{u_1}(v_4) - \text{proj}_{u_2}(v_4) - \text{proj}_{u_3}(v_4), \end{align*} \] and so on. This produces an orthogonal basis (why?).

For instance, let’s look again at the basis of \(\mathbb{R}^2\) given by \[\left\{\begin{pmatrix}4\\2\end{pmatrix}, \begin{pmatrix}2\\4\end{pmatrix}\right\}.\] The Gram-Schmidt algorithm produces an orthogonal basis with two vectors. One of them is \(\begin{pmatrix}4\\2\end{pmatrix}\). What is the other?

                     

Dot Products and Inner Products

In this quiz, we explored the dot product, which is a way to “multiply” vectors and tells us how “close” the vectors are. We also generalized the dot product to vector spaces other than \(\mathbb{R}^n\), even non-numeric ones such as function spaces, using inner products. Finally, we applied this to our continuing exploration of bases, by observing how the dot product relates to orthogonal bases and in particular how they can be constructed via the Gram-Schmidt algorithm.

×

Problem Loading...

Note Loading...

Set Loading...