# Cauchy-Schwarz Inequality

The *Cauchy-Schwarz inequality* states that for all sequences of real numbers \( a_i\) and \(b_i \), we have \(\left(\displaystyle \sum_{i=1}^n a_i^2\right)\left( \displaystyle \sum_{i=1}^n b_i^2\right)\ge \left( \displaystyle \sum_{i=1}^n a_ib_i\right)^2\). Equality holds if and only if \(a_i=kb_i \) for some constant \( k\in\mathbb{R}^+\). It can be generalized to Hölder's inequality.

Not only is this inequality useful for proving Olympiad inequality problems, it is also used in multiple branches of mathematics, like linear algebra, probability theory and mathematical analysis.

#### Contents

## Using the Cauchy-Schwarz Inequality

Starting from the definition, we will head into the usage of the Cauchy-Schwarz inequality.

Cauchy-Schwarz inequalitystates that for all real numbers \( a_i\) and \(b_i \), we have\[\left(\sum_{i=1}^n a_i^2\right)\left(\sum_{i=1}^n b_i^2\right)\ge \left(\sum_{i=1}^n a_ib_i\right)^2,\]

where equality holds if and only if \(\frac{a_i}{b_i}=k\) for some constant \( k\in\mathbb{R}^+\), for all \(1\le i\le n\) which have \( a_ib_i \neq 0 \). \(_\square\)

There are many reformulations of this inequality. There's also a vector form and a complex number version of it. But we only need the above elementary form to tackle Olympiad problems and problems in other areas.

To get a feel of it, let's consider the case of \(2\) terms. Let the two sequences be \(\{a,b\}\) and \(\{c,d\}\). Then by Cauchy-Schwarz inequality we have

\[\left(a^2+b^2\right)\left(c^2+d^2\right)\ge \left(ac+bd\right)^2.\]

This easily follows from the Fibonacci-Brahmagupta identity

\[\left(a^2+b^2\right)\left(c^2+d^2\right)=\left(ac+bd\right)^2+\left(ad-bc\right)^2\ge \left(ac+bd\right)^2.\]

Notice that the equality holds when \(ad-bc=0\Rightarrow \dfrac{a}{c}=\dfrac{b}{d}\) if the terms are non-zero. The case of 3 terms states that

\[ \left( a^2 + b^2 + c^2 \right) \left( d^2 + e^2 + f^2 \right) \geq ( ad + be + cf )^2. \]

Let's work through an example followed by a problem to try yourself.

## Given \(a_1,a_2...a_n\in\mathbb{R} \) such that \( a_1+a_2+\cdots+a_n=1\), prove that \[\]

\[a_1^{2}+a_2^{2}+\cdots+a_n^{2}≥\frac{1}{n}.\]

By Cauchy-Schwarz, we have \[\begin{align} (1\cdot a_1+1\cdot a_2+\cdots+1\cdot a_n)^{2}&\le \left(a_1^{2}+a_2^{2}+\cdots+a_n^{2}\right)(1+1+\cdots+1)\\ a_1^{2}+a_2^{2}+\cdots+a_n^{2}&≥\frac{1}{n}. \end{align}\] We have equality when \( \dfrac{a_i} { 1} \) is a constant, or when \( a _ i \) is a constant. This happens when \[ a_1 = a_2 = \cdots = a_n = \frac{1}{n}. \ _\square \]

The following is one of the most common examples of the use of Cauchy-Schwarz. We can easily generalize this approach to show that if \( x^2 + y^2 + z^2 = 1 \), then the maximum value of \( ax + by + cz \) is \( \sqrt{ a^2 + b^2 + c^2 } \).

## If \( x^2 + y^2 + z^2 = 1 \), what is the maximum value of \( x + 2y + 3z \)?

We have \( ( x + 2y + 3z ) ^ 2 \leq \left( 1^2 + 2^2 + 3^2 \right) \left( x^2 + y^2 + z^2 \right) = 14 .\)

Hence, \( x + 2y + 3z \leq \sqrt{14} \). Equality holds when \( \frac{x}{1} = \frac{y}{2} = \frac{z}{3} \).

Together with \( x^2 + y^2 + z^2 = 1 \), we get \[ x = \frac{1}{ \sqrt{14} }, y = \frac{2}{ \sqrt{14} } , z = \frac{3} { \sqrt{14} }. \ _\square \]

Making the right choice of \(a_i\) and \(b_i\) can simplify a problem. The following problem will make you realize it:

## For positive reals \(a, b, c > 0 \), show that \[\]

\[ abc (a +b + c) \leq a^3 b + b^3 c + c^3 a .\]

At first glance, it is not clear how we can apply Cauchy-Schwarz, as there are no squares that we can use. Furthermore, the RHS is not a perfect square. The power of Cauchy-Schwarz is that it is extremely versatile, and the right choice of \( a_i\) and \(b_i \) can simplify the problem.

By Cauchy-Schwarz, we have

\[ \small \left ( \frac{a}{ \sqrt{c} } \times \sqrt{c} + \frac{ b}{ \sqrt{a} } \times \sqrt{a} + \frac{c}{ \sqrt{b} } \times \sqrt{b} \right) ^2 \leq \left ( \frac{ a^2 }{c} + \frac{ b^2}{a} + \frac{ c^2 } { b} \right) ( c + a + b ). \]

As such, \( ( a + b + c) \leq \left( \frac{ a^2 }{c} + \frac{ b^2}{a} + \frac{ c^2 } { b} \right ) \). Multiplying throughout by \( abc\), we obtain the given inequality. \( _ \square \)

Let's give it one more try by yourself with this practical use of the Cauchy-Schwarz inequality:

\(a, b, c\ (a<b<c)\) are coprime integers, and there are the following 2 sets of present boxes:

- \(3\) pink cubic boxes of respective side lengths \(a, b, c\)
- \(3\) identical blue cuboid boxes of dimensions \(a\times b\times c\).

Which set has a larger total surface area?

An often used consequence of the Cauchy-Schwarz inequality is Titu's lemma. It is so important that we devote an entire page to it. It states that for \(a_i,b_i\in\mathbb{R}, b_i>0\)

\[\frac{a_1^{2}}{b_1}+\frac{a_2^{2}}{b_2}+\cdots+\frac{a_n^{2}}{b_n}≥\frac{(a_1+a_2+\cdots+a_n)^{2}}{b_1+b_2+\cdots+b_n},\]

and the equality similarly holds iff \({\dfrac{a_1}{b_1}=\dfrac{a_2}{b_2}=\cdots=\dfrac{a_n}{b_n}}\). \(_\square\)

On the left diagram above, the red rectangle is inscribed in a larger outside rectangle.

On the right, in the same outside rectangle, two blue rectangles are formed by the perpendicular lines arising from 2 adjacent vertices of the red rectangle.

Which one has a larger area, the red region or the blue region?

## Proof of the Cauchy-Schwarz Inequality

There are various ways to prove this inequality. A short proof is given below.

Consider the function

\[f(x)=\left(a_1x-b_1\right)^2+\left(a_2 x-b_2\right)^2+\cdots +\left(a_nx-b_n\right)^2.\]

Being a sum of squares, \(f(x)\) is always non-negative. Now we expand out \(f(x)\) and collect terms. Then it becomes

\[f(x)=\left(\sum_{i=1}^n a_i^2\right)x^2-2\left(\sum_{i=1}^n a_i b_i\right)x+\left(\sum_{i=1}^n b_i^2\right),\]

which is a quadratic in \(x\). The graph of this quadratic is an up parabola. Since \(f\ge 0\), the graph either touches the \(x\)-axis or stays in the upper half of it. That means either it's a perfect square (has a repeated root) or doesn't have any real root. So the discriminant is \(\le 0\). Computing the discriminant, we have

\[4\left(\sum_{i=1}^n a_i b_i\right)^2-4\left(\sum_{i=1}^n a_i^2\right)\left(\sum_{i=1}^n b_i^2\right)\le 0.\]

Dividing by \(4\) and rearranging yields the Cauchy-Schwarz inequality.

Notice that equality holds when \(f\) has a real root (repeated, of course). From the first form of \(f(x)\) and using the fact that the sum of squares equals to \(0\) only when each square equals to \(0\), we have \(a_i x-b_i=0 \implies b_i/a_i=x\) for all \(1\le i\le n\). That is when \(a_i/b_i\) is constant for all \(i\). \(_\square\)

We can also derive the Cauchy-Schwarz inequality from the more general Hölder's inequality. Simply put \( m = 2 \) and \( r = 2 \), and we arrive at Cauchy Schwarz. As such, we say that Holders inequality generalizes Cauchy-Schwarz.

## Vector form of Cauchy Schwarz

This section is devoted to defining and proving the vector form of Cauchy-Schwarz inequality.

For all vectors \(x\) and \(y\) of a real inner product space, \[ |\langle x,y\rangle| ^2 \leq \langle x,x\rangle \cdot \langle y,y\rangle, \] where \(\langle\cdot,\cdot\rangle\) is the inner product.

Equivalently, \(|\langle x,y\rangle| \leq \|x\| \cdot \|y\|.\)

Equality hold if and only if the 2 vectors are linearly dependent. \(_\square\)

For a complex vector space, we have

\[ \left| \sum_{i=1}^n x_i \bar{y}_i \right|^2 \leq \sum_{j=1}^n |x_j|^2 \sum_{k=1}^n |y_k|^2.\]

Equality holds if and only if \(x\) and \(y\) are linearly dependent, that is, one is a scalar multiple of the other (which includes the case when one or both are zero).

Now we will prove the above claim by the vector form of Cauchy-Schwarz.

If \(v = 0,\) it is clear that we have equality, and in this case \(u\) and \(v\) are also trivially linearly dependent. We henceforth assume that \(v\) is non-zero. We also assume that \(\langle u, v \rangle \ne 0\) because otherwise the inequality is obviously true.

Let

\[ z= u-\frac {\langle u, v \rangle} {\langle v, v \rangle} v. \]

From the theory of vectors, we know that \(z \) is the projection of \(u\) onto the plane orthogonal to \(v\). We will show this is true, by showing that \(z\) is orthogonal to the vector \(v\), or equivlently that \( \langle z , v \rangle = 0 \). By the linearity of the inner product in its first argument, one has

\[ \langle z, v \rangle = \left\langle u -\frac {\langle u, v \rangle} {\langle v, v \rangle} v, v\right\rangle = \langle u, v \rangle - \frac {\langle u, v \rangle} {\langle v, v \rangle} \langle v, v \rangle = 0.\]

Since these vectors are orthogonal, we can apply the Pythagorean theorem to

\[ u =\dfrac {\langle u, v \rangle} {\langle v, v \rangle} v+z,\] which gives

\[ \left\|u\right\|^2 = \left|\frac{\langle u, v \rangle}{\langle v, v \rangle}\right|^2 \left\|v\right\|^2 + \left\|z\right\|^2 = \frac{|\langle u, v \rangle|^2}{\left\|v\right\|^2} + \left\|z\right\|^2 \geq \frac{|\langle u, v \rangle|^2}{\left\|v\right\|^2}. \]

After multiplying by \(||v||^ 2\), we obtain the Cauchy–Schwarz inequality.

If the relation in the above expression is actually an equality, then \(||z||^2 = 0\) and hence \(z = 0\). From the definition of \(z\) as the projection vector, this implies that there is no part of \(u\) that is perpendicular to \(v\), and hence these two vectors are parallel. This establishes the theorem. \(_ \square \)

Another approach to prove it, in an algebraic way, is given below.

Consider two sequences \(A=\{a_{i}\}\) and \(B=\{b_{j}\}\), and a matrix, whose composing rule for each element is \(M_{ij}={a_{i}}^{2}.{b_{j}}^{2}\). For sequences with equal number of terms, we'll have

\[\left| \begin{matrix} {a_{1}}^{2}.{b_{1}}^{2} & {a_{1}}^{2}.{b_{2}}^{2} & \ldots & {a_{1}}^{2}.{b_{n}}^{2}\\ {a_{2}}^{2}.{b_{1}}^{2} & {a_{2}}^{2}.{b_{2}}^{2} & \ldots & {a_{2}}^{2}.{b_{n}}^{2}\\ \vdots & \vdots & \ddots & \vdots\\ {a_{n}}^{2}.{b_{1}}^{2} & {a_{n}}^{2}.{b_{2}}^{2} & \ldots & {a_{n}}^{2}.{b_{n}}^{2} \end{matrix} \right| .\]

If we attempt to sum all the terms on the matrix, by the distributive property, this would be given as

\[ S_{M}=\left( \displaystyle \sum_{i=1}^n a_i^2\right) \left( \displaystyle \sum_{j=1}^n b_j^2\right) .\]

Now, noting that the indexes on the diagonal are equal, if we attempt to sum only the terms on the diagonal, this would be given as

\[ S_{D}=\left( \displaystyle \sum_{i=1}^n a_ib_i\right)^2 .\]

As all the terms on the matrix are squares, they are all non-negative. Because the summands of \(S_{M}\) are all non-negative, if we discount the summands on the diagonal, what remains is non-negative, i.e. \(S_{M}-S_{D} \geq 0\), and

\[ \left( \displaystyle \sum_{i=1}^n a_i^2\right) \left( \displaystyle \sum_{j=1}^n b_j^2\right) \geq \left( \displaystyle \sum_{i=1}^n a_ib_i\right)^2. \ _\square \]

The following example is motivated by the vector form of Cauchy-Schwarz.

## What is the maximum of the function \( a \sin \theta + b \cos \theta \)?

By Cauchy-Schwarz,

\[ ( a \sin \theta + b \cos \theta ) ^2 \leq \left( a^2 + b^2 \right) \left( \sin ^2 x + \cos ^2 x \right) = a^2 + b^2 . \]

Hence, the maximum is \( \sqrt{ a^ 2 + b ^2 } \), which is achieved when \( \tan \theta = \frac{ a}{b} \). \(_ \square \)

## Additional Problems Using Cauchy-Schwarz

In this section we will learn the applications of Cauchy-Schwarz inequality through some examples and problems.

## For \(x,y,z\in\mathbb{R^{+}}\), prove that \[\]

\[\sqrt{x(3x+y)}+\sqrt{y(3y+z)}+\sqrt{z(3z+x)}≤2(x+y+z).\]

By Cauchy-Schwarz, we have

\[\begin{align} &\sqrt{x(3x+y)}+\sqrt{y(3y+z)}+\sqrt{z(3z+x)} \\ \leq& \sqrt{\left(\left(\sqrt{x}\right)^{2}+\left(\sqrt{y}\right)^{2}+\left(\sqrt{z}\right)^{2}\right)\left(\left(\sqrt{3x+y}\right)^{2}+\left(\sqrt{3y+z}\right)^{2}+\left(\sqrt{3z+x}\right)^{2}\right)}\\

\leq& \sqrt{4(x+y+z)^{2}}\\ =&2(x+y+z). \ _\square \end{align}\]

It's interesting to know that even triangle inequality in \(n\) dimensions leads to Cauchy-Schwarz inequality, which can be proved easily.

For real values \( \{ a_i, b_i \} \), show that

\[ \small \sqrt{ a_1 ^2 + a_2 ^2 + \cdots+ a_n^2 } + \sqrt{ b_1 ^2 + b_2^2 + \cdots+ b_n^2 } \geq \sqrt{ ( a_1 + b_1)^2 + ( a_2 + b_2) ^2 + \cdots + ( a_n+ b_n)^2}. \]

Squaring both sides and subtracting common terms, we want to show that

\[ \small 2 \sqrt{ a_1 ^2 + a_2 ^2 + \cdots+ a_n^2 } \times \sqrt{ b_1 ^2 + b_2^2 + \cdots +b_n^2 } \geq 2 \left( a_1 b_1 + a_2 b_2 + \cdots + a_n b_n \right). \]

Squaring this and dividing by 4, we get

\[ \left( a_1 ^2 + a_2 ^2 + \cdots+ a_n^2 \right) \times \left( b_1 ^2 + b_2^2 + \ldots b_n^2 \right) \geq \left( a_1 b_1 + a_2 b_2 + \ldots + a_n b_n \right)^2, \]

which is Cauchy-Schwarz inequality and we know its proof. \(_\square\)

Here is the simple problem involving the application of Cauchy-Schwarz inequality:

Here are some additional problems you can try to prove on your own:

Prove that if \( \{a_i, b_i\} \) are infinite sequences such that \( \sum a_i ^2 < \infty, \sum b_i ^2 < \infty \), then \( \sum |a_i b_i | < \infty \).

[APMO 1991] For positive reals \( \{ a_i, b_i \} \) such that \( a_1 + a_2 + \cdots + a_ n = b_1 + b_2 + \cdots+ b_n \), show that

\[ \frac{ a_1 ^2 } { a_1 + b_1 } + \frac{ a_2 ^2 } { a_2 + b_2 } + \cdots + \frac{ a_n^2 } { a_n + b_n } \geq \frac{ a_ 1 + a_2 + \cdots + a_n } { 2}. \]

- Show that for \( x, y, z > 0 \), we have
\[ x + y + z \leq 2 \left ( \frac{ x^2 } { y+z } + \frac{ y^2 }{ x+z } + \frac{ z^2 } { x+y } \right). \]

- Prove that \[ \left( \sum a_i b_i c_i \right)^2 \leq \left( \sum a_i ^2 \right ) \left( \sum b_i ^2 \right) \left( \sum c_i ^2 \right ). \]

Solve this last problem and you're excellent at Cauchy-Schwarz inequality. Here you go:

## See Also

**Cite as:**Cauchy-Schwarz Inequality.

*Brilliant.org*. Retrieved from https://brilliant.org/wiki/cauchy-schwarz-inequality/