# Symmetric Polynomials

A symmetric polynomial is a polynomial where if you switch any pair of variables, it remains the same. For example, \(x^2+y^2+z^2\) is a symmetric polynomial, since switching any pair, say \(x\) and \(y\), the resulting polynomial \(y^2+x^2+z^2\) is the same as the initial polynomial. Symmetric polynomials can often be found in Vieta's Formula and Newton's Identities.

#### Contents

## Definition

The polynomial \(P(x_1,x_2,\dots ,x_n)\) is

symmetricif for any permutation \(\sigma\) of \(\{x_1,x_2,\dots ,x_n \}\),\[P(x_1,x_2,\dots ,x_n) \equiv P(x_{\sigma(1)}, x_{\sigma(2)}, \dots , x_{\sigma(n)})\]

Informally speaking, this means that all the variables play the same role in the polynomial and replacing one by another would have no effect. If \( f(..., x_a, ..., x_b, ... ) \) is a **symmetric polynomial**, then \[ f(..., x_a, ..., x_b, ... ) = f( ..., x_b, ..., x_a, ...) \]

Perhaps the most simple example is \[P(x, y) = P(y, x) = x + y\]

Or, a slightly more complex example in three variables \[P(x,y, z) = x^n + y^n + z^n + x^{n-2} y z + y^{n-2} x z + z^{n-2} x y + xyz\]

## Elementary Symmetric Polynomials

Like atoms build up matter, elementary symmetric polynomials are the building blocks for all symmetric polynomial. The elementary symmetric polynomials for a polynomial consists of \(n\) variables, \(\{x_1,x_2,\dots ,x_n\}\), are defined as \(\{e_0,e_1,\dots ,e_n\}\)

\[\begin{aligned} e_0 &= 1 \\ e_1 &= \sum_{1\leq j \leq n} x_j \\ e_2 &= \sum_{1\leq j < k \leq n} x_jx_k \\ e_3 &= \sum_{1\leq j < k < l \leq n} x_jx_kx_l \\ \vdots \\ e_n &= x_1x_2\dots x_n \end{aligned}\]

In general, for \(0\leq m\leq n\)

\[e_m = \sum_{1\leq j_1<j_2<\dots <j_m\leq n} x_{j_1}x_{j_2}\dots x_{j_m}\]

Elementary symmetric polynomials in \(n\) variables, \(\{x_1,x_2,\dots ,x_n\}\), can be generated using the polynomial

\[\prod_{i=1}^{n}(t+x_i)\]

where the \(k^{\text{th}}\) elementary symmetric polynomial, \(e_k\), will be the coefficient of \(t^{n-k}\).

## List all elementary symmetric polynomials that consist of \(3\) variables \(x,y,z\).

\[\begin{aligned} e_0 &= 1 \\ e_1 &= x+y+z \\ e_2 &= xy+yz+zx \\ e_3 &= xyz \end{aligned}\]

Fundamental Theorem of Symmetric Polynomials

Every symmetric polynomial \(P(x_1,x_2,\dots,x_n)\) can be rewritten in terms of the elementary symmetric polynomials ( \(e_1,e_2,\dots ,e_n )\).

## Express \(x^3yz + xy^3z+xyz^3\) only in terms of the elementary symmetric polynomials.

\[\begin{aligned} x^3yz + xy^3z+xyz^3 &= xyz(x^2+y^2+z^2) \\ &= xyz[(x+y+z)^2-2(xy+yz+zx)] \\ &= e_3((e_1)^2-2e_2) \end{aligned}\]

## Factorization Of Symmetric Polynomials

When factoring symmetric polynomials, it's useful to make use of the Fundamental Theorem of Symmetric Polynomials and rewrite the original symmetric polynomial completely in terms of the elementary symmetric polynomials, because then you can factor more easily. When you are dealing with power sums, Newton's Identities are helpful in that they allow you to express the power sums in terms of elementary symmetric polynomials.

Factor \( x^3 + y^3 + z^3 - 3xyz \).

Solution:The above polynomial is a symmetric polynomial because you can switch any two variables and still get the same polynomial.

Thus, we can express it in terms of the elementary symmetric polynomials. In this case, we can use Newton's Identities for the power sum portion \( x^3 + y^3 + z^3 \) which can be rewritten as \( e_1 P_2 - e_2 P_1 + 3e_3 \), where \( e_n \) is the nth degree elementary symmetric polynomial and \( P_n \) is the nth degree power sum.

This gives us

\[ e_1 P_2 - e_2 P_1 + 3e_3 - 3e_3 \]

\[ = e_1 P_2 - e_2 P_1 \]

Since \( P_1 = e_1 \),

\[ = e_1 P_2 - e_1 e_2 \]

\[ = e_1 (P_2 - e_2 ) \]

\[ = \boxed{(x+y+z)(x^2 + y^2 + z^2 - xy - yz - xz )} \]

## Inequalities Of Symmetric Polynomials

Symmetric polynomials are quite common in most inequailties. For example, their appear in the AM-GM, AM-HM, or the power mean inequality.

[[start-example]]

Let \(a_1,a_2,\ldots,a_n\) be \(n\) positive real numbers. Define \(f_{AM}=\dfrac{a_1+a_2+\ldots+a_n}{n}\) as the arithmetic mean, and \(f_{GM}=\sqrt[n]{a_1a_2\ldots a_n}\) as the geometric mean. Then, the **AM-GM inequality** states that \(f_{AM} \geq f_{GM}\).

We have symmetric polynomials on both sides, so we can use elementary symmetric polynomials to restate it as \(\dfrac{e_1}{n}\geq\sqrt[n]{e_n}\).

Now let \(f_{HM}=\dfrac{n}{\frac{1}{a_1}+\frac{1}{a_2}+\cdots+\frac{1}{a_n}}\) be the harmonic mean. Then, the

AM-HM inequalitystates that \(f_{AM} \geq f_{HM}\). Again, both sides are symmetric polynomials, but it's not very obvious how to express \(f_{HM}\) using elementary symmetric polynomials. Let's try to add the fractions:\[f_{HM}=\dfrac{n\;a_1a_2\ldots a_n}{a_2a_3\ldots a_n+a_1a_3\ldots a_n+\cdots+a_1a_2\ldots a_{n-1}}\]

Now we recognize both the numerator and denominator as elementary symmetric polynomials:

\[f_{HM}=\dfrac{n\;e_{n}}{e_{n-1}}\]

And thus, the inequality becomes \(\dfrac{e_1}{n}\geq \dfrac{n\;e_n}{e_{n-1}}\) or \(e_1e_{n-1}\geq n^2 e_n\).

What would we get if we start from the GM-HM inequality? Try it!

\[F_{GM} \geq F_{HM} \implies \sqrt[n]{e_n} \geq \dfrac{n\;e_{n}}{e_{n-1}} \implies e_{n-1}^n \geq n^n e_n ^ {n-1}\]

It would be nicer if we could derive more inequalities with every elementary symmetric polynomial. Again, consider the set of positive real numbers \(A=\{a_1,a_2,\ldots,a_n\}\) and all their elementary symmetric polynomials \(e_k\):

We define the \(k^{th}\)

elementary symmetric meanas \(E_k=\dfrac{e_k}{e_k(1)}=\dfrac{e_k}{\binom{n}{k}}\), where \(e_k(1)\) is the \(k^{th}\) elementary symmetric polynomial of \(n\text{ 1's}\). Do you see why \(e_k(1)=\binom{n}{k}\)?Note that \(e_k(1)\) gives us the number of distinct terms of the elementary symmetric polynomial \(e_k\), which is equivalent to see in how many ways we can choose a product consisting of \(k\) variables from the set \(a\). This is just the binomial coefficient \(\binom{n}{k}\).

Let's start with two important theorems:

Newton's inequality:\[(E_k)^2 \geq E_{k-1} E_{k+1} \quad \forall \quad 1\leq k \leq n-1\] With equality if and only if \(a_1=a_2=\ldots=a_n\). Link to the proof.

Raise both sides to the \(k^{tk}\) power: \[(E_k)^{2k} \geq E_{k-1}^k E_{k+1}^k\]

If we take the product from \(k=1\) to \(n-1\) of both sides, we obtain another theorem: \[(E_1)^2 (E_2)^4 (E_3)^6 \cdots (E_{n-1})^{2(n-1)} \geq (E_0 E_2)(E_1^2 E_3^2)(E_2^3 E_4^3)\cdots(E_{n-2}^{n-1} E_{n}^{n-1})\]

Almost everything cancels, and we end up with: \[(E_{n-1})^n \geq (E_n)^{n-1} \implies \sqrt[n-1]{E_{n-1}} \geq \sqrt[n]{E_n}\]

So we introduce the second theorem:

Maclaurin's inequality:\[\sqrt[j]{E_j} \geq \sqrt[k]{E_k}\quad\text{ if }\quad j \leq k\] With equality if and only if \(a_1=a_2=\ldots=a_n\).

That allow us to obtain the following chain of inequalities: \[E_1 \geq \sqrt{E_2} \geq \sqrt[3]{E_3} \geq \ldots \geq \sqrt[n]{E_n}\]

Note that if we take \(E_1 \geq \sqrt[n]{E_n}\) we obtain the AM-GM inequality!

Prove that for positive reals \(a_1,a_2,\ldots,a_n\) where \(n>1\), the following inequality holds: \[\sum_{i=1}^{n} a_i^2 \geq \dfrac{2\displaystyle\left(\sum_{1\leq i<j\leq n} a_i a_j\right)}{n-1}\]

We have, by Maclaurin's inequality, \[\begin{align} E_1 &\geq \sqrt{E_2}\\ \dfrac{e_1}{\binom{n}{1}} &\geq \sqrt{\dfrac{e_2}{\binom{n}{2}}}\\ \dfrac{e_1^2}{n^2} &\geq \dfrac{2e_2}{n(n-1)}\\ e_1^2 &\geq \dfrac{2n\; e_2}{n-1} \end{align}\] We also know that \(a_1^2+a_2^2+\ldots+a_n^2=e_1^2-2e_2\), so: \[\begin{align} \sum_{i=1}^{n} a_i^2 + 2e_2 &\geq \dfrac{2n\; e_2}{n-1} \\ \sum_{i=1}^{n} a_i^2 &\geq \dfrac{2e_2}{n-1}.\ _\square \end{align}\]

Notice that when \(n=3\) we obtain the famous inequality \(a^2+b^2+c^2 \geq ab+ac+bc\).

## Example Polynomials

## Applications

Problem Solving Principle:If you can switch variables, do so.

An antisymmetric polynomial \(f(x,y)\) is one such that \(f(x,y) = - f(y,x)\)

Show that any antisymmetric polynomial can be expressed as \[f(x, y) = (x-y)g(x,y)\] where \(g\) is symmetric.

The first step is certainly to show that \((x-y)\) divides \(f(x,y)\). This is straight forward from the remainder-factor theorem. We need to show that \(f(x,x) = 0\). To do so, notice that by definition of antisymmetry \[f(x, x) = -f(x, x) \\ \implies f(x,x) = 0 \]

Hence, \[f(x,y) = (x-y)g(x,y) \\ \implies g(x,y) = \frac{f(x,y)}{x-y}\]

Now, let us use our problem solving principle here.

\[ g(y, x) = \frac{f(y,x)}{y-x} \\ = \frac{-f(x,y)}{-(x-y)} \\ = \frac{f(x,y)}{(x-y)} = g(y, x) \]

Thus, \(g\) is symmetric.

Problem Solving Principle:Express your symmetric polynomials using the elementary polynomials.

Let \[ x + y = a \\ x^2 + y^2 = b \\ x^3 + y^3 = c \]

Assuming the system has a solution, what is the relationship between \(a\), \(b\) and \(c\)?

Let us use the principle we just stated

\[ x+ y = e_1 = a \\ x^2 + y^2 = (x+y)^2 - 2xy = e_1^2 - 2 e_2 = b \\ x^3 + y^3 = (x+y)^3 - 3xy(x+y) = e_1^3 - 3 e_1 e_2 = c \]

The rest of the problem is trivial now. Substitute \(a = e_1\) in the second equation to get \[a^2 - 2 e_2 = b \\ \frac{a^2 - b}{2} = e_2 \]

Furthermore, substitute that into the last equation: \[a^3 - 3 a \left ( \frac{a^2 - b}{2} \right ) = c \]

Problem Solving Principle:Replace uncomfortable terms with variables in the hope that symmetric polynomials could be formed.

Find \(x\) satisfying \[ \sqrt[3]{10 - x} + \sqrt[3]{x} = 1\]

Clearly, we are uncomfortable with the term under \(\sqrt[3]{.}\) But hey! The nice thing is that \[ (10 - x ) + x = 10 \] which is a pretty simple equation.

Let us replace these terms with \(a\) and \(b\) \[ a + b = 1 \\ a^3 + b^3 = 10 \]

Now, let us use the previous problem solving principle \[ a + b = e_1 = 1 \\ a^3 + b^3 = e_1^3 - 3 e_1 e_2 = 10\]

By substitution, it is quite simple to find the value of \(e_2\). And now that we know \(a + b\) and \(ab\), we could use the quadratic formula to obtain them separately. (That is, if they exist, of course).

Then we report the value of \(b^3\) as the answer.

**Cite as:**Symmetric Polynomials.

*Brilliant.org*. Retrieved from https://brilliant.org/wiki/symmetric-polynomials-definition/