# Definite Integrals of Polynomials

We will prove:

Let $$P(x)$$ be a polynomial of order $$n$$, i.e.:

$$P(x) = \displaystyle \sum_{i=0}^{n} b_ix^i \neq 0$$

Where {$$b_i$$} is a set of fixed real coefficients and all roots of $$P(x)$$ are real and unique. If $$S$$ is the set of all pairs $$(\alpha, \beta)$$, $$\alpha \neq \beta$$ and $$\alpha$$, $$\beta$$ $$\in \mathbb{R}$$ such that:

$$\displaystyle \int_{\alpha}^{\beta} P(x) \,dx = 0$$

Then $$|S| \geq \frac{n(n+1)}{2}$$ $$\forall$$ $$P(x)$$. Note that, in terms of the cardinality of $$S$$: $$(\alpha, \beta) = (\beta, \alpha)$$

Before we start, I should say that finding this inequality wasn't particularly difficult, but finding the equality case is proving to be a struggle (at least for me, I have no doubt that some Brilliant users can do it easily). If anyone can do it, please post your proof in the comment section, I would really appreciate it because it's kind of bugging me.

Proof:

Define all variables as above. Before we prove the inequality, we must first establish:

$$\displaystyle \int P(x) \,dx = \displaystyle \sum_{i=0}^n \frac{b_i x^{i+1}}{i+1} +C$$

Subproof:

Using induction, we establish a base case as $$n=1$$:

$$\Rightarrow$$ $$P(x) = \displaystyle \sum_{i=0}^{1} b_ix^i =b_0 +b_1x$$

$$\Rightarrow$$ $$\displaystyle \int P(x)\,dx = \displaystyle \int b_0 +b_1x \,dx = b_0x+\frac{b_1x^2}{2} +C = \displaystyle \sum_{i=0}^{1} \frac{b_i x^{i+1}}{i+1} +C$$

The base case clearly holds. Now we establish the inductive case $$n=k$$:

We have:

$$\displaystyle \int \displaystyle \sum_{i=0}^k b_ix^i \,dx = \displaystyle \sum_{i=0}^k \frac{b_i x^{i+1}}{i+1} +C$$

$$\displaystyle \int b_{k+1}x^{k+1} \,dx + \displaystyle \int \displaystyle \sum_{i=0}^k b_ix^i \,dx = \displaystyle \int b_{k+1}x^{k+1} \,dx + \displaystyle \sum_{i=0}^k \frac{b_i x^{i+1}}{i+1} +C$$

(By hypothesis)

$$\Rightarrow$$ $$\displaystyle \int \displaystyle \sum_{i=0}^{k+1} b_ix^i \,dx = \displaystyle \sum_{i=0}^k \frac{b_i x^{i+1}}{i+1} +C + \frac{b_{k+1}x^{k+2}}{k+2}$$

$$\Rightarrow$$ $$\displaystyle \int \displaystyle \sum_{i=0}^{k+1} b_ix^i \,dx = \displaystyle \sum_{i=0}^{k+1} \frac{b_i x^{i+1}}{i+1} +C$$

So we conclude that the statement is true for $$n=1$$ and if it is true for some $$n=k$$, then it is true for $$n=k+1$$. The proof follows by induction.

Now, let:

$$g(x) = \displaystyle \int P(x) \,dx$$

Then, by the fundamental theorem of calculus, we have:

$$\frac{dg(x)}{dx} = P(x)$$

If we set $$\frac{dg(x)}{dx} = 0$$, then it is clear that all extrema of $$g(x)$$ are the roots of $$P(x)$$. Furthermore, as we just proved above, $$g(x)$$ is a polynomial of degree $$n+1$$ because:

$$g(x) = \displaystyle \sum_{i=0}^n \frac{b_i x^{i+1}}{i+1} +C$$

We must now assume that all roots of $$g(x)$$ are real. This is a strong assumption that eliminates many functions $$P(x)$$, since we are also assuming that all roots of $$P(x)$$ are real.

Then $$g(x)$$ has $$n$$ extrema and $$n+1$$ roots. Choose $$\alpha \in \mathbb{R}$$ such that $$g(\alpha) = 0$$. Then $$\exists$$ $$n$$ values, call them $$\beta$$, $$\beta \neq \alpha$$ such that $$g(\alpha) =g(\beta) = 0$$.

Again, by the fundamental theorem of calculus:

$$g(\beta) - g(\alpha) = \displaystyle \int_{\alpha}^{\beta} P(x)\,dx$$

Then $$\exists$$ at least $$\dbinom{n+1}{2}$$ pairs $$(\alpha, \beta)$$ such that $$\displaystyle \int_{\alpha}^{\beta} P(x)\,dx = 0$$.

If we define the set of all such pairs as $$S$$ and note that $$\dbinom{n+1}{2} = \frac{n(n+1)}{2}$$, we arrive at:

$$|S| \geq \frac{n(n+1)}{2}$$

This completes the proof.

QED

The reason strict equality does not follow from the above is due to the fact that there may exist infinitely many such pairs. To illustrate this, consider $$P(x) = x$$. This is a polynomial of order $$1$$ and hence has $$n=1$$ roots. But consider any $$d \in \mathbb{R}$$. We have:

$$\displaystyle \int_{-d}^d x\,dx = \frac{1}{2}(d^2-d^2) = 0$$ $$\forall$$ pairs $$(d,-d)$$.

Hence $$\exists$$ infinitely many such pairs and it follows that:

$$|S| = + \infty > (\frac{n(n+1)}{2} = 1)$$

This proof has been edited from its original version, which included complex roots. This inequality may or may not hold for complex roots.

Note by Ethan Robinett
3 years, 10 months ago

MarkdownAppears as
*italics* or _italics_ italics
**bold** or __bold__ bold
- bulleted- list
• bulleted
• list
1. numbered2. list
1. numbered
2. list
Note: you must add a full line of space before and after lists for them to show up correctly
paragraph 1paragraph 2

paragraph 1

paragraph 2

[example link](https://brilliant.org)example link
> This is a quote
This is a quote
    # I indented these lines
# 4 spaces, and now they show
# up as a code block.

print "hello world"
# I indented these lines
# 4 spaces, and now they show
# up as a code block.

print "hello world"
MathAppears as
Remember to wrap math in $$...$$ or $...$ to ensure proper formatting.
2 \times 3 $$2 \times 3$$
2^{34} $$2^{34}$$
a_{i-1} $$a_{i-1}$$
\frac{2}{3} $$\frac{2}{3}$$
\sqrt{2} $$\sqrt{2}$$
\sum_{i=1}^3 $$\sum_{i=1}^3$$
\sin \theta $$\sin \theta$$
\boxed{123} $$\boxed{123}$$

Sort by:

How do you evaluate $$\displaystyle \int_1^i x^2 dx$$?

If this can't work in any fashion, the inequality won't hold. Good idea though.

- 3 years, 10 months ago

Disregard my previous comment, I think I see what you're saying now. Integrals with complex bounds are actually contour integrals if I'm not mistaken, hence the path needs to be specified. Luckily, the above proof can be, if this is correct, modified fairly easily to be correct if I restrict $$P(x)$$ and $$g(x)$$ to those polynomials whose roots are real. This is pretty limiting unfortunately but it's all we can do with this, unless you know of some other way.

- 3 years, 10 months ago

Yeah, I'm not sure either. I'm not too good with complex analysis so I don't know exactly the mechanics of contour integrals.

- 3 years, 10 months ago

I don't see how that integral implies anything about the inequality. If you're trying to say that you can't evaluate a definite integral with imaginary roots as bounds, it shouldn't matter, because in the above we are considering $$g(x)$$ to be what it is: a polynomial with complex-valued roots. So in short, if you find the primitive of $$P(x)$$ and let the arbitrary constant equal zero, then instead of integrating $$P(x)$$ you may use the fundamental theorem of calculus, evaluating $$g(\alpha) -g(\beta)$$ which will always equal zero when $$\alpha$$ and $$\beta$$ are roots of $$g(x)$$. Hence you don't even need to evaluate integrals like the one you mentioned. Also, 1 and $$i$$ are not roots of $$\displaystyle \int x^2 \,dx$$ so the proof wouldn't apply to this integral, though I believe you're just using that integral as an example. I could be wrong though, this is just my logic.

- 3 years, 10 months ago