# Taylor Series

This article uses summation notation.

A **Taylor series** is a polynomial of infinite degree that can be used to represent many different functions, particularly functions that aren't polynomials. Taylor series has applications ranging from classical and modern physics to the computations that your hand-held calculator makes when evaluating trigonometric expressions.

Taylor series is both useful...

$\int_{0}^{x}\frac{\sin t}{t}dt = x - \frac{x^{3}}{3\cdot3!} + \frac{x^{5}}{5\cdot5!} - \frac{x^{7}}{7\cdot7!} + \cdots = \sum_{n=0}^{\infty}(-1)^{n}\frac{x^{2n + 1}}{{(2n+1)}\cdot {(2n+1)}!}$

Here, a Taylor series is being used to evaluate an integral that cannot be computed using known methods.

...and beautiful:

$\sum_{n = 0}^{\infty} (-1)^{n}\frac{4}{2n + 1} = 4 - \frac{4}{3} + \frac{4}{5} - \frac{4}{7} + \cdots = \pi$

Here, elegant use of a Taylor series gives us the exact value of $\pi$.

#### Contents

## Introduction

Let $f(x)$ be a real-valued function that is infinitely differentiable at $x = x_0$. The

Taylor seriesexpansion for the function $f(x)$ centered around the point $x = x_0$ is given by$\sum_{n=0}^{\infty}f^{(n)}(x_0)\frac{(x - x_0)^{n}}{n!}.$

Note that $f^{(n)}(x_0)$ represents the $n^\text{th}$ derivative of $f(x)$ at $x = x_0$.

$\sum_{n=0}^{\infty}f^{(n)}(x_0)\frac{(x - x_0)^{n}}{n!}$

It is not immediately obvious how this definition constructs a polynomial of infinite degree equivalent to the original function, $f(x)$. Perhaps we can gain an understanding by writing out the first several terms of the Taylor series for $f(x) = \cos x$ centered at $x = 0$. Note that there is nothing special about using $x = 0$ other than its ease in computation, but any other choice of center is allowed and will vary based on need.

We will now use the definition above to construct a graceful polynomial equivalency to $\cos x$.

Because the formula for the Taylor series given in the definition above contains $f^{(n)}(x_0)$, we should build a list containing the values of $f(x)$ and its first four derivatives at $x = 0:$

$\begin{array}{rll} f(0) &= \cos 0 &= \color{#EC7300}1\\ f'(0) &= -\sin 0 &= \color{#3D99F6}0\\ f''(0) &= -\cos 0 &= \color{#EC7300}{-1}\\ f'''(0) &= \sin 0 &= \color{#3D99F6}0\\ f^{(4)}(0)&= \cos 0 &= {\color{#EC7300}{1}}. \end{array}$

We begin assembling the Taylor series by writing $f(x) =$ [the first number in our list] $\cdot \frac{(x - x_0)^0}{0!}$ like so:

$f(x) = {\color{#EC7300}1\cdot \displaystyle\frac{(x - 0)^0}{0!} = 1}.$

So far, our constructed function $f(x) = 1$ looks nothing like $f(x) = \cos x$. They merely have $f(0) = 1$ in common, but we shall add more terms. We add the next term from our list above, this time multiplied by $\frac{(x - x_0)^{1}}{1!}:$

$f(x) = {\color{#EC7300}1\cdot \displaystyle\frac{(x - 0)^0}{0!} + \boxed{\color{#3D99F6}0\cdot \displaystyle\frac{(x - 0)^1}{1!}} = 1}.$

Notice the exponent on $(x - 0)$ and the argument inside the factorial are both 1 this time, rather than 0 as they were in the previous term. This is because the summation dictates that we increment $n$ from 0 to 1. This process will continue by adding the next term from our list above, but again incrementing the power on $(x - 0)$ and the value inside the factorial:

$f(x) = {\color{#EC7300}1\cdot \displaystyle\frac{(x - 0)^0}{0!} + \color{#3D99F6}0\cdot \displaystyle\frac{(x - 0)^1}{1!} + \boxed{\color{#EC7300}({-1})\cdot \displaystyle\frac{(x - 0)^2}{2!}} = 1 - \displaystyle\frac{x^2}{2!}}.$

Let's stop and look at what we have so far. After three terms, our Taylor series has given us $f(x) = 1 - \frac{x^2}{2}$.

Interestingly enough, if we continue taking numbers from our list while appending incremented powers of $(x - 0)$ and incremented factorials, then our Taylor series slowly but surely conforms to the cosine curve:

$f(x) = {\color{#EC7300}1\cdot \displaystyle\frac{(x - 0)^0}{0!} + \color{#3D99F6}0\cdot \displaystyle\frac{(x - 0)^1}{1!} + \color{#EC7300}({-1})\cdot \displaystyle\frac{(x - 0)^2}{2!} + \color{#3D99F6}0\cdot \displaystyle\frac{(x - 0)^3}{3!} + \color{#EC7300}1\cdot \displaystyle\frac{(x - 0)^4}{4!} = 1 - \displaystyle\frac{x^2}{2!} + \displaystyle\frac{x^4}{4!}}.$

At this point, we can guess at the emerging pattern. The powers on $x$ are even, the factorials in the denominator are even, and the terms alternate signs. Note that more derivatives of the original function may be needed to discover a pattern, but only four derivatives were needed for this example. We encode this pattern into a summation, which finally yields our Taylor series for $\cos x:$

$\cos x = \sum_{n=0}^{\infty}(-1)^{n}\frac{x^{2n}}{(2n)!}.$

In the animation below, each frame represents an additional term appended to the previous frame's Taylor series. As we add more terms, the Taylor series tends to fit better to the cosine function it's attempting to approximate:

Important note: Because this series expansion was centered at $x = 0$, this is also known as a

Maclaurin series. A Maclaurin series is simply a Taylor series centered at $x = 0$.

So how does this work exactly? What is the intuition for this formula? Let's solidify our understanding of the Taylor series with a slightly more abstract demonstration. For the purposes of this next example, let $T(x)$ represent the Taylor series expansion of $f(x)$.

$\begin{aligned} T(x) &= \sum_{n=0}^{\infty}f^{(n)}(x_0)\frac{(x - x_0)^{n}}{n!} \\ &= f(x_0) + f'(x_0)(x - x_0) + f''(x_0)\frac{(x-x_0)^2}{2} + f'''(x_0)\frac{(x-x_0)^3}{6} + \cdots \end{aligned}$

It is important to note that the value of this summation at $x = x_0$ is simply $f(x_0)$, because all terms after the first will contain a 0 in their product. This means the value of the power series agrees with the value of the function at $x_0$ $\big($or that $T(x_0) = f(x_0)\big).$ Surely this is what we'd want from a series that purports to agree with the function! After all, if our claim is that the Taylor series $T(x)$ equals the function $f(x)$, then it should agree in value at $x = x_0$. Granted, there are an uncountable number of other functions that share the same value at $x_0$, so this equivalence is nothing special so far. Let's investigate by taking the derivative of the terms in the power series we have listed:

$T'(x) = 0 + f'(x_0) + f''(x_0)(x-x_0) + f'''(x_0)\frac{(x-x_0)^2}{2} + f^{(4)}(x_0)\frac{(x-x_0)^3}{3!}+ \cdots.$

If we evaluate the differentiated summation at $x = x_0$, then all terms after $f'(x_0)$ vanish (again due to containing 0 in their product), leaving us with only $f'(x_0)$. So, in addition to $T(x_0) = f(x_0)$, we

alsohave that $T'(x_0) = f'(x_0)$, meaning the Taylor series and the function it represents agree in the value of their derivatives at $x_0$. One can repeatedly differentiate $T(x)$ and $f(x)$ at $x = x_0$ and find that this pattern continues. Indeed, the next derivative $T''(x)$ takes on the value $f''(x_0)$, the derivative after that $T'''(x)$ takes on the value $f'''(x_0),$ and so on, all at $x = x_0$.This is a promising result! If we can ensure that the $n^\text{th}$ derivative of $T(x)$ agrees with the $n^\text{th}$ derivative of $f(x)$ at $x = x_0$ for all values of $n$, then we can expect the behavior of the Taylor series and $f(x)$ to be identical.

Now, there are rare, pathological examples to this conclusion, but to ensure those don't crop up, we condition this theorem on the function being infinitely differentiable.

There are already dozens of known Taylor series. Some of them are easy to derive on your own (and you should!) while others are far too complicated for the scope of this wiki:

$\begin{aligned} \cos x &= \sum_{n=0}^{\infty}(-1)^{n}\frac{x^{2n}}{(2n)!} & \sin x &= \sum_{n=0}^{\infty}(-1)^{n}\frac{x^{2n+1}}{(2n+1)!} & \tan^{-1} x &= \sum_{n=0}^{\infty}(-1)^{n}\frac{x^{2n+1}}{(2n+1)} \\ e^{x} &= \sum_{n=0}^{\infty}\frac{x^{n}}{n!} & \ln(1 + x) &= \sum_{n = 1}^{\infty}(-1)^{n+1}\frac{x^n}{n} & \frac{1}{1 - x} &= \sum_{n = 0}^{\infty}x^n. \end{aligned}$

## Interval and Radius of Convergence

Main Article: Interval and Radius of Convergence

The **interval of convergence** for a Taylor series $\displaystyle\sum_{n = 0}^{\infty}a_{n}(x - x_{0})^{n}$ is the set of values of $x$ for which the series converges.

Examine the geometric power series $\frac{1}{1 - x} = 1 + x + x^2 + x^3 + x^4 +\cdots = \displaystyle\sum_{n = 0}^{\infty}x^{n}$. Recall that a geometric progression of infinite terms is

$S_n = a + a \cdot r + a \cdot r^2 + a \cdot r^{3} + \cdots,$

which is equal to

$S_n = \frac{a}{1-r}.$

## Taylor Polynomial Derivation

Suppose we want to interpolate an infinite number of points on the Cartesian plane using a continuous and differentiable function $f$. How can this be done?

Given $n$ points on the Cartesian plane, the set of points can be interpolated using a polynomial of at least degree $n-1$. Given an infinite number of points to interpolate, we need an infinite polynomial

$f(x) = {a}_{0} + {a}_{1}(x-{x}_{0}) + {a}_{2}{(x-{x}_{0})}^{2} +\cdots,$

where $\left|x-{x}_{0}\right|$ is within the radius of convergence.

Observation:

$\begin{aligned} f({x}_{0}) &= {a}_{0}\\ f'({x}_{0}) &= {a}_{1}\\ f''({x}_{0}) &= 2{a}_{2}\\ f'''({x}_{0}) &= 6{a}_{3}\\ {f}^{(4)}({x}_{0}) &= 24{a}_{4}\\ {f}^{(n)}({x}_{0}) &= n!{a}_{n}. \end{aligned}$

Solving for each constant term expands the original function into the infinite polynomial

$f(x) = \sum _{ n=0 }^{ \infty }{ \frac { 1 }{ n! } { f }^{ (n) }({ x }_{ 0 } } ){ (x-{ x }_{ 0 }) }^{ n }.$

## Using Taylor Series in Approximations

Main Article: Taylor Series Approximation

Imagine that you have been taken prisoner and placed in a dark cell. Your captors say that you can earn your freedom, but only if you can produce an approximate value of $\sqrt[3]{8.1}$. Worse than that, your approximation has to be correct to five decimal places! Even without a calculator in your cell, you can use the first few terms of the Taylor series for $\sqrt[3]{x}$ about the point $x = 8$ as a tool for making a quick and decent approximation.

We certainly won't be able to compute an infinite number of terms in a Taylor series expansion for a function. However, as more terms are calculated in the Taylor series expansion of a function, the approximation of that function is improved.

Using the first three terms of the Taylor series expansion of $f(x) = \sqrt[3]{x}$ centered at $x = 8$, approximate $\sqrt[3]{8.1}.$

We have

$f(x) = \sqrt[3]{x} \approx 2 + \frac{(x - 8)}{12} - \frac{(x - 8)^2}{288} .$

The first three terms shown will be sufficient to provide a good approximation for $\sqrt[3]{x}$. Evaluating this sum at $x = 8.1$ gives an approximation for $\sqrt[3]{8.1}:$

$\begin{aligned} f(8.1) = \sqrt[3]{8.1} &\approx 2 + \frac{(8.1 - 8)}{12} - \frac{(8.1 - 8)^2}{288} \\ &=\color{#3D99F6}{2.008298}\color{#D61F06}{61111}\ldots \\ \\ \sqrt[3]{8.1} &= \color{#3D99F6}{2.008298}{\color{#D61F06}{85025}\dots}. \end{aligned}$

With just three terms, the formula above was able to approximate $\sqrt[3]{8.1}$ to six decimal places of accuracy. $_\square$