If the limit of a sequence is 0, does the series converge?
This is part of a series on common misconceptions.
What's the common misconception?
If the terms of a sequence are getting smaller and smaller, is it guaranteed that the sum of the entire sequence is some finite number? For example, this simple series which approaches \(0\) has a sum which converges to 2:
\[1 + \frac{1}{2} + \frac{1}{4} + \frac{1}{8} + \frac{1}{16} + \cdots + \frac{1}{2^n} = \sum_{n = 0}^{\infty} \frac{1}{2^n} = 2.\]
Can this observation be generalized? To ask this question more formally, let \( a_n \geq 0, n \in \{1,2,3, \ldots \} \) denote a sequence of positive numbers \(a_1, a_2, a_3, \ldots\)
Is this true or false?
\[ \lim_{n \rightarrow \infty} a_n = 0 \implies \sum_{n = 1}^{\infty} a_n < \infty \]
Why some people say it's true: When the terms of a sequence that you're adding up get closer and closer to 0, the sum is converging on some specific finite value. Therefore, as long as the terms get small enough, the sum cannot diverge.
Why some people say it's false: A sum does not converge merely because its terms are very small.
The statement that \(\lim\limits_{n \rightarrow \infty} a_n = 0 \implies \sum\limits_{n = 1}^{\infty} a_n < \infty \) is \( \color{red} {\textbf{false}}\).
Counterexample 1:
Consider the set \( \left\{\frac{1}{n}\right\}_{n \in \mathbb{N}_0} = \left\{1, \frac{1}{2}, \frac{1}{3}, \ldots \right\}\). Clearly, we see that \( \lim\limits_{n \rightarrow \infty} \frac{1}{n} = 0, \) but it is well-known that \(\sum\limits_{n = 1}^{N}\frac{1}{n}\) diverges as \(N \rightarrow \infty.\) (See proof here.)Counterexample 2:
We can also purposefully construct a series that very clearly will not converge to a finite sum, although advanced terms of the series are arbitrarily close to 0. Consider this series:\[1 + \frac{1}{2} + \frac{1}{2} + \frac{1}{3} + \frac{1}{3} + \frac{1}{3} + \frac{1}{4} + \frac{1}{4} + \frac{1}{4} + \frac{1}{4} + \frac{1}{5} + \cdots.\]
Just by grouping sets of like terms, we can tell that this sum will not converge:
\[1 + \left(\frac{1}{2} + \frac{1}{2}\right) + \left(\frac{1}{3} + \frac{1}{3} + \frac{1}{3}\right) + \left(\frac{1}{4} + \frac{1}{4} + \frac{1}{4} + \frac{1}{4}\right) + \left(\frac{1}{5} + \cdots\right)+\cdots = 1 + 1 + 1 + 1 + 1 +\cdots = \infty.\]
Rebuttal:
One of the first theorems for series we learn is that \(\lim\limits_{n \rightarrow \infty} a_n \neq 0 \implies \sum\limits_{n = 1}^{N} a_n \text{ diverges as } N \rightarrow \infty .\) Therefore, if the limit of \(a_n\) is 0, then the sum should converge.Reply:
Yes, one of the first things you learn about infinite series is that if the terms of the series are not approaching 0, then the series cannot possibly be converging. This is true. However, the opposite claim is not true: as proven above, even if the terms of the series are approaching 0, that does not guarantee that the sum converges.There is also a correct way to 'reverse' the statement in your claim, but this is a syntactic reversal, creating a second statement that is logically equivalent to the first. The statement "if the terms of the series are not approaching 0, then the series cannot possibly be converging" is logically equivalent to the claim that "if a series converges, then it is guaranteed that the terms in the series approach 0." More formally,
\[\sum\limits _{ n=1 }^{ N } a_{ n } \text{ converges as } N \rightarrow \infty \implies \lim\limits _{ n\rightarrow \infty } a_{ n }=0.\]
If your teacher said that the 'reverse' of the original statement is also true, this kind of reversal is likely what he or she meant.
See Also