I’m in the mood to do something a little more ‘math-y’! A few weeks ago, Tyler at PowerUp did a nice post about the divergence of the harmonic series, and that got me thinking about the weirdness of infinite series. Since I’ve been working on a book chapter on infinite series anyway, as a part of my upcoming ‘math methods’ textbook, I thought I’d talk a little about infinite series and some unusual results associated with them!
So what is an infinite series? Let us first imagine an infinite ordered collection of numbers , where represents the location in the sequence. Such a collection is called an infinite sequence; an example is the sequence
An infinite series is the sum of the terms of an infinite sequence; for the sequence above, the infinite series is written as
The end of this equation is the mathematical shorthand for the summing of an infinite series: we ‘sum’ () from to infinity () terms of the form .
Such infinite series appear in mathematical physics all the time; the series above, with terms , appeared very early in history in one of Zeno’s paradoxes! The so-called ‘dichotomy paradox‘ was summarized by Aristotle as follows:
Zeno’s arguments about motion, which cause so much disquietude to those who try to solve the problems that they present, are four in number. The first asserts the non-existence of motion on the ground that that which is in locomotion must arrive at the half-way stage before it arrives at the goal. This we have discussed above.
In other words, in order for a person to walk 2 miles, they must first walk 1 mile. Before they walk a 1 mile, however, they must walk 1/2 mile. Before they walk a 1/2 mile, they must walk 1/4 mile. Extending this argument ad infinitum, Zeno argued that motion was apparently impossible, because to travel a finite difference involved an infinite number of intermediate events:
The total distance traveled can be added up, working backwards: 1 mile +1/2 mile + 1/4 mile + 1/8 mile + 1/16 mile, etc.!
Zeno considered the dichotomy argument a paradox because it seemed irreconcilable that traveling a finite distance (2 miles) could be broken into an infinite number of ‘events’ (crossing the 1 mile mark, 1/2 mark, 1/4 mark, and so forth).
The mathematics of infinite series, though not necessarily completely resolving the paradox, demonstrates that it is mathematically consistent to consider the path as a succession of progressively smaller ‘half-steps’. In fact, we can exactly determine the numerical value of the sum of a series of the form
Series of this form, where is a number whose absolute value is less than one, is known as a geometric series. We can determine the sum of this series using the following clever tricks:
- Define a finite geometric series as
- Note that may be written as
- Note that may also be written as
- Equate these two versions of :
- Solve for :
- In the limit , the term goes to zero provided , and becomes the sum of the infinite geometric series:
For the specific case of Zeno’s paradox, where , we find that the sum of the series is , which is exactly the total distance traversed in Zeno’s argument.
The next natural question to ask is: what types of infinite series sum to a finite value, i.e. “converge”? In other words, if I am given an infinite series, how do I determine if it sums to a finite value? Clearly a series of the form
sums to an infinite value, i.e. “diverges”, simply because every term in the series is bigger than the previous one! In fact, one can show that the sum of a finite number of terms of the series can be written as:
You can check this yourself for the first few terms of the finite series. As we let , we find that the sum must tend to infinity.
It is obvious, then, that for a series to be finite, the terms of the sum must grow smaller in size as we get further along in the summation; that is, the quantity as . This condition is necessary for the series to converge, but is it sufficient?
Here we come to the example of the harmonic series, which Tyler discussed. The harmonic series is of the form
Each new term we add to the series is smaller than the previous one. As , , so it would seem natural to assume that this series must converge to a finite value. However, it does not! To see this, we group the terms of the series in the following suggestive way:
The first parenthetical grouping is two terms, the next four terms, the next 8 terms, and so on. Looking at the first grouping, we note that . This means that
Similarly, , as is and . This means that
The same argument can be made for each grouping. Therefore, the sum of each group of terms is greater than 1/2; since the series is infinite, there are an infinite number of groups of this form, and therefore the sum of the series is infinite!
For a series to converge, it is not sufficient for the terms of the series simply to go to zero; the terms must go to zero at a very rapid rate. The geometric series described above has terms which go to zero sufficiently rapidly, while the harmonic series does not.
Now let’s get even weirder, and consider a ‘cousin’ of the harmonic series, the alternating harmonic series, defined as
This series differs from the harmonic series in that the terms alternate in sign. At first, one might think that this series doesn’t converge, but one can show, theoretically and computationally, that the series, summed as shown, approaches the value , where represents the natural logarithm.
This series converges because the terms of opposing sign partially cancel each other out. This partial cancellation is enough to make the alternating harmonic series converge.
Here comes the weird part! We know from elementary arithmetic that addition is an order-independent operation; that is, when we add a finite set of numbers together, say , and , we get the same answer regardless of which numbers we add first:
and so forth. Rearranging the order of summation does not change the value of the sum.
Is this also true if we rearrange the order of an infinite series such as the alternating harmonic series? We consider the following different addition schemes:
- : the ‘normal’ addition, we add one positive term, then one negative term:
- : we add one positive term, then two negative terms:
- : we add two positive terms, then one negative term:
- : we add four positive terms, then one negative term:
Though we are performing an infinite rearrangement of the terms of the series, one might expect that the sum must be the same regardless of the order we take them, right?
Wrong! Let us numerically sum the series as described above; the horizontal axis roughly represents the number of terms of the series we have summed so far:
Rather remarkably, the series converges to different values depending on how we sum it! In fact, one can prove rigorously that, by making an infinite rearrangement of the terms of the alternating harmonic series, one can get it to sum to any value you desire, including infinity!
The infinite case is easy to show; simply add together all the positive terms before adding any negative terms (i.e. )! As we add the positive terms together, we have a series of the form,
But consider the following: . This means that the sum satisfies the relation,
Formally, this series is half the sum of the harmonic series, which we know to be infinite. ‘Half of infinity’ is still infinite, so summing all positive terms first results in an infinite sum!
To try and understand the peculiar behavior of the alternating harmonic series, we try rearranging the order of another series, the ‘alternating quadratic” series:
It is still an alternating series, but the terms go to zero ‘faster’ than the alternating harmonic series. The corresponding computation results in a plot of the form:
No matter how we sum the series, it always eventually approaches a limit approximately around 0.8!
Evidently there are two distinct ways in which a series can converge to a finite sum. Some series, like the alternating quadratic series, have terms which go rapidly to zero. In essence, the sum of the series is almost entirely determined by the low- n terms of the series, and the higher terms contribute nothing. Other series, like the alternating harmonic series, only converge because adjacent members of the series partially cancel each other out. In essence, every term of the series contributes significantly to the total sum. When we rearrange the series, for instance by using more positive terms than negative terms, we are ‘favoring’ the positive terms: because we are summing infinitely, the negative terms never get a chance to ‘catch up’!
A series which sums like the alternating harmonic series is called conditionally convergent: it only sums on the ‘condition’ that the alternating terms cancel, and if we took the sum of the absolute values of , i.e. , we would find the series diverges (it would in fact just be the harmonic series). A series which sums like the alternating quadratic series is called absolutely convergent, and it converges even if we sum the absolute value of each of the terms, i.e. , though it doesn’t necessarily converge to the same value. In fact, the definition of an absolutely convergent series is that the sum of the absolute value of the terms is finite, i.e.
Most problems in mathematical physics involve absolutely convergent series. Such series are convenient because they behave like ‘real’ numbers: we can add them together in any order we want, and the product of two absolutely convergent series is simply the product of the sums of the individual series.
It would be nice to say that conditionally convergent series never make an appearance in mathematical physics, but it isn’t true: such series appear in a number of physically relevant mathematical problems. It is just another example of the reality that we always must be a little careful when working with infinity in mathematics.
And it shows that infinite series are kinda weird!