Absolute convergence of a series – Serlo

Aus Wikibooks
Zur Navigation springen Zur Suche springen

In this article, you may learn about the absolute convergence, which is a stronger property than just convergence. Absolute convergence is able to guarantee that a series keeps the same limit under re-arrangement of elements. So an absolute convergent series has a kind of unique limit, independent of the order of summation. The concept also turns out useful for defining integrals ("=continuous sums") with unique values.


Does the value of a sum depend on the order of the order of elements? For finite sums, the answer is no, as you may have already learned in school. For instance,

is the same as

This is a consequence of "commutativity of the sum": there is for all . Finitely many commutations of neighbouring elements will not change the value of the sum. However, for infinitely many commutations, it is not clear whether the value of the sum remains invariant. So may the value of the sum depend on the order, here? The answer turns out to be yes! A series like

might have a different value than its re-arranged version

It might even happen that a convergent sum gets divergent under re-arrangement and vice versa. Therefore, it might be useful to have a criterion for when the value of a series does not depend on the order of its elements. And indeed, such a criterion exists:

The limit of a series is invariant under re-arrangement of the elements if and only if the series is absolutely convergent.

The idea is there are sequences which include both positive elements which add up to and negative elements, which add up to . The term is not defined and the limiting behaviours of such a series may depend on how elements are combined. For instance, consider the series

The partial sums of this series jump between 1 and 0, so they do not converge, but stay bounded. However, one may re-arrange the elements such that alternatingly two positive and two negative elements follow each other:

Since , this series is equivalent to . Equivalently, by taking "more negative than positive elements", we can reach

How to avoid these ambiguities? It may be a good ansatz to require that the sums of absolutes of all positive and of all negative elements are smaller than infinity. Then, we do not run into the problem of having an ill-defined limit . Having bounded sums of the absolute values is actually what characterizes an absolutely convergent series.


Definition (absolute convergence)

A series converges absolutely if and only if converges.

So a series is absolutely convergent, if and only if the series of the absolute values of its elements converges. This forbids the case, where all or all sum up to or .


If for all , the we have . In that case, absolute convergence and convergence are identical.

Absolutely convergent series converge [Bearbeiten]

Absolute convergence is the strongest form of convergence. This sentence can be formulated as a mathematical theorem:

Theorem (Absolute convergence implies convergence.)

every absolutely convergent series converges.

Proof (Absolute convergence implies convergence.)

Let be an absolutely convergent series. That means, has to converge. Let us consider the sequence of partial sums . We would like to show that this is a Cauchy sequence. So let be given. The sequence converges, so it is a Cauchy sequence (as every convergent sequence is Cauchy). Consequencetly, there is an , such that for all . We conclude all summands and assume without loss of generality (w.l.o.g.) that .

The partial sums of can be bounded from above by this expression: For , there is

So is a (real-valued) Cauchy sequence as well. Now, within the real numbers, all Cauchy sequences converge, so converges, as well.

Alternative proof (proof by the Cauchy criterion)

Using the Cauchy criterion for series, we can get an even faster proof. Let be an absolutely convergent series, i.e. converges. The Cauchy criterion is equivalent to convergence, so must satisfy the Cauchy criterion

Now, the triangle inequality implies . So for , we also have that . Hence

which means that also defines a Cauchy sequence and hence converges. This is what we wanted to show.


The proof above implies that for absolutely convergent series . So the series of absolute values provides an upper bound for the absolute value of the original series:

Not all convergent series converge absolutely[Bearbeiten]

Every absolutely convergent series converges. However, the converse does not hold true (otherwise one would not need the notion of an absolutely convergent series). An example for a series which converges, but does not converge absolutely is the alternating harmonic series . Using the Alternating series test, one can prove its convergence. However, the series of absolute values is just the harmonic series, which is known to diverge. We conclude:

Every absolutely convergent series converges, but there are convergent series, which do not converge absolutely.

Criteria for absolute convergence [Bearbeiten]

Not every convergent series converge absolutely. Are there conditions under which a convergent series is known to converge absolutely? Indeed, there are! In order to find such a condition, we may ask the question how can a not absolutely convergent series possible converge? we can decompose such a series into its positive and negative parts:


for instance, for, there is


We may think of as the "budget of positive values" and of as the "budget of negative values" of the series . Absolute convergence means that the combined budget is finite. For absolutely divergent series, the budget must be infinite. Now, a convergent but not absolutely convergent series combines an infinite budget to a finite limit, by cancelling out infinities within the budget like . We can prevent this by requiring that the "budgets" and of the series are finite:

Theorem (criterion for absolute convergence)

The series converges absolutely, if and only if the series and converge. In that case, .

Proof (criterion for absolute convergence)

Proof step: If converges absolutely, then also and converge absolutely.

Let be absolutely convergent, so converges. There is and . By means of the direct comparison test, the "smaller" series and also have to converge absolutely, so they converge in the usual sense.

Proof step: If and converge, then converges absolutely.

Let and be convergent. There is , so by the computation rules for series, there is , which converges.

Example (absolutely convergent series)

The series converges absolutely: we can take the absolute values of the series elements and get the "absolute series" . This series converges, as shown in the article Bounded series and convergence. Its limit is .

Now, the criterion above implies that also and must converge. Conversely, if one can show that and converge, then the criterion implies that must converge absolutely.

Question: Do the "budget series" and always converge whenever converges in the original sense?

No, because the "budgets" might cancel as . An example is the alternating harmonic series . This series is convergent. However, the "budget series" and both diverge, as their absolutes make up the divergent harmonic series.

Re-arrangement of series[Bearbeiten]

In some cases, it might be convenient to re-arrange elements within a series. The important question is now: does the limit stay invariant under any re-arrangement? A mathematician divides series into two sets, depending on what the answer to that question is:

unconditional convergence
A series convergences unconditionally, if it has the same limit as any series obtained by re-arrangement of the elements.
conditional convergence
A series converges conditionally, if there are re-arrangements, under which have a different limit than the original series.

For real-valued sequences, a series converges unconditionally, if and only if it converges absolutely. The proof will be given in the next article Rearrangement theorem for series. The above notion of conditional and unconditional convergence becomes important for non-real valued series. then, absolute convergence might be not equivalent to unconditional convergence and one needs two different notions to distinguish both situations.