Many problems in real analysis lectures (and also in applications afterwards) involve the investigation whether a certain series converges or diverges. This page offers you a collection of methods how to tackle such problems. We present strategies that experienced mathematicians use to successfully prove or disprove convergence. These strategies are then applied to some practical examples
Methods for investigating convergence[Bearbeiten]
Using the term test[Bearbeiten]
A series
can never converge, if the corresponding sequence
does not converge to 0. It therefore makes sense to first try to find the limit of
. If this limit doesn't exist or is not 0, you instantly know that
diverges
If you find out that
converges to 0, the next question is: How fast does it converge to 0? If it goes to 0 slower than
(harmonic series) you can expect divergence. The harmonic series is the "fastest decaying series that still diverges". By contrast, if
decays faster than
for some
(geometric series) you know that it converges. The geometric series is "one of the slowest decaying series that still converges". It is a useful idea, to compare
to a harmonic or geometric series in order to disprove or prove its convergence.
This test is virtually a comparison to a geometric series. It is often useful for series in quotient form
. If
, then the series converges absolutely by the ratio test, since it is bounded by some geometric series
(with some constant
). In that case, any
serves for such a bound. However, for
, the series diverges. In case
, we cannot say anything about the convergence and have to use a different test.
This test is also effectively a comparison to a geometric series. It is particularly useful for power series like
or
. Absolute convergence holds for
or
. However, for
or
we have divergence. If the sequence
or
converges, we can replace the
with a
. If the
equals
, we again cannot make any conclusions and need a different test.
Alternating series test[Bearbeiten]
Alternating series call for being treated by the alternating series criterion. According to it, any series
or
converges if
is a monotonously decreasing null sequence. "Monotonously decreasing" makes sure that
(if
are the positive elements). By "null sequence", we know that all
sum up to
at most and that the
are positive.
There are alternating series, for which
does not meet these two criteria. Then, the alternating series test does not work - even though we have an alternating series. If
is not a null sequence, the series even diverges by the term test. If
is a null sequence that does not decrease monotonously, we need to search for a different criterion.
Direct comparison is useful, if we have a fraction of polynomials - like
. The ratio test and the root test may fail in this case. In particular,
and
might be complicated and therefore difficult to handle. It is easier to compare the fraction to a convergent series
(i.e. power 2 in the denominator) or to a divergent harmonic series
(power 1). In general, convergence can be established by comparison to
with any power
and divergence by comparison to
with a power
. The convergence proof requires bounding
from above and
from below. The divergence proof works vice versa. If the polynomials
have some degrees
, the following rule of thumb holds: If
, the series converges . If
,it diverges. For a mathematical proof, we can then use a direct comparison to a series
with some constants
.
Decision tree for convergence and divergence[Bearbeiten]
The tricks above can be visually represented in a decision tree:
We consider the series
The coefficient sequence is
This is a null sequence. We already proved
for natural numbers
. The squeeze theorem implies
for all rational
. The sequence is not alternating, since its elements are positive. As it is a sequence of quotients, we might have good luck with the ratio test:
In the limit
, there is
So the ratio test applies and our series
converges absolutely. Done with it!
Next, we consider the sequence
Again, we have a null sequence of elements:
This will get obvious when we write
. There is
, so also
. This sequence is again not alternating as all elements are positive. We have a quotient, which suggests taking the ratio test. But there is also a power of
, which suggests using the root test. Handling powers of
by the ratio test is tedious, so we try the root test first, i.e. we take the
-th root:
Since
the root test succeeds and our series
converges absolutely.
Now, we investigate the alternating series
The corresponding sequence
is a null sequence, since
. At this point, the alternating series test seems the first option. In order to apply it, we need to check whether
is monotonously decreasing. For all
, there is:
So
is monotonously decreasing. By the alternating series test, we know that the series
converges. But does it also converge absolutely?
We need to investigate whether the series
converges. This scales like a harmonic series, so it should not converge. And indeed, we can compare it to a harmonic series:
As the harmonic series
diverges, we also have divergence of the scaled version
and by direct comparison, the series
diverges. So our series
is convergent, but not absolutely convergent.
We consider the following quotient of polynomials:
The coefficient sequence is
with
The two polynomials in the fraction
are
and
. Its degrees are
which implies absolute convergence by direct comparison: The coefficient sequence scales like
for large
. We can therefore bound it from above by
with
. Explicitly, we can do this by increasing the enumerator and decreasing the denominator:
The series
converges, so also our series
converges absolutely (and
is an upper bound for it).
Our last example is another alternating series:
It may be tempting to use the alternating series test, here. However, we should first check whether the sequence of elements is even a null sequence:
Apparently,
is not a null sequence! So we can not apply the alternating series test. However, we instantly know that
is neither a null sequence and can directly apply the term test. By means of the term test, our series
diverges.