# Examples for limits – Serlo

## Some important limits

The following limits are fundamental building bricks, which you can use to find the limits for a lot of other sequences. You should know them by heart - and have a figure in mind what they look like and how they converge:

• ${\displaystyle \lim _{n\rightarrow \infty }c=c}$ for all ${\displaystyle c\in \mathbb {R} }$
• ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {1}{n}}=0}$
• ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {1}{n^{k}}}=0}$ for all ${\displaystyle k\in \mathbb {N} }$
• ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {1}{\sqrt[{k}]{n}}}=0}$ for all ${\displaystyle k\in \mathbb {N} }$
• ${\displaystyle \lim _{n\rightarrow \infty }q^{n}=0}$ for all ${\displaystyle q\in \mathbb {R} }$ with ${\displaystyle |q|<1}$
• ${\displaystyle \lim _{n\rightarrow \infty }{\sqrt[{n}]{c}}=1}$ for all ${\displaystyle c>0}$
• ${\displaystyle \lim _{n\rightarrow \infty }{\sqrt[{n}]{n}}=1}$
• ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {n^{k}}{z^{n}}}=0}$ for all ${\displaystyle k\in \mathbb {N} }$ and ${\displaystyle z\in \mathbb {R} }$ with ${\displaystyle |z|>1}$
• ${\displaystyle \lim _{n\rightarrow \infty }n^{k}q^{n}=0}$ for all ${\displaystyle k\in \mathbb {N} }$ and ${\displaystyle q\in \mathbb {R} }$ with ${\displaystyle |q|<1}$
• ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {z^{n}}{n!}}=0}$ for all ${\displaystyle z\in \mathbb {R} }$ with ${\displaystyle |z|>1}$
• ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {n!}{n^{n}}}=0}$
• ${\displaystyle \lim _{n\rightarrow \infty }\left(1+{\tfrac {1}{n}}\right)^{n}=e}$

In the following, we will derive all of those limits using the Epsilon-definition of convergence. Only ${\displaystyle \lim _{n\rightarrow \infty }\left(1+{\tfrac {1}{n}}\right)^{n}=e}$ will be considered later in the article „Monotony criterion“.

Hint

In real analysis, it is helpful to have an intuition about how fast sequences grow. For instance, there is ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {n^{k}}{z^{n}}}=0}$, because the exponentially increasing ${\displaystyle (z^{n})_{n\in \mathbb {N} }}$ grows faster than the polynomially increasing ${\displaystyle (n^{k})_{n\in \mathbb {N} }}$. An experienced mathematician will not try to compute ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {n^{k}}{z^{n}}}}$ but instead use the intuitive picture "exponential beats polynomial". Both the enumerator and the denominator grow infinitely large, but the "denominator wins" in this process and the sequence goes to ${\displaystyle {\frac {1}{\infty }}=0}$.

There is actually an entire hierarchy of growth speeds. We express "${\displaystyle a_{n}}$ wins against ${\displaystyle b_{n}}$" by writing ${\displaystyle a_{n}\ll b_{n}}$ and mathematically meaning ${\displaystyle \lim _{n\to \infty }{\tfrac {a_{n}}{b_{n}}}=0}$. The list of limits above then implies:

${\displaystyle 1\ll n^{k}\ll z^{n}\ll n!\ll n^{n}}$

Or intuitively, polynomial beats constant, exponential beats polynomial, factorial beats exponential and ${\displaystyle n^{n}}$ beats them all.

## Constant sequences

Theorem (Limit of a constant sequence)

Every constant sequence converges to

${\displaystyle \lim _{n\rightarrow \infty }c=c}$

Example (Limit of a constant sequence)

• ${\displaystyle \lim _{n\rightarrow \infty }42=42}$
• ${\displaystyle \lim _{n\rightarrow \infty }-1=-1}$

How to get to the proof? (Limit of a constant sequence)

The formal proof requires to show that ${\displaystyle |a_{n}-c|<\epsilon }$ for any ${\displaystyle \epsilon >0}$ and all ${\displaystyle n\in \mathbb {N} }$ . But this is rather simple, as ${\displaystyle a_{n}=c}$ , so ${\displaystyle |c-c|=0}$ and

${\displaystyle \underbrace {|a_{n}-c|} _{=\ 0}<\epsilon }$

always holds. We do not even have to care about how to choose ${\displaystyle N}$ . For simplicity, we may just use ${\displaystyle N=1}$ . The proof now reads:

„Choose ${\displaystyle N=1}$. Let ${\displaystyle n\in \mathbb {N} }$ with ${\displaystyle n\geq N=1}$ be arbitrary. Then…

This fomulation can be simplified, since ${\displaystyle n\geq 1}$ just means that ${\displaystyle n}$ is a natural number:

„For all ${\displaystyle n\in \mathbb {N} }$ there is…“

Proof (Limit of a constant sequence)

Let ${\displaystyle a_{n}=c}$ be a constant sequence and ${\displaystyle \epsilon >0}$ arbitrary. For all ${\displaystyle n\in \mathbb {N} }$ (and hence, almost all ${\displaystyle n\in \mathbb {N} }$) there is

${\displaystyle |a_{n}-c|=|c-c|=0<\epsilon }$

## Harmonic sequence

Theorem (Limit of the Harmonic sequence)

There is ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {1}{n}}=0}$.

How to get to the proof? (Limit of the Harmonic sequence)

We have to establish the inequality ${\displaystyle \left|{\tfrac {1}{n}}-0\right|<\epsilon }$ . First, let us get rid of the absolute ${\displaystyle \left|{\tfrac {1}{n}}-0\right|}$:

${\displaystyle \left|{\tfrac {1}{n}}-0\right|=\left|{\tfrac {1}{n}}\right|={\tfrac {1}{n}}}$

So we need to find some ${\displaystyle n}$, such that ${\displaystyle {\tfrac {1}{n}}<\epsilon }$. Which ${\displaystyle n}$ satisfy this? Let us resolve the inequality for ${\displaystyle n}$:

${\displaystyle {\begin{array}{rrl}&{\tfrac {1}{n}}&<\epsilon \\\Leftrightarrow \ &1&<\epsilon \cdot n\\\Leftrightarrow \ &{\tfrac {1}{\epsilon }}&

So there must be ${\displaystyle n>{\tfrac {1}{\epsilon }}}$ , in order to fulfill ${\displaystyle {\tfrac {1}{n}}<\epsilon }$.

Question: Which threshold ${\displaystyle N}$ will we choose for the later proof?

Just choose any ${\displaystyle N\in \mathbb {N} }$ with ${\displaystyle N>{\tfrac {1}{\epsilon }}}$. If we consider any bigger number ${\displaystyle n\geq N}$, then also ${\displaystyle n\geq {\tfrac {1}{\epsilon }}}$.

How do we know that such an ${\displaystyle N}$ exist? This will be considered to be obvious in an exam or a textbook. Nevertheless, it is useful to think about the exact reasoning at least once:

Question: Why does such an ${\displaystyle N}$ exist?

This follows by the Archimedean axiom. For all real numbers, ${\displaystyle M}$ , there is a larger natural number ${\displaystyle N}$ , i.e. ${\displaystyle N>M}$. We choose ${\displaystyle M={\tfrac {1}{\epsilon }}}$.

Proof (Limit of the Harmonic sequence)

Let ${\displaystyle \epsilon >0}$ be arbitrary. We choose an ${\displaystyle N\in \mathbb {N} }$ with ${\displaystyle N>{\tfrac {1}{\epsilon }}}$. Te existence is provided by the Archimedean axiom. Let ${\displaystyle n\geq N}$ be arbitrary. Then

{\displaystyle {\begin{aligned}\left|{\tfrac {1}{n}}-0\right|&=\left|{\tfrac {1}{n}}\right|={\tfrac {1}{n}}\leq {\tfrac {1}{N}}<\epsilon \end{aligned}}}

## Inverse power series

Theorem

For all ${\displaystyle k\in \mathbb {N} }$ there is ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {1}{n^{k}}}=0}$.

How to get to the proof?

The proof is quite similar to the harmonic sequence above (which is if fact just a special case for ${\displaystyle k=1}$). We start by simplifying ${\displaystyle \left|{\tfrac {1}{n^{k}}}-0\right|}$:

${\displaystyle \left|{\tfrac {1}{n^{k}}}-0\right|=\left|{\tfrac {1}{n^{k}}}\right|={\tfrac {1}{n^{k}}}}$

Now, resolve ${\displaystyle {\tfrac {1}{n^{k}}}<\epsilon }$ for ${\displaystyle n}$:

${\displaystyle {\begin{array}{rrl}&{\tfrac {1}{n^{k}}}&<\epsilon \\\Leftrightarrow \ &1&<\epsilon \cdot n^{k}\\\Leftrightarrow \ &{\tfrac {1}{\epsilon }}&

So we need to find an ${\displaystyle N}$ with ${\displaystyle N>{\frac {1}{\sqrt[{k}]{\epsilon }}}}$ . This can be done by the Archimedean axiom

Proof

Let ${\displaystyle \epsilon >0}$ be arbitrary. Choose ${\displaystyle N\in \mathbb {N} }$ such that ${\displaystyle N>{\tfrac {1}{\sqrt[{k}]{\epsilon }}}}$ . Existence is guaranteed by the Archimedean axiom. Let ${\displaystyle n\geq N}$ be arbitrary. Then,

${\displaystyle {\begin{array}{rrl}&n&\geq N\\[0.5em]&&{\color {OliveGreen}\left\downarrow \ N>{\frac {1}{\sqrt[{k}]{\epsilon }}}\right.}\\[0.3em]\Rightarrow \ &n&>{\frac {1}{\sqrt[{k}]{\epsilon }}}\\[0.3em]\Rightarrow \ &n\cdot {\sqrt[{k}]{\epsilon }}&>1\\[0.3em]\Rightarrow \ &{\sqrt[{k}]{\epsilon }}&>{\tfrac {1}{n}}\\[0.3em]\Rightarrow \ &\epsilon &>{\tfrac {1}{n^{k}}}\\[0.3em]\Rightarrow \ &\epsilon &>\left|{\tfrac {1}{n^{k}}}-0\right|\\[0.3em]\Rightarrow \ &\left|{\tfrac {1}{n^{k}}}-0\right|&<\epsilon \end{array}}}$

## Inverse root sequence

Theorem

For all ${\displaystyle k\in \mathbb {N} }$ there is ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {1}{\sqrt[{k}]{n}}}=0}$.

How to get to the proof?

The proof is almost as above. We start by simplifying ${\displaystyle \left|{\tfrac {1}{\sqrt[{k}]{n}}}-0\right|}$:

${\displaystyle \left|{\tfrac {1}{\sqrt[{k}]{n}}}-0\right|=\left|{\tfrac {1}{\sqrt[{k}]{n}}}\right|={\tfrac {1}{\sqrt[{k}]{n}}}}$

Now, we resolve ${\displaystyle {\tfrac {1}{\sqrt[{k}]{n}}}<\epsilon }$ for ${\displaystyle n}$:

${\displaystyle {\begin{array}{rrl}&{\tfrac {1}{\sqrt[{k}]{n}}}&<\epsilon \\\Leftrightarrow \ &1&<\epsilon \cdot {\sqrt[{k}]{n}}\\\Leftrightarrow \ &{\tfrac {1}{\epsilon }}&<{\sqrt[{k}]{n}}\\\Leftrightarrow \ &\left({\tfrac {1}{\epsilon }}\right)^{k}&

So we need to find an ${\displaystyle N}$ with ${\displaystyle N>{\frac {1}{\epsilon ^{k}}}}$ . This can be done by the Archimedean axiom:

Proof

Let ${\displaystyle \epsilon >0}$ be arbitrary. Choose ${\displaystyle N\in \mathbb {N} }$ such that ${\displaystyle N>{\tfrac {1}{\epsilon ^{k}}}}$ . Existence is guaranteed by the Archimedean axiom. Hence,

${\displaystyle {\begin{array}{rrl}&N&>{\frac {1}{\epsilon ^{k}}}\\[0.3em]\Rightarrow \ &N\cdot \epsilon ^{k}&>1\\[0.3em]\Rightarrow \ &\epsilon ^{k}&>{\tfrac {1}{N}}\\[0.3em]\Rightarrow \ &\epsilon &>{\tfrac {1}{\sqrt[{k}]{N}}}\\[0.3em]\end{array}}}$

Let now ${\displaystyle n\in \mathbb {N} }$ with ${\displaystyle n\geq N}$ be arbitrary. There is:

${\displaystyle {\begin{array}{rrl}&N&\leq n\\[0.5em]\Rightarrow \ &{\tfrac {1}{n}}&\leq {\tfrac {1}{N}}\\[0.5em]\Rightarrow \ &{\tfrac {1}{\sqrt[{k}]{n}}}&\leq {\tfrac {1}{\sqrt[{k}]{N}}}<\epsilon \\\end{array}}}$

## Geometric series

Theorem

For all ${\displaystyle q\in \mathbb {R} }$ with ${\displaystyle |q|<1}$ there is ${\displaystyle \lim _{n\rightarrow \infty }q^{n}=0}$ .

This is now an exponentially (and no longer polynomially) decreasing sequence, so in the end, it goes to 0 even faster.

Example

• ${\displaystyle \lim _{n\rightarrow \infty }\left(-{\tfrac {1}{2}}\right)^{n}=0}$
• ${\displaystyle \lim _{n\rightarrow \infty }\left({\tfrac {1}{\pi }}\right)^{n}=0}$

How to get to the proof?

We start again by simplifying the absolute ${\displaystyle \left|q^{n}-0\right|}$:

${\displaystyle \left|q^{n}-0\right|=\left|q^{n}\right|=|q|^{n}}$

Using the Bernoulli-inequality we can show:

„For each ${\displaystyle 0\leq a<1}$ and ${\displaystyle \epsilon >0}$ there is an ${\displaystyle N\in \mathbb {N} }$ mit ${\displaystyle a^{N}<\epsilon }$.“

We set ${\displaystyle a=|q|}$. Then, there must be an ${\displaystyle N\in \mathbb {N} }$ with ${\displaystyle |q|^{N}<\epsilon }$, which is exactly what we need. For all greater ${\displaystyle n\geq N}$ there is also:

${\displaystyle |q|^{n}=\underbrace {|q|^{n-N}} _{\leq \ 1}\cdot \underbrace {|q|^{N}} _{<\ \epsilon }<\epsilon }$

Proof

Let ${\displaystyle \epsilon >0}$ and ${\displaystyle q\in \mathbb {R} }$ mit ${\displaystyle |q|<1}$ be arbitrary. By the Bernoulli-inequality, we obtain a ${\displaystyle N\in \mathbb {N} }$ mit ${\displaystyle |q|^{N}<\epsilon }$ . For all ${\displaystyle n\geq N}$, there is:

{\displaystyle {\begin{aligned}\left|q^{n}-0\right|&=\left|q^{n}\right|=|q|^{n}=\underbrace {|q|^{n-N}} _{\leq \ 1}\cdot \underbrace {|q|^{N}} _{<\ \epsilon }<\epsilon \end{aligned}}}

Hint

We can also investigate convergence for ${\displaystyle |q|\geq 1}$ . This requires distinguishing 3 cases:

• ${\displaystyle q=1:}$ This is just a constant sequence ${\displaystyle (q^{n})_{n\in \mathbb {N} }=(1^{n})_{n\in \mathbb {N} }=(1)_{n\in \mathbb {N} }}$. Above, we have shown that ${\displaystyle \lim _{n\to \infty }q^{n}=\lim _{n\to \infty }1=1}$.
• ${\displaystyle q=-1:}$ Here, we get the alternating sequence ${\displaystyle (q^{n})_{n\in \mathbb {N} }=\left((-1)^{n}\right)_{n\in \mathbb {N} }}$. It "jumps" between 0 and 1 and hence divergence. The mathematical proof is an exercise in the article „Exercises for convergence and divergence“.
• ${\displaystyle |q|>1:}$ The sequence ${\displaystyle (q^{n})_{n\in \mathbb {N} }}$ diverges, since it is unbounded. In the article „Unbounded sequences divrge“ we will show that even all unbounded sequences must diverge.

## n-th root

Theorem (Limit of a sequence with the n-th root)

For all ${\displaystyle c>0}$ there is ${\displaystyle \lim _{n\rightarrow \infty }{\sqrt[{n}]{c}}=1}$.

How to get to the proof? (Limit of a sequence with the n-th root)

We need to establish an inequality like ${\displaystyle \left|{\sqrt[{n}]{c}}-1\right|<\epsilon }$ . First, let us focus on the proof for ${\displaystyle c\geq 1}$. Then, ${\displaystyle {\sqrt[{n}]{c}}\geq 1}$ and we can leave out the absolute:

Fall 1: ${\displaystyle c\geq 1}$

In this case, ${\displaystyle \left|{\sqrt[{n}]{c}}-1\right|={\sqrt[{n}]{c}}-1}$, and we need to show ${\displaystyle {\sqrt[{n}]{c}}-1<\epsilon }$ for almost all ${\displaystyle n\in \mathbb {N} }$ . We resolve for ${\displaystyle n}$ :

${\displaystyle {\begin{array}{rrl}&{\sqrt[{n}]{c}}-1&<\epsilon \\\Leftrightarrow \ &{\sqrt[{n}]{c}}&<1+\epsilon \\\Leftrightarrow \ &c&<(1+\epsilon )^{n}\\\end{array}}}$

Does this inequality make sense? Since ${\displaystyle \epsilon >0}$ there is ${\displaystyle 1+\epsilon >1}$. So ${\displaystyle \left((1+\epsilon )^{n}\right)_{n\in \mathbb {N} }}$ is a geometric sequence getting arbitrarily large for increasing ${\displaystyle n}$. Hence, there must be an ${\displaystyle n}$, for which ${\displaystyle (1+\epsilon )^{n}}$ is bigger than ${\displaystyle c}$ , which is equivalent to ${\displaystyle {\sqrt[{n}]{c}}-1<\epsilon }$. More precisely, by an „Implication of the Bernoulli-inequality“, there is:

For each ${\displaystyle p>1}$ and ${\displaystyle M\in \mathbb {R} }$ there is an ${\displaystyle n\in \mathbb {N} }$, such that ${\displaystyle p^{n}>M}$ .

If we now set ${\displaystyle p=1+\epsilon }$ and ${\displaystyle M=c}$ , we get the desired ${\displaystyle N\in \mathbb {N} }$ mit ${\displaystyle (1+\epsilon )^{N}>c}$. we only need to show ${\displaystyle (1+\epsilon )^{n}>c}$ for all greater real numbres ${\displaystyle n>N}$, which is done via

${\displaystyle (1+\epsilon )^{n}=\underbrace {(1+\epsilon )^{n-N}} _{\geq \ 1}\underbrace {(1+\epsilon )^{N}} _{>\ c}>c}$

Fall 2: ${\displaystyle c<1}$

This is similar to the above case. We set ${\displaystyle a={\tfrac {1}{c}}}$, such that ${\displaystyle a>1}$ . Then:

{\displaystyle {\begin{aligned}\left|{\sqrt[{n}]{c}}-1\right|&=\left|{\sqrt[{n}]{\frac {1}{a}}}-1\right|\\&=\left|{\frac {1}{\sqrt[{n}]{a}}}-1\right|\\&=\left|{\frac {1-{\sqrt[{n}]{a}}}{\sqrt[{n}]{a}}}\right|\\&=\left|{\frac {1}{\sqrt[{n}]{a}}}\right|\cdot \left|1-{\sqrt[{n}]{a}}\right|\\&={\frac {1}{\sqrt[{n}]{a}}}\cdot \left|{\sqrt[{n}]{a}}-1\right|\\\end{aligned}}}

This makes sense, because ${\displaystyle \left|{\sqrt[{n}]{a}}-1\right|}$ gets arbitrarily small (${\displaystyle a>1}$). So we just need to estimate the term ${\displaystyle {\frac {1}{\sqrt[{n}]{a}}}}$ from above. There is

${\displaystyle {\begin{array}{rrl}&a&>1\\\Rightarrow \ &{\sqrt[{n}]{a}}&>1\\\Rightarrow \ &{\frac {1}{\sqrt[{n}]{a}}}&<1\end{array}}}$

So

${\displaystyle {\frac {1}{\sqrt[{n}]{a}}}\cdot \left|{\sqrt[{n}]{a}}-1\right|<\left|{\sqrt[{n}]{a}}-1\right|}$

and by case 1, we already know that ${\displaystyle \left|{\sqrt[{n}]{a}}-1\right|}$ gets arbitrarily small.

Proof (Limit of a sequence with the n-th root)

Fall 1: ${\displaystyle c\geq 1}$

Let ${\displaystyle \epsilon >0}$ be arbitrary. We know that:

For all ${\displaystyle p>1}$ and all ${\displaystyle M\in \mathbb {R} }$ there is ${\displaystyle n\in \mathbb {N} }$, such that ${\displaystyle p^{n}>M}$ .

So there must be a natural number ${\displaystyle N\in \mathbb {N} }$ mit ${\displaystyle (1+\epsilon )^{N}>c}$. Let ${\displaystyle n\geq N}$ be arbitrary. Then,

${\displaystyle (1+\epsilon )^{n}=\underbrace {(1+\epsilon )^{n-N}} _{\geq \ 1}\underbrace {(1+\epsilon )^{N}} _{>\ c}>c}$

So

${\displaystyle {\begin{array}{rrl}&c&<(1+\epsilon )^{n}\\\Rightarrow \ &{\sqrt[{n}]{c}}&<1+\epsilon \\\Rightarrow \ &{\sqrt[{n}]{c}}-1&<\epsilon \\\Rightarrow \ &\left|{\sqrt[{n}]{c}}-1\right|&<\epsilon \\\end{array}}}$

Fall 2: ${\displaystyle c<1}$

Let ${\displaystyle \epsilon >0}$ be arbitrary. First, set ${\displaystyle a={\tfrac {1}{c}}}$. Then, ${\displaystyle a>1}$ and by case 1, we know that there must be an ${\displaystyle N\in \mathbb {N} }$ with ${\displaystyle \left|{\sqrt[{n}]{a}}-1\right|<\epsilon }$ for all ${\displaystyle n\geq N}$ . Let ${\displaystyle n\geq N}$ be arbitrary. Then,

${\displaystyle {\begin{array}{rrl}&a&>1\\\Rightarrow \ &{\sqrt[{n}]{a}}&>1\\\Rightarrow \ &{\frac {1}{\sqrt[{n}]{a}}}&<1\end{array}}}$

So

{\displaystyle {\begin{aligned}\left|{\sqrt[{n}]{c}}-1\right|&=\left|{\sqrt[{n}]{\frac {1}{a}}}-1\right|\\&=\left|{\frac {1}{\sqrt[{n}]{a}}}-1\right|\\&=\left|{\frac {1-{\sqrt[{n}]{a}}}{\sqrt[{n}]{a}}}\right|\\&=\left|{\frac {1}{\sqrt[{n}]{a}}}\right|\cdot \left|1-{\sqrt[{n}]{a}}\right|\\&={\frac {1}{\sqrt[{n}]{a}}}\cdot \left|{\sqrt[{n}]{a}}-1\right|\\&<\left|{\sqrt[{n}]{a}}-1\right|\\&<\epsilon \\\end{aligned}}}

## n-th root of n

Theorem (Grenzwert n-te Wurzel von n)

There is ${\displaystyle \lim _{n\rightarrow \infty }{\sqrt[{n}]{n}}=1}$.

How to get to the proof? (Grenzwert n-te Wurzel von n)

We need to establish ${\displaystyle \left|{\sqrt[{n}]{n}}-1\right|<\epsilon }$. Let us try to resolve for ${\displaystyle n}$ :

${\displaystyle {\begin{array}{rrl}&\left|{\sqrt[{n}]{n}}-1\right|&<\epsilon \\[0.3em]&&{\color {OliveGreen}\left\downarrow \ {\sqrt[{n}]{n}}\geq 1\right.}\\[0.3em]\Leftrightarrow \ &{\sqrt[{n}]{n}}-1&<\epsilon \\\Leftrightarrow \ &{\sqrt[{n}]{n}}&<1+\epsilon \\\Leftrightarrow \ &n&<(1+\epsilon )^{n}\\\end{array}}}$

${\displaystyle (1+\epsilon )^{n}}$ and ${\displaystyle n}$ both grow to ${\displaystyle \infty }$. So, the only way to success is to show that ${\displaystyle (1+\epsilon )^{n}}$ eventually gets bigger than ${\displaystyle n}$. Intuitively, the reason is that "exponentials beat polynomials". Mathematically, we can show this by factoring out ${\displaystyle (1+\epsilon )^{n}}$:

${\displaystyle (1+\epsilon )^{n}=\sum _{k=0}^{n}{\binom {n}{k}}\epsilon ^{k}=1+n\epsilon +{\frac {n(n-1)}{2}}\epsilon ^{2}+\ldots +\epsilon ^{n}}$

Each term in this sum is ${\displaystyle \geq 0}$. So, if we can show that ${\displaystyle n}$ gets smaller than any part of the sum ${\displaystyle 1+n\epsilon +{\frac {n(n-1)}{2}}\epsilon ^{2}+\ldots +\epsilon ^{n}}$ , we know that ${\displaystyle n}$ is smaller than the entire series ${\displaystyle 1+n\epsilon +{\frac {n(n-1)}{2}}\epsilon ^{2}+\ldots +\epsilon ^{n}=(1+\epsilon )^{n}}$ . We choose the terms ${\displaystyle 1}$ and ${\displaystyle {\tfrac {n(n-1)}{2}}\epsilon ^{2}}$ both suffice to establish

${\displaystyle n<1+{\frac {n(n-1)}{2}}\epsilon ^{2}}$

which in turn is smaller than

${\displaystyle 1+{\frac {n(n-1)}{2}}\epsilon ^{2}\leq 1+n\epsilon +{\frac {n(n-1)}{2}}\epsilon ^{2}+\ldots +\epsilon ^{n}=(1+\epsilon )^{n}}$

so

${\displaystyle n<(1+\epsilon )^{n}}$

What is a suitable ${\displaystyle n}$ to get ${\displaystyle n<1+{\tfrac {n(n-1)}{2}}\epsilon ^{2}}$? Let us resolve for ${\displaystyle n}$:

${\displaystyle {\begin{array}{rrl}&n&<1+{\frac {n(n-1)}{2}}\epsilon ^{2}\\[0.3em]\Leftrightarrow \ &n-1&<{\frac {n(n-1)}{2}}\epsilon ^{2}\\[0.3em]\Leftrightarrow \ &1&<{\frac {n}{2}}\epsilon ^{2}\\[0.3em]\Leftrightarrow \ &2&

By the Archimedean axiom, there is an ${\displaystyle N}$, with ${\displaystyle {\tfrac {2}{\epsilon ^{2}}} for all ${\displaystyle n\geq N}$ .

Proof (Grenzwert n-te Wurzel von n)

Let ${\displaystyle \epsilon >0}$ be arbitrary. We choose ${\displaystyle N\in \mathbb {N} }$ by the Archimedean axiom, such that ${\displaystyle N>{\tfrac {2}{\epsilon ^{2}}}}$ . Let ${\displaystyle n\geq N}$ be arbitrary. Then,

${\displaystyle {\begin{array}{rrl}&{\frac {2}{\epsilon ^{2}}}&

## Ratio - power series / geometric series

In simple words: "exponential beats polynomial":

Theorem

Let ${\displaystyle k\in \mathbb {N} }$ be arbitrary and ${\displaystyle z}$ a real number with ${\displaystyle |z|>1}$. Then, ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {n^{k}}{z^{n}}}=0}$.

How to get to the proof?

This proof will be quite complicated. But it includes some useful tricks, which may help you with other proofs as well. One of those tricks is writing ${\displaystyle |z|>1}$ as ${\displaystyle 1+x}$ with ${\displaystyle x=|z|-1>0}$ . The term ${\displaystyle (1+x)^{n}}$ can be multiplied out using the binomial theorem:

{\displaystyle {\begin{aligned}\left|{\frac {n^{k}}{z^{n}}}\right|&={\frac {\left|n^{k}\right|}{\left|z^{n}\right|}}\\[0.5em]&={\frac {n^{k}}{|z|^{n}}}\\[0.5em]&={\frac {n^{k}}{(1+x)^{n}}}\\[0.5em]&\quad {\color {Gray}\left\downarrow \ {\text{binomial theorem}}\right.}\\[0.5em]&={\frac {n^{k}}{\sum \limits _{l=0}^{n}{\binom {n}{l}}x^{l}}}\end{aligned}}}

The sum in the denominator is what makes this fraction complicated. It would be nice to have a more simple expression. We can achieve this by making the denominator smaller: all terms in ${\displaystyle {\binom {n}{l}}x^{l}}$ are positive. So if we leave summands out, we simplify the denominator and make it smaller. The entire fraction then grows. And if an enlarged fraction still converges to 0, the original one will do so, as well. The interesting question is, which terms to leave out.

Since the enumerator includes ${\displaystyle n^{k}}$ , it may be useful to replace the denominator by some term ${\displaystyle c_{k}\cdot n^{k+1}}$ with ${\displaystyle c_{k}\in \mathbb {R} ^{+}}$ not depending on ${\displaystyle n}$ . This way, we can make sure that the fraction still converges to 0 after the simplification. We will only keep this one term with ${\displaystyle l=k+1}$ , so ${\displaystyle {\binom {n}{k+1}}x^{k+1}}$. All other terms are left out for simplicity. Multiplying out yields ${\displaystyle {\binom {n}{k+1}}={\tfrac {n(n-1)\cdot \ldots \cdot (n-k)}{(k+1)!}}}$ as pre-factor of the term with ${\displaystyle n^{k+1}}$ , whenever there is ${\displaystyle n\geq k+1}$ . In that case,

{\displaystyle {\begin{aligned}{\frac {n^{k}}{\sum _{l=0}^{n}{\binom {n}{l}}x^{l}}}&\leq {\frac {n^{k}}{{\binom {n}{k+1}}x^{k+1}}}\\[0.5em]&={\frac {n^{k}}{{\frac {n\cdot (n-1)\cdot (n-2)\dots (n-k)}{(k+1)!}}x^{k+1}}}\\[0.5em]&={\frac {(k+1)!}{x^{k+1}}}\cdot {\frac {\overbrace {n\cdot n\cdot n\dots n} ^{k{\text{ factors}}}}{\underbrace {n\cdot (n-1)\cdot (n-2)\dots (n-k)} _{(k+1){\text{ factors}}}}}\\[0.5em]&={\frac {(k+1)!}{x^{k+1}}}\cdot {\frac {\overbrace {n\cdot n\cdot n\dots n} ^{k{\text{ factors}}}}{\underbrace {(n-1)\cdot (n-2)\dots (n-k)} _{k{\text{ factors}}}}}\cdot {\frac {1}{n}}\end{aligned}}}

If we can now estimate the expression ${\displaystyle {\tfrac {n\cdot n\cdot n\dots n}{(n-1)\cdot (n-2)\dots (n-k)}}}$ from above by something independent of ${\displaystyle n}$ , we finally have made it! The pre-factor ${\displaystyle {\tfrac {(k+1)!}{x^{k+1}}}}$ remains constant in the limit process ${\displaystyle n\to \infty }$ and cannot avoid convergence to 0. However, we will need to additionally require ${\displaystyle n\geq 2k\iff {\tfrac {n}{2}}\geq k}$. Hence,

{\displaystyle {\begin{aligned}n-1&={\frac {n}{2}}+\underbrace {{\frac {n}{2}}-1} _{\geq 0}\geq {\frac {n}{2}},\\[0.3em]n-2&={\frac {n}{2}}+\underbrace {{\frac {n}{2}}-2} _{\geq 0}\geq {\frac {n}{2}},\\[0.3em]\vdots &\\[0.3em]n-k&={\frac {n}{2}}+\underbrace {{\frac {n}{2}}-k} _{\geq 0}\geq {\frac {n}{2}}\end{aligned}}}

This in turn implies

{\displaystyle {\begin{aligned}{\frac {n\cdot n\cdot n\dots n}{(n-1)\cdot (n-2)\dots (n-k)}}&\leq {\frac {\overbrace {n\cdot n\cdot n\dots n} ^{k{\text{ times}}}}{\underbrace {({\frac {n}{2}})\cdot ({\frac {n}{2}})\dots ({\frac {n}{2}})} _{k{\text{ times}}}}}\\[0.3em]&={\frac {n^{k}}{\frac {n^{k}}{2^{k}}}}\\[0.3em]&=2^{k}\end{aligned}}}

So all together,

{\displaystyle {\begin{aligned}{\frac {n^{k}}{\sum _{l=0}^{n}{\binom {n}{l}}x^{l}}}&\leq {\frac {n^{k}}{{\binom {n}{k+1}}x^{k+1}}}\\[0.5em]&\leq {\frac {(k+1)!}{x^{k+1}}}\cdot {\frac {n^{k}}{(n-1)\cdot (n-2)\dots (n-k)}}\cdot {\frac {1}{n}}\\[0.5em]&\leq {\frac {2^{k}(k+1)!}{x^{k+1}}}\cdot {\frac {1}{n}}\end{aligned}}}

Since the expression ${\displaystyle {\frac {2^{k}(k+1)!}{x^{k+1}}}}$ does not depend on ${\displaystyle n}$ and ${\displaystyle ({\tfrac {1}{n}})}$ is a null sequence, the last expression must also be a null sequence.

For a mathematically exact proof, we need to find some ${\displaystyle N\in \mathbb {N} }$ for every ${\displaystyle \epsilon >0}$ such that for all ${\displaystyle n\geq N}$ there is:

${\displaystyle \left|{\frac {n^{k}}{z^{n}}}-0\right|={\frac {n^{k}}{|z|^{n}}}<\epsilon }$

For the above estimates we need that ${\displaystyle n>k+1}$ and ${\displaystyle n>2k}$. Since ${\displaystyle 2k>k+1}$ for all ${\displaystyle k\in \mathbb {N} }$ , we can restrict to ${\displaystyle n>2k}$. Further

${\displaystyle {\begin{array}{lrl}&{\frac {2^{k}(k+1)!}{x^{k+1}}}\cdot {\frac {1}{n}}&<\epsilon \\[1em]\iff \ &{\frac {2^{k}(k+1)!}{x^{k+1}}}&<\epsilon n\\[1em]\iff \ &{\frac {2^{k}(k+1)!}{x^{k+1}}}\cdot {\frac {1}{\epsilon }}&

So we need to satisfy a second condition ${\displaystyle n>{\frac {2^{k}(k+1)!}{x^{k+1}}}\cdot {\frac {1}{\epsilon }}}$. Both conditions can be made sure to be satisfied by setting ${\displaystyle N>\max \left\{2k,{\frac {2^{k}(k+1)!}{x^{k+1}}}\cdot {\frac {1}{\epsilon }}\right\}}$. Then, we get the desired result

${\displaystyle \left|{\frac {n^{k}}{z^{n}}}\right|\leq {\frac {2^{k}(k+1)!}{x^{k+1}}}\cdot {\frac {1}{n}}<\epsilon }$

So, let's write this into a concise proof

Proof

Let ${\displaystyle z\in \mathbb {R} }$ with ${\displaystyle |z|>1}$ be arbitrary. Set ${\displaystyle x=|z|-1}$. Then, ${\displaystyle |z|=1+x}$ with ${\displaystyle x>0}$.

Let ${\displaystyle \epsilon >0}$ be arbitrary. Choose a natural number ${\displaystyle N}$ with ${\displaystyle N>\max \left\{2k,{\frac {2^{k}(k+1)!}{x^{k+1}}}\cdot {\frac {1}{\epsilon }}\right\}}$. Let ${\displaystyle n\geq N}$ be arbitrary. Then,

{\displaystyle {\begin{aligned}\left|{\frac {n^{k}}{z^{n}}}\right|&={\frac {\left|n^{k}\right|}{\left|z^{n}\right|}}\\[0.3em]&={\frac {n^{k}}{|z|^{n}}}\\[0.3em]&={\frac {n^{k}}{(1+x)^{n}}}\\[0.3em]&\quad {\color {Gray}\left\downarrow \ {\text{binomial theorem}}\right.}\\[0.3em]&={\frac {n^{k}}{\sum _{l=0}^{n}{\binom {n}{l}}x^{l}}}\\[0.3em]&\leq {\frac {n^{k}}{{\binom {n}{k+1}}x^{k+1}}}\\[0.3em]&={\frac {(k+1)!\cdot \overbrace {n\cdot n\cdot n\dots n} ^{k{\text{ times}}}}{n\cdot (n-1)\cdot (n-2)\dots (n-k)\cdot x^{k+1}}}\\[0.3em]&={\frac {(k+1)!\cdot \overbrace {n\cdot n\cdot n\dots n} ^{k{\text{ times}}}}{\underbrace {(n-1)\cdot (n-2)\dots (n-k)} _{k{\text{ times}}}\cdot x^{k+1}}}\cdot {\frac {1}{n}}\\[0.3em]&\quad {\color {Gray}\left\downarrow \ n>2k\ \Longrightarrow \ n-1\geq {\frac {n}{2}},\ \ldots \ ,n-k\geq {\frac {n}{2}}\right.}\\[0.3em]&={\frac {(k+1)!\cdot \overbrace {n\cdot n\cdot n\dots n} ^{k{\text{ times}}}}{\underbrace {{\frac {n}{2}}\cdot {\frac {n}{2}}\dots {\frac {n}{2}}} _{k{\text{ times}}}\cdot x^{k+1}}}\cdot {\frac {1}{n}}\\[0.3em]&={\frac {2^{k}(k+1)!}{x^{k+1}}}\cdot {\frac {1}{n}}\\[0.3em]&\quad {\color {Gray}\left\downarrow \ n\geq N>{\frac {2^{k}(k+1)!}{x^{k+1}}}{\frac {1}{\epsilon }}\right.}\\[0.3em]&<{\frac {2^{k}(k+1)!}{x^{k+1}}}\cdot {\frac {1}{{\frac {2^{k}(k+1)!}{x^{k+1}}}{\frac {1}{\epsilon }}}}\\[0.3em]&={\frac {1}{\frac {1}{\epsilon }}}\\&=\epsilon \end{aligned}}}

Hint

Sometimes, you may encounter sequences of the form${\displaystyle \lim _{n\to \infty }n^{k}q^{n}=0}$ for all ${\displaystyle k\in \mathbb {N} }$ and ${\displaystyle q\in \mathbb {R} }$ with ${\displaystyle |q|<1}$. We can reduce them to the above case by setting ${\displaystyle z={\tfrac {1}{q}}}$. Obviously ${\displaystyle |z|={\tfrac {1}{|q|}}>1}$, so

${\displaystyle \lim _{n\to \infty }n^{k}q^{n}{\overset {z={\tfrac {1}{q}}}{=}}\lim _{n\to \infty }{\tfrac {n^{k}}{z^{n}}}=0}$

## Ratio - geometric series / factorial sequences

In simple words: "factorial beats exponential":

Theorem

Let ${\displaystyle z}$ be a real number with ${\displaystyle |z|>1}$. Then, ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {z^{n}}{n!}}=0}$.

How to get to the proof?

We need to show that ${\displaystyle \left|{\tfrac {z^{n}}{n!}}\right|={\tfrac {|z|^{n}}{n!}}={\tfrac {|z|\cdot |z|\cdot \ldots \cdot |z|}{1\cdot 2\cdot \ldots \cdot n}}}$ approaches 0, as ${\displaystyle n}$ goes to infinity. This will be done, by bounding it from above by a more simple expression, of which we already know that it converges to 0.

Let's take a closer look at the ratio we have to bound: Enumerator and denominator consist of an equal amount of factors (${\displaystyle n}$ of them). But those in the denominator are gradually increasing, whereas those in the enumerator stay constant at ${\displaystyle |z|}$ . So at some ${\displaystyle k>|z|\Leftrightarrow {\tfrac {|z|}{k}}<1}$ we expect the denominator to "overtake" the enumerator and finally "win against it". For making the actual bounding, we will split the sequence elements in two parts: That one, where ${\displaystyle k}$ has not yet overtaken ${\displaystyle |z|}$ will contain only a bounded amount of factors. By contrast, the part with ${\displaystyle k>|z|}$ will continue to grow with increasing ${\displaystyle n}$. We choose ${\displaystyle M\geq |z|}$ as a threshold, after which "${\displaystyle k}$ has overtaken ${\displaystyle |z|}$" and consider ${\displaystyle n\geq M+1}$. Then

{\displaystyle {\begin{aligned}{\frac {|z|^{n}}{n!}}&={\frac {|z|\cdot |z|\cdot \ldots \cdot |z|}{1\cdot 2\cdot \ldots \cdot n}}\\[0.5em]&=\underbrace {\frac {|z|\cdot |z|\cdot \ldots \cdot |z|}{1\cdot 2\cdot \ldots \cdot M}} _{={\tfrac {|z|^{M}}{M!}}}\cdot \overbrace {\frac {|z|\cdot |z|\cdot \ldots \cdot |z|}{(M+1)\cdot (M+2)\cdot \ldots \cdot n}} ^{n-M{\text{ factors, each}}}\\&{\color {Gray}\left\downarrow \ {\tfrac {|z|}{M+1}},{\tfrac {|z|}{M+2}},\ldots ,{\tfrac {|z|}{n}}\leq {\tfrac {|z|}{M+1}}\right.}\\[0.5em]&\leq {\frac {|z|^{M}}{M!}}\cdot {\frac {|z|^{n-M}}{(M+1)^{n-M}}}\\&={\frac {|z|^{M}}{M!}}\cdot {\frac {|z|^{n}}{(M+1)^{n}}}\cdot {\frac {(M+1)^{M}}{|z|^{M}}}\\&=\underbrace {\frac {(M+1)^{M}}{M!}} _{{\text{constant w.r. to }}n}\cdot \left({\frac {|z|}{M+1}}\right)^{n}\end{aligned}}}

Since ${\displaystyle {\tfrac {|z|}{M+1}}<1}$ , the sequence ${\displaystyle \left({\tfrac {|z|}{M+1}}\right)^{n}}$ is a geometric one and goes to 0. The factor ${\displaystyle {\tfrac {(M+1)^{M}}{M!}}}$ is constant for increasing ${\displaystyle n}$ and does not destroy the convergence to 0. Therefore, we should be able to show ${\displaystyle \lim _{n\to \infty }{\tfrac {z^{n}}{n!}}=0}$:

${\displaystyle {\frac {|z|^{n}}{n!}}\leq \underbrace {\underbrace {\frac {(M+1)^{M}}{M!}} _{{\text{constant w.r. to }}n}\cdot \underbrace {\left({\frac {|z|}{M+1}}\right)^{n}} _{\to 0}} _{\to 0}}$

The proof can be done explicitly, using the Epsilon-definition. Let ${\displaystyle \epsilon >0}$ be given. We need to find an ${\displaystyle N\in \mathbb {N} }$ , such that for all ${\displaystyle n\geq N}$ the inequality ${\displaystyle {\tfrac {|z|^{n}}{n!}}<\epsilon }$ holds. The elements ${\displaystyle {\tfrac {|z|^{n}}{n!}}<\epsilon }$ have already been bounded from above (see the inequality above). Will the right-hand side of this inequality get smaller than ${\displaystyle \epsilon }$ for certain ${\displaystyle n\geq N}$? We have

{\displaystyle {\begin{aligned}&&{\frac {(M+1)^{M}}{M!}}\cdot \left({\frac {|z|}{M+1}}\right)^{n}&<\epsilon \\[0.5em]&\iff &\left({\frac {|z|}{M+1}}\right)^{n}&<\epsilon \cdot {\frac {M!}{(M+1)^{M}}}\end{aligned}}}

The right-hand side does not depend on ${\displaystyle n}$. Since ${\displaystyle {\tfrac {|z|}{M+1}}<1}$ , the Archimedean axiom implies that there is an ${\displaystyle L\in \mathbb {N} }$, such that

${\displaystyle \left({\frac {|z|}{M+1}}\right)^{L}<\epsilon \cdot {\frac {M!}{(M+1)^{M}}}}$

And for all greater ${\displaystyle n\geq L}$, there is also

${\displaystyle \left({\frac {|z|}{M+1}}\right)^{n}\leq \left({\frac {|z|}{M+1}}\right)^{L}<\epsilon \cdot {\frac {M!}{(M+1)^{M}}}}$

Now, we have all parts together for constructing the proof. Throughout our considerations, we posed the conditions ${\displaystyle n\geq L}$ and ${\displaystyle n\geq M+1}$ . Both are satisfied by setting ${\displaystyle N=\max\{M+1,L\}}$.

Proof

Let ${\displaystyle \epsilon >0}$. By the Archimedean axiom there is an ${\displaystyle M\in \mathbb {N} }$ with ${\displaystyle M\geq |z|}$. Now, for all ${\displaystyle n\geq M+1}$:

{\displaystyle {\begin{aligned}{\frac {|z|^{n}}{n!}}&={\frac {|z|\cdot |z|\cdot \ldots \cdot |z|}{1\cdot 2\cdot \ldots \cdot n}}\\[0.5em]&={\frac {|z|^{M}}{M!}}\cdot {\frac {|z|^{n-M}}{(M+1)\cdot (M+2)\cdot \ldots \cdot n}}\\[0.5em]&{\color {Gray}\left\downarrow \ {\tfrac {|z|}{M+1}},{\tfrac {|z|}{M+2}},\ldots ,{\tfrac {|z|}{n}}\leq {\tfrac {|z|}{M+1}}\right.}\\[0.5em]&\leq {\frac {|z|^{M}}{M!}}\cdot {\frac {|z|^{n-M}}{(M+1)^{n-M}}}\\&=\underbrace {\frac {(M+1)^{M}}{M!}} _{\text{fest}}\cdot \left({\frac {|z|}{M+1}}\right)^{n}\end{aligned}}}

Since ${\displaystyle {\tfrac {|z|}{M+1}}<1}$ there is an ${\displaystyle L\in \mathbb {N} }$ such that ${\displaystyle \left({\tfrac {|z|}{M+1}}\right)^{L}<\epsilon \cdot {\tfrac {M!}{(M+1)^{M}}}}$. For all ${\displaystyle n\geq N=\max\{M+1,L\}}$ there is then

{\displaystyle {\begin{aligned}\left|{\frac {z^{n}}{n!}}\right|&={\frac {|z|^{n}}{n!}}\\[0.5em]&\leq {\frac {|z|^{M}}{M!}}\cdot {\frac {|z|^{n-M}}{(M+1)^{n-M}}}\\[0.5em]&={\frac {(M+1)^{M}}{M!}}\cdot \left({\frac {|z|}{M+1}}\right)^{n}\\[0.5em]&\leq {\frac {(M+1)^{M}}{M!}}\cdot \left({\frac {|z|}{M+1}}\right)^{L}\\[0.5em]&<{\frac {(M+1)^{M}}{M!}}\cdot \epsilon \cdot {\frac {M!}{(M+1)^{M}}}\\[0.5em]&=\epsilon \end{aligned}}}

## Ratio - factorial / ${\displaystyle n^{n}}$

In simple words: "${\displaystyle n^{n}}$ beats factorial":

Theorem

There is ${\displaystyle \lim _{n\rightarrow \infty }{\tfrac {n!}{n^{n}}}=0}$.

Proof

Let ${\displaystyle \epsilon >0}$. Choose ${\displaystyle N\in \mathbb {N} }$ such that ${\displaystyle N\geq {\tfrac {1}{\epsilon }}+1}$. For all ${\displaystyle n\geq N}$ there is

{\displaystyle {\begin{aligned}\left|{\frac {n!}{n^{n}}}-0\right|&={\frac {n!}{n^{n}}}={\frac {1}{n}}\cdot {\frac {2\cdot 3\cdot \ldots \cdot n}{n\cdot n\cdot \ldots \cdot n}}\\[0.3em]&={\frac {1}{n}}\cdot \prod _{k=2}^{n}{\frac {k}{n}}\\[0.3em]&{\color {OliveGreen}\left\downarrow \ {\frac {k}{n}}\leq 1{\text{ for all }}k\in \{2,\ldots ,n\}{\text{ and hence }}\prod _{k=2}^{n}{\frac {k}{n}}\leq 1\right.}\\[0.3em]&\leq {\frac {1}{n}}\leq {\frac {1}{{\tfrac {1}{\epsilon }}+1}}<\epsilon \end{aligned}}}