# Exercises: Derivatives 2 – Serlo

Zur Navigation springen Zur Suche springen

## Derivative of the inverse function

Exercise (Derivative of the inverse function)

Consider the function

${\displaystyle f:\mathbb {R} \to \mathbb {R} ,\ f(x)=e^{x}+x}$

Justify that the following derivatives and limits exist and calculate them:

1. ${\displaystyle f'(0)}$ and ${\displaystyle (f^{-1})'(1)}$
2. ${\displaystyle \lim _{y\to -\infty }f^{-1}(y)}$ and ${\displaystyle \lim _{y\to -\infty }(f^{-1})'(y)}$

Solution (Derivative of the inverse function)

Part 1:

Proof step: Existence and computation of ${\displaystyle f'(0)}$

${\displaystyle f}$ is differentiable on ${\displaystyle \mathbb {R} }$ as a sum of the differentiable functions ${\displaystyle \exp }$ and ${\displaystyle {\text{id}}}$ with derivative

${\displaystyle f'(x)=e^{x}+1}$

So ${\displaystyle f'(0)=e^{0}+1=2}$.

Proof step: Existence and computation of ${\displaystyle (f^{-1})'(1)}$

${\displaystyle f}$ is differentiable on ${\displaystyle \mathbb {R} }$ as a sum of the differentiable functions ${\displaystyle \exp }$ and ${\displaystyle {\text{id}}}$ with derivative

${\displaystyle f'(x)=e^{x}+1}$

Further, ${\displaystyle f'(x)>0}$, since ${\displaystyle e^{x}>0}$ for ${\displaystyle x\in \mathbb {R} }$. So ${\displaystyle f}$ is strictly monotonously increasing by the monotony criterion and hence injective. Further, ${\displaystyle f:\mathbb {R} \to f(\mathbb {R} )}$ is bijective. And we have ${\displaystyle f(0)=1}$, so ${\displaystyle 1\in f(\mathbb {R} )}$, and there is ${\displaystyle f'(0)=2\neq 0}$. Hence ${\displaystyle f^{-1}}$ is differentiable at ${\displaystyle 1}$ with

${\displaystyle (f^{-1})'(1)={\frac {1}{f'(0)}}={\frac {1}{2}}}$

Part 2:

Proof step: Computation of ${\displaystyle \lim _{y\to -\infty }f^{-1}(y)}$

There is

${\displaystyle \lim _{x\to -\infty }f(x)=\lim _{x\to -\infty }\underbrace {e^{x}} _{\to 0}+\underbrace {x} _{\to -\infty }=-\infty }$

So

${\displaystyle \lim _{y\to -\infty }f^{-1}(y)=-\infty }$

Proof step: Computation of ${\displaystyle \lim _{y\to -\infty }(f^{-1})'(y)}$

There is ${\displaystyle \lim _{x\to \infty }f(x)=\infty }$. According to the intermediate value theorem, ${\displaystyle f:\mathbb {R} \to \mathbb {R} }$ is bijective. For all ${\displaystyle y\in \mathbb {R} }$ we thus have

${\displaystyle (f^{-1})'(y)={\frac {1}{f'(f^{-1}(y))}}={\frac {1}{\exp(f^{-1}(y))+1}}}$

So

${\displaystyle \lim _{y\to -\infty }(f^{-1})'(y)=\lim _{y\to -\infty }{\frac {1}{\exp(f^{-1}(y))+1}}={\frac {1}{\exp(\lim _{y\to -\infty }f^{-1}(y))+1}}={\frac {1}{1}}=1}$

### Derivative of a general logarithm function

Exercise (Derivative of a general logarithm function)

Show that the function

${\displaystyle f:\mathbb {R} \to \mathbb {R} ^{+},\ f(x)=a^{x}=\exp(x\ln a)}$

with base ${\displaystyle a\in \mathbb {R} \setminus \{1\}}$ is bijective and differentiable on ${\displaystyle \mathbb {R} }$ . Show further that the inverse function

${\displaystyle f^{-1}=\log _{a}:\mathbb {R} ^{+}\to \mathbb {R} ,\ \log _{a}(y)={\frac {\ln y}{\ln a}}}$

is differentiable on all of ${\displaystyle \mathbb {R} ^{+}}$ with derivative ${\displaystyle (\log _{a})'(y)={\frac {1}{y\ln a}}}$.

Solution (Derivative of a general logarithm function)

Proof step: ${\displaystyle f}$ is bijective and differentiable

Fall 1: ${\displaystyle a>1}$

${\displaystyle f}$ is continuous on ${\displaystyle \mathbb {R} }$ as a composition of continuous functions. Further , since ${\displaystyle \ln a>0}$, ${\displaystyle \lim _{x\to \infty }\exp(x)=\infty }$ and ${\displaystyle \lim _{x\to -\infty }\exp(x)=0}$ there is:

${\displaystyle \lim _{x\to \infty }a^{x}=\lim _{x\to \infty }\exp(x\ln a)=\infty }$

as well as

${\displaystyle \lim _{x\to -\infty }a^{x}=\lim _{x\to -\infty }\exp(x\ln a)=0}$

By the intermediate value theorem, ${\displaystyle f(\mathbb {R} )=\mathbb {R} ^{+}}$ and ${\displaystyle f}$ is therefore surjective. Moreover, ${\displaystyle f}$ is differentiable according to the chain rule as a composition of differentiable functions, and

${\displaystyle f'(x)=\underbrace {\ln a} _{>0}\cdot \underbrace {\exp(x\ln a)} _{>0}>0}$

for all ${\displaystyle x\in \mathbb {R} }$. By the monotonicity criterion, ${\displaystyle f}$ is strictly monotonously increasing, and hence injective. So we have shown the bijectivity of ${\displaystyle f}$.

Fall 2: ${\displaystyle 0

${\displaystyle f}$ is continuous on ${\displaystyle \mathbb {R} }$ as a composition of continuous functions. Further , since ${\displaystyle \ln a<0}$, ${\displaystyle \lim _{x\to \infty }\exp(x)=\infty }$ and ${\displaystyle \lim _{x\to -\infty }\exp(x)=0}$ there is:

${\displaystyle \lim _{x\to \infty }a^{x}=\lim _{x\to \infty }\exp(x\ln a)=0}$

as well as

${\displaystyle \lim _{x\to -\infty }a^{x}=\lim _{x\to -\infty }\exp(x\ln a)=\infty }$

By the intermediate value theorem, ${\displaystyle f(\mathbb {R} )=\mathbb {R} ^{+}}$ and ${\displaystyle f}$ is therefore surjective. Moreover, ${\displaystyle f}$ is differentiable according to the chain rule as a composition of differentiable functions, and

${\displaystyle f'(x)=\underbrace {\ln a} _{<0}\cdot \underbrace {\exp(x\ln a)} _{>0}<0}$

for all ${\displaystyle x\in \mathbb {R} }$. By the monotonicity criterion, ${\displaystyle f}$ is strictly monotonously decreasing, and hence injective. So we have shown the bijectivity of ${\displaystyle f}$.

Proof step: ${\displaystyle f^{-1}}$ exists and is differentiable

${\displaystyle f}$ is bijective on ${\displaystyle \mathbb {R} }$ and hence invertible. Further, there is

${\displaystyle y=\exp(x\ln a)\iff \ln y=x\ln a\iff x={\frac {\ln y}{\ln a}}}$

So

${\displaystyle f^{-1}:\mathbb {R} ^{+}\to \mathbb {R} ,f^{-1}(y)=\log _{a}(y)={\frac {\ln y}{\ln a}}}$

Since in addition ${\displaystyle f'(x)=\ln a\cdot \exp(x\ln a)\neq 0}$ for all ${\displaystyle x=f^{-1}(y)\in \mathbb {R} }$, the theorem about inverse function derivatives yields that ${\displaystyle f^{-1}}$ is differentiable on ${\displaystyle \mathbb {R} ^{+}}$ with

${\displaystyle (f^{-1})'(y)={\frac {1}{f'(f^{-1}(y))}}={\frac {1}{f'({\tfrac {\ln y}{\ln a}})}}={\frac {1}{\ln a\cdot \exp({\tfrac {\ln y}{\ln a}}\ln a)}}={\frac {1}{\ln a\cdot \exp(\ln y)}}={\frac {1}{\ln a\cdot y}}}$

Exercise (derivatives on ${\displaystyle \operatorname {arccot} }$ and area-functions)

Show that the functions

• ${\displaystyle \operatorname {arccot} :\mathbb {R} \to (0,\pi )}$
• ${\displaystyle {\text{arsinh}}:\mathbb {R} \to \mathbb {R} }$
• ${\displaystyle {\text{arcosh}}:(1,\infty )\to (0,\infty )}$
• ${\displaystyle {\text{artanh}}:(-1,1)\to \mathbb {R} }$

are differentiable, and determine their derivative.

Proof (derivatives on ${\displaystyle \operatorname {arccot} }$ and area-functions)

Differentiability of ${\displaystyle \operatorname {arccot} }$:

For the cotangent function ${\displaystyle \cot |_{(0,\pi )}=\left.{\tfrac {\cos }{\sin }}\right|_{(0,\pi )}}$ there is:${\displaystyle \cot '=-1-\cot ^{2}<0}$. Thus the function is differentiable and strictly monotonically decreasing, and thus injective. Further, ${\displaystyle \cot((0,\pi ))=\mathbb {R} }$. Thus ${\displaystyle \cot :(0,\pi )\to \mathbb {R} }$ is bijective. The inverse function

${\displaystyle \operatorname {arccot} :\mathbb {R} \to (0,\pi )}$

is differentiable according to the theorem on the derivative of the inverse function, and for ${\displaystyle {\tilde {x}}\in \mathbb {R} }$ there is:

${\displaystyle \operatorname {arccot} '({\tilde {x}})={\frac {1}{\cot '(\operatorname {arccot}({\tilde {x}}))}}={\frac {1}{-1-\cot ^{2}(\operatorname {arccot}({\tilde {x}}))}}={\frac {1}{-1-{\tilde {x}}^{2}}}=-{\frac {1}{1+{\tilde {x}}^{2}}}}$

Differentiability of ${\displaystyle {\text{arsinh}}}$:

The hyperbolic sine function ${\displaystyle \sinh :\mathbb {R} \to \mathbb {R} }$ is differentiable with ${\displaystyle \sinh '=\cosh >0}$. Thus it is strictly monotonously increasing, and hence injective. Further, ${\displaystyle \sinh(\mathbb {R} )=\mathbb {R} }$. So it is also surjective. The inverse function

${\displaystyle {\text{arsinh}}:\mathbb {R} \to \mathbb {R} }$

is differentiable by the theorem on the derivative of the inverse function , and for ${\displaystyle {\tilde {x}}\in \mathbb {R} }$ there is:

${\displaystyle {\text{arsinh}}'({\tilde {x}})={\frac {1}{\sinh '({\text{arsinh}}({\tilde {x}}))}}={\frac {1}{\cosh({\text{arsinh}}({\tilde {x}}))}}={\frac {1}{\sqrt {\sinh ^{2}({\text{arsinh}}({\tilde {x}}))+1}}}={\frac {1}{\sqrt {{\tilde {x}}^{2}+1}}}}$

Differentiability of ${\displaystyle {\text{arcosh}}}$:

The hyperbolic cosine function ${\displaystyle \cosh :(0,\infty )\to (1,\infty )}$ is differentiable with ${\displaystyle \cosh '=\sinh >0}$ on ${\displaystyle (0,\infty )}$. Thus it is strictly monotonically increasing, and hence injective. Further, ${\displaystyle \cosh((0,\infty ))=(1,\infty )}$. So it is also surjective. The inverse function

${\displaystyle {\text{arcosh}}:(1,\infty )\to (0,\infty )}$

is differentiable by to the theorem on the derivative of the inverse function, and for ${\displaystyle {\tilde {x}}\in (1,\infty )}$ there is:

${\displaystyle {\text{arcosh}}'({\tilde {x}})={\frac {1}{\cosh '({\text{arcosh}}({\tilde {x}}))}}={\frac {1}{\sinh({\text{arcosh}}({\tilde {x}}))}}={\frac {1}{\sqrt {\sinh ^{2}({\text{arsinh}}({\tilde {x}}))-1}}}={\frac {1}{\sqrt {{\tilde {x}}^{2}-1}}}}$

Differentiability of ${\displaystyle {\text{artanh}}}$:

For the cotangent function ${\displaystyle \tan :\mathbb {R} \to (-1,1)}$ there is: ${\displaystyle \tanh '=1-\tanh ^{2}>0}$. Thus the function is strictly monotonically increasing, and thus injective. Further, ${\displaystyle \tanh(\mathbb {R} )=(-1,1)}$. Thus ${\displaystyle \tanh :\mathbb {R} \to (-1,1)}$ is bijective. The inverse function

${\displaystyle {\text{artanh}}:(-1,1)\to \mathbb {R} }$

is differentiable by the theorem on the derivative of the inverse function, and for ${\displaystyle {\tilde {x}}\in (-1,1)}$ there is:

${\displaystyle {\text{artanh}}'({\tilde {x}})={\frac {1}{\tanh '({\text{artanh}}({\tilde {x}}))}}={\frac {1}{1-\tanh ^{2}({\text{artanh}}({\tilde {x}}))}}={\frac {1}{1-{\tilde {x}}^{2}}}}$

Exercise (Non-differentiable functions at zero)

Let ${\displaystyle \epsilon >0}$. Show that:

1. Let ${\displaystyle f,g:(-\epsilon ,\epsilon )\to \mathbb {R} }$ with ${\displaystyle f(0)=g(0)=0}$ and ${\displaystyle f(x)g(x)=x}$ for all ${\displaystyle x\in (-\epsilon ,\epsilon )}$. Then ${\displaystyle f}$ and ${\displaystyle g}$ are not simultaneously differentiable at zero.
2. Let ${\displaystyle f,g:(-\epsilon ,\epsilon )\to \mathbb {R} }$, and let ${\displaystyle f}$ be differentiable at zero. Further let ${\displaystyle f(0)=f'(0)=0}$ and ${\displaystyle g(f(x))=x}$ for all ${\displaystyle x\in (-\epsilon ,\epsilon )}$. Then ${\displaystyle g}$ is not differentiable at zero.

Solution (Non-differentiable functions at zero)

Part 1: Assuming ${\displaystyle f}$ and ${\displaystyle g}$ are differentiable at zero. Then according to the product rule, ${\displaystyle fg}$ is also differentiable at zero, and from ${\displaystyle fg=\mathrm {id} }$ we get for all ${\displaystyle x\in (-\epsilon ,\epsilon )}$ that:

${\displaystyle f'(0)\underbrace {g(0)} _{=0}+\underbrace {f(0)} _{=0}g'(0)=1\iff 0=1}$ ?

So ${\displaystyle f}$ and ${\displaystyle g}$ cannot both be differentiable at zero.

Part 2: Assume that ${\displaystyle g}$ is differentiable at zero. Then according to the chain rule ${\displaystyle g\circ f}$ is also differentiable at zero, and from ${\displaystyle g\circ f=\mathrm {id} }$ we get for all ${\displaystyle x\in (-\epsilon ,\epsilon )}$ that:

${\displaystyle g'(\underbrace {f(0)} _{=0})\cdot \underbrace {f'(0)} _{=0}=1\iff 0=1}$ ?

So ${\displaystyle g}$ cannot be differentiable at zero.

## Derivatives of higher order

### Exercise 1

Exercise (Arbitrarily often differentiable function)

Show that the function ${\displaystyle f:\mathbb {R} \to \mathbb {R} ,\ f(x)=xe^{x}}$ is arbitrarily often differentiable and for all ${\displaystyle n\in \mathbb {N} }$ there is:

${\displaystyle f^{(n)}(x)=(x+n)e^{x}}$

Proof (Arbitrarily often differentiable function)

The proof goes by induction over ${\displaystyle n\in \mathbb {N} }$:

Theorem whose validity shall be proven for the ${\displaystyle m\in \mathbb {N} }$:

${\displaystyle f^{(n)}(x)=(x+n)e^{x}}$

1. Base case:

${\displaystyle f'(x){\underset {\text{rule}}{\overset {\text{product}}{=}}}1\cdot e^{x}+x\cdot e^{x}=(x+1)e^{x}}$

1. inductive step:

2a. inductive hypothesis:

${\displaystyle f^{(n)}(x)=(x+n)e^{x}}$

2b. induction theorem:

${\displaystyle f^{(n+1)}(x)=(x+n+1)e^{x}}$

2b. proof of induction step:

${\displaystyle f^{(n+1)}(x)=(f^{(n)}(x))'{\underset {\text{assumption}}{\overset {\text{induction}}{=}}}\left((x+n)e^{x}\right)'{\underset {\text{rule}}{\overset {\text{product}}{=}}}1\cdot e^{x}+(x+n)e^{x}=(x+n+1)e^{x}}$

### Exercise 2

Exercise (Exactly one/two/three times differentiable functions)

Provide an example of a

1. function ${\displaystyle f\in C^{0}(\mathbb {R} )\setminus C^{1}(\mathbb {R} )}$
2. function that is differentiable, but not continuously differentiable on ${\displaystyle \mathbb {R} }$
3. function ${\displaystyle f\in C^{2}(\mathbb {R} )\setminus C^{3}(\mathbb {R} )}$

Solution (Exactly one/two/three times differentiable functions)

Solution sub-exercise 1:

${\displaystyle f(x)=|x|}$ or ${\displaystyle f(x)={\begin{cases}x\sin \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$ or ${\displaystyle f(x)={\begin{cases}x\cos \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$

Solution sub-exercise 2:

${\displaystyle f(x)={\begin{cases}x^{2}\sin \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$ or ${\displaystyle f(x)={\begin{cases}x^{2}\cos \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$

Solution sub-exercise 3:

${\displaystyle f(x)=x\cdot |x|}$ or ${\displaystyle f(x)={\begin{cases}x^{3}\sin \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$ or ${\displaystyle f(x)={\begin{cases}x^{3}\cos \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$

Solution sub-exercise 4:

${\displaystyle f(x)=x^{2}\cdot |x|}$ or ${\displaystyle f(x)={\begin{cases}x^{5}\sin \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$ or ${\displaystyle f(x)={\begin{cases}x^{5}\cos \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$

Hint

We can successively proceed with the construction of functions and obtain some ${\displaystyle f\in C^{k}(\mathbb {R} )\setminus C^{k+1}(\mathbb {R} )}$ for all ${\displaystyle k\in \mathbb {N} }$, with

${\displaystyle f(x)=x^{k}\cdot |x|}$ or ${\displaystyle f(x)={\begin{cases}x^{2k+1}\sin \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$ or ${\displaystyle f(x)={\begin{cases}x^{2k+1}\cos \left({\frac {1}{x}}\right)&{\text{ for }}x\neq 0,\\0&{\text{ for }}x=0.\end{cases}}}$

### Exercise 3

Exercise (Application of the Leibniz rule)

Determine the following derivatives using the Leibniz rule

1. ${\displaystyle (x\sin(x))^{(10)}}$
2. ${\displaystyle (x^{3}\ln(x))^{(2016)}}$ for ${\displaystyle x>0}$

Solution (Application of the Leibniz rule)

Solution sub-exercise 1:

The functions ${\displaystyle {\text{id}}}$ and ${\displaystyle \sin }$ are arbitrarily often differentiable on ${\displaystyle \mathbb {R} }$ . Hence there is

{\displaystyle {\begin{aligned}(x\sin(x))^{(10)}&{\underset {\text{rule}}{\overset {\text{Leibniz-}}{=}}}\sum _{k=0}^{10}{\binom {10}{k}}(x)^{(k)}(\sin(x))^{(10-k)}\\[0.3em]&\color {Gray}\left\downarrow \ (x)'=1,\ (x)^{(k)}=0{\text{ for }}k\geq 2,\ (\sin(x))^{(4k+1)}=\cos(x){\text{ and }}(\sin(x))^{(4k+2)}=-\sin(x){\text{ for all }}k\in \mathbb {N} \right.\\[0.3em]&={\binom {10}{0}}(x)^{(0)}(\sin(x))^{(10)}+{\binom {10}{1}}(x)'(\sin(x))^{(9)}\\[0.3em]&={\binom {10}{0}}x(-\sin(x))+{\binom {10}{1}}\cos(x)\\[0.3em]&=-x\sin(x)+10\cos(x)\end{aligned}}}

Solution sub-exercise 2:

The functions ${\displaystyle x\mapsto x^{3}}$ and ${\displaystyle \ln }$ are arbitrarily often differentiable on ${\displaystyle \mathbb {R} ^{+}}$ . Hence there is

{\displaystyle {\begin{aligned}(x^{3}\ln(x))^{(2016)}&{\underset {\text{rule}}{\overset {\text{Leibniz-}}{=}}}\sum _{k=0}^{2016}{\binom {2016}{k}}(x)^{(k)}(\ln(x))^{(2016-k)}\\[0.3em]&\color {Gray}\left\downarrow \ (x^{3})'=3x^{2},\ (x^{3})''=6x,\ (x^{3})'''=6,\ (x)^{(k)}=0{\text{ for }}k\geq 4,\ (\ln(x))^{(k)}=(-1)^{k-1}{\tfrac {(k-1)!}{x^{k}}}{\text{ for all }}k\in \mathbb {N} \right.\\[0.3em]&={\binom {2016}{0}}x^{3}(-1)^{2015}{\tfrac {(2015)!}{x^{2016}}}+{\binom {2016}{1}}\cdot 3x^{2}(-1)^{2014}{\tfrac {(2014)!}{x^{2015}}}+{\binom {2016}{2}}\cdot 6x(-1)^{2013}{\tfrac {(2013)!}{x^{2014}}}+{\binom {2016}{3}}\cdot 6\cdot (-1)^{2012}{\tfrac {(2012)!}{x^{2013}}}\\[0.3em]&=-x^{3}{\tfrac {(2015)!}{x^{2016}}}+2016\cdot 3x^{2}{\tfrac {(2014)!}{x^{2015}}}-2016\cdot 2015\cdot 3x{\tfrac {(2013)!}{x^{2014}}}+2016\cdot 2015\cdot 2014\cdot {\tfrac {(2012)!}{x^{2013}}}\\[0.3em]\end{aligned}}}