Taylor Series

Introduction Mean Value Theorem Taylor Series L'Hopital's Rule
Embodied Symbolic Formal
Embodied
Symbolic
Formal



It is a common practice to approximate general functions by simpler functions, e.g., (truncated) polynomials.

Approximations allow us to study the $\textbf{local}$ behaviour of functions.


$\textbf{Linear approximation}$ means approximating a function by a linear function at some point $c$, which is just the tangent line of the function at point $c$.

Examples.

Choose a function:


$\textbf{Quadratic approximation}$ means approximating a function with a quadratic polynomial which behaves similarly to the function near the point $c$.

Examples.

Choose a function:


Similarly, we may approximate functions using $\textbf{cubic approximation}$ or, in general, $\textbf{$n$-th degree polynomial approximation}$.

$\textbf{Error analysis}$.


Given a function $f(x)$ and its linear approximation $g(x)$ near the point $c$, what is the error of this approximation?

What is the relationship between this error $e=f(x)-g(x)$ and the distance $d=x-c$?

(Hint: compare $\frac{2e}{d^2}$ with the second derivative of the function within the interval with $c$ and $x$ as end points. Write $\displaystyle M=\max_t f''(t)$ and $\displaystyle m=\min_t f''(t), t$ between $c$ and $x$.)





Given a function $f(x)$ and its quadratic approximation $g(x)$ near the point $c$, what is the error of this approximation?

What is the relationship between this error $e=f(x)-g(x)$ and the distance $d=x-c$?

(Hint: compare $\frac{(3!)e}{d^3}$ with the third derivative of the function within the interval with $c$ and $x$ as end points. Write $\displaystyle M=\max_t f^{(3)}(t)$ and $\displaystyle m=\min_t f^{(3)}(t), t$ between $c$ and $x$.)





Observation: no matter what $c$ and $x$ we choose, $m \leq \frac{2e}{d^2} \leq M$ for linear approximation and $m \leq \frac{(3!)e}{d^3} \leq M$ for quadratic approximation.

By the intermediate value theorem, the fraction $\frac{2e}{d^2}$ (resp. $\frac{(3!)e}{d^3}$) must be equal to the second (resp. third) derivative of $f(x)$ at some point $t$ between $c$ and $x$.

$\textbf{Error analysis (cont.)}$.


$\textbf{Taylor's Theorem}$ tells us that:

If $f$ is $n$-times differentiable, $P_n(x)$ is the $n$-th degree Taylor polynomial, then the remainder $R_n(x)=\frac{f^{(n+1)}(t)}{(n+1)!}(x-c)^{n+1}$ where $t$ is some number between $c$ and $x$.


Examples:

The following diagram illustrates the $n$-th degree polynomial approximations to $\sin x, \cos x, e^x, \ln(1+x)$ and the corresponding estimation errors. ($n \leq 5$ is a non-negative integer)

Here, $e$ is the estimation error, $r=\frac{(n+1)!e}{d^{n+1}}$ where $\displaystyle d=x-c, M=\max_t f^{(n+1)}(t)$, and $\displaystyle m=\min_t f^{(n+1)}(t), t$ between $c$ and $x$.

Notice how $r$ is always bounded by $M$ and $m$.





In practice, it is quite difficult to find the value of such $t$.

However, if we can find a number $M$ such that $|f^{(n+1)}(t)|\lt M$ for any $t$ between $c$ and $x$,

then we may estimate $R_n(x)$ by $|R_n(x)|=\left| \frac{f^{(n+1)}(t)}{(n+1)!}(x-c)^{n+1} \right|\lt \frac{M}{(n+1)!} |x-c|^{n+1}$


Knowing this allows us to control the error of approximation by setting an upper bound.

This is very useful in real applications of the Taylor approximation.

It is intuitive that tangent lines approximate functions linearly. How about high-order approximations?

A suitable $n$-th degree polynomial is computed by $\displaystyle P_n(x)=\sum_{i=1}^n \frac{f^{(i)}}{i!}(x-c)^i$, using Taylor's Theorem.


Example.

Find the $n$-th degree Taylor polynomial of $f(x)=\frac{1}{4-x}$ at $x=0$.

Step 1: derive the derivatives of $f(x)$ up to $n-$th order, where $n$ is the degree of your desired polynomial.

(Note: The derivatives of functions may or may not have general forms. Observe carefully.)

$f'(x)=$

The following diagram illustrates $f(x)$ and its $n$-th degree polynomial approximation. ($n$ from $0$ to $5$.)

Example.(Substitution Technique)

Compute the Taylor series at $x=0$ for $f(x)=\frac{1}{4-x}$. Also find the interval of convergence.

Solution.

We may find the Taylor series as usual by computing derivatives. Here, we present an alternative method.

Let $g(x)=\frac{1}{1-x}$. We know that its infinite sum is the sum of a geometric series: $g(x)=\frac{1}{1-x}=1+x+x^2+\cdots, -1 \lt x \lt 1$.

Exercises.

Find the Taylor series centered at $x=0$ and the intervals of convergence for the following functions without directly computing derivatives.

  1. $f(x)=\frac{1}{1-x^2}$.    Taylor expansion: Interval of convergence:
  2. $g(x)=\frac{1}{1+x}$.    Taylor expansion: Interval of convergence:
  3. $h(x)=\frac{x}{1-x}$.    Taylor expansion: Interval of convergence:
  4. $p(x)=e^{-x^2}$.    Taylor expansion: Interval of convergence:
  5. $q(x)=\ln(1+4x^2)$. Taylor expansion: Interval of convergence:

$\textbf{(i) } x+x^2+x^3+\cdots \textbf{ (ii) } 1-x^2+\frac{1}{2!}x^4-\frac{1}{3!}x^6+\cdots \textbf{ (iii) } 1+x^2+x^4+x^6+\cdots \textbf{ (iv) } 4x^2-\frac{4^2}{2}x^4+\frac{4^3}{3}x^6+\cdots \textbf{ (v) } 1-x+x^2-x^3+\cdots$

$\textbf{(a) } (\infty, \infty) \textbf{ (b) } (-2,2) \textbf{ (c) } (-\frac{1}{2},\frac{1}{2}) \textbf{ (d) } (-1,1) \textbf{ (e) } [-1,1] $




Exercises.

  1. Compute the Taylor series for $f(x)=\sin x$ and $g(x)=\cos x$ around $x=0$.
    1. $= \sum_{n=0}^{\infty} (-1)^n \frac{x^{2n}}{(2n)!}$
    2. $= \sum_{n=0}^{\infty} (-1)^n \frac{x^{2n+1}}{(2n+1)!}$

    Remark.

    You might recall that $\sin x$ and $\cos x, x\in \mathbb{R}$ are defined though trigonometry. However, series representations are used to formally define sine and cosine as $\textbf{complex}$ functions.




  2. Compute the Taylor series for $f(x)=(1+x)^m$ around $x=0$.
  3. Solution.

    Case 1:

    Compute derivatives: $f^n(x)=\left\{ \begin{array}{ll} m(m-1)(m-2)\cdots (m-n+1)(1+x)^{m-n} & n\leq m \\ 0 & n>m \\ \end{array}\right.$



    Case 2:

    Compute derivatives: $f^n(x)=m(m-1)(m-2) \cdots (m-n+1)(1+x)^{m-n}$ for any $n\in \mathbb{N}$.


    Remark.

    Case 1 corresponds to the Binomial Theorem, while Case 2 corresponds to the generalization of the Binomial Theorem to non-integer exponents (Binomial series).




$\textbf{Taylor's Theorem}$.

Roughly speaking, if $f$ is a $n$-times differentiable function, then we can find a degree $n$ polynomial to approximate $f$, called the Taylor polynomial.

Most functions that we encounter are infinitely many times differentiable. Therefore, we may find the Taylor polynomials of these functions for any degree $n$.


The following diagram illustrates some common functions and their Taylor polynomials. The symbolic computation of Taylor polynomials is left as exercise.


Theorem. (term-by-term differentiation and integration)

Suppose a function $f(x)$ has the power series $\displaystyle f(x)=\sum_{n=0}^{\infty}=a_0+a_1x+a_2x^2+\cdots$ which converges for $-R\lt x \lt R$. Then:

  1. The derivative of $f$ has the power series $\displaystyle f'(x)=\sum_{n=1}^{\infty}=a_1+2a_2x+3a_3x^2+\cdots$ which converges for $-R\lt x \lt R$.
  2. The antiderivative of $f$ has the power series $\displaystyle \int f(x)dx=C+\sum_{n=0}^{\infty}a_n\frac{x^{n+1}}{n+1}=C+a_0x+a_1\frac{x^2}{2}+a_2\frac{x^3}{3}+\cdots$ which converges for $-R \lt x \lt R$.

Remark.

  1. Although the radius of convergence is preserved, the convergence at the endpoints may differ.
  2. This theorem simply says that the sum rule for derivatives and integrals also applies to power series. This is not a trivial result because in general, the results we know for finite sums are not necessarily true for infinite sums.

Exercises.

Using differentiation or integration on the Taylor series for suitable functions, compute the Taylor series for the following functions around $x=0$. Also find the interval of convergence.

  1. $f(x)=\ln(1-x)$.    Taylor expansion: Interval of convergence:
  2. $g(x)=\frac{1}{(1-x)^2}$.      Taylor expansion: Interval of convergence:
  3. $\displaystyle h(x)=\int x^2 e^x, h(0)=2$. Taylor expansion: Interval of convergence:

$\textbf{(i) } 1+2x+3x^2+\cdots \textbf{ (ii) } -(x+\frac{x^2}{2}+\frac{x^3}{3}+\cdots) \textbf{ (iii) } 2+\frac{x^3}{3}+\frac{x^4}{4}+\frac{x^5}{5\cdot 2!}+\cdots$

$\textbf{(a) } (-\infty, \infty) \textbf{ (b) } [-1,1) \textbf{ (c) } (-1,1)$




Definitions.

Let $\{ x_n\}_{n=1}^{\infty}$ be a sequence of real numbers.

The $\textbf{infinite series generated by } \{x_n\}_{n=1}^{\infty}$ is the sequence $\{S_n\}_{n=1}^{\infty}$ defined by $\displaystyle S_n = \sum_{i=1}^n x_i$.

The numbers $S_n$ are called the $\textbf{partial sums}$ of this series.


Let $\{ f_n\}_{n=1}^{\infty}$ be a sequence of functions defined in a real domain $D$.

The $\textbf{infinite series generated by } \{f_n\}_{n=1}^{\infty}$ is the sequence $\{S_n\}_{n=1}^{\infty}$ defined for $x\in D$ by $\displaystyle S_n(x) = \sum_{i=1}^n f_i(x)$.

The functions $S_n$ are called the $\textbf{partial sums}$ of this series.


If a sequence has a limit, then the sequence is said to be $\textbf{convergent}$. Otherwise, it is said to be $\textbf{divergent}$.


Proposition. (Ratio Test)

Let $\displaystyle S_n=\sum_{i=1}^n c_i$ be the partial sum of the series $\sum c_n$. Then $S_n$ is (i) convergent if $\displaystyle \lim_{n\to \infty} \left| \frac{c_{n+1}}{c_n} \right| \lt 1$; (ii) divergent if $\displaystyle \lim_{n\to \infty} \left| \frac{c_{n+1}}{c_n} \right| \gt 1$.

Proof.

Omitted. You may learn more about sequences and series from introductory real analysis. ($\text{MATH}2050, \text{MATH}2060$)

Remark.

If $\lim \left| \frac{c_{n+1}}{c_n} \right| = 1$, the ratio test is inconclusive. We may resort to other tests for convergence.


Definitions.

A series of real functions $\sum f_n$ is said to be a $\textbf{power series around } x=c$ if the function has the form $f_n(x)=a_n(x-c)^n$, where $a_n, c \in \mathbb{R}, n=0,1,2,\cdots$.

The $\textbf{interval of convergence}$ of the power series is the set $I$ of real numbers $x$ such that the power series converges.

The $\textbf{radius of convergence}$ of the power series is defined to be (i) infinity if $I=(-\infty,\infty)$, (ii) $R$ otherwise.


Theorem. (Taylor's Theorem)

Let $n$ be a positive integer, $I=[a,b]$ be a closed interval on the real line, $x_0\in I$ and $f$ be a real-valued function defined on $I$ such that $f', f'', \cdots, f^{(n)}$ are continuous on $I$ and that $f^{(n+1)}$ exists on $(a,b)$.

Then for any $x\in I$, there exists some $c$ between $x$ and $x_0$ such that $f(x)=f(x_0)+f'(x_0)(x-x_0)+\frac{f''(x_0)}{2!}(x-x_0)^2+...+\frac{f^{(n)}(x_0)}{n!}(x-x_0)^n+\frac{f^{(n+1)}(c)}{(n+1)!}(x-x_0)^{n+1}$.


Proof.

For any fixed $x\in I$, let $J$ be the closed interval with end points $x_0$ and $x$.

Define a function $F$ on $J$ by $F(t)=f(x)-f(t)-(x-t)f'(t)-\cdots-\frac{f^{(n)}(t)}{n!}(x-t)^n$ where $t\in [x_0,x]$.

Next, define another function $G$ on $J$ by $G(t)=F(t)-(\frac{x-t}{x-x_0})^{n+1}F(x_0)$.

Then $F'(t)=$


Definitions.

We call the polynomial $P_n(x)=\sum_{i=1}^n \frac{f^{(i)}(x_0)}{i!}(x-x_0)^i = f(x_0)+f'(x_0)(x-x_0)+\frac{f''(x_0)}{2!}(x-x_0)^2+ \cdots +\frac{f^{(n)}(x_0)}{n!}(x-x_0)^n$ the $\textbf{$n$-th Taylor polynomial for } f$ at $x_0$ and the term $R_n(x)=f(x)-P_n(x)=\frac{f^{(n+1)}(c)}{(n+1)!}(x-x_0)^{n+1}$ is called the $\textbf{remainder}$.

If the $n$-th derivatives of $f$ exist for any order $n$, then $f$ may be represented by its $\textbf{Taylor series around }c$: $\displaystyle f(x)=\sum a_n (x-c)^n$ where $a_n=\frac{f^{(n)}(c)}{n!}$. Note that this series converges to $f$ only if the sequence of remainders $R_n(x)$ converges to $0$.

The $\textbf{$n$-th order Maclaurin polynomial of }f$ is the Taylor polynomial of $f$ centered at 0.

The $\textbf{Maclaurin series of }f$ is the Taylor series of $f$ centered at $0$.