Chapter 8 Sums and Infinite Series

Mathematics is the work of the human mind, which is destined rather to study than to know, to seek the truth rather than to find it.

— Evariste Galois

The methods of Calculus were not discovered in the order that they are presented in this text. Indeed, we have arranged the book in a way that we feel facilitates understanding. Yet, Isaac Newton’s first forays into Calculus involved the calculation of sums with an infinite number of terms. We have been using such infinite sums ever since we encountered the integral. Indeed, we essentially defined the integral to be the limit of a sum as \(n\rightarrow\infty\). In this chapter, we will formally explore

We have discovered that we can sum an infinite number of terms (an infinite series from the previous chapter). We can also sum an infinite number of functions. We can describe functions as an infinite sum of terms called a power series. The first such example is described below.

8.1 Derivation of the Maclaurin Series

Suppose you are given a function \(f(x)\). Our task is to find coefficients \(c_n\) satisfying the following equation:

\[g(x) = \sum_{n = 0}^{\infty} c_n x^n = c_0 + c_1 x + c_2 x^2 + \dots\]

In other words, our objective is to try to approximate the function \(g(x)\) with some polynomial \(f(x) = c_0 + c_1 x + \dots\), possibly involving an infinite number of terms. Initially, this may seem like a hopeless task. How can it possibly be the case that a polynomial function \(f(x)\) can approximate a trigonometric function like \(g(x) = \sin(x)\)?

In the diagram above, the tails explode to \(-\infty\) or \(+\infty\) as \(x\) approaches \(-\infty\) and \(+\infty\), respectively. Meanwhile, the sine function is bounded between \(-1\) and \(1\). If the task seemed hopeless before, now it might seem especially so!

In math, however, simply playing about with ideas often yields interesting and useful results. In the spirit of carefree intellectual play, let’s see if we can find coefficients \(c_n\) such that polynomial \(f(x) = c_0 + c_1 x + c_2 x^2 + \dots\) equals a trigonometric function like \(\sin(x)\).

First, note that we can find the coefficient \(c_0\) very simply. We only need to set \(x=0\):

\[g(0) = c_0 + c_1\cdot 0 + c_2\cdot 0^2 + \dots = c_0\]

Fantastic! We’ve found our first coefficient; namely, \(\boxed{c_0 = g(0)}\).

Let’s take a derivative. The derivative of a constant is zero, while the rest of the exponents are made part of the coefficients:

\[g'(x) = c_1 + 2c_2 x + 3c_3 x^2 + 4c_4 x^3 + \dots\]

If we set \(x = 0\), we have \(\boxed{c_1 = g'(0)}\). Great! We’ve found our second coefficient. Let’s take another derivative

\[g''(x) = 2\cdot 1\cdot c_2 + 3\cdot 2\cdot c_3 x + 4\cdot 3\cdot c_4 x^2 + \dots\]

Again, if we set \(x = 0\), we have \(\boxed{c_2 = \frac{g''(0)}{2\cdot 1}}\). Wonderful, we’ve found yet another coefficient! Let’s take one more derivative, and see if we can’t generalize our little procedure.

\[g'''(x) = 3\cdot 2\cdot 1 \cdot c_3 + 4\cdot 3\cdot 2\cdot c_4 x + \dots\]

We set \(x = 0\) yet again. Upon doing so, we find \(c_3 = \frac{g'''(0)}{3\cdot 2\cdot 1}\). We can continue this procedure to find an arbitrary coefficient \(c_n\). Recall that the factorial is defined by

\[n! = n\cdot (n - 1)\cdot (n - 2) \cdots 2\cdot 1\]

For example, \(4! = 4\cdot 3\cdot 2\cdot 1 = 24\). Based on what we’ve so far, it might seem natural to predict that

\[c_n = \frac{g^{(n)}(0)}{n!},\]

where \(g^{(n)}(0)\) is the \(n^{th}\) derivative of function \(g\) evaluated at zero. For instance, \(c_3 = \frac{g^{(3)}(0)}{3!} = \frac{g'''(0)}{3\cdot 2\cdot 1}\), which is what we had found before. Therefore, we have found that

Definition 8.1 (The Maclaurin Series) The Maclaurin Series for a function \(f\) is given by

\[\begin{equation} g(x) = \sum_{n = 0}^{\infty}\frac{g^{(n)}(0)}{n!}x^n \tag{8.1} \end{equation}\]

This power series is referred to as the Maclaurin Series, in honor of Scottish mathematician Colin Maclaurin.

8.2 Examples of the Maclaurin Series

Sine Function

Let’s first use our recipe above to determine the Maclaurin series of the sine function \(f(x) = \sin(x)\). Our only real task is to find the derivatives of the function \(f(x)\), then set these derivatives equal to zero. We have

\[f(x) = \sin x \Rightarrow f(0) = \sin 0 = 0\]

\[f'(x) = \cos x \Rightarrow f'(0) = \cos 0 = 1\]

\[f''(x) = -\sin x \Rightarrow f''(0) = -\sin 0 = 0\]

\[f'''(x) = -\cos x \Rightarrow f'''(0) = -\cos 0 = -1\]

We could continue this process indefinitely to obtain better and better polynomial approximations of the sine function. Therefore, our polynomial looks like this:

\[\sin(x) = \frac{0}{0!}x^0 + \frac{1}{1!}x^1 + \frac{0}{2!}x^2 + \frac{-1}{3!}x^3 + \frac{0}{4!}x^4 + \frac{1}{5!}x^5 + \frac{0}{6!}x^6 + \frac{-1}{7!}x^7 + \dots\] or

\[\sin(x) = x - \frac{1}{6}x^3 + \frac{1}{120}x^5 - \frac{1}{5040} x^7 + \dots\]

As is always the case in Calculus, the polynomial never equals the sine function for a finite number of terms. To reiterate, the polynomial on the right is a better and better approximation of the sine function, and “infinity” in this case just refers to the never-ending process in which we add more terms to obtain a more precise approximation.

Recall that the sine function is odd (1.2). Note also that all of the terms involved in the polynomial for sine (\(x\), \(x^3\), \(x^5\), \(x^7\), …) are also odd!

As usual, we mathematicians also want to visualize what we’ve created. Below is a sine function (in red) and the first \(K\) terms of the Maclaurin series for sine (in blue). Note that the blue function more closely approximates the sine function as the number of terms is made larger!

Cosine Function

Let’s now perform our little procedure on the cosine function. Once again, we need only to compute the derivatives of \(f(x) = \cos(x)\), from which we can write down the Maclaurin series corresponding to cosine. We have the following

\[f(x) = \cos x \Rightarrow f(0) = \cos 0 = 1\] \[f'(x) = -\sin x \Rightarrow f'(0) = -\sin 0 = 0\]

\[f''(x) = -\cos x \Rightarrow f''(0) = -\cos 0 = -1\]

\[f'''(x) = \sin x \Rightarrow f'''(0) = \sin 0 = 0\]

This is very similar to the derivatives for sine; indeed, in both cases, the value at \(x=0\) repeats after four derivatives. Plugging these derivatives into the formula for the Maclaurin series, we have:

\[\cos(x) = \frac{f(0)}{0!}x^0 + \frac{f'(0)}{1!}x^1 + \frac{f''(0)}{2!}x^2 + \frac{f'''(0)}{3!}x^3 + \frac{f''''(0)}{4!}x^4 + \dots\]

or

\[\cos(x) = 1 + \frac{-1}{2!}x^2 + \frac{1}{4!}x^4 + \dots \] or

\[\cos(x) = 1 - \frac{1}{2}x^2 + \frac{1}{24}x^4 + \dots\]

Once again, the polynomial on the right will never equal the cosine function for a finite number of terms. The polynomial approximation is an increasingly accurate approximation of cosine as more terms are added.

Recall that the cosine function is even (1.1). Note also that all of the terms involved in the polynomial for sine (\(1\), \(x^2\), \(x^4\), …) are also even!

Below is a cosine function (in red) and the first \(K\) terms of the Maclaurin series for cosine (in blue). Note that the blue function more closely approximates the sine function as the number of terms is made larger!

Exponential Function

Let’s use our recipe one last time to determine the Maclaurin series of \(f(x) = e^x\), the basic exponential function. Having also computed the Maclaurin series for sine and cosine, we will be able to uncover a result which will hint at a much deeper branch of mathematics known as Complex Analysis.

As usual, from the derivatives of \(f(x) = e^x\), we will be able to write down the Maclaurin series for \(e^x\), which we will then visualize. Remember that the derivative of \(e^x\) is itself (reference here), so computing the Maclaurin series in this case will be very easy:

\[f(x) = e^x \Rightarrow f(0) = e^0 = 1\] \[f'(x) = e^x \Rightarrow f'(0) = e^0 = 1\]

\[f''(x) = e^x \Rightarrow f''(0) = e^0 = 1\]

In all cases, the derivative evaluated at \(x = 0\) is always one. Therefore, the Maclaurin series is given by

\[e^x = \frac{1}{0!}x^0 + \frac{1}{1!}x^1 + \frac{1}{2!}x^2 + \frac{1}{3!}x^3 + \dots\]

or

\[e^x = 1 + x + \frac{x^2}{2!} + \frac{x^3}{3!} + \dots\]

Notice that, if you take a derivative of the polynomial \(1 + x + \frac{x^2}{2!} + \frac{x^3}{3!} + \dots\), you will get \(1 + x + \frac{x^2}{2!} + \frac{x^3}{3!} + \dots\) again. This is exactly what we expect from the function \(f(x) = e^x\), since its derivative is itself.

We provide an illustration of the Maclaurin series for the exponential function below.

Euler’s Formula

A very important connection can be made between exponential functions and trigonometric functions through the Maclauring series and complex numbers. Recall from Chapter 1 (REFERENCE HERE) that the imaginary unit \(i = \sqrt{-1}\) gives rise to a larger number system than the real numbers. By definition, the imaginary unit is the number whose square is \(-1\); that is, \(i^2 = -1\).

We can continue multiplying the imaginary unit by itself and note that it has a cycle very similar to the derivatives of sine and cosine at zero. We have

\[i = \sqrt{-1}, \hspace{3mm} i^2 = -1, \hspace{3mm} i^3 = -\sqrt{-1} = -i, \hspace{3mm} i^4 = 1, \hspace{3mm} i^5 = i = \sqrt{-1}, \dots\]

Something magical happens if we plug \(ix\) into the Maclaurin series for the exponential function. We have

\[e^{ix} = 1 + ix + \frac{(ix)^2}{2!} + \frac{(ix)^3}{3!} + \frac{(ix)^4}{4!} + \frac{(ix)^5}{5!} + \dots = \underbrace{1 + ix + i^2\frac{x^2}{2!} + i^3\frac{x^3}{3!} + i^4\frac{x^4}{4!} + i^5\frac{x^5}{5!} + \dots}_{*}\]

Using the cycle of \(i\)s above, the quantity (*) becomes

\[e^{ix} = 1 + ix - \frac{x^2}{2!} - i\frac{x^3}{3!} + \frac{x^4}{4!} + i\frac{x^5}{5!} + \dots\]

We can group the terms with an \(i\) and group the terms without an \(i\). We have

\[e^{ix} = \left(1 - \frac{x^2}{2!} + \frac{x^4}{4!} + \dots \right) + i\left(x - \frac{x^3}{3!} + \frac{x^5}{5!} + \dots\right)\]

The left parentheses contain the Maclaurin series for cosine. On the other hand, the right parentheses contain the Maclaurin series for sine. Therefore, we have derived a formula, called Euler’s Identity , which plays a central role in all of mathematics:

Theorem 8.1 (Euler's Identity) Euler’s Identity is given by the formula

\[\begin{equation} e^{ix} = \cos x + i\sin x \tag{8.2} \end{equation}\]

Taylor Series

Derivation of Stirling’s Approximation

This section can be skipped without loss of continuity

Derivation of Jensen’s Inequality

This section can be skipped without loss of continuity