Chapter 4 Applications of the Derivative

An equation means nothing to me unless it expresses a thought of God.

— Srinivasa Ramanujan

4.2 The Second Derivative

In chapter 3, we explored the meaning of the first derivative of a function. In particular, we showed that the first derivative returns the slope of the tangent line of a function. Moreover, we learned that the derivative of a function is yet another function. If it exists, what will happen if we take another derivative?

This other derivative is aptly named the second derivative. In fact, as long as the derivatives exist, we can take as many derivatives as we like. Computing the second derivative is not too different than taking the first. The following example illustrates how to take the second derivative of the function \(f(x) = \sin x\).

Example 4.3 (The Second Derivative)

Find the second derivative of \(f(x) = \sin x\).

We learned in the previous chapter that the first derivative of sine is cosine (3.3.3). That is, \(\frac{d}{dx} \sin x = \cos x\).

Therefore, the second derivative is given by

\[\frac{d^2}{dx^2} \sin x = \frac{d}{dx}\left(\frac{d}{dx} \sin x\right) = \frac{d}{dx} \cos x = - \sin x\]

Notice that the second derivative is just the derivative applied twice.

The illustration below provides a picture of the second derivative.

The important thing to notice here is that we have a two-step process. The green segment is the derivative of the sine function. This produces the first derivative function, cosine. The orange segment represents the first derivative of the cosine function. This produces the function \(f(x) = -\sin x\). This is clear from the fact that the orange function is the red function flipped over. We provide another example of the second derivative below.

Example 4.4 (The Second Derivative, Part 2)

Find the second derivative of \(f(x) = \sin x\).

We learned in the previous chapter that the first derivative of sine is cosine (3.3.3). That is, \(\frac{d}{dx} \sin x = \cos x\).

Therefore, the second derivative is given by

\[\frac{d^2}{dx^2} \sin x = \frac{d}{dx}\left(\frac{d}{dx} \sin x\right) = \frac{d}{dx} \cos x = - \sin x\]

Notice that the second derivative is just the derivative applied twice.

What is the second derivative of the function \(f(x) = e^x\sin x\)?

\(e^x\sin x + e^x\cos x\)
\(2e^x\sin x + 2e^x\cos x\)
\(2e^x\sin x - e^x\cos x\)
\(2e^x\cos x\)

Another illustration of the second derivative is provided below.

The value of the slope of \(f(x)\) determines the derivative \(\color{red}{\frac{df}{dx}}\). Then, the slope of the derivative \(\color{red}{\frac{df}{dx}}\) determines the value of the second derivative, \(\color{green}{\frac{d^2 f}{dx^2}}\). Note that in the foregoing illustration, the derivative \(\color{red}{\frac{df}{dx}}\) is a straight line, so it has a constant slope. Therefore, the second derivative \(\color{green}{\frac{d^2 f}{dx^2}}\) is a constant function.

4.3 Finding Maxima and Minima

Among the most important applications of Calculus is its ability to find optimal solutions. In particular, Calculus allows one to compute the maxima and minima of a function. To determine the maxima and minima of a function, the only thing we need to compute is the first and second derivatives of the function. Consider the following illustration:

What is the value of the slope when the tangent reaches the peak of the function above?

The foregoing example illustrates a method we can use to find both the maxima and minima of a function. We first compute the derivative of the function, then we find where the derivative is equal to zero. But how do we know whether the point we found is a maximum or a minimum? To decide, we will need the second derivative of the function. Whenever the point is a maximum, the slope decreases near the maximum. The picture below illustrates this:

Notice that as the tangent line is moved from left to right, its slope goes from being positive to being negative. In other words, the rate of change of the rate of change is negative. Whenever this occurs, it must be that the second derivative is negative. In math, we have:

\[\frac{d^2 y}{dx^2} < 0\]

Therefore, maxima are given by two conditions: the first derivative is set to zero \(\frac{dy}{dx} = 0\). If the second derivative is negative \(\frac{d^2 y}{dx^2} < 0\), then we are considering a maximum.

Meanwhile, if the point is a minimum, the slope increases near the minimum. This is illustrated below.

In the figure above, as the tangent line is moved from the left to the right, the slope goes from being negative \(\frac{dy}{dx} < 0\) to being positive \(\frac{dy}{dx} > 0\). In other words, the rate of change of the rate of change is positive. Whenever this occurs, it must be that the second derivative is positive. Expressing this in notation, we have:

\[\frac{d^2 y}{dx^2} > 0\]

Therefore, minima are given by two conditions: the first derivative is set to zero \(\frac{dy}{dx} = 0\). If the second derivative is positive \(\frac{d^2 y}{dx^2} > 0\), then we are considering a minimum.

There is one last case that must be considered: what happens if the second derivative is equal to zero? In that case, we are considering an inflection point. Inflection points are illustrated below.

Example 4.5 (Finding Maxima and Minima) Find all critical points (maxima, minima, and inflection points) of the function \(f(x) = x^3 - 9x\).

Whenever doing a problem of this sort, we must first compute the derivative of the function and determine where the function is equal to zero. We have:

\[\frac{df}{dx} = 3x^2 - 9\]

Now we set this derivative equal to zero to determine where the critical points occur. We have

\[ 3x^2 - 9 = 0 \Rightarrow x^2 = 3 \Rightarrow x = \pm \sqrt{3}\]

Wonderful! We have our critical points. Now we must determine what the sign of the second derivative is at these critical points. Taking another derivative, we have:

\[\frac{d^2f}{dx^2} = 6x\]

For the critical point at \(x = \sqrt{3}\), we have

\[\frac{d^2 f}{dx^2} = 6\sqrt{3} > 0,\]

so this critical point is a minimum.

The critical point at \(x = -\sqrt{3}\) is a maximum, since

\[\frac{d^2 f}{dx^2} = -6\sqrt{3} < 0.\]

This is illustrated in the plot below.

Find the critical points of the function \(f(x) = \frac{3x^2 + 5x - 2}{x - 2}\)

\(3 \pm \frac{4}{3}\sqrt{17}\)
\(7 \pm \frac{2}{3}\sqrt{14}\)
\(6 \pm \frac{3}{2}\sqrt{12}\)
\(4 \pm \frac{3}{4}\sqrt{15}\)
\(2 \pm \frac{2}{3}\sqrt{15}\)
\(3 \pm \frac{2}{3}\sqrt{17}\)

4.4 The Newton-Raphson Method and Finding Roots

The first derivative can also be used to help find the roots of a function \(f(x)\). The roots of a function are those values \(a\) such that \(f(a) = 0\). Finding the roots of a function will be very useful when discussing physics in the final chapter of this textbook.

Consider the following diagram:

We are imagining that we don’t have a nice graphic that allows us to easily see where the function is equal to zero. We must develop some procedure that will allow us to compute the roots of a function. We begin with some initial guess, which we call \(x_0\). From there, we compute the derivative of the function at that point and follow the tangent line until it hits the \(x\)-axis again. We call the point at which the tangent intersects the \(x\)-axis \(x_1\), as illustrated in the diagram.

How can we determine what \(x_1\) is with a formula? We have two points: \((x_0, f(x_0))\) and \((x_1, 0)\), and we wish to determine a formula allowing us to compute \(x_1\). We know the derivative at \(x_0\) is given by \(f'(x_0)\). This is the slope \(m\) of the two points above. Therefore, we have

\[f'(x_0) = \frac{0 - f(x_0)}{x_1 - x_0}\]

We want to solve this formula for \(x_1\). Rearranging, we have

\[(x_1 - x_0)\cdot f'(x_0) = -f(x_0) \Rightarrow x_1 = x_0 - \frac{f(x_0)}{f'(x_0)}\]

Wonderful! But we can’t guarantee that this is the root of the function. Therefore, we repeat the exact same process to obtain a new point \(x_2\). We have

\[x_2 = x_1 - \frac{f(x_1)}{f'(x_1)}\]

Taking this procedure further and further, we have

\[x_{n + 1} = x_n - \frac{f(x_n)}{f'(x_n)}\]

For most functions, this little procedure will find the roots of the function, as illustrated in the diagram above. This procedure is often called the However, there are some cases for which this method will not work. Consider the diagram below.

When we select our initial value \(x_0\), we follow the same procedure as before and compute \(f(x_0)\) (in the interval \(0 \leq t < 1\)) and \(f'(x_0)\) (in the interval \(1 \leq t < 2\)). Upon reaching \(t = 2\), we touch the \(x\)-axis at the value \(x_1\). From there, we go upward before reaching \(f(x_1)\) at \(t = 3\). We then compute the derivative \(f'(x_1)\) and follow the tangent line. This time, instead of converging toward a zero of the function, the tangent returns to \(x_0\)! This process will repeat forever and will fail to find the zero of this function for the given value of \(x_0\).

There are other reasons why Newton’s method might fail. Consider the following diagram:

We choose some initial value. In this case, we chose \(x_0 = 0.453\). We follow exactly the same procedure as in the previous two examples and reach the \(x\) value \(x_1 = -0.328\). This happens to be the \(x\) coordinate for the minimum of the function \(f(x)\) in red. When we drop the vertical line to \(f(x_1)\), we find that the slope of the function at that point is totally flat. That is, it is parallel to the \(x\)-axis, so it will never touch the x-axis. Because it will never touch the \(x\)-axis, there will never be a point \(x_2\), so the algorithm has failed to find a zero of the function \(f(x)\).

Example 4.6 (Applying Newton's Method)

Without using a graphing tool, determine the zero of the function \(f(x) = \sin(x - 5) + x\).

As was true in the previous illustrations, we must provide an initial guess to the algorithm. For simplicity, let’s choose \(x = 0\). We must first find the derivative of \(f(x)\):

\[f'(x) = \cos(x - 5) + 1\]

Applying the formula for Newton’s Method once, we have:

\[x_1 = x_0 - \frac{f(x_0)}{f'(x_0)} = 0 - \frac{\sin(-5) + 0}{\cos(-5) + 1} \approx -\frac{0.959}{1.284} \approx -0.747\]

Applying this formula again, we have:

\[x_2 = x_1 - \frac{f(x_1)}{f'(x_1)} = -0.747 - \frac{\sin(-0.747 - 5) - 0.747}{\cos(-0.747 - 5) + 1} \]

\[\approx -0.747 - \frac{-0.236}{1.860} \approx -0.620\]

We can apply the formula one last time to obtain

\[x_3 = x_2 - \frac{f(x_2)}{f'(x_2)} = -0.620 - \frac{\sin(-0.620 - 5) - 0.620}{\cos(-0.620 - 5) + 1} \]

\[\approx -0.620 - \frac{-0.00434}{1.7880} \approx -0.6176\]

Notice that the change between \(x_3\) and \(x_2\) is smaller than the change between \(x_2\) and \(x_1\). This is a good indication that the algorithm is converging as we wish. As illustrated in the picture below, the zero of the function is approximately \((-0.618, 0)\). Therefore, the algorithm was within \(0.0004\) of the real zero within three iterations, demonstrating its power.

Using Newton’s Method, which of the following is a zero of the function \(x^{3}-3x^{2}+7\)?

-1.364
-1.279
-0.854
0.124

4.5 The Mean Value Theorem and L’Hopital’s Rule

4.5.1 The Mean Value Theorem

A valuable rule that arises frequently in proving results in Calculus is the Mean Value Theorem. Consider the illustration below:

We have connected points \((a, f(a))\) and \((b, f(b))\) with a line. From the previous chapter, we know that such a line is called a secant (@ref()). The Mean Value Theorem states that, as long as the function \(f(x)\) is continuous on the interval \([a, b]\) and differentiable on the interval \((a, b)\), we can always find a value \(x = c\) in between \(a\) and \(b\) such that the derivative at \(c\) is equal to the secant line between \((a, f(a))\) and \((b, f(b))\). This is described mathematically below.

Theorem 4.1 (The Mean Value Theorem) Let \(f(x)\) be a continuous function on the interval \([a, b]\) and differentiable on \((a, b)\). Then there exists some \(c\) in \((a, b)\) such that

\[f'(c) = \frac{f(b) - f(a)}{b - a} \]

4.5.2 L’Hopital’s Rule

There is one last derivative rule we will need before we proceed. It is called L’Hopital’s Rule. We will state the general case of the rule, but we will only prove a specific case we will need for the remainder of the text.

Theorem 4.2 (L'Hopital's Rule) Suppose we wish to compute the following limit:

\[ \lim_{x \rightarrow a} \frac{f(x)}{g(x)} \]

Furthermore, suppose the following four conditions hold:

  1. \(\lim_{x\rightarrow a} f(x) = \lim_{x\rightarrow a} g(x) = 0 \hspace{1mm}\text{or} \pm \infty\),

  2. \(f(x)\) and \(g(x)\) are differentiable on an open interval except possible at \(x = a\),

  3. \(g'(x) \neq 0\) for all \(x\) where \(x \neq a\), and

  4. \(lim_{x \rightarrow a} \frac{f(x)}{g(x)}\) exists.

If these conditions are satisfied, then the following is true:

\[ \lim_{x\rightarrow a}\frac{f(x)}{g(x)} = \lim_{x\rightarrow a}\frac{f'(x)}{g'(x)} \].

We will prove this theorem only in the restricted case where \(f(x)\) and \(g(x)\) are continuously differentiable. The proof of the general case can be found in a textbook like the ones here or here.

Proof (Special Case of L'Hopital's Rule). Suppose that \(f(x)\) and \(g(x)\) have continuous first derivatives, \(f(a) = g(a) = 0\), and \(g'(a) \neq 0\). Then we have the following:

\[ \lim_{x\rightarrow a}\frac{f(x)}{g(x)} = \lim_{x\rightarrow a}\frac{f(x) - 0}{g(x) - 0} = \lim_{x \rightarrow a}\frac{f(x) - f(a)}{g(x) - g(a)}\]

\[ = \lim_{x\rightarrow a}\frac{\frac{f(x) - f(a)}{x - a}}{\frac{g(x) - g(a)}{x - a}} = \frac{\lim_{x\rightarrow a}\frac{f(x) - f(a)}{x - a}}{\lim_{x\rightarrow a}\frac{g(x) - g(a)}{x - a}} = \frac{f'(a)}{g'(a)} = \lim_{x\rightarrow a} \frac{f'(x)}{g'(x)},\] where the last equality holds because \(f\) and \(g\) have continuous derivatives at \(x = a\).