Skip to main content
\( \newcommand{\lt}{ < } \newcommand{\gt}{ > } \newcommand{\amp}{ & } \)

Section5.2Introduction to Optimization

The previous section emphasized that knowing the signs of the derivative \(f'(x)\) to a function \(f(x)\) informs us regarding the intervals of monotonicity of the function \(f(x)\). We can use this information to identify extreme values.

Applications involving the identification of extreme values are often called optimization problems. The task in optimization is to identify the value of an independent variable in the system that will maximize or minimize some objective. Aside from the calculus in finding the extreme values, creating an appropriate function that will serve as the objective is often the greatest challenge.

Subsection5.2.1Local Extreme Values

When the derivative \(f'(x)\) changes sign at a point where \(f(x)\) is continuous, the function has a local or relative extreme value. We begin by focusing on what we mean by a local extreme value. A local extreme is a point where the function reaches either its highest or lowest point on an interval around that point. The function might exceed the value on some other interval, but the value needs to be the extreme in a neighborhood of the point.

Definition5.2.1Local (Relative) Extreme Values

A function \(f(x)\) has a local maximum at a point \(x=c\) in the domain if there is an interval \((a,b)\) with \(c \in (a,b)\) so that \(f(x) \le f(c)\) for all \(x \in (a,b)\).

A function \(f(x)\) has a local minimum at a point \(x=c\) in the domain if there is an interval \((a,b)\) with \(c \in (a,b)\) so that \(f(c) \le f(x)\) for all \(x \in (a,b)\).

The first derivative can often provide enough information to identify local extreme values. The applicable theorem is called the first derivative test.


The function \begin{equation*}\displaystyle f(x) = \frac{x}{x^2+3}\end{equation*} has a derivative \begin{equation*}f'(x) = \frac{3-x^2}{(x^2+3)^2}.\end{equation*} Describe the local extreme values of \(f(x)\).


We will learn later how to identify whether a function has global extreme values.


Optimization is the application of finding extreme values to either maximize or minimize some quantity of interest. In general, we will have a system where there is some variable that we have freedom to vary and some quantity that is a function of that variable that we want to be at a maximum or minimum value. The variable that we vary is the independent variable. The quantity that we optimize is the dependent variable and is often called the objective function.

Frequently, the more challenging aspect of an optimization problem is identifying the appropriate function. Once the function is identified, the task is reduced to identifying local extreme values and determining whether one of those might also be the extreme value of interest. If there are multiple independent varaibles, we generally find a constraint that allows us to treat one as a dependent variable so that we ultimately only have a single independent variable.

Several simple examples come from geometry where we need to construct a shape that has some feature (like a given perimeter, area or volume) and we wish to make some other feature as large as possible. We use these examples not because they are practical but because they illustrate the principles of optimization effectively.


Suppose we want to cut out a rectangle that has an area of \(500 \; cm^2\) but that horizontal and vertical cuts have different costs. If a horizontal cut costs twice as much as a vertical cut per centimeter, how should we cut the rectangle to minimize our cost?


A biological example follows. A fundamental hypothesis of biology is that evolution drives organisms to maximize their fitness, which corresponds to the number of surviving offspring. There is often a trade-off between the number of offspring and the probability that the offspring survive. Let \(f\) (fecundity) represent the number of offspring an organism produces and let \(s\) (survival) represent the probability that an offspring will survive. The then fitness is given by \(F=f \cdot s\), the average number of offspring that survive.


Suppose that the survival probability is related to fecundity so that it decreases exponentially according to the equation \begin{equation*}s = e^{-0.012f}.\end{equation*} How many offspring should the organism have to maximize fitness?