When we are looking at two-dimensional optimization algorithms, there are a variety of options. The first is the bisection method. I like the bisection method, because it makes a delightful use of the intermediate value theorem. We can use the fact that continuous functions take on all of the values between two points. They may take on a value more than once, but that is someone else’s problem. Formally stated, the intermediate value theorem says:
If
is continuous on a closed interval , and is any number between and inclusive, then there is at least one number in the closed interval such that .
As a root-finding algorithm, bisection takes advantage of the fact
we know the value crosses 0. Take, for example,
We can see that the function crosses zero. Of course, what we see
is irrelevant, what is important is the image tells us a story.
Going back to our theorem, we can say let
But there’s a problem here. What if there are no negative values
of the function? Take, for example,
We can solve this problem with Newton’s
method. Of
course, the problem with Newton’s method is that you need to know
the derivative of