In his 1823 work, Résumée des leçons données á l’ecole royale polytechnique sur le calcul infintésimal, Augustin Cauchy provided another form of the remainder for Taylor series.
Theorem7.3.1.
Cauchy’s Form of the Remainder
Suppose \(f\) is a function such that \(f^{(n+1)}(t)\) is continuous on an interval containing \(a\) and \(x\text{.}\) Then
With this in mind, let \(x\) be a fixed number with \(-1\lt x\lt 0\) and consider that the binomial series is the Maclaurin series for the function \(f(x)=(1+x)^{\frac{1}{2}}\text{.}\) As we saw before,
Notice that if \(-1\lt x\leq c\text{,}\)\(\) then \(0\lt 1+x\leq 1+c\text{.}\) Thus \(0\lt \frac{1}{1+c}\leq\frac{1}{1+x}\) and \(\frac{1}{\sqrt{1+c}}\leq\frac{1}{\sqrt{1+x}}\text{.}\) Thus we have
Suppose \(-1\lt x\leq c\leq 0\) and consider the function \(g(c)=\frac{c-x}{1+c}\text{.}\) Show that on \([x,0]\text{,}\)\(g\) is increasing and use this to conclude that for \(-1\lt x\leq c\leq 0\text{,}\)
Use this fact to finish the proof that the binomial series converges to \(\sqrt{1+x}\) for \(-1\lt x\lt 0\text{.}\)
The proofs of both the Lagrange form and the Cauchy form of the remainder for Taylor series made use of two crucial facts about continuous functions. First, we assumed the Extreme Value Theorem: Any continuous function on a closed
bounded interval assumes its maximum and minimum somewhere on the interval. Second, we assumed that any continuous function satisfied the Intermediate Value Theorem: If a continuous function takes on two different values, then it must take on any value between those two values.
Mathematicians in the late \(1700\)s and early \(1800\)s typically considered these facts to be intuitively obvious. This was natural since our understanding of continuity at that time was, solely, intuitive. Intuition is a useful tool, but as we have seen before it is also unreliable. For example consider the following function.
Is this function continuous at 0? Near zero its graph looks like this:
but this graph must be taken with a grain of salt as \(\sin
\left(\frac{1}{x}\right)\) oscillates infinitely often as \(x\) nears zero.
No matter what your guess may be, it is clear that it is hard to analyze such a function armed with only an intuitive notion of continuity. We will revisit this example in the next chapter.
As with convergence, continuity is more subtle than it first appears.
We put convergence on solid ground by providing a completely analytic definition in the previous chapter. What we need to do in the next chapter is provide a completely rigorous definition for continuity.