Files
NoteNextra-origin/content/Math4121/Math4121_L4.md
2025-07-06 12:40:25 -05:00

3.1 KiB

Math4121 Lecture 4

Chapter 5. Differentiation

The continuity of the derivative

Theorem 5.12

Suppose f is differentiable on [a,b], Then f' attains intermediate values between f'(a) and f'(b).

Proof:

Let \lambda\in (f'(a),f'(b)). We need to show that there exists x\in (a,b) such that f'(x)=\lambda.

Let g(x)=f(x)-\lambda x. Then g is differentiable on (a,b) and


g'(x)=f'(x)-\lambda.

So g'(a)=f'(a)-\lambda<0 and g'(b)=f'(b)-\lambda>0.

We need to show that g'(x)=0 for some x\in (a,b).

Since g'(a)<0, \exists t_1\in (a,b) such that g'(t_1)<g(a).

If not, then g(t)\geq g(a) for all t\in (a,b). But then g'(a)\gets \frac{g(t)-g(a)}{t-a}\geq 0, which contradicts g'(a)<0.

With the loss of generality, since g'(b)>0, \exists t_2\in (a,b) such that g'(t_2)<g(b).

Hence, g attains its infimum on [a,b] at some x\in (a,b). Then this x is a local minimum of g on (a,b).

So g'(x)=0 and f'(x)=\lambda.

QED

L'Hôpital's Rule

Theorem 5.13

Suppose f and g are differentiable on (a,b) and g'(x)\neq 0 for all x\in (a,b), where -\infty\leq a<b\leq \infty. Suppose


\frac{f'(x)}{g'(x)}\to A \text{ as } x\to a\dots

If


f(x)\to 0 \text{ and } g(x)\to 0 \text{ as } x\to a,

or


g(x)\to \infty \text{ as } x\to a,

then


\frac{f(x)}{g(x)}\to A \text{ as } x\to a.

Note that all these numbers A can be \infty or -\infty (on extended real line).

We're using the open neighborhood definition of \to here. An open neighborhood of \infty is an interval of the form (c,\infty) for some c\in \mathbb{R}.

Recall the Definition 3.1.

Proof:

Main step:

Suppose -\infty\leq A\leq \infty, and let q>A with neighborhood (-,\infty,q). Then \exists c\in \mathbb{R} such that \frac{f(x)}{g(x)}<q,\forall x\in (a,c).

Proof of the main step:

Fix A<r<q. Then \exists c\in (a,b) such that \frac{f'(x)}{g'(x)}<r,\forall x\in (a,c).

Now, for any a<x<y<c, by generalized mean value theorem, \exists t\in (x,y) such that


\frac{f(x)-f(y)}{g(x)-g(y)}=\frac{f'(t)}{g'(t)}

Since t\in (a,c), \frac{f'(t)}{g'(t)}<r.

Case 1: f(x)\to 0 and g(x)\to 0 as x\to a.

As x\to a, f(x)\to 0 and g(x)\to 0. So


\begin{aligned}
\lim_{x\to a}\frac{f(x)-f(y)}{g(x)-g(y)}&=\lim_{x\to a}\frac{0-f(y)}{0-g(y)}\\
&=\lim_{x\to a}\frac{f(y)}{g(y)}\\
&=\frac{f'(y)}{g'(y)}\\
&\leq r<q
\end{aligned}

\forall y\in (a,c), \frac{f(y)}{g(y)}<q.

Case 2: g(x)\to \infty as x\to a.

We can find c_1\in (a,y) such that g(x)>g(y) for all x\in (a,c_1).

Therefore,


\begin{aligned}
\frac{f(x)-f(y)}{g(x)}&<\frac{r[g(x)-g(y)]}{g(x)}\\
\frac{f(x)}{g(x)}&<r-\frac{rg(y)}{g(x)}+\frac{f(y)}{g(x)}
\end{aligned}

To make the right side less than q, we need


\frac{|rg(y)|+|f(y)|}{|g(x)|}<q-r

so,


|g(x)|>\frac{|rg(y)|+|f(y)|}{q-r}

There exists c_2\in (a,c_1) such that |g(x)|>\frac{|rg(y)|+|f(y)|}{q-r},\forall x\in (a,c_2).

So \forall x\in (a,c_2),


\frac{f(x)}{g(x)}<\frac{rg(y)+f(y)}{g(x)}<r+(q-r)=q

\forall x\in (a,c_2), \frac{f(x)}{g(x)}<q.

QED