- #1
Punkyc7
- 420
- 0
Let f:[a,b][itex]\rightarrow[/itex]R be continuous on [a,b] and differentiable in (a,b). Show that if lim f'(x)=A as x goes to a then f'(a) exist and equals A.
So I was thinking this has to do either with the mean value theorem or Darboux's Theorem.
I have that
f(b)-f(a)=f'(c)(b-a) by the mean value theorem.
From here I stuck on how to get the x into the equation.
Would I say that let x=c.
Then we have
f'(x)=[itex]\frac{f(b)-f(a)}{b-a}[/itex]=A
If so how would I work in the f'(a)?
So I was thinking this has to do either with the mean value theorem or Darboux's Theorem.
I have that
f(b)-f(a)=f'(c)(b-a) by the mean value theorem.
From here I stuck on how to get the x into the equation.
Would I say that let x=c.
Then we have
f'(x)=[itex]\frac{f(b)-f(a)}{b-a}[/itex]=A
If so how would I work in the f'(a)?