- #1
VeeEight
- 615
- 0
The following two questions are practice problems that I have been stuck on.
Use the Mean Value Theorem to show that e^x > 1 + x for all x > 0
Mean Value Theorem: If f: [a,b] to R is continuous on [a,b] and differentiable on (a,b) then there exists a point c in (a,b) where f(c) = [tex]f(b) - f(a)/b - a[/tex]
I can do the question by induction so I was thinking about first showing the inequality is true for an x > 0, and then using the Mean Value Theorem in the second step to show that this implies ee^x > x + 2 for all x > 0. The only thing is that I don't know how to use the Mean Value Theorem in this situation - I've tried a few random cases but I can't think of what my interval should be in order to get the desired condition.
The second problem:
Suppose f'(0) exists and f(x + y) = f(x)f(y) for all x and y. Prove that f' exists for all x.
Here are some things I gathered from the given information.
f'(0) exists implies that the limit as x approaches 0 of [tex]f(x) - f(0)/x[/tex] exists.
f'(x + y) = f'(x)f(y) + f'(y)f(x)
so f'(0) = f(x - x) = f'(x)f(-x) + f'(-x)f(x)
and so f'(x)f(-x) + f'(-x)f(x) = the limit as x approaches 0 of [tex]f(x) - f(0)/x[/tex]
I'm not sure if I am just going down the wrong path here since I tried to rearrange the above equation so it can look better but I got nowhere.
Homework Statement
Use the Mean Value Theorem to show that e^x > 1 + x for all x > 0
Homework Equations
Mean Value Theorem: If f: [a,b] to R is continuous on [a,b] and differentiable on (a,b) then there exists a point c in (a,b) where f(c) = [tex]f(b) - f(a)/b - a[/tex]
The Attempt at a Solution
I can do the question by induction so I was thinking about first showing the inequality is true for an x > 0, and then using the Mean Value Theorem in the second step to show that this implies ee^x > x + 2 for all x > 0. The only thing is that I don't know how to use the Mean Value Theorem in this situation - I've tried a few random cases but I can't think of what my interval should be in order to get the desired condition.
The second problem:
Homework Statement
Suppose f'(0) exists and f(x + y) = f(x)f(y) for all x and y. Prove that f' exists for all x.
The Attempt at a Solution
Here are some things I gathered from the given information.
f'(0) exists implies that the limit as x approaches 0 of [tex]f(x) - f(0)/x[/tex] exists.
f'(x + y) = f'(x)f(y) + f'(y)f(x)
so f'(0) = f(x - x) = f'(x)f(-x) + f'(-x)f(x)
and so f'(x)f(-x) + f'(-x)f(x) = the limit as x approaches 0 of [tex]f(x) - f(0)/x[/tex]
I'm not sure if I am just going down the wrong path here since I tried to rearrange the above equation so it can look better but I got nowhere.