Limit of derivative as x goes to infinity

In summary: I'm not sure what the cutoff is).In summary, the function f' is continuous and has an antiderivative, so it follows that the limit as x->infinity of f'(x) doesn't equal 0.
  • #1
Adorno
30
0

Homework Statement


Suppose that [itex]f[/itex] and [itex]f'[/itex] are continuous functions on [itex]\mathbb{R}[/itex], and that [itex]\displaystyle\lim_{x\to\infty}f(x)[/itex] and [itex]\displaystyle\lim_{x\to\infty}f'(x)[/itex] exist. Show that [itex]\displaystyle\lim_{x\to\infty}f'(x) = 0[/itex].


Homework Equations


Definition of derivative: [itex]f'(x) = \displaystyle\lim_{h\to0}\frac{f(x+h) - f(x)}{h}[/itex]
Fundamental theorem of calculus: [itex]f(x) = \frac{d}{dx}\displaystyle\int^x_a f(t)dt[/itex]


The Attempt at a Solution


At first I just wrote it in terms of the definition of the derivative:[tex]\displaystyle\lim_{x\to\infty}f'(x) = \displaystyle\lim_{x\to\infty}(\displaystyle\lim_{h\to0}\frac{f(x+h) - f(x)}{h})[/tex] Then I thought that you could change the order of the limits (since both limits exist and the function [itex]\frac{f(x+h) - f(x)}{h}[/itex] is continuous right?):
[tex]\displaystyle\lim_{x\to\infty}f'(x) = \displaystyle\lim_{h\to0}( \displaystyle\lim_{x\to\infty} \frac{f(x+h) - f(x)}{h} )[/tex] And then since [itex]h[/itex] is just a constant it should follow that [itex]\displaystyle\lim_{x\to\infty}f(x+h) = \displaystyle\lim_{x\to\infty}f(x) = c[/itex], so that [itex]\displaystyle\lim_{x\to\infty}(f(x+h) - f(x)) = c - c = 0[/itex]. Then we have [tex]\displaystyle\lim_{x\to\infty}f'(x) = \displaystyle\lim_{h\to0}0 = 0.[/tex] I'm not sure about this though. It seems a little too simple and doesn't seem to use all of the information given. Also, I'm not sure if I'm allowed to change the order of the limits, so maybe this doesn't work at all. Could anyone help?
 
Physics news on Phys.org
  • #2
Adorno said:
Then I thought that you could change the order of the limits (since both limits exist and the function [itex]\frac{f(x+h) - f(x)}{h}[/itex] is continuous right?)


This seems like a mighty big leap of logic. Maybe it's valid, maybe it's not, but either way it's not obvious. Can you say exactly what theorem you are using?

Offhand I would think that you should start with the fact that

[tex]\lim_{x \rightarrow \infty} f'(x)[/tex]

exists. Call the limit L. Then see if you can obtain a contradiction if you assume that L > 0 or L < 0.
 
  • #3
Note that the assumption

[tex]\lim_{x\rightarrow +\infty}{f^\prime(x)}~\text{exists}[/tex]

is necessary. If the assumption does not hold, then

[tex]f(x)=\frac{\sin(x^3)}{x}[/tex]

is a counterexample. I'm just telling this because finding that very counterexample was once one of my exam questions, and I thought you might find it interesting :smile:
 
  • #4
Could you assume the opposite and say that, if the limit as x->infinity of f' doesn't equal 0, then the function would be divergent (ie it would imply the limit as x->infinity of f doesn't exist)? What is your given definition of convergent?
 
  • #5
That is interesting, micromass. I wasn't sure whether that condition was actually necessary.

TylerH and jbunniii: Yeah, I thought about that as well, but I'm not sure where the contradiction would come from. As for a definition of convergence, I take it you mean this: if [itex]\displaystyle\lim_{x\to\infty}f(x) = L[/itex] then given any [itex]\epsilon > 0[/itex] there exists some [itex]N[/itex] such that [itex]|f(x) - L| < \epsilon[/itex] [itex]\forall x > N[/itex]. I don't know about using this because it doesn't involve the derivative at all.

Also, yes, I think my initial method was totally wrong.
 
  • #6
Let

[tex]\lim_{x \rightarrow \infty} f'(x) = L[/tex].

If L > 0, then there is some N such that

[tex]f'(x) > L/2[/tex]

for all x > N.

Now what happens if you integrate? Can you get a contradiction?
 
  • #7
I see. So if we integrate we get something like [itex] f(x) > (L/2)x + c[/itex]. And the right-hand side goes to infinity as [itex] x \to \infty [/itex], which is a contradiction since [itex]\displaystyle\lim_{x \to \infty}f(x)[/itex] was assumed to exist. Is that right?
 
  • #8
Adorno said:
I see. So if we integrate we get something like [itex] f(x) > (L/2)x + c[/itex]. And the right-hand side goes to infinity as [itex] x \to \infty [/itex], which is a contradiction since [itex]\displaystyle\lim_{x \to \infty}f(x)[/itex] was assumed to exist. Is that right?

That is good! Except for one small thing thing. You said that f(x)>L/2, this does not mean that

[tex]\int_0^x{f(x)dx}>\int_0^x{L/2dx}[/tex]

It merely means that [itex]\geq[/itex] holds. But your proof still holds...
 
  • #9
Adorno said:
I see. So if we integrate we get something like [itex] f(x) > (L/2)x + c[/itex]. And the right-hand side goes to infinity as [itex] x \to \infty [/itex], which is a contradiction since [itex]\displaystyle\lim_{x \to \infty}f(x)[/itex] was assumed to exist. Is that right?

Yes, more or less. Just be careful about the interval over which you integrate. The inequality only holds for x > N.

Note that you use the continuity of [itex]f'[/itex] in order to integrate it and to invoke the fundamental theorem of calculus.

By the way, I don't think the continuity of [itex]f'[/itex] is required, as long as all the other assumptions are satisfied. Integrability of [itex]f'[/itex] and the fact that [itex]f'[/itex] has [itex]f[/itex] as an antiderivative should suffice. You also don't need the assumption that [itex]f[/itex] is continuous, as it's automatically true given the existence of [itex]f'[/itex] (at least for sufficiently large x).
 

FAQ: Limit of derivative as x goes to infinity

What is the limit of a derivative as x approaches infinity?

The limit of a derivative as x approaches infinity represents the behavior of a function as the input approaches an infinitely large value. It determines the steepness or slope of the function at that point.

How do you calculate the limit of a derivative as x approaches infinity?

To calculate the limit of a derivative as x approaches infinity, you can take the derivative of the function and then plug in the value of infinity for x. The resulting number will be the limit of the derivative.

What does it mean if the limit of a derivative as x approaches infinity is positive?

If the limit of a derivative as x approaches infinity is positive, it means that the function is increasing at an increasing rate. This can be seen as the slope of the function becomes steeper and steeper as x approaches infinity.

What does it mean if the limit of a derivative as x approaches infinity is negative?

If the limit of a derivative as x approaches infinity is negative, it means that the function is decreasing at an increasing rate. This can be seen as the slope of the function becomes flatter and flatter as x approaches infinity.

Why is it important to understand the limit of a derivative as x approaches infinity?

Understanding the limit of a derivative as x approaches infinity allows us to analyze the behavior of functions as their input values become very large. This can be useful in various fields of science, such as physics and economics, where we are interested in understanding how a system behaves at extreme values.

Similar threads

Replies
13
Views
947
Replies
12
Views
680
Replies
19
Views
1K
Replies
13
Views
1K
Replies
8
Views
710
Back
Top