Limit Theorem: Does f'(x) Approach 0 as x Goes to Infinity?

  • Thread starter vincent_vega
  • Start date
  • Tags
    Limits
In summary: Its derivative, f'(x), also approaches 0 as x goes to infinity?The OP is trying to prove that f'(x) exists and that it approaches 0 as x goes to infinity.
  • #1
vincent_vega
32
0
suppose there is a function f(x), and it's limit as x goes to infinity is 0.

Is there a theorem that says it's derivative, f'(x), also approaches 0 as x goes to infinity?

Thanks.
 
Physics news on Phys.org
  • #2
Probably not, since it is not true. Consider

[tex]f(x)=\frac{\sin(x^2)}{x}[/tex]

What is true, that if [itex]f^\prime(x)[/itex] has a limit, then this limit must be 0.
 
  • #3
If f is differentiable in (-oo,oo) , use the mean value theorem:

f(b)-f(a)=f'(c)(b-a) . Maybe you can partition [0,oo)into [0,1],[0,2],...,[n,n+1].

Then:

f(1)=f(0)+f'(co) ;f(2)=f(1)+f'(c1) ;...f(n)=f(n-1)+f'(c_(n-1)).

Then you can find a closed form for f(n).
 
  • #4
Bacle2 said:
If f is differentiable in (-oo,oo) , use the mean value theorem:
micromass's example shows this isn't going to work.
 
  • #5
Of course I'm assuming f'(x) is defined as x-->oo , that is implied in my argument.

Basically, take [0,b] . Then

f(b)-f(0)=f'(c)(b)

If f' is defined everywhere and we let b-->oo , then the limit cannot be 0 unless f'(c) decreases to zero.

If the OP says "its derivative approaches 0 as x --> infinity" seems to me to assume that the derivative is defined as x-->oo.
 
Last edited:
  • #6
Bacle2 said:
Of course I'm assuming f'(x) is defined as x-->oo , that is implied in my argument.
You mean, that the limit is defined? If your argument relies on that then it needs to be stated.
 
  • #7
From the OP:" it's derivative, f'(x), also approaches 0 as x goes to infinity? "


This looks to me like an assumption that f'(x) is defined as x-->oo
 
  • #8
Maybe the OP can clarify the conditions of the problem to eliminate ambiguity?
 
  • #9
Bacle2 said:
From the OP:" its derivative, f'(x), also approaches 0 as x goes to infinity? "
This looks to me like an assumption that f'(x) is defined as x-->oo
No, that's what he's trying to prove. He didn't say "if f' approaches a limit, that limit is 0", so the most reasonable interpretation is that he wants to prove "f' approaches a limit, and that limit is 0".
 
  • #10
It's not clear to me either way. f'(x) is said to exist without any qualification; I see

no reason to assume it exists in a specific subset of the real line only, nor reason

to assume otherwise. In your interpretation, why didn't the OP say something like

is f'(x) defined, and if so , what is its limit. He refers to f'(x) which states that f'(x)

exists. It may exist somewhere or everywhere.

The problem is posed sloppily ; I think out of basic manners, the OP should clarify.
 
  • #11
Bacle2 said:
f'(x) is said to exist without any qualification

Then it does not follow that the limit ##\lim_{x\rightarrow\infty}f'(x)## is defined. The statement "f' exists" is limited to ##x \in \mathbb{R}## because the domain of f is the real numbers and not the extended reals. What happens as ##x\rightarrow\infty## is considered a separate condition, and must explicitly be mentioned.
 

FAQ: Limit Theorem: Does f'(x) Approach 0 as x Goes to Infinity?

What is the Limit Theorem?

The Limit Theorem is a foundational concept in calculus that describes the behavior of a function as its input, or independent variable, approaches a specific value or point. It helps us understand the behavior of a function at points where it is not defined or is difficult to determine.

What is f'(x)?

f'(x) represents the derivative of the function f(x), which describes the rate of change of the function at a particular point. It is defined as the slope of the tangent line at that point, and it helps us understand how the function is changing at a specific input value.

How does f'(x) approach 0 as x goes to infinity?

The behavior of f'(x) as x approaches infinity depends on the behavior of the original function f(x). If f(x) approaches a constant value or approaches infinity as x increases, then f'(x) will approach 0 as x goes to infinity. This indicates that the function is becoming increasingly flat or horizontal at larger values of x.

Why is it important to understand the behavior of f'(x) as x goes to infinity?

Understanding the behavior of f'(x) as x goes to infinity can help us determine the overall behavior of the function f(x) at large input values. It can also help us identify important characteristics of the function, such as the presence of horizontal asymptotes or the existence of a limit at infinity.

How is the Limit Theorem used in real-world applications?

The Limit Theorem is used in various fields such as physics, engineering, and economics to model and predict real-world phenomena. For example, it can be used to model population growth or predict stock market trends. In these applications, understanding how a function behaves at large input values is crucial for making accurate predictions and decisions.

Back
Top