Limits and derivative: is this proof accurate enough?

In summary: Yes, that looks good! Just make sure to specify that ##\delta## depends on ##\epsilon## and that it exists for all ##\epsilon > 0##.
  • #1
Felafel
171
0

Homework Statement



f is differentiable in ##\mathbb{R^+}## and
##\displaystyle \lim_{x \to \infty} (f(x)+f'(x))=0##
Prove that
##\displaystyle \lim_{x \to \infty}f(x)=0##

The Attempt at a Solution



I can split the limit in two:
##(\displaystyle \lim_{x \to \infty} f(x)+\displaystyle \lim_{x \to \infty} f'(x))=0##
I consider the second one and say that, by definition of derivative I have:
##\displaystyle \lim_{x \to \infty} \displaystyle \lim_{h \to 0} \frac{f(x+h)-f(x)}{h}##
As f is differentiable, then the second limit exists and is 0.
So, i have ##\displaystyle \lim_{x \to \infty} 0 =0##
And then, by hypothesis:
##\displaystyle \lim_{x \to \infty} (f(x)+0)=0##
Are the passages logically correct?
thank you in advance!
 
Physics news on Phys.org
  • #2
Felafel said:

Homework Statement



f is differentiable in ##\mathbb{R^+}## and
##\displaystyle \lim_{x \to \infty} (f(x)+f'(x))=0##
Prove that
##\displaystyle \lim_{x \to \infty}f(x)=0##

The Attempt at a Solution



I can split the limit in two:
##(\displaystyle \lim_{x \to \infty} f(x)+\displaystyle \lim_{x \to \infty} f'(x))=0##

No, you can't do that. You can only do that if you know that both limits exist. So:

[tex]\lim_{x\rightarrow +\infty} f(x)+g(x) = \lim_{x\rightarrow +\infty} f(x) + \lim_{x\rightarrow +\infty} g(x)[/tex]

is not true in general, but only if you know that the limits actually exist.

As a counterexample:

[tex]0=\lim_{x\rightarrow +\infty} x-x \neq \lim_{x\rightarrow +\infty} x + \lim_{x\rightarrow +\infty} -x[/tex]
 
  • #3
micromass said:
No, you can't do that. You can only do that if you know that both limits exist. So:

[tex]\lim_{x\rightarrow +\infty} f(x)+g(x) = \lim_{x\rightarrow +\infty} f(x) + \lim_{x\rightarrow +\infty} g(x)[/tex]

is not true in general, but only if you know that the limits actually exist.

As a counterexample:

[tex]0=\lim_{x\rightarrow +\infty} x-x \neq \lim_{x\rightarrow +\infty} x + \lim_{x\rightarrow +\infty} -x[/tex]

Thanks, you saved me from a major mistake!
Maybe I should prove this way, then?:

By definition of derivative:
##f(x)+f'(x)=f(x)+ \displaystyle \lim_{h \to 0} \frac{f(x+h)-f(x)}{h} ##=

## \displaystyle \lim_{h \to 0} f(x)-\frac{f(x)}{h}+\frac{f(x+h)}{h}## =
## \displaystyle \lim_{h \to 0} f(x)(1-\frac{1}{h})+\frac{f(x+h)}{h}##=

and, by hypothesis:

##\displaystyle \lim_{x \to \infty} \displaystyle \lim_{h \to 0} f(x)(1-\frac{1}{h})+\frac{f(x+h)}{h}##= 0

Say:

##\displaystyle \lim_{x \to \infty}f(x)=L## so that the expression above becomes:

##\displaystyle \lim_{h \to 0}L(1-\frac{1}{\epsilon})+\frac{L}{\epsilon}##=0

##\displaystyle \lim_{h \to 0} L=0## ##\Rightarrow## ##\displaystyle \lim_{x\to \infty }f(x)=L##=0
 
  • #4
Now you assume that

[tex]\lim_{h\rightarrow 0} \lim_{x\rightarrow +\infty} f(x,h) = \lim_{x\rightarrow +\infty}\lim_{h\rightarrow 0} f(x,h).[/tex]

This is also not true in general.
 
  • #5
micromass said:
Now you assume that

[tex]\lim_{h\rightarrow 0} \lim_{x\rightarrow +\infty} f(x,h) = \lim_{x\rightarrow +\infty}\lim_{h\rightarrow 0} f(x,h).[/tex]

This is also not true in general.

oh.. I've run out of possible good ideas then, any hint?
 
  • #6
The intuition is this: if x is very large, then f(x) is very close to -f'(x). In particular, if f(x) is very large, then -f'(x) is very negative. And thus f(x) decreases very fast.

Try to work with an epsilon-delta definition. What does it mean that f(x) does not tend to 0??
 
  • #7
can I simply say then, that:
##\forall \epsilon >0####\exists \delta>0## such that if ##0<|x-x_0|<\delta \Rightarrow |f(x)+f'(x)-0|<\epsilon##
And so, being the function >0 because it is defined on ##\mathbb{R^+}:
##0<|f(x)|< \epsilon - f'(x)|## ##\Rightarrow## ##|f(x)|< \epsilon## and for the squeeze rule ##\lim f(x)=0##
 

FAQ: Limits and derivative: is this proof accurate enough?

What is a limit in calculus?

A limit in calculus refers to the value that a function approaches as the input of the function gets closer and closer to a specific value. It is used to describe the behavior of a function near a certain point, and it is essential in understanding the concept of continuity and differentiability.

What is a derivative?

A derivative is a mathematical concept that represents the rate of change of a function at a specific point. It is calculated by finding the slope of the tangent line at that point, and it can be used to determine the instantaneous rate of change of a function.

How do you prove the accuracy of a limit or derivative?

To prove the accuracy of a limit or derivative, you need to use mathematical techniques such as the epsilon-delta definition of a limit or the limit laws. These methods involve showing that the limit or derivative satisfies certain conditions, and if it does, then it is considered accurate.

Are there any common mistakes when proving limits and derivatives?

Yes, there are common mistakes when proving limits and derivatives. These include not properly understanding the definition or concept, making algebraic errors, and not showing all the necessary steps in the proof. It is important to carefully follow the steps and understand the concepts involved in order to avoid these mistakes.

How important are limits and derivatives in calculus?

Limits and derivatives are fundamental concepts in calculus and are essential in understanding and applying the principles of calculus. They are used in various areas of mathematics, physics, engineering, and economics, making them crucial for students pursuing these fields of study.

Similar threads

Replies
7
Views
1K
Replies
7
Views
868
Replies
13
Views
1K
Replies
19
Views
1K
Replies
12
Views
684
Replies
8
Views
713
Replies
5
Views
969
Back
Top