A question about limits and infinity

In summary, the question asks if there is a more rigorous way to prove that a function such as f(x) = 1/x goes to infinity as x approaches 0. The conversation also explores a similar example with a function involving a square root. The expert suggests using the concept of delta epsilon to show that the function is sufficiently large for all values of x that are close to 0. However, it is noted that the epsilon-delta wording does not apply to infinity as it is not a number.
  • #1
0kelvin
50
5
I hope I can make this question clear enough.

When we have a function such as f(x) = 1/x and calculate the side limits at x = 0, the right side goes to positive infinity. The left side goes to negative infinity. In calculus we are pluggin in values closer and closer to zero and seeing what the value of f(x) is. For ex: 1/10, then plug 1/100, then 1/500 so on. Is there a more rigorous way to prove that the function is in fact going to infinity?

another example: f(x) = 1/[sqrt(9 + x) - 3]. If x = 0 we have a division by zero. Now if I plug in something small such as 10^(-10), it's not zero but we are going beyond the precision of a hand calculator. If I plot this graph and zoom in enough, at some point google warns that the graph may be wrong due to precision errors. Is there some theory behind such precision errors and may lead us to think that the graph is increasing or decreasing when it's not?
 
Physics news on Phys.org
  • #2
0kelvin said:
I hope I can make this question clear enough.

When we have a function such as f(x) = 1/x and calculate the side limits at x = 0, the right side goes to positive infinity. The left side goes to negative infinity. In calculus we are pluggin in values closer and closer to zero and seeing what the value of f(x) is. For ex: 1/10, then plug 1/100, then 1/500 so on. Is there a more rigorous way to prove that the function is in fact going to infinity?
You show that given any number ##M>0## there will be a value ##x>0## such that ##f(x)>M##. Since ##M## was arbitrary large, the function ##f(x)## grows beyond all limits. The negative version is according.

0kelvin said:
another example: f(x) = 1/[sqrt(9 + x) - 3]. If x = 0 we have a division by zero. Now if I plug in something small such as 10^(-10), it's not zero but we are going beyond the precision of a hand calculator. If I plot this graph and zoom in enough, at some point google warns that the graph may be wrong due to precision errors. Is there some theory behind such precision errors and may lead us to think that the graph is increasing or decreasing when it's not?
I don't understand this. Do you mean error calculations? This case isn't any different from the previous one.
 
  • #3
0kelvin said:
another example: f(x) = 1/[sqrt(9 + x) - 3]. If x = 0 we have a division by zero. Now if I plug in something small such as 10^(-10), it's not zero but we are going beyond the precision of a hand calculator. If I plot this graph and zoom in enough, at some point google warns that the graph may be wrong due to precision errors. Is there some theory behind such precision errors and may lead us to think that the graph is increasing or decreasing when it's not?
Plugging in numbers might not be very helpful, but sketching a graph of ##f(x) = \frac 1 {\sqrt{x + 9} - 3}## would be very helpful.

First, sketch the graph of ##y = \sqrt{x + 9} - 3##. This left endpoint of this graph is at (-9, -3) and goes through the origin. This graph is strictly increasing on its domain.

The reciprocal function, ##f(x) = \frac 1 {\sqrt{x + 9} - 3}##, will have a vertical asymptote at x = 0. Since the denominator is negative for x < 0, the graph of f goes off to negative infinity. Since the denominator is positive for x > p, the graph of f goes off to positive infinity.
 
  • #4
fresh_42 said:
You show that given any number ##M>0## there will be a value ##x>0## such that ##f(x)>M##. Since ##M## was arbitrary large, the function ##f(x)## grows beyond all limits. The negative version is according.


I don't understand this. Do you mean error calculations? This case isn't any different from the previous one.
You need to use the delta epsilon function basically it means if we add a positive value ,delta ,to x=0 and constrain this value closer to 0 you need to prove that 1/x is will always be larger than M a large constant for all values of delta. From the negative just subtract delta.
x<d ==>1/x>1/d
assign 1/d=M ==>1/x>M

you have to prove that this M exists and can be <= 1/d for all delta
 
  • #5
Trollfaz said:
You need to use the delta epsilon function basically it means if we add a positive value ,delta ,to x=0 and constrain this value closer to 0 you need to prove that 1/x is will always be larger than M a large constant for all values of delta. From the negative just subtract delta.
x<d ==>1/x>1/d
assign 1/d=M ==>1/x>M

you have to prove that this M exists and can be <= 1/d for all delta
You mean that finding one value of ##x## is not enough? You need to show that the function is sufficiently large for all ##0 < x < \delta##? For some ##\delta## that depends on ##M##.
 
  • #6
Trollfaz said:
You need to use the delta epsilon function basically it means if we add a positive value ,delta ,to x=0 and constrain this value closer to 0 you need to prove that 1/x is will always be larger than M a large constant for all values of delta. From the negative just subtract delta.
x<d ==>1/x>1/d
assign 1/d=M ==>1/x>M

you have to prove that this M exists and can be <= 1/d for all delta
The ##\varepsilon -\delta## wording doesn't apply to infinity. The definition of ##\longrightarrow \pm \infty ## is different from ##\longrightarrow L## since infinity isn't a number.
 
  • #7
fresh_42 said:
The ##\varepsilon -\delta## wording doesn't apply to infinity. The definition of ##\longrightarrow \pm \infty ## is different from ##\longrightarrow L## since infinity isn't a number.
That may be so, but it doesn't change the fact that this is not correct:

fresh_42 said:
You show that given any number ##M>0## there will be a value ##x>0## such that ##f(x)>M##. Since ##M## was arbitrary large, the function ##f(x)## grows beyond all limits. The negative version is according.
Finding one value of ##x > 0## is not enough. By that definition the function ##e^x## would be unbounded as ##x \rightarrow 0##.
 
  • Like
Likes fresh_42
  • #8
PeroK said:
That may be so, but it doesn't change the fact that this is not correct:

Indeed! That was more than just sloppy by me. I totally forgot the neighborhood.
 

FAQ: A question about limits and infinity

What is the concept of limits and infinity in mathematics?

Limits and infinity are mathematical concepts that deal with the behavior of a function as its input values approach a certain value or go towards infinity. Essentially, it is a way to understand what happens to a function as it gets closer and closer to a specific point or as its input values get larger and larger.

How do you calculate limits and infinity in a mathematical function?

To calculate limits and infinity in a mathematical function, you can use various techniques such as substitution, factoring, and algebraic manipulation. However, in more complex cases, you may need to use advanced techniques such as L'Hopital's Rule or Taylor series expansions.

What is the significance of limits and infinity in real-world applications?

Limits and infinity have many real-world applications, especially in fields such as physics, engineering, and economics. For example, they can be used to model the behavior of a system as it approaches a critical point or to understand the behavior of a function as its input values get larger and larger.

Can limits and infinity have multiple values?

No, limits and infinity can only have one value. This value can be a real number, infinity, or negative infinity. However, in some cases, a function may have a limit at a specific point from the left and a different limit from the right, which is known as a two-sided limit.

Are there any limitations to using limits and infinity in mathematics?

Yes, there are some limitations to using limits and infinity in mathematics. For example, some functions may not have a limit at a certain point, or the limit may not exist if the function oscillates or has a vertical asymptote. Additionally, the concept of infinity can be challenging to grasp and can lead to paradoxes and contradictions if not used carefully.

Back
Top