Show that f is continious at 0 (easy but confused)

  • Thread starter Andrax
  • Start date
  • Tags
    Confused
In summary, the conversation discusses how to show that a function, f, is continuous at 0. It is mentioned that it is easy to see that f(0)=0, and then a solution is presented using the definition of continuity. The solution manual does not give a specific value for α, but instead states that it can be any value less than the given ε (E). The concept of limits is also briefly mentioned.
  • #1
Andrax
117
0

Homework Statement



let f be a function t: lf(x)l≤lxl
show that f is continious at 0

The Attempt at a Solution


it's easy to see that f(0)=0
now [itex]\forall[/itex]E>0 [itex]\exists[/itex]α>0 [itex]\forall[/itex]x[itex]\in[/itex]D: lxl<α => lf(x)-f(0)l<E now in the solution manual they just put it like this : since lxl<α implies lf(x)-(f(0)=0)l<E then f i s continious at a , what I'm not getting is that they didn't give alpha a value they just want from x<alpha to the result ? I've been studying limits since 2012 so this is a weird issue to me , please help
 
Last edited by a moderator:
Physics news on Phys.org
  • #3
thanks i forgot a bit about the first definition that's why i had trouble ,i've got it now
 

FAQ: Show that f is continious at 0 (easy but confused)

How do you show that f is continuous at 0?

To show that f is continuous at 0, we must prove that the limit of f(x) as x approaches 0 is equal to f(0). This can be done by using the definition of continuity, which states that for any ε > 0, there exists a δ > 0 such that if |x - 0| < δ, then |f(x) - f(0)| < ε.

What is the definition of continuity?

The definition of continuity is that a function f is continuous at a point x = a if the limit of f(x) as x approaches a is equal to f(a).

Why is proving continuity at 0 important?

Proving continuity at 0 is important because it is a fundamental concept in calculus and is necessary for understanding the behavior of functions at specific points. It also allows us to make predictions about the values of a function at a given point based on its behavior around that point.

What is the role of epsilon and delta in proving continuity?

Epsilon and delta are used in the definition of continuity to represent the distance between the input and output values of a function. Epsilon represents the desired precision or accuracy of the output, while delta represents the distance between the input and the point at which we are trying to prove continuity.

Can you provide an example of proving continuity at 0?

For example, let f(x) = 2x + 1. To show that f is continuous at 0, we must prove that the limit of f(x) as x approaches 0 is equal to f(0). Using the definition of continuity, we can set δ = ε/2 and show that if |x - 0| < δ, then |f(x) - f(0)| < ε. This can be simplified to |2x + 1 - 1| < ε, which is equivalent to |2x| < ε. Therefore, if we choose δ = ε/2, we can show that the limit of f(x) as x approaches 0 is equal to f(0), and thus f is continuous at 0.

Back
Top