- #1
tmbrwlf730
- 42
- 0
Hi everyone. So the delta-epsilon proof to show that x2 is continuous goes a little like: |f(x) - f(xo)| = |x2 - xo2| = |x - xo| |x + xo|.
Here you want to bound the term |x + xo| = |x| + |xo| by taking |x| = |x - xo + xo| = |x - xo| + |xo|.
Here you're suppose to take δ = 1 while |x - xo| < δ, so |x - xo| + |xo| < 1 + |xo|.
Putting it back into the earlier equation to get:
(1 + 2|xo|) |x - xo| < ε.
My question is why don't you set the last |x - xo| in |x + xo| |x - xo| to 1 to get (1 + 2|xo|) * 1 < ε? Why do you only set |x - xo| to 1 for the |x + xo| term but not for the other?
Thank you.
Here you want to bound the term |x + xo| = |x| + |xo| by taking |x| = |x - xo + xo| = |x - xo| + |xo|.
Here you're suppose to take δ = 1 while |x - xo| < δ, so |x - xo| + |xo| < 1 + |xo|.
Putting it back into the earlier equation to get:
(1 + 2|xo|) |x - xo| < ε.
My question is why don't you set the last |x - xo| in |x + xo| |x - xo| to 1 to get (1 + 2|xo|) * 1 < ε? Why do you only set |x - xo| to 1 for the |x + xo| term but not for the other?
Thank you.