Why is the series equal to 0,if x=0?

  • MHB
  • Thread starter evinda
  • Start date
  • Tags
    Series
In summary, you can separate a series into two individual series regardless of their convergence, but if you want to determine the convergence of the original series, at least one of the individual series must converge.
  • #1
evinda
Gold Member
MHB
3,836
0
Hey! :rolleyes:

I am looking at the following exercise:
Check if the series $\sum_{n=0}^{\infty} (1-x)x^n$ converges uniformly in $[0,1]$.

That's the solution that the assistant of the professor gave us:

  • $0 \leq x <1: \sum_{n=0}^{\infty} (1-x)x^n=(1-x) \sum_{n=0}^{\infty}x^n=1 $
  • $x=1: \sum_{n=0}^{\infty} (1-x)x^n=\sum_{n=0}^{\infty} 0=0$
So: $$ \sum_{n=0}^{\infty} (1-x)x^n=\left\{\begin{matrix}
1, & 0 \leq x<1\\
0, & x=1
\end{matrix}\right.$$

If $s(x)=\sum_{n=0}^{\infty} (1-x)x^n=\left\{\begin{matrix}
1, & 0 \leq x<1\\
0, & x=1
\end{matrix}\right.$, we see that $s(x)$ is not continuous at $[0,1]$,so the convergence is not uniform,as $(1-x)x^n$ are continuous at $[0,1]$.

But... why is it $\sum_{n=0}^{\infty} (1-x)x^n=1$,for $x=0$?? (Worried) Isn't it: $\sum_{n=0}^{\infty} (1-x)x^n=\sum_{n=0}^{\infty} 1 \cdot 0^n=\sum_{n=0}^{\infty} 0=0$ ? (Thinking)
 
Physics news on Phys.org
  • #2
evinda said:
Hey! :rolleyes:

I am looking at the following exercise:
Check if the series $\sum_{n=0}^{\infty} (1-x)x^n$ converges uniformly in $[0,1]$.

That's the solution that the assistant of the professor gave us:

  • $0 \leq x <1: \sum_{n=0}^{\infty} (1-x)x^n=(1-x) \sum_{n=0}^{\infty}x^n=1 $
  • $x=1: \sum_{n=0}^{\infty} (1-x)x^n=\sum_{n=0}^{\infty} 0=0$
So: $$ \sum_{n=0}^{\infty} (1-x)x^n=\left\{\begin{matrix}
1, & 0 \leq x<1\\
0, & x=1
\end{matrix}\right.$$

If $s(x)=\sum_{n=0}^{\infty} (1-x)x^n=\left\{\begin{matrix}
1, & 0 \leq x<1\\
0, & x=1
\end{matrix}\right.$, we see that $s(x)$ is not continuous at $[0,1]$,so the convergence is not uniform,as $(1-x)x^n$ are continuous at $[0,1]$.

But... why is it $\sum_{n=0}^{\infty} (1-x)x^n=1$,for $x=0$?? (Worried) Isn't it: $\sum_{n=0}^{\infty} (1-x)x^n=\sum_{n=0}^{\infty} 1 \cdot 0^n=\sum_{n=0}^{\infty} 0=0$ ? (Thinking)

Couldn't you just do

$\displaystyle \begin{align*} \sum_{n = 0}^{\infty} \left[ \left( 1 - x \right) \, x^n \right] &= \sum_{n = 0}^{\infty} \left( x^n - x^{n + 1} \right) \\ &= \sum_{n = 0}^{\infty} \left( x^n \right) - \sum_{n = 0}^{\infty} \left( x^{n+1} \right) \end{align*}$

Both of these are geometric series with common ratio $\displaystyle \begin{align*} x \end{align*}$, so both are convergent where $\displaystyle \begin{align*} |x| < 1 \end{align*}$, and thus your original series is convergent for $\displaystyle \begin{align*} |x|<1 \end{align*}$.
 
  • #3
evinda said:
But... why is it $\sum_{n=0}^{\infty} (1-x)x^n=1$,for $x=0$?? (Worried) Isn't it: $\sum_{n=0}^{\infty} (1-x)x^n=\sum_{n=0}^{\infty} 1 \cdot 0^n=\sum_{n=0}^{\infty} 0=0$ ? (Thinking)

All the terms of the series $\displaystyle \sum_{n=0}^{\infty} 0^{n}$ are equal to 0... with the only exception of n=0 because is $0^{0} = 1$, so that $\displaystyle \sum_{n=0}^{\infty} 0^{n} = 1$ (Sun)...

Kind regards

$\chi$ $\sigma$
 
  • #4
chisigma said:
All the terms of the series $\displaystyle \sum_{n=0}^{\infty} 0^{n}$ are equal to 0... with the only exception of n=0 because is $0^{0} = 1$, so that $\displaystyle \sum_{n=0}^{\infty} 0^{n} = 1$ (Sun)...

Kind regards

$\chi$ $\sigma$

I understand (Nod) thanks a lot! :)

- - - Updated - - -

Prove It said:
Couldn't you just do

$\displaystyle \begin{align*} \sum_{n = 0}^{\infty} \left[ \left( 1 - x \right) \, x^n \right] &= \sum_{n = 0}^{\infty} \left( x^n - x^{n + 1} \right) \\ &= \sum_{n = 0}^{\infty} \left( x^n \right) - \sum_{n = 0}^{\infty} \left( x^{n+1} \right) \end{align*}$

Both of these are geometric series with common ratio $\displaystyle \begin{align*} x \end{align*}$, so both are convergent where $\displaystyle \begin{align*} |x| < 1 \end{align*}$, and thus your original series is convergent for $\displaystyle \begin{align*} |x|<1 \end{align*}$.

So can we just separate one series into two ones? :confused:
 
  • #5
evinda said:
I understand (Nod) thanks a lot! :)

- - - Updated - - -
So can we just separate one series into two ones? :confused:

In this case you can. Look at the individual terms:

$\displaystyle \begin{align*} \sum_{n = 0}^{\infty} \left( x^n - x^{n + 1} \right) &= \left( x^0 - x^1 \right) + \left( x^1 - x^2 \right) + \left( x^2 - x^3 \right) + \left( x^3 - x^4 \right) + \dots + \left( x^n - x^{ n + 1 } \right) + \dots \\ &= \left( x^0 + x^1 + x^2 + x^3 + \dots + x^n + \dots \right) - \left( x^1 + x^2 + x^3 + x^4 + \dots + x^{n + 1} + \dots \right) \\ &= \sum_{n = 0}^{\infty} \left( x^n \right) - \sum_{n = 0}^{\infty} \left( x^{n + 1} \right) \end{align*}$
 
  • #6
Prove It said:
In this case you can. Look at the individual terms:

$\displaystyle \begin{align*} \sum_{n = 0}^{\infty} \left( x^n - x^{n + 1} \right) &= \left( x^0 - x^1 \right) + \left( x^1 - x^2 \right) + \left( x^2 - x^3 \right) + \left( x^3 - x^4 \right) + \dots + \left( x^n - x^{ n + 1 } \right) + \dots \\ &= \left( x^0 + x^1 + x^2 + x^3 + \dots + x^n + \dots \right) - \left( x^1 + x^2 + x^3 + x^4 + \dots + x^{n + 1} + \dots \right) \\ &= \sum_{n = 0}^{\infty} \left( x^n \right) - \sum_{n = 0}^{\infty} \left( x^{n + 1} \right) \end{align*}$

A ok...In general,can we just separate one series $\sum(a_n+b_n)$ into two $\sum_{n=1}^{\infty} a_n +\sum_{n=1}^{\infty} b_n$, only if we know that at least one of these: $\sum a_n, \sum b_n$ converges? (Thinking)
 
  • #7
evinda said:
A ok...In general,can we just separate one series $\sum(a_n+b_n)$ into two $\sum_{n=1}^{\infty} a_n +\sum_{n=1}^{\infty} b_n$, only if we know that at least one of these: $\sum a_n, \sum b_n$ converges? (Thinking)

You don't even need to know anything about the convergence of each individual series. $\displaystyle \begin{align*} \sum{ \left( a_n + b_n \right) } = \sum{ \left( a_n \right) } + \sum{ \left( b_n \right) } \end{align*}$ is always true.
 
  • #8
Prove It said:
You don't even need to know anything about the convergence of each individual series. $\displaystyle \begin{align*} \sum{ \left( a_n + b_n \right) } = \sum{ \left( a_n \right) } + \sum{ \left( b_n \right) } \end{align*}$ is always true.

Erm... suppose $a_n=1, b_n=-1$...
 
  • #9
I like Serena said:
Erm... suppose $a_n=1, b_n=-1$...

So,should one of the two series converge? (Wasntme)
 
  • #10
evinda said:
So,should one of the two series converge? (Wasntme)

If both individual series converge, then their sum will also converge. (Nerd)
 
  • #11
I like Serena said:
If both individual series converge, then their sum will also converge. (Nerd)

Ok! But,can I separate a series into two ones,in any case? Or do I have to know that,if,for example, we have the series $\sum_{n=1}^{\infty} (a_n+b_n)$,that either $\sum_{n=1}^{\infty} a_n$ or $\sum_{n=1}^{\infty} b_n$ converges,to separate the series? :confused:
 
  • #12
evinda said:
Ok! But,can I separate a series into two ones,in any case? Or do I have to know that,if,for example, we have the series $\sum_{n=1}^{\infty} (a_n+b_n)$,that either $\sum_{n=1}^{\infty} a_n$ or $\sum_{n=1}^{\infty} b_n$ converges,to separate the series? :confused:

You can try to separate the series into 2 series.
Then, if you find that both those 2 series converge, you know that the original series also converges. (Sun)

If however it turns out that one of those 2 series does not converge, the argument falls apart.
That would mean that you'd have to figure out something else. (Doh)
 
  • #13
I like Serena said:
You can try to separate the series into 2 series.
Then, if you find that both those 2 series converge, you know that the original series also converges. (Sun)

If however it turns out that one of those 2 series does not converge, the argument falls apart.
That would mean that you'd have to figure out something else. (Doh)

Ok...I understand..Thanks a lot! :cool:
 
  • #14
Just an aside: not all mathematicians agree that $0^0 = 1$, it is a CONVENTION.

The reason for this, is that the function $f(x,y) = x^y$ is not continuous at (0,0), so there is good reason to leave it "undefined".

Still, some regard $m^n$ for POSITIVE integers $m,n$ as $m$ times itself $n$ times, and that $m^0$ should be the "empty product" (which ought to be the multiplicative identity, just as it is the additive identity for the "empty sum").

I urge you to read this web page:

Q: What does 0^0 (zero raised to the zeroth power) equal? Why do mathematicians and high school teachers disagree? | Ask a Mathematician / Ask a Physicist

Which not only contains a delightful exposition, but only some responses from various people, some learned, some not so much.

In short, your professor's argument has a small "hole" in it, and if I were you, I would press him/her to justify their claim.
 
  • #15
A rigorous proof of the fact that is $0^{0}=1$ was given in ...

http://mathhelpboards.com/analysis-50/never-ending-dispute-2060.html

I don't think it is appropriate, at least for me, to devote time and effort to discuss the matter further ...

Kind regards

$\chi$ $\sigma$
 
  • #16
chisigma said:
A rigorous proof of the fact that is $0^{0}=1$ was given in ...

http://mathhelpboards.com/analysis-50/never-ending-dispute-2060.html

I don't think it is appropriate, at least for me, to devote time and effort to discuss the matter further ...

Kind regards

$\chi$ $\sigma$

Except for the fact that it's not a fact...
 
  • #17
There are, depending on context, different "answers" to the question:

"What is $0^0$"?

Firstly, one should ask oneself, "why do I need to know"? For example, if one is evaluating the polynomial:

$\displaystyle \sum_{k = 0}^n a_nx^k$ at $x = 0$, there is "only one answer that makes sense".

In this particular case, I believe the professor has glossed over a significant point, probably in favor of expedience. I feel evinda is to be commended for feeling a bit "unsure" of what is surely "hand-waving" by the professor.

chisigma has linked to another thread, which in turn links to another thread on another forum. In that forum, I made THIS post:

0^0 - Math Help Forum

In the thread that chisigma linked to, I am very much of the same mind as CaptainBlack.

With all due repsect for the impressive display of analytic acumen shown in the linked thread, I must defer. The exception I take issue with are the very first words:

chisigma; said:
A rigorous proof of the fact that is $0^{0}=1$...

Those are some very strong words. At least two people (Captain Black and myself) on these forums, and other on that "other forum" have some qualms about this.

Philosophically, there are deeper issues at stake, which I will only touch on briefly here:

1) Mathematics involves definitions. The definitions used can greatly impact the conclusions reached. One could (but I will not) go so far as to say "none" of mathematics is FACT, but only consistent conclusions based on initial assumptions. Often, what we say is true, depends on what character we want our mathematical systems to have.

2) Rigor is kind of a "squishy" term. Using high-powered mathematics doesn't make something "true", rather it often has the undesirable side effect of hiding assumptions.

To underscore what I mean: in the linked thread, chisigma makes reference to Taylor series of two variables. To PROPERLY apply this theorem, and thus any formulae we may obtain, $f(x,y)$ must be DEFINED in an open disk containing $(x_0,y_0)$.

He derives a formula for $\ln(a^x)$, and (to make a long post short) essentially shows that the FUNCTIONS:

$\displaystyle \lim_{a \to 0} f_a$ where $f_a(x) = a^x$ converge to $f = 0$ on $(0,1)$.

The problem is, if you look at what $f_a$ is actually converging to, it is the union of the $x$ and $y$-axes (on the unit square), which isn't a function!

This is, in my opinion, tantamount to "proof by intimidation", it looks very complicated, so it MUST be true, right?

In point of fact, I am prepared to "go along" with the convention that $0^0 = 1$ for most purposes. In the post of mine that I linked to, I present some contexts in which it seems "natural".

HOWEVER, I think chisgma's response is a little "too strong", logarithms and power series are not the "last word" in mathematical definitions, even though there are very useful tools. And I think it would BENEFIT the original poster to have the professor explain themselves. I am NOT saying the professor is WRONG (or even that chisigma is wrong), I am saying: be CLEAR about the assumptions and definitions being used. There is much to be learned from exploring what might seem like "a simple issue".
 
  • #18
Deveno said:
... philosophically, there are deeper issues at stake... the problem is, if you look at what $f_a$ is actually converging to, it is the union of the $x$ and $y$-axes (on the unit square), which isn't a function!... I am prepared to 'go along' with the convention that $0^0 = 1$ for most purposes. In the post of mine that I linked to, I present some contexts in which it seems 'natural'... I think chisgma's response is a little 'too strong' logarithms and power series are not the 'last word' in mathematical definitions, even though there are very useful tools... I am not saying the professor is wrong (or even that chisigma is wrong), I am saying: be clear about the assumptions and definitions being used... there is much to be learned from exploring what might seem like 'a simple issue'...

... for my part I can add that to a few millennia mathematical thinking knows no such expressions as 'philosophically','actually convergent','convention','natural','last word in mathematical definitions', but only expressions like 'it is' or 'it isn't' (Wasntme)...

Kind regards

$\chi$ $\sigma$
 
  • #19
chisigma said:
... for my part I can add that to a few millennia mathematical thinking knows no such expressions as 'philosophically','actually convergent','convention','natural','last word in mathematical definitions', but only expressions like 'it is' or 'it isn't' (Wasntme)...

Kind regards

$\chi$ $\sigma$

I suppose that depends on your view of what is, and isn't mathematics.

I myself am strongly opposed to the idea that mathematics is (merely) Turing-complete, a decision algorithm that eventually spits out "yes" or "no". This is a "meta-mathematical" position, and as such I can certainly discuss mathematics with mathematicians who hold opposing views without problem.

These forums are about math, but we are not forbidden to use things besides Latex (for example, English, German, Italian or Russian) to express the ideas we hold about mathematics.

You are skirting a perilous precipice my friend, when you say (things like) "this is undeniably true", even in such an unambiguous language as mathematics. Truth is even MORE elusive than mathematics, even though I daresay we all have some "idea" of what it ought to be.

I find it likely that in this case that the professor is using "the well-known result"

$\displaystyle \sum_{n = 0}^{\infty} x^n = \dfrac{1}{1 -x}$

where $x^0$ is merely a "short-hand" for 1 (which could be listed separately as an initial term without affecting convergence of the series), much as we do for polynomials. As I indicated before, this is a CONVENTION, based on USAGE (and a rather convenient one, at that), and I would NOT go so far to say: $0^0 = 1$, but rather (in this case, or: here,) we take $x^0 = 1$ for all $x$.

To quote from Wikipedia:

However, not all sources define 00 to be 1, particularly in the context of continuously varying exponents.

To sum up, you (appear to) feel that mathematics statements divide cleanly:

1. True
2. False

While I feel that statements fall like so:

1. True in this context
2. False in this context
3. Undecidable in this context
4. Unknown

Even after the monumental contributions by Weierstrass and Cauchy, calculus still has some "fuzzy bits" to it. I feel it is worth remembering that the real numbers are a CONSTRUCT of mankind, and not something we empirically deduced.
 

FAQ: Why is the series equal to 0,if x=0?

Why does a series equal to 0 when x=0?

When x=0, the terms of the series are all multiplied by 0, which results in a sum of 0. This is because multiplying any number by 0 results in 0.

Is there a specific reason for a series to equal 0 when x=0?

Yes, this is a fundamental property of mathematics known as the zero product property. It states that when a number is multiplied by 0, the result is always 0.

Can a series equal to 0 at any other value of x?

It is possible for a series to equal 0 at other values of x, but this is dependent on the individual terms of the series and their coefficients. It is not a universal property and varies from series to series.

Why is it important to understand why a series equals to 0 when x=0?

Understanding this property is important in many mathematical applications and calculations. It helps in simplifying equations and solving problems involving series and their sums.

Can there be exceptions to the rule of a series equaling to 0 when x=0?

Yes, there can be exceptions to this rule, but they are rare and depend on specific scenarios and mathematical concepts. In general, the zero product property holds true for most series and their terms.

Similar threads

Replies
4
Views
1K
Replies
3
Views
1K
Replies
11
Views
1K
Replies
7
Views
2K
Replies
9
Views
1K
Back
Top