Proof that 0 + 0 +....+ 0 +.... = 0

  • I
  • Thread starter deRoy
  • Start date
  • Tags
    Proof
In summary, the conversation discusses a possible proof for the statement that the sum of an infinite number of zeroes is equal to zero. The proof uses the concept of geometric series and the idea that as the ratio approaches 0, the sum also approaches 0. The conversation also brings up the importance of defining the terms used in the proof and considering the implications of different types of infinity.
  • #1
deRoy
37
5
Is there any rigorous way of proving this?

I tried using geometric series of ever diminishing ratio and noticing that 0 is always less than each term of the series, then 0 + 0 +...+ 0 +... must be always less than ## \frac{1 } {1-r} - 1 ##. (*)

Eventually, as r goes to 0 so does ## \frac{1 } {1-r} - 1 ##, so by (*) argument 0 + 0 +...+ 0 +... = 0.

But I don't know if my proof is rigorous enough...

I haven't found any proof of this by googling it so any help is appreciated.
 
  • Like
Likes atyy
Mathematics news on Phys.org
  • #2
deRoy said:
Is there any rigorous way of proving this?

I tried using geometric series of ever diminishing ratio and noticing that 0 is always less than each term of the series, then 0 + 0 +...+ 0 +... must be always less than ## \frac{1 } {1-r} - 1 ##. (*)

Eventually, as r goes to 0 so does ## \frac{1 } {1-r} - 1 ##, so by (*) argument 0 + 0 +...+ 0 +... = 0.

But I don't know if my proof is rigorous enough...

I haven't found any proof of this by googling it so any help is appreciated.

Certainly the sum on N zeroes as N approaches infinity (in this case, Aleph Null) is zero.
I'm not sure that more of a proof is needed. But if you insist, try this:
A = 0 + 0 + 0 + ...
2A = 2x0 + 2x0 + 2x0 + ... = 0 + 0 + 0 + ...
Therefore 2A = A; A = 0.
 
  • #3
If there's anything not rigorous it.s the statement of the result. Try to state 0+0+0 + ... + 0 = 0 as a limit and without using an ellipsis.
 
  • #4
First you need a rigorous definition what that means, because an infinite sum cannot end.
If you mean 0+0+0+..., then it is sufficient to note that all partial sums are 0, and ##\lim_{n \to \infty} 0 = 0## directly follows from the definition of limits for sequences.
 
  • Like
Likes QuantumQuest and jbriggs444
  • #5
mfb said:
First you need a rigorous definition what that means, because an infinite sum cannot end.
Where in my post did I assert this?

willem2 said that 0+0+0 + ... + 0 = 0 , but never did I make such a preposterous claim!
 
  • #6
Imagine an N-dimensional sphere of radius 1. The path along the diagonal for which all coordinates are positive hits the sphere when all coordinates are 1/sqrt(N).
Now take the limit N to infinity. The diagonal hits the sphere where 0+0+0+... = 1 ;)
 
  • #7
deRoy said:
Where in my post did I assert this?

willem2 said that 0+0+0 + ... + 0 = 0 , but never did I make such a preposterous claim!
Ah you didn’t have a zero at the end. The zero in the middle is still odd.
 
  • Like
Likes fresh_42
  • #8
deRoy said:
Is there any rigorous way of proving this?

Induction?
 
  • Like
Likes atyy, QuantumQuest and gmax137
  • #9
Induction indeed!
I had thought of this before and since we concur I am happy now.
 
  • Like
Likes Kninfinite and atyy
  • #10
deRoy said:
Where in my post did I assert this?

willem2 said that 0+0+0 + ... + 0 = 0 , but never did I make such a preposterous claim!
As long as you don't say, what ##0+0+\ldots +0+\ldots ## means, as long is every answer meaningless, because they all make assumptions on what it might mean. It is your turn to deliver. It is a summation, that's clear, so we have ##\sum_\mathcal{I} 0##. The dots further suggest that ##|\mathcal{I}| = \infty##. What has to be defined is: which infinity, and which summation rules. I assume the result will always be zero, although I'm not that sure there aren't any strange summation rules according to which it is not, but you asked for a rigorous proof, so both questions have to be answered in advance.
 
  • #11
fresh_42 said:
As long as you don't say, what ##0+0+\ldots +0+\ldots ## means, as long is every answer meaningless, because they all make assumptions on what it might mean. It is your turn to deliver. It is a summation, that's clear, so we have ##\sum_\mathcal{I} 0##. The dots further suggest that ##|\mathcal{I}| = \infty##. What has to be defined is: which infinity, and which summation rules. I assume the result will always be zero, although I'm not that sure there aren't any strange summation rules according to which it is not, but you asked for a rigorous proof, so both questions have to be answered in advance.
Thanks for the input, but are you implying that if infinity is aleph-1, result may not be 0?

I am very curious to learn now...
 
  • #12
deRoy said:
Thanks for the input, but are you implying that if infinity is aleph-1, result may not be 0?

I am very curious to learn now...
I'm only implying that it cannot be said for sure as long as the notation allows ambiguities. If we write ##0+0+\ldots ## we normally consider a countable index set for summation, but sums can sometimes consist of possibly uncountable many summands. E.g. if we consider the vevtor space of continuous functions, then it has (as every vector space, assuming the validity of the axiom of choice) a basis ##\{f_\iota\,\vert \,\iota \in \mathcal{I}\}## which is uncountable in this case. So if we write ##f= \sum_{\iota \in \mathcal{I}} c_\iota f_\iota ## we have therefore to add a the condition, that almost all (= all up to finitely many) ##c_\iota## are equal to ##0##. This is just meant as an example, that such sums aren't a priori finite or countable. Therefore the dots have to be explained, because otherwise it is just a convention to assume countably many. But this is only a convention and not put in stone.

The summation itself is also only defined as a binary operation: ##(a,b) \mapsto a+b##. Already a third term makes an assumption: Is it ##a+b+c=(a+b)+c## or is it ##a+b+c=a+(b+c)##. Of course it doesn't matter if we speak about, e.g. real numbers, because addition is associative for them, i.e. the order doesn't matter. However, not all mathematical structures are associative and we cannot assume it without mention. So to imply ##0 \in \mathbb{N}## or ##0 \in \mathbb{Z}## or ##0 \in \mathbb{Q}## or ##0 \in \mathbb{R}## or ##0 \in \mathbb{C}## is another hidden assumption, or if you like, a convention, as the ##"+"## operation is usually only used for associative and commutative binary operations. Nevertheless, it is an assumption or a convention.

Even in case it is a usual addition, there are concepts (outside usual calculus) which consider different summation processes, e.g. the Ramanujan summation. Of course this is nothing we have to bother here, as it is more of an analytical tool than a summation rule, but it shall demonstrate, that the summation isn't defined in itself. Again we have the convention here, that ##"+"## denotes the summation of real numbers if not stated otherwise, but you've asked for rigor, in which case it has to be said.

That brings me to another point which I previously have forgotten to list. What is ##0\,##? We all followed the convention here, that it is the neutral element of addition, however, nobody has said so. E.g. ## x \cdot y := \dfrac{2xy-x-y}{xy-1}## defines a multiplication with neutral element ##0##. Because of this overall convention about the notation of ##0##, it would be abhorrent to use ##0## differently. Nevertheless, in a purely logical framework it had to be mentioned.

Finally, to destroy any doubts here, those conventions make absolutely sense and there is usually no need to mention them. So don't request it. But in this internet context here, where nobody knows in advance, what people know, what they are currently dealing with, or - as here - to which extend the term rigor has been meant, things are far less clear in advance.
 
  • #13
deRoy said:
Thanks for the input, but are you implying that if infinity is aleph-1, result may not be 0?
I may have misstated. I was referring to this summation:
$$ \sum_{n=1}^{\infty}{0} $$
In that case, the number of "add"s is Aleph Null. It can't by Aleph-1 because the terms in the summation can be listed serially.
$$ Note \space that \space this \space \space is \space not \space \aleph_0 \times 0 \space because \space \aleph_0 \space is \space not \space a \space number.$$
 
  • #14
.Scott said:
I may have misstated. I was referring to this summation:
$$ \sum_{n=1}^{\infty}{0} $$
In that case, the number of "add"s is Aleph Null. It can't by Aleph-1 because the terms in the summation can be listed serially.
$$ Note \space that \space this \space \space is \space not \space \aleph_0 \times 0 \space because \space \aleph_0 \space is \space not \space a \space number.$$
True, but in your proof how are you equating A = 0 + 0 + 0 + ... = 0 + 0 + 0 + ... = 2 A in advance, without first establishing that a limit exists? This is just begging the question. If you haven't proven that a limit exists first this is like saying: 0 = 1 - 1 +1 -1 +... = 1 -1 +1 -1 +1 -... = 1
 
  • #15
fresh_42 said:
I'm only implying that it cannot be said for sure as long as the notation allows ambiguities. If we write ##0+0+\ldots ## we normally consider a countable index set for summation, but sums can sometimes consist of possibly uncountable many summands. E.g. if we consider the vevtor space of continuous functions, then it has (as every vector space, assuming the validity of the axiom of choice) a basis ##\{f_\iota\,\vert \,\iota \in \mathcal{I}\}## which is uncountable in this case. So if we write ##f= \sum_{\iota \in \mathcal{I}} c_\iota f_\iota ## we have therefore to add a the condition, that almost all (= all up to finitely many) ##c_\iota## are equal to ##0##. This is just meant as an example, that such sums aren't a priori finite or countable. Therefore the dots have to be explained, because otherwise it is just a convention to assume countably many. But this is only a convention and not put in stone.

The summation itself is also only defined as a binary operation: ##(a,b) \mapsto a+b##. Already a third term makes an assumption: Is it ##a+b+c=(a+b)+c## or is it ##a+b+c=a+(b+c)##. Of course it doesn't matter if we speak about, e.g. real numbers, because addition is associative for them, i.e. the order doesn't matter. However, not all mathematical structures are associative and we cannot assume it without mention. So to imply ##0 \in \mathbb{N}## or ##0 \in \mathbb{Z}## or ##0 \in \mathbb{Q}## or ##0 \in \mathbb{R}## or ##0 \in \mathbb{C}## is another hidden assumption, or if you like, a convention, as the ##"+"## operation is usually only used for associative and commutative binary operations. Nevertheless, it is an assumption or a convention.

Even in case it is a usual addition, there are concepts (outside usual calculus) which consider different summation processes, e.g. the Ramanujan summation. Of course this is nothing we have to bother here, as it is more of an analytical tool than a summation rule, but it shall demonstrate, that the summation isn't defined in itself. Again we have the convention here, that ##"+"## denotes the summation of real numbers if not stated otherwise, but you've asked for rigor, in which case it has to be said.

That brings me to another point which I previously have forgotten to list. What is ##0\,##? We all followed the convention here, that it is the neutral element of addition, however, nobody has said so. E.g. ## x \cdot y := \dfrac{2xy-x-y}{xy-1}## defines a multiplication with neutral element ##0##. Because of this overall convention about the notation of ##0##, it would be abhorrent to use ##0## differently. Nevertheless, in a purely logical framework it had to be mentioned.

Finally, to destroy any doubts here, those conventions make absolutely sense and there is usually no need to mention them. So don't request it. But in this internet context here, where nobody knows in advance, what people know, what they are currently dealing with, or - as here - to which extend the term rigor has been meant, things are far less clear in advance.
Thank you for your answer. I hadn't imagined that such a simple question as this has so many implications and I am grateful that you took the time to write such a lengthy reply.
Still, you haven't answer my question: Suppose we are having aleph-1 many uncountable summands now, so that ## \sum_{n=1}^{\infty}{0} ## does not apply, is there a way of proving that 0 + 0 + 0 + ... = 0, or is it the case that such a question is nonsense and no such proof exists at all?
 
  • #16
deRoy said:
Thank you for your answer. I hadn't imagined that such a simple question as this has so many implications and I am grateful that you took the time to write such a lengthy reply.
Still, you haven't answer my question: Suppose we are having aleph-1 many uncountable summands now, so that ## \sum_{n=1}^{\infty}{0} ## does not apply, is there a way of proving that 0 + 0 + 0 + ... = 0, or is it the case that such a question is nonsense and no such proof exists at all?
I'm a bit rusty on that matter, but transfinite induction may be a possibility.
 
  • #17
Hmmm I thought I posted an answer but now it is gone.

Write the integers base ##-2##, the negative binary system. Every integer is expressed as sums of powers of ##-2##. One knows that ##0=1+ -1## since they are inverses of each other. In the negative binary system this is the addition ##1 + 11## since ##-1 = (-2)^1+(-2)^0##.

Adding ##1## to ##11##, one gets in the ones column ##1 + 1 = 110## so write down the zero and carry the ##11##. Now in the ##-2##'s column one again has ##1+1## so write down another zero carry the ##11##. This goes on forever and generates an unending sum of zeros.

If one thinks of the infinite addition of zeros as signifying all of these computations then the answer is it equals zero.
 
Last edited:
  • #18
fresh_42 said:
...transfinite induction may be a possibility.
...And Bob's your uncle ! . :biggrin:
 
  • #19
deRoy said:
True, but in your proof how are you equating A = 0 + 0 + 0 + ... = 0 + 0 + 0 + ... = 2 A in advance, without first establishing that a limit exists?
Ok. How about this:
I may have misstated. I was referring to this summation:
$$ \sum_{n=1}^{\infty}{2^{-n}} = 1$$
Multiplying both sides by zero:
$$ \sum_{n=1}^{\infty}{0\times2^{-n}} = 0$$
Thus:
$$ \sum_{n=1}^{\infty}{0} = 0$$
 
  • Like
Likes deRoy
  • #20
.Scott said:
Ok. How about this:
I may have misstated. I was referring to this summation:
$$ \sum_{n=1}^{\infty}{2^{-n}} = 1$$
Multiplying both sides by zero:
$$ \sum_{n=1}^{\infty}{0\times2^{-n}} = 0$$
Thus:
$$ \sum_{n=1}^{\infty}{0} = 0$$
Nice proof and better than mine. I like it.
 
  • #21
deRoy said:
Nice proof and better than mine. I like it.
Note that there is a minor loss of rigor in the step where multiplication of the infinite sum by zero is shown as multiplication of the summand by zero. It turns out that multiplication by a fixed constant does distribute over [convergent] infinite sums. But that is something that would need to be proved rather than taken for granted.
 
  • #22
.Scott said:
Ok. How about this:
I may have misstated. I was referring to this summation:
$$ \sum_{n=1}^{\infty}{2^{-n}} = 1$$
Multiplying both sides by zero:
$$ \sum_{n=1}^{\infty}{0\times2^{-n}} = 0$$
Thus:
$$ \sum_{n=1}^{\infty}{0} = 0$$
Proving the first one is more effort than proving the last one.
And you also have to prove that you can commute the product with the series.
You are using a lot of advanced methods - methods that require knowledge of the last result already - to prove something very elementary.
 
  • Like
Likes jbriggs444
  • #23
mfb said:
Proving the first one is more effort than proving the last one.
And you also have to prove that you can commute the product with the series.

I wasn't trying to create a Bertrand Russell type proof - only one that would build on what the OP would take as given.
 
  • #24
A monotone Cauchy sequence ( assuming you have a countably-infinite sum) converges to its lub/glb?
 
  • #25
One of the issues here is "what is rigorous"? It's not a (no pun intended), 0/1 thing. (Okay, so maybe the pun was a little intended) There are degrees of rigor.

In most cases, what we have proven is that if this series converges to anything at all, it converges to 0. So there needs to be one more piece that shows it converges. Unfortunately, many convergence tests are stymied by the fact that all the terms are zero - for example, any test involving a ratio. Probably the way to prove this is to show that 1 + 1/2 + 1/4 + 1/8... converges to 2 and 0 + 0 + 0 + ... is strictly non-negative and uniformly smaller.
 
  • #26
Vanadium 50 said:
One of the issues here is "what is rigorous"? It's not a (no pun intended), 0/1 thing. (Okay, so maybe the pun was a little intended) There are degrees of rigor.

In most cases, what we have proven is that if this series converges to anything at all, it converges to 0. So there needs to be one more piece that shows it converges. Unfortunately, many convergence tests are stymied by the fact that all the terms are zero - for example, any test involving a ratio. Probably the way to prove this is to show that 1 + 1/2 + 1/4 + 1/8... converges to 2 and 0 + 0 + 0 + ... is strictly non-negative and uniformly smaller.

But you have a result for Real numbers that a Cauchy sequence converges. A Cauchy sequence is one where the difference between terns beyond a fixed one becomes "As small as you want", and it converges to the greatest lower bound.
 
  • #27
Vanadium 50 said:
One of the issues here is "what is rigorous"? It's not a (no pun intended), 0/1 thing. (Okay, so maybe the pun was a little intended) There are degrees of rigor.

In most cases, what we have proven is that if this series converges to anything at all, it converges to 0. So there needs to be one more piece that shows it converges. Unfortunately, many convergence tests are stymied by the fact that all the terms are zero - for example, any test involving a ratio. Probably the way to prove this is to show that 1 + 1/2 + 1/4 + 1/8... converges to 2 and 0 + 0 + 0 + ... is strictly non-negative and uniformly smaller.

Let ##s_k = \sum_{n=1}^{k} 0##.

By definition: ##\sum_{n=1}^{\infty} 0 = \lim_{k \rightarrow \infty} s_k##

(Note that ##\forall k: s_k = 0##)

Let ##\epsilon > 0##. Take ##N =1##:

##k > N \Rightarrow |s_k - 0| = |0 - 0| = 0 < \epsilon##

Hence ##\lim_{k \rightarrow \infty} s_k = 0##

QED
 
  • Like
Likes mfb
  • #28
Any constant sequence in a metric space converges. There is nothing special about a sequence of finite sums of zeros. Also, one does not need the usual metric on the real numbers. Any metric will work.

The metric is used to interpret an infinite summation as a limit of a sequence. Without the metric one might still ask whether one can make sense of it.
 
Last edited:
  • #29
Well, the standard way is to go by limits.

Let [itex] S_{N}=\sum_{n=0}^{N}[/itex]. Then [itex](\forall N>0)(S_{N}=0) [/itex]. Thus [itex]\sum_{n=0}^{\infty}=\lim_{N\rightarrow \infty}S_{N}=\lim_{N\rightarrow \infty}0=0 [/itex].
 
  • #30
If we correspond ##0 \mapsto \emptyset## and the binary operation ##+\mapsto \cup## then by the algebra of sets:

## \emptyset \cup \cdots \cup \emptyset \,=\, \emptyset ##

that is

## 0+ \cdots + 0\,=\, 0##.
Ssnow
 
  • #31
With the same logic ##1+ \cdots + 1\,=\, 1## because ##\{1\} \cup \cdots \cup \{1\} = \{1\}##?
That approach brings more problems than it is supposed to solve, especially as it is trivial to show the limit directly with the definitions (as done in posts 27 and 29).
 
  • #32
mfb said:
With the same logic 1+⋯+1=11+ \cdots + 1\,=\, 1 because {1}∪⋯∪{1}={1}\{1\} \cup \cdots \cup \{1\} = \{1\}?

No beacuse defining the successor function for sets ##S(A)=A\cup\{A\}## you have that ##1=S(\emptyset)=\emptyset \cup \{\emptyset\}=\{\emptyset,\{\emptyset\}\}## so ##1+1 \not = 1## but ##\{\{\emptyset,\{\emptyset\}\},\{\{\emptyset,\{\emptyset\}\}\}\}=2##.
Ssnow
 
  • #33
That is different from what you suggested before, and it is still wrong. If you identify addition with the union of sets then x+x=x for all x because ##s \cup s=s\quad \forall s##.

If you identify the addition with ##f(s,t)=\{s,\{t\}\}## then 1+1 works but nothing else works any more.
 
  • #34
mfb said:
If you identify the addition with f(s,t)={s,{t}}f(s,t)=\{s,\{t\}\} then 1+1 works but nothing else works any more.

Yes, I was thinking of "classes" rather than "sets" ...
 

FAQ: Proof that 0 + 0 +....+ 0 +.... = 0

1. What is the proof that 0 + 0 +....+ 0 +.... = 0?

The proof for this statement is based on the concept of the additive identity property, which states that the sum of any number and zero is equal to that number. Since we are adding an infinite number of zeros, the sum will always be equal to zero.

2. Can you explain why this statement is true?

This statement is true because the value of zero is defined as having no magnitude or quantity. Therefore, adding zero to itself an infinite number of times will still result in a value of zero.

3. Is this statement universally accepted in mathematics?

Yes, this statement is universally accepted and is considered a fundamental property in mathematics. It is often used in algebraic equations and proofs.

4. How does this statement relate to the concept of infinity?

This statement is closely related to the concept of infinity because we are adding an infinite number of zeros. In this case, the infinite sum is equal to zero, which demonstrates the concept of convergence in mathematics.

5. Can this statement be applied to other numbers besides zero?

Yes, this statement can be applied to any number. For example, 5 + 0 +....+ 0 +.... = 5, since the additive identity property applies to all numbers, not just zero.

Similar threads

Replies
66
Views
5K
Replies
17
Views
386
Replies
55
Views
4K
Replies
13
Views
2K
Replies
5
Views
1K
Replies
1
Views
429
Replies
13
Views
2K
Replies
2
Views
944
Replies
1
Views
434
Replies
8
Views
2K
Back
Top