Don Page - cosmological doomsday argument

In summary: This means that the number of BBs will go down, regardless of how long the universe lasts.In summary, the paper by Carroll and Sean shows that the number of BBs will go down even if the universe lasts forever.
  • #1
durant35
292
11
In his 2005 paper titled 'Lifetime of the universe1' [ https://arxiv.org/abs/hep-th/0510003 ] Canadian physicist Don Page gives an argument that our universe must end on the timescale of 1060 years to avoid having more Boltzmann brains than normal observers. If not the volume of our comoving volume will be e1050 and by his words it will create more BBs.

My question is, isn't that a too short lifetime. The expected timescale for a single BB to fluctuate is a double exponential and his timescale for getting more BBs is only a single exponential. What is wrong with this argument?

Also in the recent few years Sean Carroll had an impressive line of defence against BBs, but his lifetime at which BBs don't occur anymore is also a double exponential numbed, much larger than Page's prediction for the doomsday. It seems that these two models are in contradiction and I can't detect where so any help would be greatly appreciated.
 
Last edited by a moderator:
Space news on Phys.org
  • #2
durant35 said:
Canadian physicist Don Page gives an argument that our universe must end on the timescale of 1060 years to avoid having more Boltzmann brains than normal observers. If not the volume of our comoving volume will be e1050 and by his words it will create more BBs.

You appear to be mixing up the numbers somewhat. The time scale of ##10^{60}## years assumes that the universe continues exponential expansion (i.e., dark energy dominated). The ##e^{10^{50}}## number is not a volume, it's a different time scale (in years), based on an alternate assumption that the expansion eventually switches from exponential to power law (i.e., either radiation or matter dominated), based on speculations that dark energy might be unstable and might eventually decay into either radiation or some kind of matter field.
 
  • #3
durant35 said:
The expected timescale for a single BB to fluctuate is a double exponential

Do you have a reference for this?

durant35 said:
his timescale for getting more BBs is only a single exponential

Page's time scales are not for "getting more BBs". They are for the universe ending in a Big Crunch.
 
  • #4
PeterDonis said:
You appear to be mixing up the numbers somewhat. The time scale of ##10^{60}## years assumes that the universe continues exponential expansion (i.e., dark energy dominated). The ##e^{10^{50}}## number is not a volume, it's a different time scale (in years), based on an alternate assumption that the expansion eventually switches from exponential to power law (i.e., either radiation or matter dominated), based on speculations that dark energy might be unstable and might eventually decay into either radiation or some kind of matter field.

I'm not referring to the abstract and the chapter where he indeed mentions the model in which dark energy decays, but the chapter in the paper where he mentions that the spacetime volume needs to be e1050 for vacuum fluctuation observations to outnumber human observations and that if the exponential expansion continues this will be accomplished in 1060 years.

Anyway, I'm pretty sure that I'm mixing up the numbers somewhere but the part I just mentioned is what he wrote down. That's the part I don't understand.
PeterDonis said:
Do you have a reference for this?

Well, on the wikipedia page there is an expected timetable on the section 'timeline of the far future' and I can quote it, but I would rather stay relevant and focus on Page's assumptions.

PeterDonis said:
Page's time scales are not for "getting more BBs". They are for the universe ending in a Big Crunch.

Please re-read the chapter I mentioned in the paper.
 
  • #5
durant35 said:
the chapter in the paper where he mentions that the spacetime volume needs to be e1050 for vacuum fluctuation observations to outnumber human observations and that if the exponential expansion continues this will be accomplished in 1060 years.

Ah, yes, I see; the ##e^{10^{50}}## number is both a spacetime volume and a timescale for the case where the expansion becomes power law in the far future (with exponent of order unity).

durant35 said:
Please re-read the chapter I mentioned in the paper.

I have, and I don't see anywhere where he talks about "getting more BBs". The only definition of what he means by "lifetime of the universe" is in the chapter on lower bounds, and there it is clear that he means "time until the universe recollapses in a Big Crunch".
 
  • #6
PeterDonis said:
I have, and I don't see anywhere where he talks about "getting more BBs". The only definition of what he means by "lifetime of the universe" is in the chapter on lower bounds, and there it is clear that he means "time until the universe recollapses in a Big Crunch".
He doesn't necessarily talk about recollapse. Rather, he seems to focus on the possibility of quantum vacuum decay (which won't create a new BB event).

My guess is that this work is undercut by the results of this paper: https://arxiv.org/abs/1405.0298

The main issue here is that you can't just compare the lifetimes. If the above paper is correct, then the quantum vacuum will settle into a steady state that does not fluctuate long before any significant number of Boltzmann Brains are created. A quote from the paper (on page 24):
If the horizon volume is initially in an excited state (as it is if the dark energy is a positive cosmological constant), then the cosmic no-hair theorem dictates that correlations fall off exponentially with time as the excitations leave the horizon. The total number of Boltzmann brains will thus be finite and presumably small, given the vast exponential suppression of macroscopic fluctuations. Thus, if enough observers are produced before de Sitter space approaches the vacuum (e.g., in a period of structure formation) the vast majority of observers can, in fact, trust their memories and observations.
This isn't definitive, I don't think. I don't think they went through the very challenging calculations of actually estimating the number of Boltzmann Brains, but my guess is that under very reasonable assumptions, it'd be easy to show that the total number would be tiny absent a cosmological constant. But it does demonstrate that Don Page's calculations are likely to be invalid.
 
  • #7
kimbyd said:
He doesn't necessarily talk about recollapse. Rather, he seems to focus on the possibility of quantum vacuum decay (which won't create a new BB event).

My guess is that this work is undercut by the results of this paper: https://arxiv.org/abs/1405.0298

The main issue here is that you can't just compare the lifetimes. If the above paper is correct, then the quantum vacuum will settle into a steady state that does not fluctuate long before any significant number of Boltzmann Brains are created. A quote from the paper (on page 24):

This isn't definitive, I don't think. I don't think they went through the very challenging calculations of actually estimating the number of Boltzmann Brains, but my guess is that under very reasonable assumptions, it'd be easy to show that the total number would be tiny absent a cosmological constant. But it does demonstrate that Don Page's calculations are likely to be invalid.
Yup, this is the paper I was talking about. I think the second main issue is that Don Page does an estimate without including the exponential suppression that Sean and colleagues mentioned. Also, Page's paper is quoted as reference 42 as an example where BBs are created in a Minkowski vacuum, so it seems that his paper really uses assumptions about the Minkowski vacuum and generalizes them to an expanding universe.

Anyway, thanks for the reply. If I might ask, have you read Page's paper. Do you have any clue where his calculations might have gone wrong?
 
  • #8
durant35 said:
Yup, this is the paper I was talking about. I think the second main issue is that Don Page does an estimate without including the exponential suppression that Sean and colleagues mentioned. Also, Page's paper is quoted as reference 42 as an example where BBs are created in a Minkowski vacuum, so it seems that his paper really uses assumptions about the Minkowski vacuum and generalizes them to an expanding universe.

Anyway, thanks for the reply. If I might ask, have you read Page's paper. Do you have any clue where his calculations might have gone wrong?
I didn't read it in detail, but I think he's just assuming fluctuations due to the Hawking temperature of de Sitter space. That's not necessarily wrong, since this isn't settled science. But there is reason to doubt.
 
  • #9
kimbyd said:
I didn't read it in detail, but I think he's just assuming fluctuations due to the Hawking temperature of de Sitter space. That's not necessarily wrong, since this isn't settled science. But there is reason to doubt.

That's exactly what is the motivation behind my thread. Hawking radiation from the de Sitter horizon is relevant much later, as far as I know the numbers they are all exponentially greater than his 1060 years which is basically nothing in comparision to the timescale for when the universe will approach de Sitter vacuum.

What he is basically saying is that a bunch of Boltzmann brains will appear much earlier if the universe keeps expanding exponentially due to the size of the universe - or the total volume of spacetime that it will reach - rather than due to the time component. As far as I know and you approved, most cosmologists don't believe that things work that way and that basically the 'true' de Sitter phase is the problem, like the analysis in the paper you mentioned. But Page is suggesting otherwise, and I want to know why, on what basis did he come to the really 'unlikely to be true' calculation that you mentioned.

I've also read his second paper 'Susskind's challenge to the Hartle-Hawking no boundary proposal and possible resolutions' where he quoted the paper I referred to in this thread with these words:

"In [68] I used SE(medium brain) ∼ 1050 to estimate that if the de Sitter spacetime lasts longer than about 1050t0 ∼ 1060 years, then the spacetime 4-volume would be so large that one would expect many observers to fluctuate into existence in it, rather than having just a very low probability per de Sitter spacetime."

So it seems that he is extrapolating the probability on every spacetime volume that can contain a brain to the spacetime volume needed to be huge so BBs can dominate - in contrast with the conventional having a low probability per de Sitter universe-contained-in-a-horizon.

Could this be right, or are there any other suggestions what's wrong/different with Page's original argument?
 
  • #10
I still don't understand why Page's and Sean's calculations differ so dramatically. Sean's model seems very natural but the numbers are drastically different than Page's. Maybe someone who has a better insight can read both of papers and give a specific mention of what is wrong from Page's perspective.
 

Related to Don Page - cosmological doomsday argument

What is the cosmological doomsday argument proposed by Don Page?

The cosmological doomsday argument proposed by Don Page is a theoretical argument that suggests the possibility of a future event that could lead to the extinction of all intelligent life in the universe.

What evidence supports Don Page's cosmological doomsday argument?

There is currently no scientific evidence to support Don Page's cosmological doomsday argument. It is purely a theoretical concept based on assumptions about the nature of the universe and the potential for future events.

What are the implications of the cosmological doomsday argument for humanity?

If the cosmological doomsday argument is correct, it could mean that humans are on a finite timeline and that our existence may eventually come to an end. It also raises questions about the purpose and meaning of our existence in the grand scheme of the universe.

Is there any way to prevent the scenario described in the cosmological doomsday argument?

Since the cosmological doomsday argument is based on theoretical assumptions, there is currently no way to confirm or prevent its potential outcome. However, continued scientific research and advancements may provide insights and potential solutions for the future.

What are some criticisms of the cosmological doomsday argument proposed by Don Page?

Some critics argue that the assumptions made in the cosmological doomsday argument are not scientifically supported and that the argument relies too heavily on philosophical reasoning. Others argue that the concept of a "doomsday" is subjective and that it may not necessarily lead to the extinction of all intelligent life in the universe.

Similar threads

Replies
62
Views
3K
Replies
1
Views
3K
Replies
5
Views
3K
  • STEM Academic Advising
Replies
3
Views
4K
Replies
13
Views
5K
Replies
24
Views
7K
  • Beyond the Standard Models
Replies
1
Views
3K
Replies
109
Views
55K
Back
Top