A new approach to the information Paradox

In summary, new black hole simulations incorporating quantum gravity suggest that when a black hole dies, it produces a gravitational shock wave that radiates information. This finding could potentially solve the information paradox surrounding black holes. The simulations show that black holes are optimal scramblers of information and are protected from descrambling by external observers, as long as the remaining black hole is macroscopic. However, if the radiation emitted by the black hole is not pseudorandom, an external observer may be able to create a firewall by applying a polynomial-time quantum computation. Therefore, there appears to be a race condition between the scrambling capacity of a black hole and the decoding capacity of an external agent. This highlights the importance of studying black hole information in the final stages of
  • #1
Oldman too
260
492
Summary: New black hole simulations that incorporate quantum gravity indicate that when a black hole dies, it produces a gravitational shock wave that radiates information, a finding that could solve the information paradox.

Hello, Please excuse the rather "conversational" approach I'm using, however the facts are simply that I'm posting on a subject beyond my "skill set". This of course can be problematic on PF and I'm hoping to avoid the appearance of not knowing or caring about that point. My reason for posting this article is two fold, first, to get an informed opinion from members and secondly to toss on the table, so to speak, for discussion and consideration, what may be a rather novel approach to a somewhat vexing problem. The level I'm posting in is a sort of middle ground, please adjust as required, I'll do my best to follow but the subject will quickly outpace my understanding of the topic and I wouldn't want a discussion limited to my "B thread" understanding of the science involved. That being said, I want to thank, in advance everyone for their thoughts on the paper I'm citing. As a side note I should add that I was drawn to the theory by the rather intriguing fact that most theories avoid speculation concerning what goes on "inside" the BH, this one is different.

https://physics.aps.org/articles/v15/s35
https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.128.121301

Thanks again, Scott
 
Last edited:
  • Like
Likes Delta2 and Demystifier
Physics news on Phys.org
  • #2
Black hole dies? When and how?
 
  • Like
Likes ohwilleke
  • #4
Moderator's note: Moved thread to Beyond the Standard Model forum since it has to do with quantum gravity models.
 
  • Like
Likes Oldman too
  • #5
I have no comments on the simulations but my opinion is that the generic idea that information can be recovered from the radiation of black home, regardless of exactly how it will be emitted is a very reasonable and rational resolution to the information paradox.

The view that black holes are optimal scramblers of information, that at least for large black holes are protected from descrambling of outside observers to the limiting decoding capacity is as I see it the generic perspective here. Set aside specific models of what actually happens in the final stages of evaporation.

The ghost in the radiation: Robust encodings of the black hole interior​

"This encoded interior is entangled with the late outgoing Hawking quanta emitted by the old black hole, and is inaccessible to computationally bounded observers who are outside the black hole. Specically, ecient operations acting on the radiation, those with quantum computational complexity polynomial in the entropy of the remaining black hole, commute with a complete set of logical operators acting on the encoded interior, up to corrections which are exponentially small in the entropy. Thus, under our pseudorandomness assumption, the black hole interior is well protected from exterior observers as long as the remaining black hole is macroscopic. On the other hand, if the radiation is not pseudorandom, an exterior observer may be able to create a rewall by applying a polynomial-time quantum computation to the radiation."
-- https://arxiv.org/abs/2003.05451

There it see seems to be a race condition, between the scrambling capacity of a black hole, and the decoding capacity of an external agent. This is IMO the most natural and logical explanation why there is a big difference between a macroscopic(classical) black hole, and a microscopic one. Conclusions drawn from semiclassical ideas (ie macroscopic BH), likely aren't valid for a small evaporation BH)

/Fredrik
 
  • Informative
Likes Oldman too
  • #6
Fra said:
I have no comments on the simulations but my opinion is that the generic idea that information can be recovered from the radiation of black home,black holes are optimal scramblers of information

The view that black holes are optimal scramblers of information, that at least for large black holes are protected from descrambling of outside observers to the limiting decoding capacity is as I see it the generic perspective here. Set aside specific models of what actually happens in the final stages of evaporation.

The ghost in the radiation: Robust encodings of the black hole interior​

"This encoded interior is entangled with the late outgoing Hawking quanta emitted by the old black hole, and is inaccessible to computationally bounded observers who are outside the black hole. Specically, ecient operations acting on the radiation, those with quantum computational complexity polynomial in the entropy of the remaining black hole, commute with a complete set of logical operators acting on the encoded interior, up to corrections which are exponentially small in the entropy. Thus, under our pseudo randomness assumption, the black hole interior is well protected from exterior observers as long as the remaining black hole is macroscopic. On the other hand, if the radiation is not pseudorandom, an exterior observer may be able to create a rewall by applying a polynomial-time quantum computation to the radiation."
-- https://arxiv.org/abs/2003.05451

There it see seems to be a race condition, between the scrambling capacity of a black hole, and the decoding capacity of an external agent. This is IMO the most natural and logical explanation why there is a big difference between a macroscopic(classical) black hole, and a microscopic one. Conclusions drawn from semiclassical ideas (ie macroscopic BH), likely aren't valid for a small evaporation BH)

/Fredrik
Hello Fra, Thank you for your reply as well as thoughts on my post. I would imagine that you read the initial post as intended, that being basically a disclaimer of my "skill set" concerning knowledge of the physics involved concerning black holes. It would be safe to say that 99% of what I do know has been learned here on PF, still that isn't the same as say, completing the education most people here have. With that being said, I'll share my thoughts on your post and hope you, and everyone else will forgive my limited technical skills.

I'm curious about your statement, "I have no comments on the simulations" as the simulations appear to me to be at the heart of the paper cited.

"Analyzing the results of their simulations, the team uncovers three surprising findings: First, they see that matter “bounces” inside a black hole, moving both toward and away from the hole’s central point. Second, dying black holes emit a gravitational shock wave that should be detectable by a distant observer. Third, black holes have much shorter lives than previous theories predict. Husain says that the root of both findings is the removal by quantum gravity of a point of infinite density that Einstein’s theory predicts exists at a black hole’s center."

I do realize that for the most part, we are both citing theory and these theories are works in progress, however they are the best models available at the time, so all one can do is keep chipping away at the problem and hope for good data, perhaps between the Event Horizon Telescope, JWST and related projects
there will be solid evidence in our lifetimes, we can always hope.

From your point of view, how do the papers we both have mentioned compare? that is to say, are they complimentary in any sense or divergent on main concepts. I've got to admit that the idea of pseudo randomness is a new one to me, (this should be a good indicator of my familiarity with the concepts discussed) I can of course search the term and get a good idea of the meaning, this usually ends in a lack of context however, thus my query on your thoughts.

I'm already learning from this thread and look forward to hearing more.
Thanks, Scott
 
  • #7
Oldman too said:
I'm curious about your statement, "I have no comments on the simulations" as the simulations appear to me to be at the heart of the paper cited.
...

From your point of view, how do the papers we both have mentioned compare? that is to say, are they complimentary in any sense or divergent on main concepts. I've got to admit that the idea of pseudo randomness is a new one to me, (this should be a good indicator of my familiarity with the concepts discussed) I can of course search the term and get a good idea of the meaning, this usually ends in a lack of context however, thus my query on your thoughts.
Yes, they use a specific model of QM, run a simulation and find some results that differ from semiclassical results, for example a different dependence on the lifetime vs the mass. As my own approach to this takes a different route, I leave commenting their choice of model to others.

From my perspective (which seeks a yet uknnown reconstruction of QM, where gravity is expected to emerge naturally) I see the paper as a sort of semiclassical toy model still, but even if not perfect, such models and simulations often gives interesteing insights. I think one both has to do research from semiclassical or toymodels as well as radical new ideas, and hope to meet somewhere.

I commented only on the point that a solution to the information paradox seems to lie in that the commnication/radation from the black hole when small, is no longer random and thus may be decodable. I see that as a more generic or "principal" insight, independnt on the detailed models. But the fact that various toy-simulations support this direction is interesting.

This is why I refrain from commenting their choice of variables, which as they say also is motivated by loop quantum gravity.

/Fredrik
 
  • Like
  • Informative
Likes Dragrath and Oldman too
  • #8
Hi, Thank you very much for the reply. Your statements and answers seem very logical, I have new things to consider on the subject now, interesting viewpoint.
An example would be that with my limited knowledge, I wasn't aware the paper was actually saying that Small black holes were no longer random, an insight in itself. (you have, in a sense made it. "decodable"

"This is why I refrain from commenting their choice of variables, which as they say also is motivated by loop quantum gravity." To myself most "variables" are of the "hidden" variety, I must add that I respect your reasoning concerning "commenting their choice of variables". As Ben Barres said "Science proceeds at its best when it includes diverse studies performed by diverse scientists."

I know we are supposed to stay away from personal theories on PF, so I'm going to ask you a personal opinion instead. (I hope that is an acceptable alternative)
Concerning the Loop Quantum Gravity variable choice the authors pursue, is there a particular reason that you choose to not comment it? by that I mean are you aware of another theory besides the "discrete chunk" approach that LQG uses that is more to your liking or is it that you have a policy of not embracing anyone theory over the other? This is a sincere question, I don't want to be misconstrued as "asking a dumb question" its just that from my level of (expertise?) framing the question in that form is easy for me to pick up useful information. I hope the mods will take this into account and let the personal view slide.

Thanks again for your response, Scott
 
  • #9
Oldman too said:
Concerning the Loop Quantum Gravity variable choice the authors pursue, is there a particular reason that you choose to not comment it? by that I mean are you aware of another theory besides the "discrete chunk" approach that LQG uses that is more to your liking or is it that you have a policy of not embracing anyone theory over the other?
The field of beyond the standard model, and the quest for unification is very broad, it involves foundational problems and potential reconstruction of QM, "quantisation of gravity", or unification of all forces. Some research programs focus on some things, while ignoring others and some try to address more issues at once. Each program has some starting points from one seeks explanations. Typically the starting points from one program can be questions from the perspective of other programs.

I'm am neither an expert nor a fan of LQG, so I leave to others more qualified to comment as my preferred focus is elsewhere anyway.

My main perspective and vision is a deeper reconstruction of all interactions and QM, from the perspective of agent-agent interactions, which is an information theoretic, gaming abtraction, where "interactions" that has some work yet todo to map the new abstractions to physical concepts such as mass and energy. The starting points I perfer are very different from LQG or string theory.

/Fredrik
 
  • Like
Likes Dragrath and Oldman too
  • #10
Hello, Thank you for the excellent reply, very informative. Your approaching the issue from an agent-agent viewpoint is something I'm not familiar with, do you find it more "user friendly" than string or LQG?
Cheers, Scott
 
  • #11
Oldman too said:
do you find it more "user friendly" than string or LQG?
No.

If one thinks QM is weird, this is even more weird and more radical becacuse the focus is more on learning and evolving algorithms, not pre-fixed rules/law that one discoveres. It requires a new paradigm of thinking and analysing things.

This is a immature topic, but thinking from others than tangent this but which is only part of the full story, that may give ideas are

Law without law: from observer states to physics via algorithmic information theory​

-- https://arxiv.org/abs/1712.01826

Other ideas are the qbism, computational complexity analysis of dualities where the connection is that there is evolutionary advantages (due to minimal information resources) for nature(an agent) to prefer one dual encoding over the other. So when one looks which predictive system is "best", it's not about some in absurdum asymptotic consistencty, one also need to consider there here and now constraints of computational resources. This is how the method is more important than some imaginary result. This is where the processing and computational parts comes in. But there are plenty of problems in here that are unsolved, for example the exact mapping of abstractions to conventional terms, and spacetime etc. this may also lead nowhere.

/Fredrik
 
  • Like
Likes Oldman too
  • #12
This is a whole new paradigm (to me), basically a purely mathematical approach with no physical aspects involved?
 
  • #13
Oldman too said:
This is a whole new paradigm (to me), basically a purely mathematical approach with no physical aspects involved?
Don't let the abstractions confuse you. The idea is of course some isomporhism between physical interactions and the abstractions. On the contrary, i take the physical constraints of the agent more seriously. I am not into this for the math itself.

/Fredrik
 
  • Informative
Likes Oldman too
  • #14
mathman said:
Black hole dies? When and how?
My thoughts exactly.

Stellar and up sized black holes generally don't die, because they gain more mass from cosmic background radiation and interstellar dust than they lose in Hawking radiation. You'd need an extremely pure vacuum for this conclusion not to hold, and I'm not convinced that such a pure vacuum could exist even in theory (although it is less obvious in that case).

It could happen for primordial black holes that can be smaller than that in principle and do evaporate, but there is no evidence that they exist, and it doesn't solve the information paradox (to the extent that it really is a paradox) for "normal" black holes for which we have strong observational evidence.
 
Last edited:
  • Informative
Likes Oldman too
  • #15
ohwilleke said:
I'm not convinced that such a pure vacuum could exist even in theory
I'm not sure what you mean by this. I'm not aware of any theory that says there is a theoretical lower limit to energy density.

In a universe with a positive cosmological constant (which ours appears to be), the cosmological horizon has a finite temperature and emits black body radiation at that temperature, so any black hole whose mass is large enough that its Hawking temperature is less than the horizon temperature will be a net absorber of radiation forever, regardless of what happens with anything else (and over long enough time scales matter and the CMBR will dilute to the point where their effects are negligible compared to the horizon temperature). That is a valid reason to doubt whether sufficiently large black holes will ever evaporate even if the theoretical prediction that black hole evaporation is possible is correct. But it has nothing to do with any theoretical limit on "pure vacuum".
 
  • Like
Likes Oldman too
  • #16
ohwilleke said:
and it doesn't solve the information paradox (to the extent that it really is a paradox) for "normal" black holes for which we have strong observational evidence.
The conceptual resolution for a large black hole is then, than the information is not necessarily irreversibly lost, it may be thought of as just HIDDEN/encrypted / screened beyond the decoding capacity of a typical outside observer. And indeed, unless the BH evaporates, it will remain hidden and effectively lost?

What remains unsolved still is to reconstruct QM, in a way that gives each agent a formalism, that represents this, and that allows for information to be created(learned). This is not allowed in QM as it stands - and if one by paradox means to show that QM can not handle this, then I agree it's not resolved. But conceptually this idea moves this from a conceptual mystery, to a problem of revising QM.

/Fredrik
 
  • Informative
Likes Oldman too
  • #17
Fra said:
But conceptually this idea moves this from a conceptual mystery, to a problem of revising QM.
I like that thought.
 
  • #18
PeterDonis said:
I'm not sure what you mean by this. I'm not aware of any theory that says there is a theoretical lower limit to energy density.

In a universe with a positive cosmological constant (which ours appears to be), the cosmological horizon has a finite temperature and emits black body radiation at that temperature, so any black hole whose mass is large enough that its Hawking temperature is less than the horizon temperature will be a net absorber of radiation forever, regardless of what happens with anything else (and over long enough time scales matter and the CMBR will dilute to the point where their effects are negligible compared to the horizon temperature). That is a valid reason to doubt whether sufficiently large black holes will ever evaporate even if the theoretical prediction that black hole evaporation is possible is correct. But it has nothing to do with any theoretical limit on "pure vacuum".
Temperature is more fundamentally derived from fundamental particles. And if you have enough fundamental particles with enough momentum/mass-energy in the vicinity of every black hole, then you don't have a sufficiently "pure" vacuum (since it has too many particles in it) to bring temperature in the vicinity of a black hole in excess of its Hawking temperature.

Is there a sufficient lower bound on that temperature to prevent all black holes from evaporating? Maybe. It isn't something someone can know without doing math with real world parameters, although your answer seems to indicate that the answer is yes.

I was thinking about a couple of things that could impose the limit:

1) the cosmological constant recast as a homogeneous dark energy scalar like quintessence (and hence not truly a vacuum) and

2) the bounds on the distribution of matter and energy in a universe that is approximately homogeneous subject to variations in density governed by cosmological parameters and observed mixed of interstellar matter and radiation relative to the minimum spatial extent of minimum sized stellar black holes (with some additional limitations based upon how structure emerges around black holes).

My terminology may not be quite a conventional as would be idea, but I think that you and I are actually on the same page here.
 
  • Like
Likes Oldman too
  • #19
ohwilleke said:
if you have enough fundamental particles with enough momentum/mass-energy in the vicinity of every black hole, then you don't have a sufficiently "pure" vacuum (since it has too many particles in it)
In other words, if you don't have a vacuum state, since there are too many particles, then you don't have a vacuum state, since there are too many particles. This is true, since it's a tautology, but I'm not sure what the point is of stating this tautology.

ohwilleke said:
Is there a sufficient lower bound on that temperature to prevent all black holes from evaporating?
That depends on the global properties of spacetime. If there is a positive cosmological constant, then the horizon temperature associated with it sets a lower bound on temperature, yes, since the horizon radiates at that temperature.

In spacetimes with zero cosmological constant, however, I'm not aware of any lower bound on temperature. It would not make sense for there to be one since it would amount to disallowing the quantum field from being in the vacuum state (which has zero temperature), as well as possibly other states, and I don't know of any theoretical basis for such an exclusion.

ohwilleke said:
It isn't something someone can know without doing math with real world parameters
I'm not sure what you mean by this. The theoretical issue is pretty simple; see above.
 
  • #20
Firstly, on one of the topics discussed, haven’t read it, but among the references is one this account - https://arxiv.org/abs/2010.13480.

A second thought is that there is no reference to any qbist takes that could reqire notions of knowledge or subjectivity but rather information theory in the LQG literature seems to be relative information based on Shannon.
 
Last edited:
  • #21
Perhaps this could help, especially the questions and answers at the end
 
  • Informative
Likes Oldman too

FAQ: A new approach to the information Paradox

What is the information paradox?

The information paradox is a problem in physics that arises when trying to reconcile the principles of quantum mechanics and general relativity. It refers to the apparent contradiction between the idea that information cannot be destroyed, as stated by quantum mechanics, and the fact that information can seemingly be lost in a black hole, as predicted by general relativity.

What is the new approach to the information paradox?

The new approach to the information paradox is a theory proposed by scientists that suggests that information is not actually lost in a black hole, but rather it is encoded in the event horizon and can eventually be retrieved. This theory is known as the holographic principle and it states that all of the information contained in a volume of space can be represented by information on the surface of that space.

How does the holographic principle explain the information paradox?

The holographic principle states that the information of a three-dimensional volume can be encoded on a two-dimensional surface. In the case of a black hole, the information that falls into it is encoded on the event horizon, which is a two-dimensional surface. This means that the information is not actually lost, but rather it is stored in a different form.

What evidence supports the holographic principle?

There is strong evidence to support the holographic principle, including the fact that it is consistent with other theories such as string theory and black hole thermodynamics. Additionally, studies of black hole entropy and the behavior of entangled particles have provided further support for this principle.

What are the implications of the holographic principle?

The implications of the holographic principle are vast and have the potential to change our understanding of the universe. It suggests that the universe may be a hologram, with all of its information encoded on a two-dimensional surface. This could also lead to a better understanding of the nature of space and time, as well as potentially solving other long-standing mysteries in physics.

Similar threads

Replies
9
Views
3K
Replies
2
Views
2K
Replies
8
Views
1K
Replies
2
Views
1K
Replies
1
Views
2K
Replies
11
Views
2K
Replies
26
Views
4K
Back
Top