- #141
.Scott
Science Advisor
Homework Helper
- 3,525
- 1,637
Just looking at the chapter titles in his book in sections 1 and 2, it appears Penrose and myself are on the same page.kith said:This idea of Penrose might be interesting for you.
This won't be the first time. I give kudos to Penrose for being emphatic that consciousness, at its root, is an artifact of QM. I think most Physicists would agree - but it's not an allowed topic on this board.
Given my background, and specifically lack of a serious Physics background, I am certainly open to challenges on my use of proper terminology. Although I was familiar with the term before using it on this board, my use of it on this board was based of how previous posts were using it.kith said:I still have some objections. Unfortunately, your non-standard terminology makes it hard to get to the core of the issue. For example it doesn't make sense to call an event which reduces entropy a "decoherence event" although the underlying idea may well be valid.
Having reviewed the wiki article, the term "decoherence" clearly carries some luggage that is interesting but not fundamental to how I am describing the increase in information or entropy.
What is essential to my arguments is that decoherence (or any other rose by any other name) creates a set of outcomes with no possibility, in principle, of knowing which one you will observe. The real problem is with the notion of "increasing entropy" or "adding information". I noticed that Penrose's section 2 title is "The oddly special nature of the Big Bang". It did create something special. It created an environment filled with clocks. We know the Big Bang happened about 13.8 billion years ago! In our current universe, there is no reasonable doubt about which world predates which world. Whenever there is a decoherence event, all of the possible outcomes create worlds that we have never occurred before, so our formula for added information is not challenged. However, if we are in a world when heat death has occurred and your world has limited mass and space, then you can't presume that entropy has increased - even though it is the very same type of event.
For example, in the current world a single photon leaves a flood lamp with a miniscule aperture and strikes one of a billion atoms. Each of those atoms representing the transition into a world that has never existed since the Big Bang. So, if for one particular atom, if its chance of being struck is one in a billion, then the world entered for the photon hitting that atom will have an additional 30 bits from the pre-decoherence world.
But that same scenario in a world that has, in principle, lost track of time, will add 30 bits but possibly land you in a world that has already been "created", one with a shorter time line than the world where the photon had not yet decohered. The decoherence event is fundamentally the same. The probability is the same. You still have 30 "added" bits, but they're not really new bits because they don't land you in a unique, never-before-seen world.
The critical part of the Bekenstein bound to my argument is that it puts a cap on the amount of information that can be held by any world. If there is no such limit, then there is no upper bound on entropy and we don't have to worry as much about heat death and we don't have to worry at all about being, in principle, unable to track time.kith said:The most obvious point is about the Bekenstein bound. The bound takes it's maximum entropy value for a black hole. A black hole is not isolated from it's environment: it absorbs matter and emits Hawking radiation. My understanding is that the bound occurs in the first place because a region of a certain radius which contains a certain amount of matter (resp. energy) cannot be isolated better from it's surroundings than a black hole. I don't see how it makes sense to apply this bound to the universe as a whole.
/edit: Also I think we need to keep in mind that we are not talking about the MWI here but about a speculative combination of the MWI and general relativity. As far as I know, the Bekenstein bound is derived from both GR and QM. We know that the simple combination of GR and QM is impossible at least in some cases. So the bound could be an expression of this incompatibility.