Classical chaos and quantum mechanics

In summary, Urs Schreiber is saying that the thermal interpretation of quantum mechanics provides a resolution to the problem of the origin of quantum probabilities, but this is still a conjecture, until there is a proof.
  • #1
A. Neumaier
Science Advisor
Insights Author
8,638
4,684
Peter Donis said:
Whatever is keeping us from making deterministic predictions about
the results of quantum experiments, it isn't chaos due to nonlinear
dynamics of the quantum state.

A. Neumaier said:
However, it is chaos in the (classical) part of the quantum state that
is accessible to measurement devices.

A. Neumaier said:
What we directly observe in a measurement device are only macroscopic (what I called ''classical'') observables, namely the expectation values of certain smeared field operators. These form a vast minority of all conceivable observables in the conventional QM sense. For example, hydromechanics is derived in this way from quantum field theory. it is well-known that hydromechanics is highly chaotic in spite of the underlying linearity of the Schroedinger equation defining (nonrelativistic) quantum field theory from which hydromechanics is derived. Thus linearity in a very vast Hilbert space is not incompatible with chaos in a much smaller accessible manifold of measurable observables.

stevendaryl said:
I really don't think that chaos in the macroscopic world can explain the indeterminism of QM. In Bell's impossibility proof, he didn't make any assumptions about [...]

A. Neumaier said:
Bell doesn't take into account that a macroscopic measurement is actually done by recording field expectation values. Instead he argues with the traditional simplified quantum mechanical idealization of the measurement process. The latter is known to be only an approximation to the quantum field theory situations needed to be able to treat the detector in a classical way. Getting a contradiction from reasoning with approximations only shows that at some point the approximations break down.

stevendaryl said:
Well, that is certainly far from being an accepted resolution. I don't see how anything in his argument depends on that.

It is the resolution given by my thermal interpretation, and this resolution is valid (independent of the thermal interpretation) even without being accepted.

Bell assumes that measurement outcomes follow strictly and with infinite precision Born's rule for a von Neumann measurement. But the latter is known to be an idealization.
 
Last edited:
Physics news on Phys.org
  • #2
By Ehrenfest theorem, localized wave-packets move according to the classical laws. In this sense linearity of QM is compatible with classical chaos.
 
  • #3
A. Neumaier said:
It is the resolution given by my thermal interpretation, and it is valid even without being accepted.

Last we talked about this on PO, you seemed to admit here that the claim you want to make, that there remain loopholes for a realistic interpretation, remains a conjecture. Maybe it's an interesting conjecture; certainly it is a strong conjecture, in that a proof of this would make a huge splash in the community. But until there is this proof, I find your way of speaking about the would-be result a little misleading.
 
  • Like
Likes bhobba
  • #4
[URL='https://www.physicsforums.com/insights/author/urs-schreiber/']Urs Schreiber[/URL] said:
Last we talked about this on PO, you seemed to admit here that the claim you want to make, that there remain loopholes for a realistic interpretation, remains a conjecture. Maybe it's an interesting conjecture; certainly it is a strong conjecture, in that a proof of this would make a huge splash in the community. But until there is this proof, I find your way of speaking about the would-be result a little misleading.
One does not need my thermal interpretation to point out
  1. that negative arguments based on idealizations prove nothing about the real case,
  2. the well-known facts that hydromechanics is chaotic and is derivable from quantum field theory,
  3. that the macroscopic variables in this derivation are field expectations.
These facts are valid independent of my thermal interpretation. Thus there can be no doubt that at least certain probabilistic features of our universe arise from deterministic quantum motion in the form of the quantum Liouville equation, which is equivalent to the Schrödinger equation.

In addition, these facts lend strong support for my conjecture that my thermal interpretation is sound and will, in due time, lead to a resolution of the problem of the origin of quantum probabilities. The latter requires detailed arguments why microscopic observable quantum effects behave statistically according to the usual laws. My claims in this respect are still conjectural since my arguments are suggestive only and significant work remains to be done to turn them into powerful theorems.
 
Last edited:
  • #5
Demystifier said:
By Ehrenfest theorem, localized wave-packets move according to the classical laws.
Only in some approximation - approximating ##\langle f(q,p)\rangle## by ##f(\langle q\rangle,\langle p\rangle)## - , valid for very short times. Over longer times, the wave packets become strongly delocalized.
 
  • Like
Likes vanhees71
  • #6
When there is interaction with environment that causes decoherence, then wave packet may stay localized for a long time.
 
  • #7
Demystifier said:
When there is interaction with environment that causes decoherence, then wave packet may stay localized for a long time.
But this does not follow from Ehrenfest's theorem, which assumes exact dynamics (i.e., an isolated system).
 
  • Like
Likes vanhees71
  • #8
A. Neumaier said:
It is the resolution given by my thermal interpretation, and this resolution is valid (independent of the thermal interpretation) even without being accepted.
.

Well, you claim it’s valid, but I don’t think it is.
 
  • #9
stevendaryl said:
Well, you claim it’s valid, but I don’t think it is.
Which of my 3 points listed here is not valid, according to your thinking, and why not?
 
  • #10
A. Neumaier said:
Which of my 3 points listed here is not valid, according to your thinking, and why not?
Well, at first blush, it appears to contradict Bell’s theorem. If it is correct, that’s really big news, which shouldn’t be broken to the world in physics forums. It should be published in a refereed journal.
 
  • #11
stevendaryl said:
Well, at first blush, it appears to contradict Bell’s theorem.
No, because Bell's theorem is a mathematical theorem of the same kind as von Neumann's earlier theorem about the nonexistence of hidden variable theories. Both remain always true as purely mathematical facts.

But (something well recognized in case of von Neumann's theorem) they apply to reality only in as far as the interpretation of the assumptions and conclusions are valid in reality.

The latter is not the case since Bell assumes the exact validity of Born's rule for every measurement. This is not the case, since at least measurements in hydrodynamics don't follow this pattern.
stevendaryl said:
It should be published in a refereed journal.
These 3 points are well-known stuff and cannot be published. I'll publish once I have the positive results conjectured in the remaining of my post #4.
 
Last edited:
  • #12
@A. Neumaier:
Bell's theorem at full rigour is of the form: Let ##A_\alpha, B_\beta : \Lambda\rightarrow[-1,1]## be random variables (for every ##\alpha,\beta\in[0,2\pi]##) on a probability space ##(\Lambda,\Sigma,\mathrm d\mu)##. Then the CHSH inequality holds. No futher assumptions than the ones I listed go into the proof. I don't see, which ones you want to relax. Can you elaborate?
 
  • Like
Likes dextercioby
  • #13
Well, maybe you could go through, in a separate topic, how it works for spin-1/2 anticorrelated EPR pairs.
 
  • #14
rubi said:
@A. Neumaier:
Bell's theorem at full rigour is of the form: Let ##A(\alpha), B(\beta)## be random variables (for every ##\alpha,\beta\in[0,2\pi]##) on a probability space ##(\Lambda,\Sigma,\mathrm d\mu)##. Then the CHSH inequality holds.
I know that; I published a paper on the subject, which was the germ of my thermal interpretation.

This is a purely mathematical theorem.
rubi said:
No further assumptions than the ones I listed go into the proof. I don't see, which ones you want to relax. Can you elaborate?
I don't dispute the correctness of the theorem, and I stated that in the above discussion. Thus nothing needs to be relaxed.

I challenge the applicability to reality: Real states are not exactly of the form that they can be interpreted in terms of the CHSH inequality, and real observables cannot be exactly be interpreted as random variables on of the required form. Only idealizations that strip away most observable degrees of freedom except a few spin variables can be interpreted in these terms.
 
Last edited:
  • #15
A. Neumaier said:
I challenge the applicability to reality: Real states are not exactly of the form that they can be interpreted in terms of the CHSH inequality, and real observables cannot be exactly be interpreted as random variables on of the required form. Only idealizations that strip away most observable degrees of freedom except a few spin variables can be interpreted in these terms.
But for the applicability of the theorem, it doesn't matter whether ##A_\alpha## is a spin variable or a complicated function that takes into account the environment and whatnot. In an experiment, we record measurements that take the values ##-1## or ##+1## and the only thing the theorem asks for is that these numbers can be predicted by some function on a space of hidden variables.
 
  • #16
rubi said:
the only thing the theorem asks for is that these numbers can be predicted by some function on a space of hidden variables.
The theorem asks nothing. It is just a statement in probability theory, not a statement about measurements and their interpretation.

Its application to reality suffers from all foundational problems that the application of the concept of probability to reality has. Unless you specify this relation in a mathematically unambiguous way. But this seems impossible, as it only shifts the border between mathematical concepts and operational concepts. At some point everyone is left without reason (i.e., mathematical arguments) and resorts to pure belief (i.e., personal preferences in the philosophy).
 
  • #17
May I ask if there is some mathematical/theoretical definition of what "truly" random events are compared to classical chaotic one (like a dice)...

I have trouble to understand what the "probability space" of an observable would be. As an example isn't quantum spin linked to conserved quantities ? So if (in a thought experiment) we measure the spin of every particle of the universe (at once, along some identical axis) aren't we supposed to measure a sum total of zero ? Even though the distribution of value is unpredictable ?
Or is it the case that if we redo the same experiment a number of time sufficient enough, that the average would converged to zero ?
 
  • #18
Boing3000 said:
May I ask if there is some mathematical/theoretical definition of what "truly" random events are
In mathematics, randomness is
  • either defined as lack of algorithmic accessibility (Chaitin) and then related to nonconstructive mathematics (axiom of choice),
  • or something undefined, implicitly given through a probability measure.
The former, complexity theoretic case applies to individual infinite sequences of digits or numbers, and defines some sort of ''true'' randomness. But this notion is neither associated with probability, nor does it give any hint of how to interpret the phrase for a finite sequence of measurement results, the situation relevant for physics.

In the latter, probability theoretic case, ''truly random'' is a meaningless phrase, since even a deterministic observable is a random number - just one with zero variance, given by a 1-point measure.)

In the applications, one usually uses the second definition, to which the following applies.

Boing3000 said:
I have trouble to understand what the "probability space" of an observable would be.
The probability space can be thought of as the collection of all imaginable experiments or contexts.

Statements are made only about expectation values (without telling how to interpret these in a real life setting), not about single events. In particular, probabilities are expectation values of binary random variables. There are proxys for the frequency interpretation in the form of laws of large numbers. But as explanations these are circular, since they are themselves statements holding only with some probability.
Boing3000 said:
[...] compared to classical chaotic one (like a dice).
Concerning chaos, there are rigorous results relating chaos in classical hyperbolic systems to probability distributions. This is the subject of ergodic theory. The probability measure arises from the consideration of the infinite duration limit of time averages. This applies to certain simple deterministic dynamical systems and shows how mathematicians derive probability from determinism. But dice, I believe, are already far too complicated to be covered by this approach.
 
Last edited:
  • Like
Likes Boing3000, Mentz114 and dextercioby
  • #19
I'm trying to connect to what you think is the core problem. I read your thermal interpretation page, and while i can see what you write i am not sure i follow your visions or problem descriptions.

One detail caught my attention
A. Neumaier said:
problem of the origin of quantum probabilities. The latter requires detailed arguments why microscopic observable quantum effects behave statistically according to the usual laws.
Are you raising the question of, how come, or rather "how can we defend", that single systems magically seems to "obey" albeit approximately, timeless exact laws? Ie. how to combine concepts that on one hand require infinite time, infinite repititions for an actual inference from measurement, with the apparent success of applying the statistical laws to single system while the inference process is "incomplete"? Are we just lucky, or is there a deeper reason for this stability?

/Fredrik
 
Last edited:
  • #20
A. Neumaier said:
problem of the origin of quantum probabilities. The latter requires detailed arguments why microscopic observable quantum effects behave statistically according to the usual laws.

In order to see if i get your general direction, do you in any way, relate this these ideas?

Precedence and freedom in quantum physics
"
We also propose that laws of quantum evolution arise from a principle of precedence according to which the outcome of a measurement on a quantum system is selected randomly from the ensemble of outcomes of previous instance s of the same measurement on the same quantum system
...
This can be understood as saying that it is more efficient for nature to store a simple rule to generate the ensemble of outcomes than it is to store the whole ensemble of outcomes itself. This leads to a hypothesis that nature chooses the simplest rule, in the sense of algorithmic information theory, which accounts for the first small number of precedents
...
This suggests that the laws of nature are the result of a minimalization, not of an action, but of the information needed to express a rule that propagates future cases from past cases. So rather than a principle of least action we will formulate dynamics as a principle of least information.
...
One might however, still ask how a system knows what its precedents are? This is like ask ing how an elementary particle knows which laws of nature apply to it. The postulate that general timeless laws act on systems as they evolve in time requires a certain set of metaphysical presuppositions. The hypothesis given here, that instead systems evolve by copying the responses of precedents in their past, requires a different set of metaphysical presuppositions.
"-- https://arxiv.org/pdf/1205.3707.pdf, Lee Smolin

/Fredrik
 
  • #21
Fra said:
"how can we defend", that single systems magically seems to "obey" albeit approximately, timeless exact laws?
No. Nature need not be defended, only explained. I referred to a deduction of Born's rule from a mathematically fully defined (idealized) measurement setting, in the spirit of the work by Allahverdian et al, but easier to comprehend so that it can gain universal acceptance.
Fra said:
it is more efficient for nature to store a simple rule to generate the ensemble of outcomes than it is to store the whole ensemble of outcomes itself.
Nature is the storage and the single realization of everything that is. It does not need to care about efficiency. It is only us poor humans who, collecting some information on our journey along our world lines, need to think about how to efficiently store the little we know about the vast realms of Nature.
 
  • Like
Likes Boing3000
  • #22
A. Neumaier said:
But this does not follow from Ehrenfest's theorem, which assumes exact dynamics (i.e., an isolated system).
Starting from Heisenberg equations of motion, it is easy to derive Ehrenfest theorem for mixed states.
 
  • #23
A. Neumaier said:
No. Nature need not be defended, only explained. I referred to a deduction of Born's rule from a mathematically fully defined (idealized) measurement setting, in the spirit of the work by Allahverdian et al, but easier to comprehend so that it can gain universal acceptance..
I see, then i was mistaken.

( I didnt mean defend nature, only to defend our inference of deductive rule about nature. Nature here is the "black box" we are trying to understand. To explain a position is the same as to defend the logic of inference used to arrive at it. Explantory value entirely lies in the justification of the inferences made. I was thinking of how to defend "deterministic rules of probability" in an inference scenarious where it is obvious that the inferences are truncated. )

A. Neumaier said:
Nature is the storage and the single realization of everything that is.
I agree. But to understand the inne structure of nature, we need to understand how one sub-storage, relates and interacts to another.
A. Neumaier said:
It does not need to care about efficiency. It is only us poor humans who, collecting some information on our journey along our world lines, need to think about how to efficiently store the little we know about the vast realms of Nature.
I disagree here. The reason is that in my view nature is a web of relations. Parts of nature interacting with other parts, and encoding these relations, and here you run into physical constraints, where you do NEED to worry about the effiency of how relations are coded as it reflects its stability.

And we assume nature whatever it is, is at least stable, except if we bring in cosmological scales and first seconds of big bang.

And I apply the inference not just to ME, but to any part of the universe. If you think a human is poor, then consider a truly non-classical observer, such as an atom. I am not distinguished from an atom in any other way than by complexity.

My vision here is to ultimateyl extend the inference, exemplified by typical QM, and a classical measurement device (wether distributed or not) to the general case where the measurement device or observer is not classical anymore. As its my conviction that this is the key to understanding unification and gravity.

(I pass discussing the born rule, i have seen lots of arguments starting from cox axioms and similar things, but they are only partially satsifactory as some of the assumptions going in involved continuum mathematics, and this is precisely one thing that causes complications for an algorithmic perspective, i have chosen another path)

/Fredrik
 
  • #24
Demystifier said:
Starting from Heisenberg equations of motion, it is easy to derive Ehrenfest theorem for mixed states.
Yes, but for the system including the environment - where states quickly delocalize. Not for the decohered system without the environment, which you used to defend localization.
 
  • Like
Likes Demystifier
  • #25
Fra said:
how one sub-storage, relates and interacts to another.
This is a not matter of philosophical assumptions about information. it is one of nonequilibrium statistical mechanics, which on the fundamental level tells how storage mechanisms in nature work.
 
  • #26
A. Neumaier said:
This is a not matter of philosophical assumptions about information. it is one of nonequilibrium statistical mechanics, which on the fundamental level tells how storage mechanisms in nature work.

I agree it is kind of about "nonequilibrium statistical mechanics" but my point is

1) how do you justify the probabilistic methods for change when we are not in classical mechanics? And we do not have the case of actual statistics?

2) in CM we also in principle measure things, but the whole issue gets more complex in QM. As the inference itself becomes as integral part of causality in a way that isn't the case in CM.

Thus in my view the justification for statistical methods become (if you take bayesian view) observer dependent in a way that is more hidden in CM.

So the explanatory power of the statistical method needs imo another level of justification in QM, at least if you consider anything else than small subsystems that allow infinite repeitition to provide actual statistics.

I am not saying one can't justify statistical reasoning to single cases in nature i just think i don't see your version of its justification.

/Fredrik
 
  • #27
A. Neumaier said:
nonequilibrium statistical
Another point of mine that i think is a relevant complication (to be resolved) that connects to algorithmic approaches is that the equilibration processes take place along or is even indistinguishable from the statistical inferences sub systems make of each other.

The normal resolution is to imagine the system in a a bigger context where you can apply statistics. But then this correaponds to a different observer or observational scale and i CM this is more ok. But in QM one would need first to prove that the inside inference and the expectation value of the inside inference as measure by another observer agree which i suspect is not the case generally. And this disagreement could encode and explain a physical process. In CM there typically is no such interaction and no disagreement.

My point is that if wee seek to explain A from B, then assuming C from which B is a reduction and show C => A and C => B is not a valid inference ss you have added information.

/Fredrik
 
  • #28
A. Neumaier said:
The theorem asks nothing. It is just a statement in probability theory, not a statement about measurements and their interpretation.
But if we apply it to an EPRB experiment, we need to associate the measurement results with some objects in the theory. In a hidden variable theory, the correlations are computed as ##C_{\alpha\beta} = \int A_\alpha(\lambda) B_\beta(\lambda) \mathrm d\mu(\lambda)##. It seems like you don't agree with this, so I'm asking you for your proposal. Can you clearly define the mathematical setting of your proposed resolution and give an expression for the correlations?

Its application to reality suffers from all foundational problems that the application of the concept of probability to reality has. Unless you specify this relation in a mathematically unambiguous way. But this seems impossible, as it only shifts the border between mathematical concepts and operational concepts. At some point everyone is left without reason (i.e., mathematical arguments) and resorts to pure belief (i.e., personal preferences in the philosophy).
I don't see how this helps solving the problem. You seem to be suggesting that QM probabilities arise from some statistics of an underlying deterministic theory. Then you will have to explain in what way your theory is different from the class of theories excluded by Bell. Bell's class has a clear mathematical definition.
 
  • #29
Fra said:
in CM we also in principle measure things, but the whole issue gets more complex in QM. As the inference itself becomes as integral part of causality in a way that isn't the case in CM.
This is only because in interpreting classical mechanics the idealization is made that the observation can be done in principle without error. But a finite subsystem of a classical system cannot get perfect information about a disjoint classical subsystem with which it interacts. Once this is properly modeled, the situation becomes not very different from the quantum situation.
Fra said:
(if you take bayesian view) observer dependent
I don't take a Bayesian view. Physics is concerned with objectively valid statements about Nature, not with subjective views.
 
Last edited:
  • #30
rubi said:
if we apply it to an EPRB experiment, we need to associate the measurement results with some objects in the theory.
Yes. But I question that the measurements are simply measurements of the earlier prepared state without any reference to the complete context, that is hidden in the idealized classical description usually given for everything except the few spin degrees of freedom. A whole lot of assumptions go into this, including the quantum field theory of the transport process. For example, the photons transported in an optical fiber cable are not the photons of free QED generated in a cavity of the laser but quasiparticles obtained by a complicated renormalization process. There are so many idealizing assumptions involved that are ignored in the short statement and proof of the CHSH inequality.

rubi said:
Can you clearly define the mathematical setting of your proposed resolution
I don't have yet a resolution, and didn't claim that I had one. (The pieces I have are on my web page on the thermal interpetation, but this does not yet constitute a full resolution.) I only claim that we know already how to get certain probabilistic observable effects (namely those of hydrodynamics) from deterministic quantum mechanics, and this by a mechanism involving expectation values only. And I claim that a proof about an idealized situation (as in Bell type theorems) does not tell anything conclusive about the real, nonideal situation.
rubi said:
Then you will have to explain in what way your theory is different from the class of theories excluded by Bell. Bell's class has a clear mathematical definition.
The context in which Bell type theorems are interpreted always assume some form of locality.

But nonlocality is explicitly built in into the very foundations of quantum mechanics as conventionally presented. For a single massive particle, Born's rule states that ##|\psi(x,t)|^2## is the probability density for locating at a given time ##t## the particle at a particular position ##x## anywhere in the universe, and the Fourier transform ##|\widetilde\psi(p,t)|^2## is the probability density for locating at a given time ##t## the particle with a particular momentum ##p##. Now a basic theorem of harmonic analysis says that one or both of these two functions must have unbounded support. This implies that when a particle has been prepared in an ion trap (and hence is there with certainty), there is a positive probability that at an arbitrarily short time afterwards it is detected a light year away. Clearly this is an unphysical simplification. (This also proves that Born's rule cannot be taken as the exact and unquestionable foundation of everything.)

Thus, in its basic operational interpretation, the wave function already encodes highly nonlocal information. Most people don't see it, blinded by tradition, and hence think that Bell's theorem uncovers something deep about quantum mechanics. It took me a long time to free myself from these limitations.

Processing nonlocal information, standard deterministic quantum mechanics defined by the Schrödinger equation doesn't fall under the scope of Bell type theorems. So there is nothing in my thermal interpretation that would contradict established theorems.
 
Last edited:
  • Like
Likes Boing3000
  • #31
A. Neumaier said:
I challenge the applicability to reality: Real states are not exactly of the form that they can be interpreted in terms of the CHSH inequality, and real observables cannot be exactly be interpreted as random variables on of the required form. Only idealizations that strip away most observable degrees of freedom except a few spin variables can be interpreted in these terms.

Can we modify any of the better known toy examples of Bell's inequality ( e.g. the "EPR apparatus" of http://www.theory.caltech.edu/classes/ph125a/istmt.pdf or the consumer survey of http://quantumuniverse.eu/Tom/Bells Inequalities.pdf) to illustrate how the thermal approach would predict (or at least tolerate) a violation of the inequality?

Those toy examples deal with reality at the level of probability. They don't postulate any underlying state vectors as the causes of probability. What kind of ensemble is attributed thermodynamic properties in the thermal approach to QM? Are these ensembles whose individuals are described by state vectors? Or are they ensembles of that have only classical properties?
 
  • #32
A. Neumaier said:
This is only because in interpreting classical mechanics the idealization is made that the observation can be done in principle without error. But a finite subsystem of a classical system cannot get perfect information about a disjoint classical subsystem with which it interacts. Once this is properly modeled, the situation becomes not very different from the quantum situation.

I certainly agree with the bold statement => the point i made can be made an issue even in classical mechanics.

However, i think that there is coincidental a reason(General Relativity) why we kind of "get away" with this in Classical Mechanics. GR describes to us in an alternative way, what might otherwise be explained in a more complex way. Also this is one good reason why GR is preferred in CM. But in unification, we might in fact need the more complex theory.

Confession: I (secretly) think that this "issue" in classical mechanics may provide a deeper insight into what gravity is - in a way that makes the marriage with standard model physics more natural. This idea also implies gravity is an emergent phenomena at "low" energy, that is explained by a balance of universal negotiations which are attractive and inertia which resist this. But after all, GR describes how matter and energy defines curvature, but it does not explain the mechanism in terms of something else.

But I can not see how to understand unification in terms of inference while ignoring this information coding limitation. I can not ignore this detail.
A. Neumaier said:
Physics is concerned with objectively valid statements about Nature, not with subjective views.
Superficially this is of course true.

However, a more careful analysis of the the inferences suggests that things are more complicated.

An inference, is by definition conditional relative to its choice of inference logic, this is what i mean by subjective. And part of my understanding of QM is that inferences are indistinguishable from physical interaction. Two interacting systems are effectively making measurements on each other, but without a classical backdrop in general - this is the hard part to grasp; it needs to be solved, and it likel implies a reconstruction of quantum theory, yielding current QM as a limiting case of classical dominant observer (so that the information codinfg limiting never gets "relevant")

What we expect out of a stable situtation is a universe where the observers (=matter content; here i count any piece of matter as an observer) "populating it" are interaction with each other in a way that the laws of interactions as inferred by any choice of these observers are consistent.

Then either this situation is stable and both the population of observers (matter) and laws are stable. This is BTW, what we expect in our universe today.
Or the situation is not stable, and both the population of observers and law are evolving (maybe early big bang, at TOE energy scale). This is also in essence smolins idea behind evolving law. (but Smolin doesn't present the whole picture i argue for here, but the specifics of evolving law he has the same view as me).

/Fredrik
 
  • #33
Stephen Tashi said:
Can we modify any of the better known toy examples of Bell's inequality ( e.g. the "EPR apparatus" of http://www.theory.caltech.edu/classes/ph125a/istmt.pdf or the consumer survey of http://quantumuniverse.eu/Tom/Bells Inequalities.pdf) to illustrate how the thermal approach would predict (or at least tolerate) a violation of the inequality?

Those toy examples deal with reality at the level of probability. They don't postulate any underlying state vectors as the causes of probability. What kind of ensemble is attributed thermodynamic properties in the thermal approach to QM? Are these ensembles whose individuals are described by state vectors? Or are they ensembles of that have only classical properties?
The thermal interpretation makes no predictions that deviate anywhere from quantum mechanics; so the standard examples of Bell violations by QM apply.
 
  • #34
Fra said:
may provide a deeper insight into what gravity is
I am primarily interested in interpreting the well-understood part of quantum theory.

Fra said:
An inference, is by definition conditional relative to its choice of inference logic, this is what i mean by subjective.
The logic used in quantum field theory, the deepest level currently understood, is still classical logic. The only subjective part are the choice of assumptions. Logic then objectively defines the possible conclusions.
 
  • Like
Likes bhobba, Fra and dextercioby
  • #35
A. Neumaier said:
The thermal interpretation makes no predictions that deviate anywhere from quantum mechanics; so the standard examples of Bell violations by QM apply.

The toy examples illustrate situations where Bell's inequality does apply. I thought that perhaps a thermal interpretation of those situations would reveal why Bell's inequality need not apply.
 

Similar threads

Back
Top