Do Bell and PBR together point toward nonlocal reality?

In summary: I think is a reasonable assumption to make.But I'm not sure that I have all the details worked out yet. The model I've been developing is a kind of "proto-model" of non-local reality, and is not yet a fully developed model.On the other hand, the "realist" model of reality that I've been developing in that thread seems to be a pretty reasonable model of reality. That model is also not fully developed, but it has gotten a lot more development than the non-local model. I think the "realist" model is pretty well thought out and is fairly well developed.So, I think that we're actually getting quite close to being able to make a rational decision about "real
  • #36
Ilja said:
In standard dBB theory the wave function is real, so what could be the problem if PBR proves that the wave function has to be real?

dBB, as I understand it (and I am probably missing something on this point) says that all observables would be predictable with certainty if you only knew all relevant particle positions (which are in turn unknowable). That would include non-commuting observables.

On the other hand: PBR, as I understand it (and I am probably missing something on this point as well) says that non-commuting observables cannot both have well-defined values at all times.

I know this is an over-simplification. And I realize that PBR requires some assumptions that allow an "escape" for Bohmian class theories. My point is that the general tenor of these approaches are contradictory, even though there may be an escape. So we will need to see how these issues shake, if any.
 
Physics news on Phys.org
  • #37
DrChinese said:
dBB, as I understand it (and I am probably missing something on this point) says that all observables would be predictable with certainty if you only knew all relevant particle positions (which are in turn unknowable). That would include non-commuting observables.
They would be predictable if one would know not only the configuration of the "observed" object, but also the configuration of the thing named "measurement instrument". So it would be better named the result of an interaction, and not of a measurement.

So instead of non-commuting "observables" there are non-commuting interactions.

On the other hand: PBR, as I understand it (and I am probably missing something on this point as well) says that non-commuting observables cannot both have well-defined values at all times.
This has, as far as I understand, nothing to do with PBR. This is something about the impossibility of non-contextuality, von Neumann, Kochen-Specker, Gleason or so. dBB is a contextual theory. The result of "measurements" depends on the "measurement" itself and in particular on its configuration.

And I realize that PBR requires some assumptions that allow an "escape" for Bohmian class theories. My point is that the general tenor of these approaches are contradictory, even though there may be an escape. So we will need to see how these issues shake, if any.

That dBB is free of contradictions can be easily seen looking at Bohm's original theory, because the theory is defined there completely. All the equations are there. The equivalence of dBB in quantum equlibrium to standard QT is a triviality, so if you think QT is free of contradiction, there is not much room for believing that dBB is contradictory.
 
  • #38
Ilja said:
I disagree. Linguistic analysis shows that "non-locality" means "non-Einstein-causality"


non-signaling.
 
  • #39
Ilja said:
I disagree. Linguistic analysis shows that "non-locality" means "non-Einstein-causality", and, because Einstein causality is an invention of the XXth century, this does not imply any mysticism. It allows for a very non-mystical solution - a return to the classical, pre-Einsteinian notion of causality.

I've entertained this idea too. It seems as though causality, on a fundamental level, isn't really relativistic, and that the underlying physics are nonrelativistic (things like Bell inequality experiments don't 'obey' Lorentz/Poincare symmetry [but that doesn't necessarily mean they obey Gallilean symmetry instead]). There are even some very nice analogous phenomena in condensed matter physics, especially Bose-Einstein condensation and superconductivity, that might hint at that possibility. I have it on the authority of one of my professors who specializes in AdS/CFT that M-theory actually does assume that the theory is fundamentally nonrelativistic.

You're right that (Einsteinian/Lorentzian) relativity is a creation of the 20th century and that it's actually quite a radical idea unto itself. Most physicists at the time believed Gallilean relativity to be much more believable and sensible, and thought Einstein's theory was crazy. From a modern perspective though, without Einstein's relativity, there are serious issues with the basic issues of simultaneity and hence causality, so it's hard to believe in the old Galilean relativity.

It's clear that causality can't just be assumed to be as simple as what Gallileo and Newton had in mind; Bell's Theorem can teach us a lot about the subtleties of actual causality, or what I like to call "quantum causality", which does reflect the relativistic principle of a speed limit to the sending of information, despite the presence of superluminal correlations.
 
Last edited:
  • #40
Ilja said:
This has, as far as I understand, nothing to do with PBR. This is something about the impossibility of non-contextuality, von Neumann, Kochen-Specker, Gleason or so. dBB is a contextual theory. The result of "measurements" depends on the "measurement" itself and in particular on its configuration.

I am not so sure. From Matt Leifer's excellent summary of PBR:

1. Wavefunctions are epistemic and there is some underlying ontic state. Quantum mechanics is the statistical theory of these ontic states in analogy with Liouville mechanics.
2. Wavefunctions are epistemic, but there is no deeper underlying reality.
3. Wavefunctions are ontic (there may also be additional ontic degrees of freedom, which is an important distinction but not relevant to the present discussion).

I will call options 1 and 2 psi-epistemic and option 3 psi-ontic. Advocates of option 3 are called psi-ontologists, in an intentional pun coined by Chris Granade. Options 1 and 3 share a conviction of scientific realism, which is the idea that there must be some description of what is going on in reality that is independent of our knowledge of it. Option 2 is broadly anti-realist, although there can be some subtleties here[2].

The theorem in the paper attempts to rule out option 1, which would mean that scientific realists should become psi-ontologists. I am pretty sure that no theorem on Earth could rule out option 2, so that is always a refuge for psi-epistemicists, at least if their psi-epistemic conviction is stronger than their realist one.

I would classify the Copenhagen interpretation, as represented by Niels Bohr[3], under option 2.

...

Pretty much all of the well-developed interpretations that take a realist stance fall under option 3, so they are in the psi-ontic camp. This includes the Everett/many-worlds interpretation, de Broglie-Bohm theory, and spontaneous collapse models. Advocates of these approaches are likely to rejoice at the PBR result, as it apparently rules out their only realist competition, and they are unlikely to regard anti-realist approaches as viable.""


http://mattleifer.info/2011/11/20/can-the-quantum-state-be-interpreted-statistically/

Now admittedly Matt is agreeing with you that dBB is a member of option 3. My point is that I think what dBB is saying sounds to me a lot more like 1 than 3. If you COULD know all of the relevant variables and then make a certain prediction, that sounds more like 1 than 3.
 
  • #41
more explicitily, from:
--------
"http://physics.stackexchange.com/qu...e-interpreted-statistically-again/36390#36390
That goes with the epistemic, ontic or complete interpretations of the quantum state.

By the way the options are:

.-only one pure quantum state corrrespondent/consistent with various ontic states.

.-various pure quantum states corrrespondent/consistent with only one ontic state.

.-only one pure quantum state corrrespondent/consistent with only one ontic state.Maximally epistemic interpretations of the quantum state and contextuality
http://arxiv.org/pdf/1208.5132.pdf
...If one could prove, without auxiliary assumptions, that the support of every distribution in an ontological model must contain a set of states that are not shared by the distribution corresponding to any other quantum state, then these results would follow. Whether this can be proved is an important open question...Einstein, incompleteness, and the epistemic view of quantum states
http://arxiv.org/pdf/0706.2661.pdf
...ψ-ontic if every complete physical state or ontic state in the theory is consistent with only one pure quantum state; we call it ψ-epistemic if there exist ontic states that are consistent with more than one pure quantum state...

...The simplest possibility is a one-to-one relation. A schematic of such a model is presented in part (a) of Fig. 1, where we have represented the set of all quantum states by a onedimensional ontic state space Λ labeled by ψ. We refer to such models as ψ-complete because a pure quantum state provides a complete description of reality...

a conference about New Perspectives on the Quantum State*
-----

*What de Broglie--Bohm Mechanics tells us about the Nature of the Quantum State
http://streamer.perimeterinstitute.ca/Flash/5a14272c-198f-4135-8a7d-bdc357b02f91/viewer.html
 
Last edited by a moderator:
  • #42
@audioloop: non-signalling is different, and weaker. Violations of Einstein causality may be hidden, associated with a hidden preferred frame. One choice of a preferred frame allows an explanation with superluminal A->B, another choice with superluminal B->A. There is no explanation without superluminal influence. But you cannot use this for superluminal signalling. Because signalling A->B would be impossible, incompatible with the explanation B->A, and signalling B->A incompatible with the explanation A->B. So there is no superluminal signalling, but nonetheless no Einstein causality.

@Jolb: What is proposed is not a return to Galilean symmetry. Violoations of Bell's inequality are in itself in agreement with relativistic symmetry. The point is that any realistic explanation has to violated relativistic symmetry. So, realistic theories need a preferred frame. Classically, this frame is hidden, from point of view of pure observational quantum effects it remains hidden too, but a realistic explanation needs it.

Regarding observation, to introduce a preferred frame is unproblematic. In special relativity, it is simply the Lorentz-Poincare interpretation. In general relativity, one has to modify GR to introduce a preferred frame, but this is not that difficult. See arXiv:gr-qc/0205035 and arXiv:1003.144 for my proposal how to introduce a preferred frame into GR, and arXiv:0909.1408 for a completely independent quantum gravity argument in favour of introducing a background into GR.

@DrChinese: Sorry, but I cannot do more than you have already done, dBB is well-known, its ontology too, and it consists of the wave function psi(q,t) and the position q(t), so it fits into 3, once the wave function is ontic together with something else. These are fixed and well-defined things, nothing of type "sounds more like". Such "sounds more like" I prefer to leave to Copenhagen and many worlds and other interpretations criticized already by Bell as "unprofessionally vague".
 
  • #43
I have often read that there is an analogy between dBB and classical statistical mechanics (I think Demystifier wrote about this several times). Is this analogy invalid now due to the PBR theorem or was it never in the sense of option 1 from Matt Leifer? If so, what is the analogy?
 
  • #44
kith said:
I have often read that there is an analogy between dBB and classical statistical mechanics (I think Demystifier wrote about this several times). Is this analogy invalid now due to the PBR theorem or was it never in the sense of option 1 from Matt Leifer? If so, what is the analogy?

I am really scratching my head why PBR has anything to do with DBB. Its the realist interpretation par excellence - PBR is irrelevant since its only concerned with interpretations that believe a state is not real but some aspects are - it's Matt's option 1 - it shows its really option 3 in disguise. DBB is well and truly option 3 right from the gehcko.

I have said it before, and will say it again, PBR is interesting but since nearly all interpretations are either of type 2 or 3 its not really that much of a game changer. Yet we have people bringing it up all over the place. Never thought Matt's type 1 interpretations were that popular - because that's all it affects.

Thanks
Bill
 
  • #45
Ilja said:
Sorry, but I cannot do more than you have already done, dBB is well-known, its ontology too, and it consists of the wave function psi(q,t) and the position q(t), so it fits into 3, once the wave function is ontic together with something else. These are fixed and well-defined things, nothing of type "sounds more like". Such "sounds more like" I prefer to leave to Copenhagen and many worlds and other interpretations criticized already by Bell as "unprofessionally vague".
I thought I should mention that even within the pilot wave camp, there are at least 2 different varieties:

1. ψ is seen as some type of "field/object" : Valentini, deBroglie, Bohm
2. ψ is viewed as nomological (a law) : Durr/Goldstein/Zanghi (DGZ), Maudlin

It seems, to me, that option 3 given by Leifer (e.g. wavefunctions are ontic) is more in line with ψ as some type of field vs ψ as nomological? So, does this mean that PBR may rule out even certain Bohmian sub-interpretations? It seems Valentini was very enthusiastic about PBR but I haven't seen much written from the Durr/Goldstein/Zanghi, Maudlin group. But then again, I'm not sure. Since PBR rules out ψ-epistemic theories within the realist camp, does it also rule out Bohmian versions where ψ is treated as a law (nomological)? I'm still a bit confused by this.

The ψ as nomological view has been criticized by some authors including some posters on this forum supportive of the pilot-wave model. Maaneli writes:
There is a very serious and obvious problem with their interpretation; in claiming that the wavefunction is nomological (a law-like entity like the Hamiltonian as you said), and because they want to claim deBB is a fundamentally complete formulation of QM, they also claim that there are no underlying physical fields/variables/mediums in 3-space that the wavefunction is only a mathematical approximation to (unlike in classical mechanics where that is the case with the Hamiltonian or even statistical mechanics where that is the case with the transition probability solution to the N-particle diffusion equation). For these reasons, they either refuse to answer the question of what physical field/variable/entity is causing the physically real particles in the world to move with a velocity field so accurately prescribed by this strictly mathematical wavefunction, or, when pressed on this issue (I have discussed this issue before with DGZ), they simply deny that this question is meaningful. The only possiblity on their view then is that the particles, being the only physically real things in the world (along with their mass and charge properties of course), just somehow spontaneously move on their own in such a way that this law-like wavefunction perfectly prescribes via the guiding equation. This is totally unconvincing, in addition to being quite a bizarre view of physics, in my opinion, and is counter to all the evidence that the equations and dynamics from deBB theory are suggesting, namely that the wavefunction is either a physically real field on its own or is a mathematical approximation to an underlying and physically real sort of field/variable/medium, such as in a stochastic mechanical type of theory.
https://www.physicsforums.com/showthread.php?t=247367&page=2

Belousek makes the same point in his exceptionally well-written paper on the different varieties of "Bohmian" mechanics. He writes:
On the DGZ view, then, the guidance equation allows for only the prediction of particle trajectories. And while correct numerical prediction via mathematical deduction is constitutive of a good physical explanation, it is not by itself exhaustive thereof, for equations are themselves 'causes' (in some sense) of only their mathematical-logical consequences and not of the phenomena they predict. So we are left with just particles and their trajectories as the basis within the DGZ view of Bohmian mechanics. But, again, are particle trajectories by themselves sufficient to explain quantum phenomena? Or, rather are particle trajectories, considered from the point of view of Bohmian mechanics itself, as much a part of the quantum phenomena that needs to be explained?...the mere existence of those trajectories is by itself insufficient for explanation. For example, to simply specify correctly the motion of a body with a certain mass and distance from the sun in terms of elliptical space-time orbit is not to explain the Earth's revolving around the sun but rather to redescribe that state of affairs in a mathematically precise way. What remains to be explained is how it is that the Earth revolves around the sun in that way, and within classical mechanics, Newton's law of universal gravitation and second law provide that explanation.
Formalism, Ontology and Methodology in Bohmian Mechanics
http://www.ingentaconnect.com/content/klu/foda/2003/00000008/00000002/05119217
 
Last edited by a moderator:
  • #46
bohm2 said:
does this mean that PBR may rule out even certain Bohmian sub-interpretations? ... Since PBR rules out ψ-epistemic theories within the realist camp, does it also rule out Bohmian versions where ψ is treated as a law (nomological)?
No, it does not, because the PBR theorem requires the assumption that different epistemic states can refer to the same ontic state, which is not true of Bohmian mechanics.

See the work of Maxim Raykin for a new equation of motion for Bohmian trajectories, that does not utilize a pilot wave at all.
 
  • #47
bohm2 said:
So, does this mean that PBR may rule out even certain Bohmian sub-interpretations?

Everything I have read about DBB is the pilot wave is real, the wave-function is real. Positions and momentum are real and exist at all times. Probabilities enter into it due to lack of knowledge about initial conditions. If that's not option 3 I don't know what is.

However I guess the final word on it would have to come from an expert in it - and I know, without naming names, a number of people who have contributed to this thread are - so I guess its over to them.

Thanks
Bill
 
  • #48
kith said:
I have often read that there is an analogy between dBB and classical statistical mechanics (I think Demystifier wrote about this several times). Is this analogy invalid now due to the PBR theorem or was it never in the sense of option 1 from Matt Leifer? If so, what is the analogy?
dBB is still analogous to classical statistical mechanics. But the point is that it is analogous to statistical mechanics of particles in some external potential. In classical mechanics not only particles are real, but the potential is real as well. The role of potential is somewhat different from that of the particles, which is why you can call it nomological rather than ontological. But if you define the notion of reality in the PBR sense, then the potential is real, and not merely epistemic.
 
  • #49
DrChinese said:
dBB, as I understand it (and I am probably missing something on this point) says that all observables would be predictable with certainty if you only knew all relevant particle positions (which are in turn unknowable). That would include non-commuting observables.
It is a matter of textbook QM that all position operators (at a given time) commute. Therefore, knowing all particle positions (at a given time) does NOT include non-commuting observables.
 
  • #50
bohm2 said:
1. ψ is seen as some type of "field/object" : Valentini, deBroglie, Bohm
2. ψ is viewed as nomological (a law) : Durr/Goldstein/Zanghi (DGZ), Maudlin

It seems, to me, that option 3 given by Leifer (e.g. wavefunctions are ontic) is more in line with ψ as some type of field vs ψ as nomological? So, does this mean that PBR may rule out even certain Bohmian sub-interpretations?

This is something worth to be discussed. I would add a third direction, my own, where the wave function is purely Bayesian, see arXiv:1103.3506. But the answer is no in all cases.

The point is that one has to distinguish here between the wave function of the universe and the effective wave function of a small system. The dBB formula which defines the effective wave function is [itex]\psi(q)=\Psi(q,q_{env})[/itex], where [itex]q_{env}[/itex] is the configuration of the remaining universe except the system under consideration. Thus, if [itex]\Psi[/itex] is nomological or not does not matter that much, it may be as well purely epistemical (Bayesian). Anyway, the effective wave function [itex]\psi[/itex] of the system depends on [itex]q_{env}[/itex] which is ontic in all three subinterpretations.

Moreover, remember how we define the actual state of a wave function. We measure, that means, we prepare a system together with a measurement device and consider their interaction. The very construction of the situation has, of course, also some aspect of reality, but one can, as well, consider it as nomological (as defining H of the initial measurement procedure) or Bayesian (our knowledge about it). Whatever, it is the result of the measurement, which is visible in the measurement device, thus, part of [itex]q_{env}[/itex], which defines the effective wave function.

So what we obtain after the preparation procedure is an effective wave function which is essentially ontic, even if the wave function of the universe is nomological or Bayesian.
 
  • #51
Ilja said:
Time-symmetric interpretations, with causal influences into the past, are interpretations for those who like science fiction and mystics. There is not a single bit of empirical evidence in favour of causal influences from future into the past.

We have, of course, very strong evidence against Einstein causality. It is not possible to give any realistic interpretation of violations of Bell's inequality compatible with Einstein causality. So it has to be given up. But that means we have to go back to classical causality, and there is no reason to go into the direction of sci-fi mystics of causal influences into the past.

I think there is disagreement about what is suggested by the empirical evidence. There is no evidence in favor of there being a direction of time in the laws of physics. There is no evidence of any breakdown in (local) Lorentz invariance of physics. So there is no empirical evidence in favor of the program you suggest, which is to give up Einstein causality in favor of a time-asymmetric, non-Lorentz-invariant theory.

Having said that, I don't think anyone needs empirical justification for exploring an idea. In the early stages of developing a theory, it's basically like brainstorming, nothing should be considered too far-out. May a thousand flowers bloom--or rather, may a thousand flowers be planted in the hopes that maybe one will bloom.

It definitely isn't scientific to criticize an approach based on the fact that it sounds silly. That's very subjective.
 
  • #52
Demystifier said:
It is a matter of textbook QM that all position operators (at a given time) commute. Therefore, knowing all particle positions (at a given time) does NOT include non-commuting observables.

But if you know the position of a particle at all times, then you know the velocity at all times (well, if the position is a differentiable function of time). Yet position and velocity are non-commuting.
 
  • #53
DrChinese said:
dBB, as I understand it (and I am probably missing something on this point) says that all observables would be predictable with certainty if you only knew all relevant particle positions (which are in turn unknowable). That would include non-commuting observables.

On the other hand: PBR, as I understand it (and I am probably missing something on this point as well) says that non-commuting observables cannot both have well-defined values at all times.

I know this is an over-simplification. And I realize that PBR requires some assumptions that allow an "escape" for Bohmian class theories. My point is that the general tenor of these approaches are contradictory, even though there may be an escape. So we will need to see how these issues shake, if any.

Could you post a concise statement of PBR, or a link to such a statement? I remember reading the paper and yawning, because it didn't seem like it said anything that I didn't already know (or suspect).

[edit]Never mind, I found a good discussion here:
http://mattleifer.info/2011/11/20/can-the-quantum-state-be-interpreted-statistically/
 
  • #54
Ilja said:
That dBB is free of contradictions can be easily seen looking at Bohm's original theory, because the theory is defined there completely. All the equations are there. The equivalence of dBB in quantum equlibrium to standard QT is a triviality, so if you think QT is free of contradiction, there is not much room for believing that dBB is contradictory.

I don't agree that it is a triviality that dBB is equivalent to standard quantum theory. Maybe the phrase "quantum equilibrium" works to address my worries, but it seems that standard quantum mechanics applies to tiny systems such as singe molecules, where the notion of "equilibrium" is ill-defined.

Take an example of a single particle in some kind of potential well. The standard quantum approach is that the wave function gives a probability distribution on positions of the particle, and if you perform a measurement to find out the particle's location, then the wave function collapses to something sharply peaked at the observed location. A second measurement will have probabilities given by the collapsed wave function, not the original.

But in the Bohm model, the particle always has a definite position. So what is the relationship between the wave function and the particle's position? When you detect the particle at some location, does the wave function collapse to a sharply peaked one? If so, what is the mechanism for this? Presumably, this means that there is an interaction between the detector and the wave function, but such an interaction goes beyond ordinary quantum mechanics, it seems to me. I don't see that they are equivalent.

The usual argument for the equivalence of Bohm's model and standard quantum mechanics is for an ensemble of many particles with the same wave function. In such a scenario, the effect of detecting a single particle is negligible, and so there is not a big error introduced by using the same wave function as before.
 
  • #55
stevendaryl said:
I think there is disagreement about what is suggested by the empirical evidence. There is no evidence in favor of there being a direction of time in the laws of physics. There is no evidence of any breakdown in (local) Lorentz invariance of physics. So there is no empirical evidence in favor of the program you suggest, which is to give up Einstein causality in favor of a time-asymmetric, non-Lorentz-invariant theory.
I would personally prefer to change a little bit into the direction of the past, when I was younger and more healthy. Unfortunately, I cannot do this, and all the empirical evidence I have suggests that this is simply impossible. So empirical evidence strongly suggests that there is no time symmetry.

Thus, I conclude that there is something something wrong with the time symmetry of our fundamental theories. By the way, the collapse in the Copenhagen interpretation as well as the development toward quantum equilibrium in dBB theory are not time symmetric, so that the fundamental theory is less time-symmetric as usually presented.

Ok, I agree, to introduce hidden objects which break a symmetry in a situation where we have not yet observed any violation of this symmetry is not nice. That means, one needs serious reasons. But there are very serious reasons - all one has to do to see this is to look at the alternatives.

The alternative is giving up realism. That's more than a nice word. It means, if taken seriously, giving up science. Ok, nobody takes it seriously, so we will continue to apply realism as usual, in all the domains where science has been already successful. But, sorry, if it would be a good idea, we should apply it everywhere, that means, to reject realism everywhere. If this is not a good idea, then to give it up in fundamental physics only is, may be, not a good idea too.

More fundamental theories often have different symmetry groups. Thus, to think that the symmetry group of actual theory survives is an idea certainly worth to try, but not more, it is clearly not a necessity, or something having a fundamental connection with the scientific method itself. So, giving up a particular symmetry group - especially in a situation where the two most fundamental theories we have have different symmetry groups - is not problematic.

But giving up realism is something completely different.
 
  • #56
Ilja said:
I would personally prefer to change a little bit into the direction of the past, when I was younger and more healthy. Unfortunately, I cannot do this, and all the empirical evidence I have suggests that this is simply impossible. So empirical evidence strongly suggests that there is no time symmetry.

Logically, the argument "The lack of evidence for X implies evidence against Y" requires you to establish that if Y were true, then X would follow. In the case X = changing the past, Y = time symmetric physics, it doesn't work. If you actually look at the mechanism for causality, you will see that it is ultimately about boundary conditions, not about the directionality of the laws of physics. Causality propagates in the direction of increasing entropy.

Now, there is a central mystery about cosmology, which is: Why was the entropy of the early universe so low? It's possible that new physics will be required to explain this, and that that new physics might be time-asymmetric. But for non-cosmological physics, involving small regions of spacetime, there is no need for time-asymmetric laws of physics in order to understand the asymmetry in causality.
 
  • #57
Ilja said:
Ok, I agree, to introduce hidden objects which break a symmetry in a situation where we have not yet observed any violation of this symmetry is not nice. That means, one needs serious reasons. But there are very serious reasons - all one has to do to see this is to look at the alternatives.

The alternative is giving up realism.

I think that it's important to distinguish between anti-realism in the form of solipsism--"Nothing exists other than my perceptions, and theories of physics are just ways of describing regularities in those perceptions"--and in the form of the conclusion that reality is very different from how it appears. A Many-Worlds type model is very different from the reality that we perceive, but it's not throwing away the idea of reality.

There is a philosophical issue, here, which is to what extent should a theory of physics be as close as possible to what we directly observe. It certainly is logically possible that there could be a huge gap between the two.
 
  • #58
Demystifier said:
dBB is still analogous to classical statistical mechanics. But the point is that it is analogous to statistical mechanics of particles in some external potential. In classical mechanics not only particles are real, but the potential is real as well. The role of potential is somewhat different from that of the particles, which is why you can call it nomological rather than ontological. But if you define the notion of reality in the PBR sense, then the potential is real, and not merely epistemic.
Thanks, that's a nice point of view. I still don't understand something: both the quantum potential and the probabilities are derived from the wave function. The wave function and the potential are regarded as ontic in the PBR sense. So where does the epistemicity -which is reflected by the probabilities- come from? Or speaking in terms of classical mechanics: we seem to have an equation for a state of knowledge ρ which describes the motion of some particles in an ontic potential V(ρ). This is hard to reconcile for me.
 
  • #59
stevendaryl said:
If you actually look at the mechanism for causality, you will see that it is ultimately about boundary conditions, not about the directionality of the laws of physics. Causality propagates in the direction of increasing entropy.
Unless we can derive the second law from the time-symmetric laws, we have to add it as fundamental (directional) law by its own right. I still haven't wrapped my head around how such a derivation could look like (Demystifier has written an interesting paper with different local arrows of time, but I haven't had the time to read it in detail).
 
  • #60
kith said:
Unless we can derive the second law from the time-symmetric laws, we have to add it as fundamental (directional) law by its own right. I still haven't wrapped my head around how such a derivation could look like (Demystifier has written an interesting paper with different local arrows of time, but I haven't had the time to read it in detail).

No, you absolutely do not have to add the second law as a fundamental law. There is no need for it, since the time-symmetric laws of physics are overwhelmingly likely to evolve a low entropy state into a higher entropy state. The thing that you may have to add by hand as unexplained additional assumption is that the universe started out in an extremely low entropy state.
 
  • #61
stevendaryl said:
Take an example of a single particle in some kind of potential well. The standard quantum approach is that the wave function gives a probability distribution on positions of the particle, and if you perform a measurement to find out the particle's location, then the wave function collapses to something sharply peaked at the observed location. A second measurement will have probabilities given by the collapsed wave function, not the original.

But in the Bohm model, the particle always has a definite position. So what is the relationship between the wave function and the particle's position? When you detect the particle at some location, does the wave function collapse to a sharply peaked one? If so, what is the mechanism for this? Presumably, this means that there is an interaction between the detector and the wave function, but such an interaction goes beyond ordinary quantum mechanics, it seems to me. I don't see that they are equivalent.

The relationship between wave function and configuration is that the configuration q(t) follows the guiding equation defined by the wave function.

There is an interaction between the detector and the particle, and this interaction has to be described by the dBB theory for the whole system. This may be impossible in reality but is unproblematic conceptually. There is a wave function of the combined system [itex]\Psi(q_{sys},q_{env},t)[/itex], which follows a Schrödinger equation, and configurations [itex]q_{sys}(t), q_{env}(t)[/itex] which follow the guiding equation. The point is that there is also an effective wave function of the system, which is equivalent whenever there is no interaction between the system and the environment, and it is defined simply by
[itex]\psi(q_{sys},t)=\Psi(q_{sys},q_{env}(t),t)[/itex]. But during the interaction, the effective wave function does not follow the Schrödinger equation for the system alone. Instead, its evolution describes the collapse of the wave function. The final result of this process depends on the configuration of the measurement device [itex]q_{env}(t_{fin})[/itex], or, in other words, on the measurement result which we see.

In some sense, this goes beyond QM, indeed. QM does not describe the measurement process. But all what QM tells us is recovered. The wave function collapses, the resulting effective wave function of the system is uniquely defined by the result of the measurement. The resulting probabilities can be computed correctly, using the same QM formulas, if one assumes that the initial state of the whole system is [itex]\Psi(q_{sys},q_{env})=\psi(q_{sys})\psi_{env}(q_{env})[/itex] and that they are all in quantum equilibrium.
 
  • #62
Ilja said:
There is an interaction between the detector and the particle, and this interaction has to be described by the dBB theory for the whole system. This may be impossible in reality but is unproblematic conceptually. There is a wave function of the combined system [itex]\Psi(q_{sys},q_{env},t)[/itex], which follows a Schrödinger equation, and configurations [itex]q_{sys}(t), q_{env}(t)[/itex] which follow the guiding equation.

I understand that, but without at least argument showing that the interaction of the detector with the wave function will cause an apparent collapse of the (effective single-particle) wave function to a sharply peaked delta-function, I don't think that you can say with certainty that the Bohm approach is empirically equivalent to the usual approach.
 
  • #63
kith said:
The wave function and the potential are regarded as ontic in the PBR sense. So where does the epistemicity -which is reflected by the probabilities- come from? Or speaking in terms of classical mechanics: we seem to have an equation for a state of knowledge ρ which describes the motion of some particles in an ontic potential V(ρ).

The state of knowledge is introduced by the notion of quantum equilibrium.

You have a bottle - ontic. You put some water into the bottle. It can move now in the bottle in a quite arbitrary way. But you have the somehow preferred state where the water is in "equilibrium", at rest. This equilibrium is clearly defined by the form of the bottle.

In a similar way, an epistemic probability distribution ρ(q) can be arbitrary, and the dBB equations tell us how it changes in time. But there is a special probability distribution - the quantum equilibrium [itex]\rho(q)=|\psi(q)|^2[/itex] - which is preferred: Once it is initially in this equilibrium it remains there.
 
  • #64
stevendaryl said:
No, you absolutely do not have to add the second law as a fundamental law. There is no need for it, since the time-symmetric laws of physics are overwhelmingly likely to evolve a low entropy state into a higher entropy state.
Neither the Liouville nor the von Neumann equation evolve a low entropy state to a higher entropy state. So what time-symmetric laws are you referring to?
 
  • #65
stevendaryl said:
I understand that, but without at least argument showing that the interaction of the detector with the wave function will cause an apparent collapse of the (effective single-particle) wave function to a sharply peaked delta-function, I don't think that you can say with certainty that the Bohm approach is empirically equivalent to the usual approach.

Of course you will not obtain δ-functions if you consider a realistic measurement process with finite energy. But any realistic description of measurements in QM has the same problem.

What is usually done in QM is to consider measurements as interactions such that [itex]\psi_s\psi_e \to \sum \psi_s^i\psi_e^i[/itex]. If now the [itex]\psi_e^i(q_e)[/itex] are interpreted as macroscopic states of the measurement device after the measurement, then this can be clearly translated into the condition that the [itex]\psi_e^i(q_e)[/itex] don't overlap so that if we know the [itex]q_e[/itex] we can also uniquely identify the corresponding value i, the measurement result, because [itex]\psi_e^j(q_e)\approx 0[/itex] for all other j. But then [itex] \sum_j \psi_s^j\psi_e^i \approx \psi^i_s\psi^i_e[/itex].
 
  • #66
Thanks, Ilja. I think I get the main idea of the analogy now.

Ilja said:
But there is a special probability distribution - the quantum equilibrium [itex]\rho(q)=|\psi(q)|^2[/itex] - which is preferred: Once it is initially in this equilibrium it remains there.
It seems a bit strange to me that the equilibrium probability distribution follows changes in the potential instantaneously.
 
Last edited:
  • #67
kith said:
Unless we can derive the second law from the time-symmetric laws, we have to add it as fundamental (directional) law by its own right. I still haven't wrapped my head around how such a derivation could look like (Demystifier has written an interesting paper with different local arrows of time, but I haven't had the time to read it in detail).

I might point out that entropy can increase from "now" in both time directions. Obviously most lab situations are special cases in which entropy is made to be unusually lower than the surroundings. If you sampled the entropy of a typical environment, in thermal equilibrium: wouldn't you expect it to be at a local minimum? Ie the number of states it could have evolved from and the number of states it can evolve towards are both greater than now? That would be the statistical view, I believe. In a film of that, I do not believe you could discern its direction as forward or backward in any way (in contrast to the usual idea of a film of a glass breaking being an example of the time direction being obvious).

Or alternately, think of decoherence: entanglement disperses and decreases as you go forward in time. Does that require a fundamental time asymmetric law to describe as well?
 
  • #68
kith said:
Neither the Liouville nor the von Neumann equation evolve a low entropy state to a higher entropy state. So what time-symmetric laws are you referring to?

You have to be a little careful about what you mean by "entropy" when talking about the second law. In the case of both classical phase space and quantum wave functions, there is a notion of "entropy" that is unchanged by the evolution equations. But that is not the kind of entropy that we observe to always increase. When a quantity of gas expands rapidly to fill a vacuum, that's an irreversible process, even though the volume in phase space (which is what is preserved under Liouville equations) remains constant. We don't observe phase space volumes, we observe that gas expands to fill a vacuum, and it never happens that all the gas in a container spontaneously gathers into a small volume, leaving vacuum behind.

The kind of entropy that we observe to increase is coarse-grain entropy. Roughly speaking, the coarse-grain entropy of a state is the log of the number of microscopic states that "look the same" under the coarse-graining. Time-symmetric laws don't imply that this notion of entropy is constant.
 
  • #69
stevendaryl said:
I think that it's important to distinguish between anti-realism in the form of solipsism--"Nothing exists other than my perceptions, and theories of physics are just ways of describing regularities in those perceptions"--and in the form of the conclusion that reality is very different from how it appears. A Many-Worlds type model is very different from the reality that we perceive, but it's not throwing away the idea of reality.
I know that one can play around a lot with different notions of realism and confuse people. Especially if one introduces interpretations which are not well-defined like many worlds (it presupposes a fundamental decomposition of the universe into systems without any base, and I have not seen a satisfactory derivation of the Born rule yet).

And I have no problem if someone tries the hard job of developing a weaker notion of realism which is nonetheless powerful enough to be comparable with the common sense realism but compatible with Einstein causality and the violation of Bells inequality. I doubt this is possible, but who knows.

My point is that there is a simple and straightforward alternative - the quite trivial assumption that the symmetry group of subquantum theory is different from that of quantum theory, or that of quantum gravity different from that of classical gravity. No need to change a single bit in fundamental notions of realism and causality. This is the simple, easy way out of the violation of Bell's inequality, and of a lot of other problems too.

But it is essentially forbidden. String theory publishes thousands of articles without a single empirical prediction. Based on the alternative approach, you can be happy if you succeed to publish a single paper, if you succeed to derive the whole particle content of the standard model from simple principles applied to a quite simple model (arXiv:0908.0591), and you can be sure that nobody even looks at it - once this horrible approach requires a preferred frame.

There is a philosophical issue, here, which is to what extent should a theory of physics be as close as possible to what we directly observe. It certainly is logically possible that there could be a huge gap between the two.
Logically possible is almost everything. So that's not a point. Of course, a theory closer to what we directly observe should be preferable, the question is if the competitor has other advantages.

Logically, the argument "The lack of evidence for X implies evidence against Y" requires you to establish that if Y were true, then X would follow. In the case X = changing the past, Y = time symmetric physics, it doesn't work. If you actually look at the mechanism for causality, you will see that it is ultimately about boundary conditions, not about the directionality of the laws of physics. Causality propagates in the direction of increasing entropy.
My point was not a logical proof, but that there is strong empirical evidence that there is no time symmetry in nature. That there is an animal named mechanism for causality is new to me, as far as I know causality is fundamental, assumed as given from the start. But, I guess, we think about different things named "causality".
 
  • #70
DrChinese said:
If you sampled the entropy of a typical environment, in thermal equilibrium: wouldn't you expect it to be at a local minimum?
I'm not sure I understand your post correctly. With environment you mean something like a cold vacuum in an experimental chamber which encases the system of interest?

DrChinese said:
In a film of that, I do not believe you could discern its direction as forward or backward in any way.
So this film would show the evacuation of the chamber before the experiment and the flooding with air afterwards?

DrChinese said:
Or alternately, think of decoherence: entanglement disperses and decreases as you go forward in time. Does that require a fundamental time asymmetric law to describe as well?
Obviously, I don't doubt that entanglement gets destroyed by decoherence and that a gas expands. ;-) I just don't really understand how these processes are derived from the time-symmetric laws.

stevendaryl said:
The kind of entropy that we observe to increase is coarse-grain entropy. Roughly speaking, the coarse-grain entropy of a state is the log of the number of microscopic states that "look the same" under the coarse-graining.
I never really got this distinction. The Liouville equation leaves the Shannon entropy of the probability distribution constant. What you call the coarse-grain entropy is also called the Boltzmann entropy and I thought it was equivalent to the Shannon entropy. Maybe that's a misconception?

Puh, I think this really leads off topic.
 
Last edited:

Similar threads

Replies
220
Views
20K
Replies
69
Views
7K
Replies
7
Views
2K
Replies
47
Views
4K
Replies
28
Views
2K
Replies
36
Views
8K
Back
Top