An interpretation of quantum mechanics in terms of Brownian–like motion.

In summary, Edward Nelson proposed a mathematical model for quantum mechanics based on a universal jiggling of particles. However, this theory is inconsistent with special relativity and has other problems, including the lack of a physical justification for a key quantization condition. Various attempts have been made to solve these issues, but none have been completely successful. Additionally, there have been criticisms that the probability density in Nelson's theory plays the role of a force field, but this is based on a misunderstanding of its physical meaning. Overall, while Nelson's insights may lead to a new testable theory, it is not currently a complete solution for quantum mechanics.
  • #1
miosim
140
0
Edward Nelson, a Princeton mathematician showed that quantum mechanics could be derived from the principle that elementary particles are subjected to a universal jiggling of an unspecified cause.
As I understand the obstacle to use this mathematical model as a physical one, is that dynamics of jiggling particles contradict with relativism.

Is it true? Are there are other problems with this model?
 
Physics news on Phys.org
  • #2
The problem I have with this kind of thinking is that it just sounds like window dressing to me, in the absence of any observational handle on the nature of the jiggling. To me, the big surprise of quantum mechanics is the importance of indeterminacy-- the inability to say what is an apple and what is an orange, when we are faced with apple/orange superpositions. I would say the lesson there is that determinacy ain't what we cracked it up to be, but we can always ignore that lesson and cling to determinacy-- and just say that some kind of inscrutable jiggling is what only appears to spoil the determinacy. What's the point of doing that, other than excusing us to cling to outmoded ideas that don't work unless we force them to? Now, this doesn't make Nelson's insights a waste of time-- if they give us a way to probe the jiggling, such that it might have actual properties or ramifications that lead to a new testable theory, then it's the best that science can get. But it isn't there yet.
 
  • #3
Ken G said:
... Now, this doesn't make Nelson's insights a waste of time-- if they give us a way to probe the jiggling, such that it might have actual properties or ramifications that lead to a new testable theory ...

… Or if the new theory could offer more reasonable interpretation of quantum events than mind twisting interpretations of “existence in many places at once” and “parallel worlds”.
 
Last edited:
  • #4
miosim said:
Edward Nelson, a Princeton mathematician showed that quantum mechanics could be derived from the principle that elementary particles are subjected to a universal jiggling of an unspecified cause.
As I understand the obstacle to use this mathematical model as a physical one, is that dynamics of jiggling particles contradict with relativism.

Is it true? Are there are other problems with this model?
It is true that Nelson's theory is inconsistent with special relativity in the sense that the dynamics of the particles in his theory is nonlocal, and hence picks out a preferred foliation of spacetime. But this isn't particular to Nelson's theory. Even deterministic hidden-variable theories like the de Broglie-Bohm theory are inconsistent with SR for the same reason as Nelson's.

Re other problems with Nelson's theory, Wallstrom pointed out back in the late 80's/early 90's that Nelson's theory only recovers QM if one imposes a special quantization condition on the mean particle momentum, grad S, namely that the change in grad S around any closed path in space is an integer multiple of Planck's constant:

http://www.springerlink.com/content/v07n5l128613x5g0/

http://pra.aps.org/abstract/PRA/v49/i3/p1613_1

This quantization condition turns out to be equivalent to the condition that the derived wavefunctions in Nelson's theory be single-valued and smooth, since S/h plays the role of the phase of the derived wavefunctions. The problem with this (as Wallstrom saw it) was that while standard QM has 'natural' physical justifications for the conditions of single-valuedness and smoothness on psi (e.g. it results from requiring that psi satisfy the linear superposition principle, that |psi|^2 be single-valued and smooth, and that expectation values of quantum observables remain finite), Wallstrom was unable to identify any independent physical justification for the quantization condition within the equations of Nelson's theory, and he noted that any attempt to use the linearity of the Schroedinger equation in such a justification would be logically circular (i.e. it would be tantamount to assuming quantum mechanics). So, Wallstrom concluded that the quantization condition had no independent justification in Nelson's theory, and hence that Nelson's theory (and all equivalent versions of it) does not succeed in deriving QM.

Now, there have been various attempts to solve the problem raised by Wallstrom. Of the one's that I know of which are in print, they are by Smolin, Fritsche and Haugk, and Schmelzer:

http://arxiv.org/abs/quant-ph/0609109

http://arxiv.org/abs/0912.3442

http://arxiv.org/abs/1101.5774

I am also working on a proposed solution, which is quite different from the above approaches, and hope to submit a preprint to Arxiv soon.

In my assessment, the proposals of Smolin and Fritsche/Haugk are unsuccessful, while Schmelzer's seems solid to me. Smolin essentially wants to claim that discontinuous (and hence multi-valued) wavefunctions are physically admissible for QM, but doesn't seem to realize that (1) this causes the expectation values of quantum observables such as the momentum and total energy to diverge, and (2) it causes |psi|^2 to be multi-valued, thereby destroying its interpretation as a probability density. Fritsche and Haugk's proposal is essentially that, in order for the Nelsonian particle probability density to be normalizable, it must be the case that the momentum, grad S, is quantized; however, to justify this claim, their argument assumes that the derived wavefunctions, out of which the probability density is constructed, satisfy the linear superposition principle, thereby begging the question. Schmelzer's approach seems solid to me because it simply involves imposing a natural boundary condition on the Nelsonian particle probability density, namely that the probability density be a smooth function so that the Laplace operator acting on it remains finite. I had independently thought of a similar proposal previously, but Schmelzer beat me to the punch by actually fleshing out the details.

Another complaint that's been leveled at Nelson's theory (e.g. by Bohm and Hiley) is that it looks like the probability density plays the role of a real physical force field in the osmotic velocity equation, and that this is logically inconsistent with the interpretation of the probability density as a distribution over a statistical ensemble of systems (or as a measure of uncertainty for an observer, if you're a subjectivist about probability). However, in my opinion, this complaint is incorrect, as it's based on a basic misunderstanding about the physical meaning of the osmotic velocity.

Hope the above info helps.
 

FAQ: An interpretation of quantum mechanics in terms of Brownian–like motion.

What is the main concept behind an interpretation of quantum mechanics in terms of Brownian-like motion?

The main concept is that the seemingly random and unpredictable behavior of particles at the quantum level can be explained by a type of Brownian motion, which is a random movement of particles in a fluid due to collisions with smaller particles. This interpretation suggests that the uncertainty and indeterminacy of quantum mechanics is actually caused by the interactions between particles in a fluid-like medium.

How does this interpretation differ from other interpretations of quantum mechanics?

This interpretation differs from others, such as the Copenhagen interpretation, which views particles as having inherently random properties and behavior. The Brownian-like motion interpretation instead suggests that this randomness is a result of interactions with a surrounding medium.

What evidence supports this interpretation?

One piece of evidence is the phenomenon of decoherence, where quantum systems interact with their environment and lose their quantum properties, behaving more like classical systems. This can be seen as a form of Brownian motion at the quantum level. Additionally, experiments have shown that particles at the quantum level can exhibit behavior similar to Brownian motion.

Are there any potential implications of this interpretation?

Yes, this interpretation could have implications for our understanding of the fundamental nature of reality and the role of consciousness in observing and influencing quantum systems. It also has the potential to bridge the gap between quantum mechanics and classical mechanics, as it suggests that classical behavior can emerge from quantum interactions with a surrounding medium.

Is this interpretation widely accepted among scientists?

No, it is still a relatively new and debated concept in the scientific community. While some scientists find it intriguing and promising, others argue that it may not fully explain all aspects of quantum mechanics and that further research is needed to fully understand its implications.

Similar threads

Back
Top