Quantum mechanics is not weird, unless presented as such

In summary, quantum mechanics may seem weird due to the way it is often presented to the general public. However, there is a long history of this approach, as it sells better. In reality, it can be an obstacle for those trying to truly understand the subject. The paper referenced in the conversation shows that quantum mechanics can actually be derived from reasonable assumptions, making it not as weird as some may think. However, this derivation is only one author's view and may not be the complete truth. There are also other interpretations of quantum mechanics, such as the ensemble interpretation, which may not be fully satisfactory. Overall, a proper derivation of quantum mechanics must account for all aspects, including the treatment of measurement devices and the past before measurements
  • #281
stevendaryl said:
If we assume (as Einstein did) that causal influences propagate at lightspeed or slower, then [...] we don't need to know what conditions are like everywhere, just in the backward lightcone of where we are trying to make a prediction.

But we need to know the complete details of the universe in the backward light cones with apex at the spacetime positions at which e measure. This means all the details of the preparation and transmission, including all the details of the preparation equipment and the transmission equipment. For a nonlocal experiment over 1 km, the two backward lightcones span at the time of the preparation of the common signal a spherical region of at least this size, which is a huge nonlocal system on all of whose details the prediction at the final two points may depend.

Thus to ''know what conditions are like just in the backward lightcone'' is a very formidable task, as any lack of detail in our model of what we assume in this light cone contributes to the nondeterminism. You dismiss this task with the single word ''just''.

Not a single paper I have seen takes this glaring loophole into account.
 
Last edited:
Physics news on Phys.org
  • #282
A. Neumaier said:
Yes. The missing dynamics is that of the environment.

In all descriptions of Bell-like experiments, the very complex environment (obviously nonlocal, since it is the remainder of the universe) is reduced to one single act - the collapse of the state. Thus even if the universe evolves deterministically, ignoring the environment of a tiny system to this extent is sufficient cause for turning the system into a random one. (The statistical mechanics treatment in the review paper that I cited and you found too long to study tries to do better than just postulating collapse.)

Well, that's interesting, but surely that's not a standard view, that the apparent nondeterminism of QM would is resolved by ignored details of the rest of the universe?
 
  • #283
stevendaryl said:
Well, that's interesting, but surely that's not a standard view, that the apparent nondeterminism of QM would be resolved by ignored details of the rest of the universe?
None of my views on the foundations of quantum mechanics, as argued in this thread, is standard. Does it matter? It resolves or at least greatly reduces all quantum mysteries - only that's what matters.

In the past, I had spent a lot of time (too much for the gains I got) studying in detail the available interpretations of QM and found them wanting. Then I noticed more and more small but important things that people ignore routinely in foundational matters although they are discussed elsewhere:

  • It is fairly well known that real measurements are rarely von Neumann measurements but POVM measurement. Nevertheless, people are content to base their foundations on the former.
  • It is well-known that real systems are dissipative, and it is known that these are modeled in the quantum domain by Lindblad equations (lots of quantum optics literature exists on this). Nevertheless, people are content to base their foundations on a conservative (lossless) dynamics.
  • It is well-known how dissipation results from the interaction with the environment. Nevertheless, people are content to ignore the environment in their foundations. (This changed a little with time. There is now often a lip service to decoherence, and also often claims that it settles things when taken together with the traditional assumptions. It doesn't, in my opinion.)
  • It is known (though less well-known) that models in which the electromagnetic field is treated classically and only the detector is quantized produce exactly the same Poisson statistics for photodetection as models employing a quantum field in a coherent state. This conclusively proves that the detector signals are artifacts produced by the detector and cannot be evidence of photons (since they are completely absent in the first model). Nevertheless, people are content to treat in their foundations detector signals as proof of photon arrival.
  • It is well-known that the most fundamental theory of Nature is quantum field theory, in which particles are mere field excitations and not the basic ontological entities. Nevertheless, people are content to treat in their foundations quantum mechanics in terms of particles.
Taken together, I could no longer take seriously the main stream foundational studies, and lost interest in them. Instead, an alternative view formed in my vision and became more and more comprehensive with time. Where others saw weirdness I saw lack of precision in the arguments and arguments with too simplified assumptions, and I saw different ways of phrasing in ordinary language exactly the same math that underlies the standard, misleading language.

This being said, let me finally note that it is well-known that decoherence turns pure states into mixed states. Since pure states form a deterministic quantum dynamics, this shows that, for purely mathematical reasons - and independent of which ontological status one assigns to the wave function - accounting for the unmodelled environment produces statistical randomness in addition to the alleged irreducible quantum randomness inherent in the interpretation of the wave function. Thus, to answer your question,
stevendaryl said:
surely that's not a standard view, that the apparent nondeterminism of QM would be resolved by ignored details of the rest of the universe?
I conclude that it is a standard view that ignoring details of the rest of the universe introduces additional nondeterminism. The only nonstandard detail I am suggesting is that the same mechanism that is already responsible for a large part of the observed nondeterminism (all of statistical mechanics is based on it) can as well be taken to be responsible for all randomness. Together with shifting the emphasis from the wave function (a mathematical tool) to the density matrix (a matrix well-known to contain the physical information, especially the macroscopic, classical one), all of a sudden many things make simple sense. See my post #257 and its context.
Those who believe in the power of Occam's razor should therefore prefer my approach. It also removes one of the philosophical problems of quantum mechanics - to give irreducible randomness an objective meaning.
 
  • #284
A. Neumaier said:
None of my views on the foundations of quantum mechanics, as argued in this thread, is standard. Does it matter? It resolves or at least greatly reduces all quantum mysteries - only that's what matters.

  • It is fairly well known that real measurements are rarely von Neumann measurements but POVM measurement. Nevertheless, people are content to base their foundations on the former.
  • It is well-known that real systems are dissipative, and it is known that these are modeled in the quantum domain by Lindblad equations (lots of quantum optics literature exists on this). Nevertheless, people are content to base their foundations on a conservative (lossless) dynamics.
  • It is well-known how dissipation results from the interaction with the environment. Nevertheless, people are content to ignore the environment in their foundations. (This changed a little with time. There is now often a lip service to decoherence, and also often claims that it settles things when taken together with the traditional assumptions. It doesn't, in my opinion.)
  • It is known (though less well-known) that models in which the electromagnetic field is treated classically and only the detector is quantized produce exactly the same Poisson statistics for photodetection as models employing a quantum field in a coherent state. This conclusively proves that the detector signals are artifacts produced by the detector and cannot be evidence of photons (since they are completely absent in the first model). Nevertheless, people are content to treat in their foundations detector signals as proof of photon arrival.
  • It is well-known that the most fundamental theory of Nature is quantum field theory, in which particles are mere field excitations and not the basic ontological entities. Nevertheless, people are content to treat in their foundations quantum mechanics in terms of particles.
I agree with all of that, but I'm not at all convinced that taking into account all of that complexity makes any difference. There is a reason that discussions of Bell's inequality and other foundational issues use simplified models, and that is that reasoning about the more realistic models is much more difficult. The assumption is that if we can understand what is going on in the more abstract model, then we can extend that understanding to more realistic models. It's sort of like how when Einstein was reasoning about SR, he used idealized clocks and light signals, and didn't try to take into account that clocks might be damaged by rapid acceleration, or that the timing of arrival of a light signal may be ambiguous, etc. To make the judgment that a simplified model captures the essence of a conceptual problem is certainly error-prone, and any conclusion someone comes to is always eligible to be re-opened if someone argues that more realistic details would invalidate the conclusion.

But in the case of QM, I really don't have a feeling that any of the difficulties with interpreting QM are resolved by the complexities you bring up. It seems to me, on the contrary, that the complexities can't possibly resolve them in the way you seem to be suggesting.

Whether it's QM or QFT, you have the same situation:
  • You have an experiment that involves a measurement with some set of possible outcomes: [itex]o_1, o_2, ..., o_N[/itex]
  • You use your theory to predict probabilities for each outcome: [itex]p_1, p_2, ..., p_N[/itex]
  • You perform the measurement and get some particular outcome: [itex]o_j[/itex]
  • Presumably, if you repeat the measurement often enough with the same initial conditions, the relative frequency of getting [itex]o_j[/itex] will approach [itex]p_j[/itex]. (If not, your theory is wrong, or you're making some error in your experimental setup, or in your calculations, or something)
What you seem to be saying is that the outcome [itex]o_j[/itex] is actually determined by the details you left out of your analysis. That seems completely implausible to me, in light of the EPR experiment (unless, as in Bohmian mechanics, the details have a nonlocal effect). In EPR, Alice and Bob are far apart. Alice performs a spin measurement along a particular axis, and the theory says that she will get spin-up with probability 1/2 and spin-down with probability 1/2. It's certainly plausible, considering Alice's result in isolation, that the details of her measuring device, or the electromagnetic field, or the atmosphere in the neighborhood of her measurement might affect the measurement process, so that the result is actually deterministic, and the 50/50 probability is some kind of averaging over ignored details. But that possibility becomes completely implausible when you take into account the perfect anti-correlation between her result and Bob's. How do the details of Bob's device happen to always produce the opposite effect of the details of Alice's device?

I understand that you can claim that in reality, the anti-correlation isn't perfect. Maybe it's only 90% anti-correlation, or whatever. But that doesn't really change the implausibility much. In those 90% of the cases where they get opposite results, it seems to me that either the details of Bob's and Alice's devices are irrelevant, or that mysteriously, the details are perfectly matched to produce opposite results. I just don't believe that that makes sense. Another argument that it can't be the details of their devices that make the difference is that it is possible to produce electrons that are guaranteed to be spin-up along a certain axis. Then we can test whether Alice always gets spin-up, or whether the details of her measuring device sometimes convert that into spin-down. That way, we can get an estimate as to the importance of those details. My guess is that they aren't important, but I need somebody who knows about experimental results to confirm or contradict that guess.

So if the ignored, microscopic details of Alice's and Bob's devices aren't important (and I just don't see how they plausibly can be), that leaves the ignored environment: the rest of the universe. Can details about the rest of the universe be what determines Alice's and Bob's outcomes? To me, that sounds like a hidden-variables theory of exactly the type that Bell tried to rule out. The hidden variable [itex]\lambda[/itex] in his analysis just represents any details that are common to Alice's and Bob's measurements. The common environment would certainly count. Of course, Bell's proof might have loopholes that haven't been completely closed. But it seems very implausible to me.

What I would like to see is some kind of simulation of the EPR experiment in which the supposed nondeterminism is actually resolved by the ignored details. That's what would convince me.
 
  • #285
stevendaryl said:
Another argument that it can't be the details of their devices that make the difference is that it is possible to produce electrons that are guaranteed to be spin-up along a certain axis. Then we can test whether Alice always gets spin-up, or whether the details of her measuring device sometimes convert that into spin-down. That way, we can get an estimate as to the importance of those details. My guess is that they aren't important, but I need somebody who knows about experimental results to confirm or contradict that guess.
If the input is all-spin-up and the measurement tests for spin-up, the result will be deterministic independent of the details of the detector. But if the input is all spin-up and the measurement tests in another direction, the random result will be produced by the detector. Both can be seen by considering a model that inputs a classical polarized field and uses a quantum detector sensitive to the polarization direction.
stevendaryl said:
So if the ignored, microscopic details of Alice's and Bob's devices aren't important (and I just don't see how they plausibly can be), that leaves the ignored environment: the rest of the universe.
The ignored environment includes the microscopic details of Alice's and Bob's devices and how they were influenced by the common past. As I haven't done the calculations (remember the report I linked to needed 150 pages to make the case in the particular models studied there) I cannot tell what would be the mathematical result but I suspect it would just give what is actually observed.

But you are mixing two topics that should be kept separate - the question of whether perfect anticorrelations can be explained classically, and the question of whether quantum randomness can be explained by restricting the deterministic quantum dynamics of the universe. Deterministic is far from equivalent with classical and/or Bell-local! Therefore these are very different questions.

The quantum mechanical correlations observed in a tiny quantum system come from the quantum mechanical dynamics of the density matrix of the universe - there is nothing classical in the latter, hence one shouldn't expect that restriction to a tiny subsystem would be classical. On the contrary, all we know about the many actually studied subsystems of slightly larger quantum systems indicates that one gets exactly the usual quantum descriptions of the isolated subsystem, plus correction terms that account for additional randomness - decoherence effects, etc.. There is no ground at all to think that this should becomedifferent when the systems get larger, and ultimately universe-sized.
 
  • #286
A. Neumaier said:
The ignored environment includes the microscopic details of Alice's and Bob's devices and how they were influenced by the common past.

But it seems to me that the perfect anti-correlations imply that the details of Alice's and Bob's devices AREN'T important. Alice can independently fool with the details of her device, and that won't upset the perfect anti-correlations with Bob's measurement.

But you are mixing two topics that should be kept separate - the question of whether perfect anticorrelations can be explained classically, and the question of whether quantum randomness can be explained by restricting the deterministic quantum dynamics of the universe. Deterministic is far from equivalent with classical and/or Bell-local! Therefore these are very different questions.

Yes, I agree that they are different questions, but as I said, I find the idea that quantum nondeterminism can be explained through ignored details about the rest of the universe to be sufficiently like the classical case that I am very dubious that it can be made to work. There are more exotic variants of this idea, which is the Bohmian approach (the extra details to resolve the nondeterminism are nonlocal) or the retrocausal approach (the extra details are found in the future, not in the present). But I find it very implausible that extra details about the causal past can possibly explain the nondeterminism. As I said, it would take a simulation (or a calculation, if I could follow it) to convince me of such a resolution. I am not a professional physicist, so I don't have the qualifications or knowledge to state this with certainty, but it seems to me that your suggestion might be provably impossible.
 
  • #287
stevendaryl said:
I am not a professional physicist, so I don't have the qualifications or knowledge to state this with certainty, but it seems to me that your suggestion might be provably impossible.

To me it seems most likely, of course unless we're drifting into a superdeterministic interpretation which is a feeling I'm getting.

Also I'm still not even clear what exactly is being argued: the idealized model was rejected without any attempt at reframing it in this new view so we didn't get any good look at it.

Not to be a bore but I think, to get at anything conclusive, the best bet is to go for last year's loophole-free experimental test. It is a realistic example and the setup is, after all, relatively simple. The problem is, honestly, we don't really believe in this idea, the burden of proof doesn't lie in the accepted framework (however unfair it may appear to be).
 
  • #288
ddd123 said:
To me it seems most likely, of course unless we're drifting into a superdeterministic interpretation which is a feeling I'm getting.

Yeah, well, superdeterminism is very irksome for philosophical and scientific reasons, but sometimes I wonder it it really is the answer. We think of the choices we make (about whether to measure this or that) as freely chosen, but since we are physical systems, obeying the same laws of physics as electrons, at some level, we no more choose what we do than an electron does.
 
  • #289
stevendaryl said:
Yeah, well, superdeterminism is very irksome for philosophical and scientific reasons, but sometimes I wonder it it really is the answer. We think of the choices we make (about whether to measure this or that) as freely chosen, but since we are physical systems, obeying the same laws of physics as electrons, at some level, we no more choose what we do than an electron does.

I don't think that's the problem. It's that the superdeterministic law would have to be concocted specifically to counter our fiddling with the instruments. It's more anthropocentric, not less, imho.
 
  • #290
ddd123 said:
I don't think that's the problem. It's that the superdeterministic law would have to be concocted specifically to counter our fiddling with the instruments. It's more anthropocentric, not less, imho.

I think that depends on the details of the superdeterministic theory. Just saying that there is a conspiracy is pretty worthless, but if someone could give a plausible answer to how the conspiracy is implemented, it might not be objectionable.
 
  • #291
stevendaryl said:
I think that depends on the details of the superdeterministic theory. Just saying that there is a conspiracy is pretty worthless, but if someone could give a plausible answer to how the conspiracy is implemented, it might not be objectionable.

Yes, I really can't imagine how that could be though. If such a theory ends up being non-magical-looking, wouldn't it be just a local realistic one, and thus nonexistent?
 
  • #292
ddd123 said:
Yes, I really can't imagine how that could be though. If such a theory ends up being non-magical-looking, wouldn't it be just a local realistic one, and thus nonexistent?

No. What Bell ruled out was the possibility of explaining the outcome of the EPR experiment by a function of the form:

[itex]P(A\ \&\ B\ |\ \alpha, \beta) = \sum_\lambda P(\lambda)P(A\ |\ \lambda, \alpha) P(B\ |\ \lambda, \beta) [/itex]

A superdeterministic theory would modify this to
[itex]P(A\ \&\ B\ |\ \alpha, \beta) = [ \sum_\lambda P(\lambda)P(A\ \& \alpha |\ \lambda) P(B\ \& \beta |\ \lambda)]/P(\alpha\ \&\ \beta) [/itex]

Alice and Bob's settings [itex]\alpha[/itex] and [itex]\beta[/itex] would not be assumed independent of [itex]\lambda[/itex]. That's a different assumption, and the fact that the former is impossible doesn't imply that the latter is impossible.
 
  • #293
stevendaryl said:
I find the idea that quantum nondeterminism can be explained through ignored details about the rest of the universe to be sufficiently like the classical case that I am very dubious that it can be made to work.
I don't think this can be made to work.

But you misunderstood me. I am only claiming the first part, ''that quantum nondeterminism can be explained through ignored details about the rest of the universe'', but not that it makes the explanation sufficiently classical. It makes the explanation only deterministic, which for me is something completely different. Nevertheless it is a step forward. Unlike Bohmian mechanics it needs not the slightest alterations to the quantum formalism.
 
  • #294
ddd123 said:
the idealized model was rejected without any attempt at reframing it
So far I didn't discuss it in detail only because I didn't get so far the requested reassurance that there wouldn't be any further shifting of ground like ''but I had intended ...'', or ''but there is another experiment where ...'', or ''but if you modify the setting such that ...'', where ... are changes in the precise description for which my analysis (of adequacy to the real world, and of similarity to classical situations) would no longer be appropriate.

Once it is clear which absolutely fixed setting is under discussion, with all relevant details, assumptions, and arguments for its weirdness fully spelled out, I'll discuss the model.
 
Last edited:
  • #295
A. Neumaier said:
I don't think this can be made to work.

But you misunderstood me. I am only claiming the first part, ''that quantum nondeterminism can be explained through ignored details about the rest of the universe'', but not that it makes the explanation sufficiently classical.

Well, regardless of whether it's classical or not, I don't believe that it is possible without "exotic" notions of ignored details (such as those that work backward in time or FTL).

It makes the explanation only deterministic, which for me is something completely different. Nevertheless it is a step forward. Unlike Bohmian mechanics it needs not the slightest alterations to the quantum formalism.

Well, if it works. That's what I find doubtful. Quantum mechanics through the Born rule gives probabilities for outcomes. For pure QM to give deterministic results means that the evolution of the wave function, when you take into account all the details of the environment, makes every probability go to either 0 or 1. That does not seem consistent with the linearity of quantum mechanics. If you have a wave function for the whole universe that represents Alice definitely getting spin-up, and you have a different wave function that represents Alice definitely getting spin-down, then the superposition of the two gives a wave function that represents Alice in an indeterminate state. So to me, either you go to Many Worlds, where both possibilities occur, or you go to something beyond pure QM, such as Bohm or collapse.
 
  • #296
stevendaryl said:
when you take into account all the details of the environment, makes every probability go to either 0 or 1. That does not seem consistent with the linearity of quantum mechanics.
Quantum mechanics is linear (von Neumann equation ##i \hbar \dot\rho=[H,\rho]##) only in the variables ##\rho## that we do not have experimental access to when the system has more than a few degrees of freedom (i.e., when a measuring device is involved). But it is highly nonlinear and chaotic in the variables that are measurable.

This can be seen already classically.
The analogue of the von Neumann equation for a classical multiparticle system is the Liouville equation ##\dot\rho=\{\rho,H\}##, and is also linear. But it describes faithfully the full nonlinear dynamics of the classical multiparticle system! The nonlinearities appear once one interprets the system in terms of the observable variables, whehe one gets through the nonlinear BBGKY hierarchy the nonlinear Boltzmann equation of kinetic theory and the nonlinear Navier-Stokes equations of hydrodynamics.

Similarly, one can derive the nonlinear Navier-Stokes equations of hydrodynamics also from quantum mechanics.

Note also that many of the technical devices of everyday live that produce discrete results and change in a discrete fashion are also governed by nonlinear differential equations. It is well-known how to get bistability in a classical dissipative system from a continuous nonlinear dynamics involving a double well potential! There is nothing mysterious at all in always getting one of two possible definite discrete answers in a more or less random fashion from a nonlinear classical dynamics, which becomes a linear dynamics once formulated (fully equivalently) as a dynamics of phase space functions, which is the classical analogue (and classical limit) of the linear Ehrenfest equation for quantum systems.
 
  • #297
stevendaryl said:
without "exotic" notions of ignored details
The notion of ignored details I am referring to is nothing exotic at all but technically precisely the same routinely applied in the projection operator technique for deriving the equations for a reduced description. It is a very standard technique from statistical mechanics that can be applied (with a slightly different setting in each case) to a variety of situations, and in particular to the one of interest here (contraction of a quantum Liouville quation to a Lindblad equation for a small subsystem). The necessary background can be found in a book by Grabert. (Sorry, again more than a few pages only.)
 
  • #298
A. Neumaier said:
Quantum mechanics is linear (von Neumann equation ##i \hbar \dot\rho=[H,\rho]##) only in the variables ##\rho## that we do not have experimental access to when the system has more than a few degrees of freedom (i.e., when a measuring device is involved). But it is highly nonlinear and chaotic in the variables that are measurable.

As I said, what I would like to see is a demonstration (simulation, or derivation) that the evolution equations of QM (or QFT) lead to (in typical circumstances) selection of a single outcome out of a set of possible outcomes to a measurement. Is there really any reason to believe that happens? I would think that there is not; as a matter of fact, I would think that somebody smarter than me could prove that it doesn't happen. I'm certainly happy to be wrong about this.
 
  • #299
stevendaryl said:
I would like to see is a demonstration (simulation, or derivation) that the evolution equations of QM (or QFT) lead to (in typical circumstances) selection of a single outcome out of a set of possible outcomes
It is nothing particularly demanding, just a lot of technical work to get it right - like every detailed derivation in statistical mechanics. If I find the time I'll give a proper derivation - but surely not in the next few days, as it is the amount of work needed for writing a research paper.

Therefore I had pointed to an analogous result for a classical bistable potential. A 2-state quantum system (elecron with two basis states ''bound'' and ''free'', the minimal quantum measurement device) behaves qualitatively very similarly.
 
  • #300
A. Neumaier said:
Therefore I had pointed to an analogous result for a classical bistable potential. A 2-state quantum system (elecron with two basis states ''bound'' and ''free'', the minimal quantum measurement device) behaves qualitatively very similarly.

I understand how bistable potentials can be similar in some respects, but I don't think that works for distant correlations such as EPR. That's the demonstration that I would like to see: show how tiny details cause Alice and Bob to get definite, opposite values in the case where they are measuring spins along the same direction.
 
  • #301
stevendaryl said:
I understand how bistable potentials can be similar in some respects, but I don't think that works for distant correlations such as EPR.
The mathematics of projection operators does not distinguish between a tensor product of two qubits very close to each other and two qubits very far apart. It doesn't distinguish between whether a system is described only by diagonal density operators (classical deterministic or stochastic system) or by nondiagonal ones (quantum deterministic or stochastic system).Both together are enough to expect that it will work as well for long-distance entangled states of qubits as for classical multistable states, in both cases reproducing the expectations of the corresponding theories.

The detailed predictions are of course different since the dynamics is different. But the statistical principle underlying both is exactly the same (projection operators - same abstract formulas!) and the resulting qualitative dynamical principles (dissipation leads under the correct conditions to discrete limiting states, and they are achieved in a fashion following an exponential law in time) are also precisely the same. Moreover there are already statistical mechanics investigations (such as the 160 page paper I had referred to) that show that the microscopic and the macroscopic are consistent., roughly in the way I discuss.

Thus I (the professional mathematician who has many years of experience in how to build correct intuition about how to qualitatively relate different instances of a common mathematical scheme) don't have any doubt that the details will work out as well when pursued with the required persistence. It would be mathematically weird if it didn't work out. Of course, this is no proof, and occasionally mathematics produces weird truths. So there is merit in doing a detailed model calculation. But as any new detailed application of statistical mechanics to a not completely toy situation is a research project that can easily take the dimensions of a PhD thesis I haven't done yet such a model calculation, and don't know when I'll find the leisure to do it. (I have a full professor's share of work to do in mathematics, and do all physics in my spare time,)

So yes, I agree that detailed calculations are desirable and would give additional insight in the mechanism. But even without this detailed calculations, the nature of the mathematics is of the kind that leads me to expect that nothing surprising (i.e., deviating from the expected results outlined by me) would come out.

Thus you may view my scenario outlined in that part of this discussion centering around the density matrix as a conjecture well supported by qualitative arguments as well as analogies drawn from detailed studies of related problems. Let us postpone the question of the actual validity of the conjecture until someone with enough time has taken up the challenge and wrote a thesis about it.
 
  • #302
There is and always will be a difference between a qualified matematician and a qualified physicist. This thread is a testament that they are in different leagues.
 
  • #303
The central idea of your thread is that the apparent weirdness lies in the fact that people talk about QM in the wrong way and that we can reduce it by changing the way we talk about QM. In your book, you try to present the mathematics of QM and classical mechanics as close as possible.

What do you think about changing the way we talk about classical mechanics? Because the apparent weirdness of QM would also be reduced if we identified preconceived notions which aren't justified by the mathematics in the way we talk about classical mechanics.
 
  • #304
kith said:
What do you think about changing the way we talk about classical mechanics?
Talk about deterministic classical mechanics needs little change, as it leads to few conceptual problems. One must only avoid the use of the notion of point particles in the context of fields, and realize that particles in classical mechanics are in reality also extended. But in the approximation where particles can be treated as rigid impenetrable spheres and the field they generate can be neglected, one can perform a valid point particle limit and hence has a good justification of the point particle picture. The main use of the latter is the great simplification it brings to theory and computations.

On the other hand, traditional thinking in classical statistical mechanics needs some change. The concept of probability (and the associated ensembles) is philosophically thorny, and the concept of indistinguishable particles flies in the face of true classical thinking, though it is necessary to get the correct statistics. In my book I try to minimize the impact of both by emphasizing expectation rather than probability. The latter then appears as a derived concept in the spirit of Whittle's nice book, rather than as a basic entity.

Did you have any other things in mind?
 
Last edited:
  • #306
What do you all mean by "weird"? Do you mean counterintuitive? Or inexplicably bizarre? Do you mean "does not fit with how we normally think of the world"?

I think you mean that it seems bizarre and inexplicable that the basic physical processes should be statistical, indeterminate, and with so little analogy to interactions on the classical scale. One can get all of classical mech starting from pushes and pulls and notions like longer than, as B. Hartmann argues at http://arxiv.org/pdf/1307.0499.pdf. As far as I know, this can't be down with QM. That's what I think you mean by "weirdness".
 
  • #307
crastinus said:
What do you all mean by "weird"?

It is the widespread impression that something is deeply unsatisfactory in the foundations of quantum mechanics. For example,
stevendaryl said:
I find it weird for QM to split things into the three parts: (1) Preparation procedures, (2) Unitary evolution, (3) Measurements. At some level, (1) and (3) are just complicated physical processes, so that should be included in (2).
stevendaryl said:
When people say that the problem in understanding QM is because it is too far removed from human experience and human intuition, I don't agree. To me, what's weird is the parts (1) and (3) above, and what's weird about them is that they seem much too tightly tied to human actions (or to humanly comprehensible actions). Nature does not have preparation procedures and measurements, so it's weird for those to appear in a fundamental theory.
stevendaryl said:
It seems to me that the various ways of explaining away the mystery of QM is akin to trying to prove to somebody that a Mobius strip is actually a cylinder. You point to one section of the strip, and say: "There's no twist in this section." You point to another section of the strip, and say: "There's no twist in this section, either." Then after going over every section, you conclude: "Clearly, there are no twists anywhere. It's a cylinder." The fact that it's twisted is a nonlocal property, you can always remove the twist from anyone section.
 
  • #308
First of all congratulations on your work! I recently graduated in electrical engineering and I look forward to studying the two pillars of physics: quantum mechanics and general relativity. It is possible to already start in these disciplines? Regarding quantum mechanics, perhaps I am not the best person to argue about this for not having a desirable knowledge in this area, but it seems to me that the problem with quantum mechanics would not be dealing with a weird approach but with an inconsistent approach. There are two sides of quantum mechanics, the Schrödinger equation and the act of performing a measurement, and these are incompatible. What would be your view on this aspect? I would like to thank you for any response :smile:.
 
  • #309
Cosmology2015 said:
There are two sides of quantum mechanics, the Schrödinger equation and the act of performing a measurement, and these are incompatible. What would be your view on this aspect? I would like to thank you for any response :smile:.

If "measuring" didn't end up with superluminal influences it would be far less weird and something could be worked out. There are other weird aspects in QM but the EPR physics is the cornerstone of weirdness, without it you could work something out to fix the other aspects.

Maybe an exception is "why the discrete chunks", as in the double slit. If, as Neumaier says, we take fields to be fundamental, why should we get clicks in detectors / why should a detector absorb the whole quantum of energy in one go. Or, if a field quantum takes the form of a spherical wave originating from a source, how can the whole energy distributed in this way end up in one single detector at a certain direction. That's weird too.
 
  • #310
Cosmology2015 said:
the Schrödinger equation and the act of performing a measurement, and these are incompatible.
The latter is an approximation of the former, when one approximates a big system consisting of a small system and a detector by a dynamics for the small system only, combined with conditioning with respect to the result of the experiment.

Quantum mechanics is fully consistent (except for fine points in the construction of relativistic quantum field theory). The inconsistency is in its interpretation only, since the latter is always dominated by imprecise and subjective talk.

To start with quantum mechanics without the usual introduction to a mystery cult, you may try my book mentioned in #2 of this thread. (There the discussion of the mysteries is delayed until Chapter 10, where they are demystified.)
 
Last edited:
  • #311
ddd123 said:
why should we get clicks in detectors / why should a detector absorb the whole quantum of energy in one go
Because of the bistable electrons that make up the detector. An electron cannot fly away to 11.578 percent - either it flies or it doesn't. If it does, it takes away a whole quantum of energy.
ddd123 said:
if a field quantum takes the form of a spherical wave originating from a source, how can the whole energy distributed in this way end up in one single detector at a certain direction.
That's discussed in a famous paper by Mott.
 
  • #312
A. Neumaier said:
Why is material existence absent when there is a mass density? Classically, in classical elasticity theory (which governs the behavior of all solids of our ordinary experience) and hydrodynamics (which governs the behavior of all liquids and gases of our ordinary experience), all you have about material existence is the mass density - unless you go into the microscopic domain where classical descriptions are not applicable.
I've been rolling this around in my head, and I'm still somewhat unclear about a couple of things. I think much of this is due to semantic ambiguity. I'm hoping that you could clarify two points that might help me understand.
1.)Would you mind trying to give me your definition of the term "material"?
2.)Having dismissed the concept of material particles, while viewing the quantum field as being ontologically material, should I interpret that to mean that you view the entire universe as a singular material object?
 
  • #313
Feeble Wonk said:
Would you mind trying to give me your definition of the term "material"?
You had introduced the term, and I used it more or less in the sense that you seemed to use it - physical existence of something you can feel or touch. (Thus excluding the massless and invisible electromagnetic field, which can be said to have physical but not material existence.)

In quantum field theory, the fields exist everywhere, but where they have zero (or small enough) mass or energy density they have no physical effect and are considered absent. For example, the solar system has an appreciable mass density concentrated on a limited number of bodies only (the Sun, the planets, asteroids, comets, and space-crafts and their debris), but additional tiny mass distribution in interplanetary space.

Feeble Wonk said:
you view the entire universe as a singular material object?
I view the universe as a single physical object (why else does it have a name?) composed of many material and nonmaterial parts. The material parts are called galaxies, stars, planets, houses, bricks, cells, molecules, atoms, quarks, etc., the nonmaterial parts are called light, electric fields, magnetic fields, gravitational fields, etc..

So the universe has a density matrix, and by restriction one can get from it the density matrix of arbitrary parts of it (given a sufficiently well-defined operational definition of which part is meant). For example, one can look at the density matrix of the Sun, the Earth, or the gravitational field in between. Or of a beam of particles, or a detector, or the current Queen of England. (Well, in the last two cases, there will be some ambiguity concerning precisely which part of the universe belongs to the object. But strictly speaking, we have this problem already for objects like the Earth or the Sun, where the atmosphere gets thinner and thinner and one has to make an arbitrary cutoff.)

But one cannot consider the density matrix of Schroedinger's cat, since it is not a well-defined part of the physical universe.
 
Last edited:
  • #314
A. Neumaier said:
...physical existence of something you can feel or touch (Thus excluding the massless and invisible electromagnetic field, which can be said to have physical but not material existence.)

I don't intend to be trivially argumentative, but one can certainly feel an electromagnetic field. In fact, even in using the symbolic "particle" concept, the oppositely charged particles typically don't actually touch because of the repulsive force of the field.

I assume that you are actually equating the degree of materiality to the content of mass. Yes?
 
  • #315
Feeble Wonk said:
you are actually equating the degree of materiality to the content of mass. Yes?
Yes. At least for the purpose of this discussion I take this as the definition of the word material. It corresponds fairly well to its meaning in ordinary life.
 

Similar threads

Replies
21
Views
1K
  • New Member Introductions
Replies
7
Views
275
Replies
36
Views
3K
  • Sticky
  • Quantum Physics
Replies
1
Views
6K
  • Quantum Interpretations and Foundations
7
Replies
218
Views
12K
  • Quantum Interpretations and Foundations
Replies
1
Views
3K
  • Quantum Interpretations and Foundations
Replies
7
Views
1K
  • Quantum Physics
Replies
2
Views
1K
  • Quantum Physics
Replies
14
Views
3K
  • Quantum Interpretations and Foundations
2
Replies
37
Views
2K
Back
Top