Quantum physics vs Probability theory

In summary, because quantum mechanics uses a different concept of probability, it's not compatible with classical probability theory. However, this doesn't answer the question of why quantum mechanics went this route instead of using Kolmogorov's approach.
  • #71
Killtech said:
Sorry, i have edited my prior post after posting with a little more elaboration. but i think you misunderstand my goal a little. i do not aim to explain anything. i am rather looking for a clear construction principle how to associate elements of the QM formalism to macroscopic observations made in the experiments - other then using the standard interpretations i am struggling with.

I don't believe you can. If we exclude highly specialist macroscopic objects that have been experimentally created to display QM phenomena, and look at "ordinary" macroscopic objects. Like a particle detector.

You can't account for every particle in the detector and environment explicitly. Schrodinger's cat might be a good example. Just how, in QM terms, are you going to define a "live" cat and a "dead" cat? How do you define a cat, for that matter! You can do it in veterinary terms. But, there is no QM definition of a cat.

You have to accept that mathematical and physical reasoning from QM does not extend to a cat. Roughly you need at least:

QM
Molecular chemistry
Organic chemistry
Cell biology
Biology

QM underpins the whole edifice, but you can't understand a cat using only QM.

Theorectically, let's assume, we could do it. But, it's practically impossible.
 
Physics news on Phys.org
  • #72
PeroK said:
I don't believe you can. If we exclude highly specialist macroscopic objects that have been experimentally created to display QM phenomena, and look at "ordinary" macroscopic objects. Like a particle detector.

Theorectically, let's assume, we could do it. But, it's practically impossible.
Well, i have to disagree here. Initially when i was looking for simply finding a way to express Schrödigers equations in pictures to better understand what it does - because i found that whenever dealing with diff equations it turned out to be extremely helpful to depict them visually to get a good intuition how solutions should look like and why certain theorems hold - the aspect that it was complex valued was a bit of a obstacle. so i using polar coordinates representation ##\Psi = \sqrt {\rho} e^{i u}## i got around it and checked the time evolution equations for both quantities. since it always helps to find similar already well understood equations as a shortcut to picturing these, i found that classical physics offers a lot. for example the time evolutions here are the simple continuity equation while the one for the probability-density-current can be written in terms of the Navier-Stokes-equations of a super-fluid with a peculiar non-linear self interaction ##\hbar^2 \frac {\nabla^2 \sqrt \rho} {2 m \sqrt \rho}## (pressure term?). Now I have looked like this object interacts with its environment just to find that it does so again in a quite familiar fashion - along Schrödigerns original attempt to interpret the wave function as the charge density (albeit here ##\rho## is encoded in it via Borns rule) and motivated by the interpretation when Dirac equations continuity equation goes negative. most convincing however is to find how intuitive this makes the H-atom solutions: a problem with Bohrs models was that a charged particle with angular momentum would emit an EM-wave and lose energy - but a charged fluid can take the form of a disk with an non zero angular momentum such that it does not change over time thus not emitting energy. Only if you combine two different energy solutions you would immediately find ##\partial _{t} \rho = \partial _{t} <E_{1}+E_{2}|E_{1}+E_{2}>## ##= \partial_{t} 2<E_{1},E_{2}> ## ##= sin((E_{2}-E_{1}) \hbar t) const## an oscillating charge distribution thus such a solution should (classically) rapidly lose energy by emitting an EM wave of proper frequency and collapse to the lower state. this behavior which would make only a discrete set of energy-eigenstates classically stable is however only possible for a non-linear system.

But generally what is stirring me up is that when looking into the time evolution of the probability density it has a non-linear self interaction term according to Schrödinger (within the current time evo) - from a PT point of view this is a No-Go for a true probability density because those would allow different realizations/outcomes of an experiment to interact with each other (this seems to be the root of all QM non classicality). That said interpreting this as a interaction with an alternate universe make a lot of sense to me here. but the easiest way to remedy this problem is if ##\rho## would simply be a physical density instead (to which a probability density is merely proportional) since those do obviously have such interactions. At this crossroads i view the latter approach as the more canonical while most QM interpretations take the other road - and i must understand why.

But yeah, i know superfluids are not be exactly common on macroscopic level. Neither are non-linear systems of that kind easy to find, however there are a lot of macroscopic non-linear examples that show a lot of interesting stuff, like for example solitions: solutions to non-linear wave equations which exhibit particle behavior. So there may not be perfect macroscopic match for the wave function behavior but you can get quite close. And for me I find it sufficient to follow that visualization of QM then sticking to abstract point particles.

But without a connection to experiments it has limited usability.
 
Last edited:
  • #73
Killtech said:
I am terribly sorry to have misunderstood this classification. is there a way I can remedy this mistake?

After reviewing the thread, I have changed the level to "I". Some aspects of the discussion probably can't be addressed fully except at the "A" level, but your posts indicate that you do not have the background needed for an "A" level discussion. Since the discussion is clearly beyond the "B" level at this point, "I" level seems like the best compromise.
 
  • #74
Stephen Tashi said:
If that refers to my questions, the problem is to show that "quantum logic" or "quantum probability" or "probability amplitudes" are organized mathematical topics that generalize ordinary probability theory. The alternative to that possibility is that these these terms are not defined in some unified mathematical context, but are informal descriptions of aspects of calculations in QM.
What do you think about Streater's Classical and Quantum Probability?

There's also Hardy's Quantum Theory From Five Reasonable Axioms which tries to reconstruct both classical and quantum probabilistic theories.
 
  • Like
Likes PeroK
  • #75
kith said:
What do you think about Streater's Classical and Quantum Probability?

There's also Hardy's Quantum Theory From Five Reasonable Axioms which tries to reconstruct both classical and quantum probabilistic theories.

I haven't looked at Streater's paper yet. Hardy's approach uses "physical probability". It's what I'd call the Axiom Of Average Luck. It modifies the Law Of Large Numbers to say that a probability can be physically approximated to any given desired accuracy by independent experiments - as oppose to the mathematical statement of the law which only deals with the probability of getting a good approximation.
 
  • #76
Stephen Tashi said:
Hardy's approach uses "physical probability". It's what I'd call the Axiom Of Average Luck. It modifies the Law Of Large Numbers to say that a probability can be physically approximated to any given desired accuracy by independent experiments - as oppose to the mathematical statement of the law which only deals with the probability of getting a good approximation.
Do you think this is enough to do physics or is there something missing? If it is enough, classical and quantum probabilities in physics are on equal footing (because rigorous probability theory itself isn't needed for physics if we take this point of view).

But try Streater, I think his treatment is much more aligned with what you are looking for.
 
  • #77
kith said:
If it is enough, classical and quantum probabilities in physics are on equal footing (because rigorous probability theory itself isn't needed for physics if we take this point of view).

Connecting probability theory with applications of probability theory is (yet another) problem of interpretation. Probability theory doesn't say that random variables have realizations, it doesn't say that we can do random sampling and it doesn't comment of whether events are "possible" or "impossible". Probability theory is circular. It only talks about probabilities.

Attempts to connect the law of large numbers to physical reality seem to work well. However, attempts to use martingale methods of gambling may also seem to work well. Suppose the probability of an event can always(!) be approximated to two decimal places by 10,000 independent trials. How times will Nature be performing sets of 10,000 trials? Will there be a physical consequence if one set of these trials fails to achieve two decimal accuracy? I don't know to reconcile the concept of "physical probability" (results of repeated experiments) with a scheme for how many times Nature conducts such series of experiments. There is also the problem that if I look for places where Nature has repeated an experiment, it is me that is grouping things into batches of independent experiments. The frequency of successes will depend on how I group them.
 
  • Like
Likes bhobba
  • #78
From comments on this thread, I take away (among other things) that classical probability is fine in its domain, but there are some instances where it won't work (Bell, two-slit, etc.) But outside of the evident counter-examples, I am not always sure where the boundary lies. For example, if I google "Schrödinger equation and brownian motion", I get a number of articles such as those attempting, using classical statistical methods, to derive the equation, or to apply it non-quantum phenomenon, such as
https://www.researchgate.net/publication/237152270_Quantum_equations_from_Brownian_motion
https://www.springer.com/gp/book/9783540570301
https://onlinelibrary.wiley.com/doi...978(199811)46:6/8<889::AID-PROP889>3.0.CO;2-Z
But could such an endeavor (either deriving the S. equation by applying classical statistics to stochastic processes, or conversely, applying the S. equation to a macro phenomenon) even make sense?
 
  • #79
nomadreid said:
From comments on this thread, I take away (among other things) that classical probability is fine in its domain, but there are some instances where it won't work (Bell, two-slit, etc.)

Are there actually instances where classical probability theory "won't work"? Or are such failures the failure of the assumptions made in modeling phenomena with classical probability theory - for example, assuming events are independent when they (empirically) are not.

Griffiths uses the term "pre-probabilities" to describe mathematical structures that are used to derive probabilities, but which are not themselves probabilities. ( section 3.5 https://plato.stanford.edu/entries/qm-consistent-histories/ ). The manipulations of "pre-probabilities" can resemble the manipulations used for probabilities. Because the pre-probabilities of QM use complex numbers, one might call them "complex" or "quantum" probabilities. But the success of pre-probabilities does not imply that classical probability theory won't work. It does imply that modeling certain physical phenomena is best done by thinking in terms of pre-probabilities instead of making simplifying assumptions of independence and applying classical probability theory directly.
 
  • Like
Likes nomadreid, vanhees71, *now* and 1 other person
  • #80
Stephen Tashi said:
Are there actually instances where classical probability theory "won't work"?
No. In the objective Bayesian interpretation, probability is simply the logic of plausible reasoning. Logic always works. If logic seems to fail, the error is somewhere else.
nomadreid said:
But could such an endeavor (either deriving the S. equation by applying classical statistics to stochastic processes, or conversely, applying the S. equation to a macro phenomenon) even make sense?
It makes sense.

The classical derivation comes from
Nelson, E. (1966). Derivation of the Schrödinger Equation from Newtonian Mechanics, Phys Rev 150(4), 1079-1085
and is known as Nelsonian stochastics.

A conceptually IMHO much better variant comes from Caticha and is named "entropic dynamics":
Caticha, A. (2011). Entropic Dynamics, Time and Quantum Theory, J. Phys. A 44 , 225303, arxiv:1005.2357

Both have a problem known as "Wallstrom objection" that the Schrödinger equation is derived only for wave functions which have no zero's in the configuration space.
 
  • Like
Likes Stephen Tashi and nomadreid
  • #81
Wasn't Nelson himself quite critical against his own baby recently? I'd have to search for the source, where I read about it ;-)).
 
  • #82
Elias1960 Thanks very much for the very informative answer.
Elias1960 said:
A conceptually IMHO much better variant comes from Caticha and is named "entropic dynamics":
Caticha, A. (2011). Entropic Dynamics, Time and Quantum Theory, J. Phys. A 44 , 225303, arxiv:1005.2357

Both have a problem known as "Wallstrom objection" that the Schrödinger equation is derived only for wave functions which have no zero's in the configuration space.

The Caticha variant has the added advantage that it is more accessible. :woot: Anyway, when I looked up the "Wallstrom objection", I got a lot of attempts to get around it, such as https://arxiv.org/abs/1101.5774, https://arxiv.org/abs/1905.03075, and others. Have any of them successfully served as a complement to either the classic derivation or to entropic dynamics?
 
  • #83
nomadreid said:
The Caticha variant has the added advantage that it is more accessible. :woot: Anyway, when I looked up the "Wallstrom objection", I got a lot of attempts to get around it, such as https://arxiv.org/abs/1101.5774, https://arxiv.org/abs/1905.03075, and others. Have any of them successfully served as a complement to either the classic derivation or to entropic dynamics?
It seems, the first of your quoted approches https://arxiv.org/abs/1101.5774 would fail to save entropic dynamics. There would be no potential ##v^i(q) = \partial_i \phi(q)## at all, but entropic dynamics requires that such a function globally exists.
Instead, if one simple explicitly excludes all ##\psi(q)## with zeroes somewhere, as in https://arxiv.org/abs/1905.03075, then a potential exists as a global function ##v^i(q) = \partial_i \phi(q)##, and Caticha's interpretation makes sense.
 
  • Like
Likes nomadreid
  • #84
Many thanks for that, Elias1960!
 
  • #85
Greetings all!

Stephen Tashi said:
Are there actually instances where classical probability theory "won't work"?
It depends. Ultimately quantum probabilities can be seen as Classical probabilities that are implicitly conditional. See the works of Andrei Khrennikov for nice expositions (https://arxiv.org/abs/1406.4886), this would be related to the "pre-probability" view above. In essence every quantum probability is like ##P(E_{i}|Q)##, i.e. chance of outcome ##E_{i}## given that variable ##Q## has been selected where as Classical Probability can have unconditional probabilities ##P(E_{i})##.

However constantly treating quantum probability this way is underdeveloped and probably more difficult than the standard way of folding all variable selections into a single non-comm Von Neumann algebra. It would be very difficult to treat Quantum Stochastic processes such as those of Belavkin this way.

Another way of phrasing the difference is that in quantum theory we can have fundamentally incompatible but non-contradictory events.
 
Last edited:
  • #86
vanhees71 said:
It's also obvious that with the SGE measurement of this spin component you change the state of the particle. Say, you have prepared to particle to have a certain spin-z component ##\sigma_z = +\hbar/2##, and now you measure the spin-x component. Then your particle is randomly deflected up or down (with 50% probability each)
Oh, you invoked the collapse! I had thought this was a no-no for you!
 
Last edited:
  • Like
Likes Auto-Didact
  • #87
No, I did not invoke the collapse. The time evolution of the wave function is entirely described by unitary time evolution, and the probabilities for finding the particle in one or the other partial beam after the ##\sigma_x## measurement is entirely determined by Born's rule using the time evoloved wave function. There's no need for collapse, particularly not in this simple example, where you can solve the time-dependent SGE (almost) exactly analytically.
 
  • #88
Why would one "avoid" collapse? Isn't state updating a normal part of QM?
 
  • #89
The collapse is an ad-hoc prescription which works well as such, but it has very fundamental problems in connection with relativistic QFT. It's contradicting the very construction of relativistic QFTs, where the only known (and very successful) models are those where interactions are strictly local, i.e., local observable operators commute at spacelike separation of their arguments, and this implies that a local measurement cannot have instant effects at far distant parts of entangled systems, and a collapse would mean an effect over space-like separated meaurement events.
 
  • #90
That's not true though. State-updating can be easily generalised to QFT without any problems with special relativity. See Hamhalter, J. "Quantum Measure Theory". It's a tough book, but Gleason's theorem and Lüders rule are generalised.

State updating doesn't lead to any problems, just like it doesn't cause signalling in entanglement in non-relativistic QM.

How do you update states in QFT if not via the usual rule? I know we don't do it normally in S-matrix calculations.
 
  • Informative
Likes Auto-Didact
  • #91
What else do you need? I also don't need a collapse to understand the fascinating Bell measurements with entangled (multi-)photon states either. The correlation is simply not caused by local interactions of the photons with the measurement devices but is already there due to the preparation in the initial entangled state (though the single-photon properties like polarization states are maximally uncertain, i.e., maximum-entropy mixed states).
 
  • #92
vanhees71 said:
What else do you need?
Beyond the S-matrix? This is basically equivalent to saying why would you need to be able to handle finite time processes in QFT. I think our fundamental theory should be able to handle finite-time processes, otherwise how could finite time events dealt with non-rel QM be considered limiting cases of QFT.

vanhees71 said:
I also don't need a collapse to understand the fascinating Bell measurements with entangled (multi-)photon states either.
Of course you don't need collapse to understand Bell correlations and of course the correlations are not caused by local interactions. The point is that you claimed state-reduction has problems with relativistic QFT. I'm saying that it has been mathematically proven that it does not.

However take a complicated multiparticle entangled state, not just the pairs in a simple Bell experiment. Something like the more general correlations considered by Gisin with ##L## particles, ##M## observables and ##N## outcomes also known as ##(L,M,N)## Bell secnarios in the literature.

You have an initial entangled state, then measurements are performed on ##R## of the particles. How do you model the correlations conditioned on these observations for the remaining ##L - R## particles without the state update rule?
 
  • Like
Likes Auto-Didact
  • #93
How can it not have problems with relativistic causality, if you claim that instantaneously by measuring a photon's polarization at point A, another photon which is most likely to be registered at a far distant place B, the state collapses from the entangled state before the measurement which is something like
$$[\hat{a}^{\dagger}(\vec{p},H) \hat{a}^{\dagger}(\vec{p}',V)-\hat{a}^{\dagger}(\vec{p}',H) \hat{a}^{\dagger}(\vec{p} V)]|\Omega \rangle$$
to a "product state", which is something like
$$\hat{a}^{\dagger}(\vec{p},H) \hat{a}^{\dagger}(\vec{p}',V) |\Omega \rangle$$
(modulo normlization factors and integration over the momenta such as to have proper wave packets rather than generalized momentum eigenstates of course)?
 
  • Like
Likes weirdoguy and QLogic
  • #94
vanhees71 said:
How can it not have problems with relativistic causality...rest of post
It's just the standard no-signalling results extended to QFT where it's a bit harder to prove. ##B## is not able to infer ##A##'s choice of observable from the statistics of observables local to ##B##. Thus there is no violation of causality.

That updating of the state is just standard QM where conditioned on one of ##A##'s results the global state is a product state. It doesn't mean that ##B## over multiple runs of the experiment can ever learn anything about ##A##'s results.

Of course in QFT we never have product states locally anyway, but that's a separate point.

Also how do you model the statistics of the ##L - R## particles conditioned on results on ##R## of them that I mentioned above?
 
  • Like
Likes Auto-Didact
  • #95
Stephen Tashi said:
Or are such failures the failure of the assumptions made in modeling phenomena with classical probability theory - for example, assuming events are independent when they (empirically) are not.
Stephen Tashi said:
But the success of pre-probabilities does not imply that classical probability theory won't work. It does imply that modeling certain physical phenomena is best done by thinking in terms of pre-probabilities instead of making simplifying assumptions of independence and applying classical probability theory directly.
What kind of dependence did you have in mind, the ignoring of which leads to errors?
 
  • #96
QLogic said:
It's just the standard no-signalling results extended to QFT where it's a bit harder to prove. ##B## is not able to infer ##A##'s choice of observable from the statistics of observables local to ##B##. Thus there is no violation of causality.

That updating of the state is just standard QM where conditioned on one of ##A##'s results the global state is a product state. It doesn't mean that ##B## over multiple runs of the experiment can ever learn anything about ##A##'s results.

Of course in QFT we never have product states locally anyway, but that's a separate point.

Also how do you model the statistics of the ##L - R## particles conditioned on results on ##R## of them that I mentioned above?
Yes sure, from the very foundations of local (microcausal) QFT it's clear that no problems can occur, but just putting the assumption of "state collapse" on top (and it's completely unnecessary too!) destroys this consistency of the formalism.

Look at the concrete experiments: You make a measurement protocol at each of the places and then evaluate them afterwards and postselect the different events. There's no collapse necessary to explain these outcomes but just Born's rule to calculate the probabilities for the outcomes of measurements and compare it with the result of the experiment.
 
  • Like
Likes bhobba and weirdoguy
  • #97
vanhees71 said:
Yes sure, from the very foundations of local (microcausal) QFT it's clear that no problems can occur, but just putting the assumption of "state collapse" on top (and it's completely unnecessary too!) destroys this consistency of the formalism.
It's a theorem found in Hamhalter's book that it doesn't.

vanhees71 said:
Look at the concrete experiments: You make a measurement protocol at each of the places and then evaluate them afterwards and postselect the different events. There's no collapse necessary to explain these outcomes but just Born's rule to calculate the probabilities for the outcomes of measurements and compare it with the result of the experiment.
Of course. However what if one has observed only a subset of the particles and you wish to model the statistics of future experiments. Again as I said if only ##R## have been measured and you wish to model the statistics of the remaining ##L - R##, how is this done without conditioning/state reduction?

The classical analogue of what you're arguing is that in probability theory one doesn't need conditioning. I can't see how that is valid.
 
  • Like
Likes Auto-Didact
  • #98
Obviously again I don't understand what you are asking. Of course one needs conditioning in both classical and quantum statistics. Is this about philosophical quibbles about the meaning of probabilities in general? If so, it doesn't belong into the quantum physics forum at all (not even into the interpretation subforum).
 
  • Like
Likes weirdoguy and *now*
  • #99
vanhees71 said:
Yes sure, from the very foundations of local (microcausal) QFT it's clear that no problems can occur, but just putting the assumption of "state collapse" on top (and it's completely unnecessary too!) destroys this consistency of the formalism.

Wrong.

vanhees71 said:
Look at the concrete experiments: You make a measurement protocol at each of the places and then evaluate them afterwards and postselect the different events. There's no collapse necessary to explain these outcomes but just Born's rule to calculate the probabilities for the outcomes of measurements and compare it with the result of the experiment.

Whatever you call it, there is no unitary time evolution of the quantum state.
 
  • Like
Likes Auto-Didact
  • #100
vanhees71 said:
Obviously again I don't understand what you are asking. Of course one needs conditioning in both classical and quantum statistics. Is this about philosophical quibbles about the meaning of probabilities in general? If so, it doesn't belong into the quantum physics forum at all (not even into the interpretation subforum).

Rubbish. It is you that rejects standard QM.
 
  • #101
vanhees71 said:
Is this about philosophical quibbles about the meaning of probabilities in general?
No. Not remotely.

In every textbook of either quantum mechanics or quantum information that I have read one has state updating as either an axiom or for some quantum information books a very early theorem (books can use different basic axioms). In quantum information state updating is used all the time to condition on measurements.

There are also solid mathematical proofs that state updating is compatible with relativity.

Thus I cannot make sense of the statement that one doesn't need state updating since every textbook has it or the statement that it is incompatible with special relativity since it provably is not.

vanhees71 said:
Of course one needs conditioning in both classical and quantum statistics
Now I am extra confused. State updating is how you condition in quantum probability. They're synonyms. How can you accept conditioning and reject state updating.

What does conditioning without the usual state updating/projection postulate look like since you are advocating this?

atyy said:
Rubbish. It is you that rejects standard QM.
That's what it looks like to me. Perhaps I am mistaken though.
 
  • Like
Likes Auto-Didact
  • #102
kith said:
What do you think about Streater's Classical and Quantum Probability?
The famous paper by Redei and Summers is the most cited introduction:
https://arxiv.org/abs/quant-ph/0601158

Here you can see several differences between classical and quantum probability, such as the much stricter conditions under which conditional expectations exist as well as more well known features like entanglement.
 
  • #103
atyy said:
Wrong.
Whatever you call it, there is no unitary time evolution of the quantum state.
Then you change QT to a new theory!
 
  • #104
QLogic said:
No. Not remotely.

In every textbook of either quantum mechanics or quantum information that I have read one has state updating as either an axiom or for some quantum information books a very early theorem (books can use different basic axioms). In quantum information state updating is used all the time to condition on measurements.

There are also solid mathematical proofs that state updating is compatible with relativity.

Thus I cannot make sense of the statement that one doesn't need state updating since every textbook has it or the statement that it is incompatible with special relativity since it provably is not.Now I am extra confused. State updating is how you condition in quantum probability. They're synonyms. How can you accept conditioning and reject state updating.

What does conditioning without the usual state updating/projection postulate look like since you are advocating this?That's what it looks like to me. Perhaps I am mistaken though.
I guess it's again some misunderstanding. For me "collapse" is the assumption that by a measurement the state instantaneously changes as a physical process affecting properties of far remote parts of the system instantaneously and this clearly violates causality in relativistic physics. No such thing is needed to calculate conditional probabilities, and the Bell experiments done with entangled photon states are well described within usual local QED without the assumption of such instaneous actions at a distance at all. Also in the evaluation of the real-world experiments you use measurement protocols taken locally and compare them afterwards to choose the wanted subensembles of measured photons. That's the famous "postselection" or "delayed-choice properties" being discussed all the time in Bell experimental setups, all confirming the predictions of QT. We had a long debate about this a while ago discussing several real-world experiments (quantum eraser, entanglement swapping etc.), all of which can be described without invoking a collapse and in accordance with local relativistic QFT.

So, if you say the collapse is compatbible with local relativistic QFT you must mean something different than I assumed above.
 
  • #105
vanhees71 said:
For me "collapse" is the assumption that by a measurement the state instantaneously changes as a physical process affecting properties of far remote parts of the system instantaneously and this clearly violates causality in relativistic physics
This explains it then. For you the word "collapse" is interpretation loaded and you are rejecting that interpretation. That's fine I won't go into that.

I just meant the usual state reduction axiom is compatible with relativity where I am taking it in its purely "everyday use" sense of what conditioning looks like in Quantum Theory. Thus I thought you were rejecting conditioning which I couldn't make sense of. Where as you are saying it shouldn't be viewed as process but simply as conditioning.

I think there's no actual disagreement here then.
 
  • Like
Likes vanhees71

Similar threads

  • Quantum Interpretations and Foundations
Replies
2
Views
1K
  • Quantum Interpretations and Foundations
2
Replies
42
Views
5K
  • Quantum Interpretations and Foundations
3
Replies
91
Views
6K
  • Quantum Interpretations and Foundations
6
Replies
204
Views
8K
  • Quantum Interpretations and Foundations
Replies
13
Views
1K
  • Quantum Interpretations and Foundations
2
Replies
44
Views
2K
  • Quantum Interpretations and Foundations
Replies
6
Views
1K
  • Quantum Interpretations and Foundations
2
Replies
37
Views
2K
  • Quantum Interpretations and Foundations
Replies
9
Views
2K
  • Quantum Interpretations and Foundations
Replies
0
Views
1K
Back
Top