Quantum physics vs Probability theory

In summary, because quantum mechanics uses a different concept of probability, it's not compatible with classical probability theory. However, this doesn't answer the question of why quantum mechanics went this route instead of using Kolmogorov's approach.
  • #106
PeroK said:
All would be well, in fact, if some bright mathematician in the 19th century had invented complex probability spaces before QM forced it out. Then, QM would rely on classical PT. It's just that classical PT would have been already enriched by complex probability amplitudes!
You might find it interesting that Bruno De Finetti in his paper:

De Finetti, B. (1937). A proposito di “correlazione.” Supplemento Statistico ai Nuovi problemi
di Politica, Storia ed Economia, 3: 41–57.

recast probability theory in terms of Hilbert spaces and actually found the Tsirelson bound about 40 years before Tsirelson. He even gave these probabilities a non-classical reading. Just a nice note of history.
 
  • Like
  • Informative
Likes Auto-Didact and PeroK
Physics news on Phys.org
  • #107
vanhees71 said:
For me "collapse" is the assumption that by a measurement the state instantaneously changes as a physical process affecting properties of far remote parts of the system instantaneously and this clearly violates causality in relativistic physics. No such thing is needed to calculate conditional probabilities, and the Bell experiments done with entangled photon states are well described within usual local QED without the assumption of such instaneous actions at a distance at all.

If we say that the calculation of a conditional probability is valid, do we accept that there is some physical process that causes the condition to affect the calculation? - perhaps not a process that is modeled by instantaeous action-at-a-distance. Or does the calculation merely imply a mental process on the part of the person calculating that is physical only in the sense the person calculating is implemented by a physical process? (As Thomas Aquinas wrote: "... for I can think of France and afterwards of Syria, without ever thinking of Italy, which stands between them. Therefore much more can an angel pass from one extreme to another without going through the middle.")For example, suppose I have a value for the probability that it rains on a randomly selected day in February. If I select a day at random and notice it is a cloudy morning, I can revise the probability to be the probability that it rains on a randomly selected day in February that has a cloudy morning. ( I see nothing subjective about such a revision. A scientific theory predicts certain results based on certain "givens". Getting a different answer when different information is given is no more subjective that getting different answers for problem 1 and problem 25 in a textbook.) My own mental agility is the only limitation on how quickly I revise the probability. However, thinking classically, I also imagine that there is some physical process that takes place which makes days with cloudy mornings statistically different that the general population of days. In thinking this way, must I be implicitly thinking of hidden variables?

For example, the paper by Ohanian https://arxiv.org/pdf/1703.00309.pdf deals with the collapse of probability distributions in spacetime, but he uses an example of a card game which seems (to me) to employ "hidden variables" in the form of the cards: "the pre-collapse merely affects what we know when and where, but it does not affect the ultimate outcome of the game—who wins and who loses—which is predetermined by the initial conditions imposed by the initial mailing of the playing cards.".
 
  • #108
It's hard to discuss in such generality. It's better you pick some example of a real-lab experiment. Then it's clear how it works. In the usual Bell measurements like the delayed-choice quantum eraser experiment all you do is to choose subensembles based on the local measurements by Alice and Bob on each of "their" photons (which are simply defined as those photon of the pair measured at the detector placed at A's and B's position in space).

Your weather example makes it very clear that there's nothing mysterious in changing the probabilities given new information. If I have probability for rain on an arbitrary day in February (given by weather data through statistical evaluation just looking at all days of many Februaries in the past), it's of course different from the probability for rain on a day in February which starts cloudy in the morning (again given by whether data through statistical analysis taking into account only all cloudy-in-the-morning days of many Februaries in the past, which is a choosen subensemble of the former). There's nothing collapsing here nor is there anything mysterious that the two probabilities are different.

For me there's also no mystery in the probabilities provided by quantum-theoretical calculations: Of course the probabilities for the outcome of a measurement depend on the selection of the ensembles with different chosen conditions.
 
  • #109
vanhees71 said:
It's hard to discuss in such generality. It's better you pick some example of a real-lab experiment. Then it's clear how it works. In the usual Bell measurements like the delayed-choice quantum eraser experiment all you do is to choose subensembles based on the local measurements by Alice and Bob on each of "their" photons (which are simply defined as those photon of the pair measured at the detector placed at A's and B's position in space).

Your weather example makes it very clear that there's nothing mysterious in changing the probabilities given new information. If I have probability for rain on an arbitrary day in February (given by weather data through statistical evaluation just looking at all days of many Februaries in the past), it's of course different from the probability for rain on a day in February which starts cloudy in the morning (again given by whether data through statistical analysis taking into account only all cloudy-in-the-morning days of many Februaries in the past, which is a choosen subensemble of the former). There's nothing collapsing here nor is there anything mysterious that the two probabilities are different.

For me there's also no mystery in the probabilities provided by quantum-theoretical calculations: Of course the probabilities for the outcome of a measurement depend on the selection of the ensembles with different chosen conditions.

@vanhees71's statements on this are generally misleading. He rejects collapse to save locality, but locality cannot be saved unless one rejects "reality". However, there is no clear rejection of "reality" in his statements. In fact, he generally conveys the impression that interactions are all local, which cannot be the case (unless one tries something else like many-worlds, or something else),
 
  • #110
Moderator's note: Thread moved to the quantum foundations and interpretations forum.
 
  • #111
vanhees71 said:
For me there's also no mystery in the probabilities provided by quantum-theoretical calculations: Of course the probabilities for the outcome of a measurement depend on the selection of the ensembles with different chosen conditions.

It's difficult (for me) to express any distinction between QM calculations and the viewpoint you describe, but there are papers that assert that the Schrodinger Eq. is not equivalent to a stochastic process. (e.g. http://www.physik.uni-augsburg.de/theo1/Talkner/Papers/Grabert_PRA_1979.pdf ).

By a "classical" stochastic process, I shall mean this: I imagine a collection of deterministic processes and classify them to be examples of "the same" process ##S##. The processes each evolve deterministically in time. To this concept, I add the concept of genuine probability by saying that when I do an experiment on such a process, I pick one of the examples ##S## at random, giving each example an equal probability of being chosen.

From this point of view, if I measure some variable ##x## involved in ##S## at time ##t## then the probability that ##x = 5## at time ##t= 2## may differ from the probability that ##x = 5## at time ##t## given that I also measured ##x = 4## at time ##t=1##. The distinction between those probabilities doesn't require any notion that my measurement at time ##t=1## changed the physical state of the process. The distinction between the probabilities is explained by the fact that knowing ##x=4## at ##t=1## reveals that the example we are observing is a member of a proper subset of the examples initially available for selection.
It's hard to discuss in such generality. It's better you pick some example of a real-lab experiment. Then it's clear how it works. In the usual Bell measurements like the delayed-choice quantum eraser experiment all you do is to choose subensembles based on the local measurements by Alice and Bob on each of "their" photons (which are simply defined as those photon of the pair measured at the detector placed at A's and B's position in space).

I don't understand which experiments make a distinction between the above description of a classsical stochastic process and a theory that involves hidden variables. Each example of a classical stochastic process can be thought of as sequence of measurements (e.g. (t=0,x=6), (t=1,x=4),...). There is no explicit hidden variable whose value at t = 0 determines the values of the example at later times. Perhaps there is a clever and abstract way to define such a variable. For example, if I imagine a way to map each example to a distinct real number, then the value of that real number trivially determines all values in the sequence for that example.
 
  • Like
Likes nomadreid
  • #112
Of course, QT-time evolution is not equivalent to a stochastic Markov process. Why should it be? I don't see any convincing interpretation of the QT probabilities in terms of classical statistical processes, at least not within a local and/or Markovian framework. There was this idea of a stochastic interpretation by Nelson, but I've never found this very convincing.

It's the other way around: The QT description of many-body systems is the so far most fundamental, and the classical statistics can be derived via various levels from QT rather than the other way around.
 
  • Like
Likes bhobba
  • #113
How can this proposal explain something as basic as quantum tunneling as a purely stochastic/deterministic process?
 
  • #114
Which proposal? QM describes quantum tunneling (of course not as stochastic-determinic process).
 
  • Like
Likes bhobba
  • #115
The proposal that classsical stochastic processes can explain quantum tunning.
 
  • #116
They can't!
 
  • Like
Likes bhobba and EPR
  • #117
vanhees71 said:
QM describes quantum tunneling (of course not as. . . process)
vanhees71 said:
They can't!
1581941185395.png
Perhaps you should engage your spellchecker, so we know without ambiguity. . . they can't ?Thank you, and,

Carry on .

.
 
Last edited:
  • Skeptical
Likes weirdoguy
  • #118
I don't know, what you want. I clearly stated that quantum tunneling cannot be described by a stochastic-deterministic process but only with QT.
 
  • Like
Likes bhobba
  • #119
vanhees71 said:
I clearly stated that quantum tunneling cannot be described by a stochastic-deterministic process but only with QT.
1581950613532.png
Works for me. . . . 😌Carry on. . . .

.
 
  • Sad
Likes weirdoguy
  • #120
In which sense does it work for you?

One should remember what "tunnelling" in the strict sense means: It means that a particle prepared in an energy eigenstate can, with some non-zero probability, be registered in regions that are classically forbidden for a particle of this specific energy. There's no way to describe this classically, because it violates classical physics (the energy-conservation law).

It's also not the same as, e.g., overcoming a potential barrier of a particle (as in the famous Kramers's problem) due to thermal fluctuations, which is completely in accordance with classical mechanics and energy conservation because the particle gains some energy from the heat bath to overcome the potential barrier. It's an open system, and thus energy conservation doesn't hold for the particle alone, i.e., it can exchange energy with the heat bath and thus with some probability overcome the barrier. That's a stochastic process, which can be described as a usual Markovian Langevin process (but also as one with colored noise and some memory effect [1]).

[1]
B. Schüller, A. Meistrenko, H. van Hees, Z. Xu, C. Greiner, Kramers's escape rate within a non-Markovian description
Ann. Phys. (NY) 412, 168045 (2019)
arXiv: 1905.09652 [hep-ph]
 
  • #121
vanhees71 said:
Of course, QT-time evolution is not equivalent to a stochastic Markov process. Why should it be?

The question of whether a phenomena is or isn't a Markov process isn't well posed until we specify the definition of "state". One thought is that the claim that QM predictions can't be modeled by a Markov process means that they cannot be modeled a Markov process using the QM definition of state. (Given absolute freedom to define "state" as one wishes, how could we show that no Markov model of a phenomena exists?)

However, if the state of a physical system evolves deterministically, there there is no probability involved in the model unless we regard it as a trivial Markov process where the state at time t transitions to the state at time t + dt with probability 1. However, isn't the claim that the QM model is not a Markov process more than an objection to such triviality?

What definition of "state of a process" is being used when we say that the results of QM cannot be modeled by a Markov process?
 
  • Like
Likes nomadreid

Similar threads

Replies
42
Views
5K
Replies
91
Views
6K
Replies
13
Views
2K
Replies
25
Views
3K
Back
Top