Why randomness means incomplete understanding

In summary, we lack a fundamental understanding of the measurement process in quantum mechanics. This lack of understanding leads to problems with our ability to create a faithful miniuniverse or deterministic simulator.
  • #36
Sure, QT is about predicting the outcome of experiments, as any theory in physics. I've no clue how you come to the conclusion that it predicts nothing in the microscopic domain in a causal fashion. In fact it does precisely this. If that was not the case, we'd be in need of a new theory, and if there were an observation, for which QT fails to predict the outcome accurately we maybe already had a hint, how to modify it.

I don't think that Heisenberg is a good source concerning discussions about the interpretation. He's one of the main culprits leading to all this fuss about this topic, and the above quote again shows that he didn't understand Bohr's very important correction of his flawed view on the uncertainty relation in his first paper, which he published without first discussing it with Bohr: The uncertainty is not due to the "measurability" of observables but due to the "preparationability" of systems. The last sentence is also very revealing: Heisenberg also fails to clearly distinguish between causality and determinism. Though he is right, more today than in his time after all the investigations following Bell's important insights, in saying that it's highly speculative to think that there may be a "hidden determinism" (again to refer to "causality" is wrong though).

As long as there is not a clear contradiction between QT and observations, I'd say it's indeed highly speculative to think that there may be a deterministic, necessarily non-local, (hidden-variable?) theory behind the probabilistic nature of the quantum description.
 
Physics news on Phys.org
  • #37
A. Neumaier said:
It now seems that in the simulation according to your recipe, nothing happens at all.
Oh, interesting!. . . a simulation that simulates nothing ?? . :DD

That couldn't even simulate a no bel prize. . . .
lmao.gif


.
 
  • #38
vanhees71 said:
Bohr's very important correction
So let me quote Bohr (Nature 1928) on causality:
Niels Bohr said:
This postulate implies a renunciation as regards the causal space-time co-ordination of atomic processes. [...] there can be no question of causality in the ordinary sense of the word. [...] we learn from the quantum theory that the appropriateness of our usual causal space-time description depends entirely upon the small value of the quantum of action as compared to the actions involved in ordinary sense perceptions. [...] the radical departure from the causal description of Nature met with in radiation phenomena, to which we have referred above in connexion with the excitation of spectra. [...] Thus we see at once that no causal connexion can be obtained between observations leading to the fixation of a stationary state and earlier observations on the behaviour of the separate particles in the atom.
Probably you'll reply that you don't think that Bohr is a good source concerning discussions about the interpretation. Nobody is, since nobody has your views. Your views about interpretation are in many respects a minority view.
vanhees71 said:
I don't think that Heisenberg is a good source concerning discussions about the interpretation.
Since you speak by argument from authority, let me emphasize that I consider Heisenberg to be a more important authority than you. His views are not antiquated at all.
 
  • #39
OCR said:
Oh, interesting!. . . a simulation that simulates nothing ?? . :DD

That couldn't even simulate a nobel prize. . .
It simulates a wave function, but no events. Thus one gets a huge amount of data, but nothing happens.
 
Last edited:
  • #40
A. Neumaier said:
So let me quote Bohr (Nature 1928) on causality:

Probably you'll reply that you don't think that Bohr is a good source concerning discussions about the interpretation. Nobody is, since nobody has your views. Your views about interpretation are in many respects a minority view.

Since you speak by argument from authority, let me emphasize that I consider Heisenberg to be a more important authority than you. His views are not antiquated at all.
That's your decision to follow "authorities". I don't claim to be an authority in any respect, but I don't think that my view on interpretation is a minority view. In my environment I don't know any physicist who doesn't share this view, i.e., does not follow the "orthodox interpretation" of QT. That may be due to the fact that our topic of theoretical-physics research (relativistic heavy-ion collisions) is very close to phenomenology and experiments, i.e., we have contact with real-world physics in the lab rather than overly philosophical speculations.

That said, I indeed consider Bohr a better source for discussions on interpretation than Heisenberg, though I don't share your enthusiasm for his writings about the subject which tend to be more confusing than necessary, but Heisenberg tops him in this respect.

Among the best writings on interpretation is the "Prologue" in the book

J. Schwinger, Quantum Mechanics, Symbolism for atomic measurements, Springer Verlag
 
  • #41
vanhees71 said:
In my environment I don't know any physicist who doesn't share this view
Everyone shares the math of quantum physics and the intuition for how to apply it.

But ask anyone about the details about how they interpret things and one finds huge differences. In fact one tends to find more views than people asked, since the same persons' views are different when asked in different contexts.
 
Last edited:
  • #42
OCR said:
That couldn't even simulate a no bel prize. . .

The space in "no bel" was not a simulation. . . it was deliberate. . :oldbiggrin:

.

 
  • #43
OCR said:
The space in "no bel" was not a simulation. . . it was deliberate. . :oldbiggrin:
This revealed a limitation in my biological OCR routine !-)
 
  • Like
  • Haha
Likes vortextor and OCR
  • #44
PAllen said:
Why would you think this is theoretically impossible?
Perhaps it's a delicate mathematical point. To define "I go to the grocery store" requires that "I" do it. But "I" am not a well defined subset of events in the universe. (E.g. atoms come and go from "me", but "I" remain, in the judgement of myself, "the same".)

Macroscopic events are judged by macroscopic beings. Attempts to define them in micrscopic detail are circular since both the event and the definer of the event are not microscopically defined. A definition of "I go to the grocery store" in terms of microscopic events that "I" judge to be correct would be one where "I" agreed that the definition worked. But "I" am not microscopically defined - unless "I" succeed in defining "I" microscopically. That attempt at self-definition is utterly circular.

I don't know whether @A. Neumaier is saying something along these lines, but it seems relevant to the objection that known dynamical laws do not model the macroscopic events of measurements being taken.
 
  • #45
Stephen Tashi said:
I don't know whether @A. Neumaier is saying something along these lines
No. You are nitpicking.

To work on the theoretical level, one doesn't need to specify a macroscopic object (such as 'You') to the last detail, knowing precisely which atoms belong to it. One just needs an approximate model that captures the relevant features. (All our models in physics are approximate!) To define "You go to the grocery store" it is enough to have a stick model of you with movable joints, knowledge of the location of all joints and the door of grocery store, and a lattice quantum model of the material of which the sticks and joints are made, to be able to work from first principles.
 
  • Like
Likes vanhees71
  • #46
A. Neumaier said:
No. You are nitpicking.

To work on the theoretical level, one doesn't need to specify a macroscopic object (such as 'You') to the last detail, knowing precisely which atoms belong to it. One just needs an approximate model that captures the relevant features.

I agree I'm nitpicking, but notice that "one just needs" implies a judgement by a macroscopic being concerning whether needs are met.

The assertion that a system of dynamical laws doesn't model the macroscopic events of measurements being taken is an example of the familiar saying "It's hard to prove a negative case". People can respond along the lines of "Of course it does. We let X be the subset of microscopic events that define 'Bob takes a a measurement' and there you have it."

If we grant that macroscopic events can be defined as subsets of microscopic events in a dynamical system, then we cannot object that the system does not represent macroscopic measurements. We can only object that the system does not tell us "naturally" or by some convenient general definition which microscopic events represent macroscopic measurements.

Your assertion that "randomness mean incomplete understanding" is more subtle than the above points - and apparently not specific to macroscopic events. However, some responses say, in effect, "What's the problem? We define 'Bob takes a measurement' as a set of microscopic events and run a simulation where we draw simulated random numbers at appropriate times and there you have it."

I don't understand your reply to this type of argument. You issue challenges like " This still does not say what an event is, and how a quantum detector in the miniverse would recognize it.". This seems to require that a proposed model include a general definition of "what an event is". Is that your requirement? You would object to people defining the events in a model on case-by-case basis? Perhaps your objection is that there is no general rule telling where to draw the random numbers.
 
  • #47
Stephen Tashi said:
"one just needs" implies a judgement by a macroscopic being concerning whether needs are met.
This is needed for all of science. With your argument you should stop being interested in it.
 
  • Like
Likes vanhees71
  • #48
Stephen Tashi said:
You issue challenges like " This still does not say what an event is, and how a quantum detector in the miniverse would recognize it.". This seems to require that a proposed model include a general definition of "what an event is". Is that your requirement? You would object to people defining the events in a model on case-by-case basis?
If there is no formal meaning to the notion of events in terms of wave functions, one cannot simulate events by simulating wave functions. Since what happens is constituted of events, nothing happens in such a simulation.

Therefore a formal notion of event is needed to be able to simulate what happens. Of course it cannot depend on a case by case basis, since what happens in reality also happens without us having made up cases. Special cases can be distinguished on the basis of the general notion, just as special molecules make sense only with a concept of a molecule.
 
Last edited:
  • Like
Likes vanhees71 and Stephen Tashi
  • #49
vanhees71 said:
Sure, QT is about predicting the outcome of experiments, as any theory in physics.
Absolutely not. The prediction of quantum mechanics, in general, is not based on knowledge of the past, as far as measurement is concerned.

What would be the usefulness of a predictive theory that would not require any measures?

/Patrick
 
  • #50
A. Neumaier said:
But then how does the simulation proceed in such a way that each simulated detector knows which value it has to display at which time (so that an event happens), while only propagating the wave function of the total system?

It now seems that in the simulation according to your recipe, nothing happens at all.

It simulates a wave function, but no events.

You are free to evolve the wavefunction, but that doesn't commit you to a statement like "nothing happens".

You could also, for example, decompose the wavefunction into a basis of mutually exclusive sequences of events ##|\Psi\rangle = \sum_\alpha C_\alpha|\Psi\rangle## and compute the aforementioned probabilities ##||C_\alpha |\Psi\rangle||^2## or, more generally, ##\mathbf{Tr}[C_\alpha\rho C_\alpha^\dagger]##.

Quantum mechanics let's you use whichever treatment of the system is most suitable for your purposes.
 
  • #51
Morbert said:
a basis of mutually exclusive sequences of events
Well, into which basis? If any basis is allowed, then anything can happen. But then it is not determined by the simulation of the wave function but by the additional choice of the basis. This would mean that in our real universe, what happens depends not only on the Schrödinger dynamics but also on choosing a basis. in other words, the basis elements constitute additional hidden variables needed to get real events from quantum mechanics.
 
  • #52
vanhees71 said:
It is important to be clear about the concepts. Quantum theory is completely causal, even in a strong sense: Knowing the state at time ##t_0## and knowing the Hamiltonian of the system, you know the state at any time ##t>t_0##.
The wave packet of a particle without interaction/measurement can spread throughout the universe.

/Patrick
 
  • Like
Likes julcab12
  • #53
microsansfil said:
Absolutely not. The prediction of quantum mechanics, in general, is not based on knowledge of the past, as far as measurement is concerned.

What would be the usefulness of a predictive theory that would not require any measures?

/Patrick
I'm not sure what you are asking. Quantum mechanics (which applies to everything as long as you can use non-relativistic physics) just predicts the outcome of experiments. What do you mean by "from the past"? As any dynamical theory QT starts from the description of the state the system is prepared in (or is observed to be prepared in) at time ##t_0## and provides via the dynamical laws what's to be expected to be observed later in a measurement, and that it does, within its realm of applicability, very well.
 
  • #54
microsansfil said:
The wave packet of a particle without interaction/measurement can spread throughout the universe.

/Patrick
Yes, of course, that what comes out of a calculation you do in the QM 1 lecture in the first or 2nd week. So what?
 
  • #55
vanhees71 said:
I'm not sure what you are asking. Quantum mechanics (which applies to everything as long as you can use non-relativistic physics) just predicts the outcome of experiments. What do you mean by "from the past"? As any dynamical theory QT starts from the description of the state the system is prepared in (or is observed to be prepared in) at time ##t_0## and provides via the dynamical laws what's to be expected to be observed later in a measurement, and that it does, within its realm of applicability, very well.
Pictures are often worth more than speeches :

Classical Mechanics

1564649148871.png


Quantum Mechanics

1564649984512.png


1564649581658.png

That what comes out of a presentation you can have in the QM 1 lecture in the first or 2nd week. Did you miss this passage during your studies?

Without measurements, it is only possible to predict probabilities as if the properties are only accessible through measurement operations that at least disturb them or at most generate them. They are not deduced from the past in a deterministic way.

/Patrick
 

Attachments

  • 1564649247250.png
    1564649247250.png
    26.9 KB · Views: 159
  • #56
That's a quite nice summary of QT, though I don't like the very problematic collapse postulate. What I meant with my statement was the spread of a free wave packet in non-relativistic QT. Usually you get the propagation of a Gaussian wave packet according to the Schrödinger equation as a problem in the first few recitation sessions. It's meaning is of course given as on your French slide: ##|\psi(t,x)|^2## is the position-probability distribution at time ##t##, i.e., it gives the probability for a detector to click at time ##t## when sitting at the point ##x## per (small) detector volume. That's all you need to know to make predictions concerning this position measurement.

What the particle does after detection is a question that cannot be part of the general formalism. If you have a von Neumann filter measurement indeed you have to adapt the wave function due to the interaction of the particle with the measurement device based on the knowledge that it registered the particle at at time ##t## at a place ##x## with some uncertainty given by the position resolution of the detector. In this (and only in this) case the "collapse postulate" is a valid FAPP description of a state-preparation procdedure, but no more.
 
  • #57
A. Neumaier said:
Well, into which basis? If any basis is allowed, then anything can happen. But then it is not determined by the simulation of the wave function but by the additional choice of the basis. This would mean that in our real universe, what happens depends not only on the Schrödinger dynamics but also on choosing a basis. in other words, the basis elements constitute additional hidden variables needed to get real events from quantum mechanics.

The quantum theory of the miniverse is in the dynamics and the initial conditions, but not the choice of basis. Different bases make clear different features of the miniverse we might wish to understand. They are not separate, alternative theories of the miniverse.

The theory does constrain our choice insofar as our decomposition has to be one that returns approximately standard probabilities, which is the case if ##Re[\mathbf{Tr}[C_{\alpha'}\rho C^\dagger_\alpha]]\approx 0## for ##\alpha'\neq\alpha##. But this is a feature, not a bug, as it ensures our physical understanding of the miniverse is always logically valid.
 
  • #58
vanhees71 said:
Quantum mechanics (which applies to everything as long as you can use non-relativistic physics) just predicts the outcome of experiments.
No. It leaves most details about the outcomes of experiments undetermined; only their gross statistics is determined.

According to all traditional interpretations, quantum mechanics alone does never predict the outcomes of any single experiment but only the statistics of a large ensemble of similarly prepared experiments.

In contrast, the thermal interpretation predicts the outcomes of experiments individually (from the state of the universe) in terms of the quantum formalism alone, and only our limited knowledge of the latter forces us to statistical considerations.
 
Last edited:
  • Like
Likes julcab12
  • #59
Morbert said:
The quantum theory of the miniverse is in the dynamics and the initial conditions, but not the choice of basis. Different bases make clear different features of the miniverse we might wish to understand. They are not separate, alternative theories of the miniverse.

The theory does constrain our choice insofar as our decomposition has to be one that returns approximately standard probabilities, which is the case if ##Re[\mathbf{Tr}[C_{\alpha'}\rho C^\dagger_\alpha]]\approx 0## for ##\alpha'\neq\alpha##. But this is a feature, not a bug, as it ensures our physical understanding of the miniverse is always logically valid.
So to predict/simulate events you need quantum mechanics plus a basis that must be added externally, though in reality, things happen without having to specify a basis. Since according to you these additional choices are necessay (rather than implied by the quantum formalism), quantum mechanics alone is incomplete.
 
  • #60
A. Neumaier said:
No. It leaves most details about the outcomes of experiments undetermined; only their gross statistics is determined.

According to all traditional interpretations, quantum mechanics alone does never predict the outcomes of any single experiment but only the statistics of a large ensemble of similarly prepared experiments.

In contrast, the thermal interpretation predicts the outcomes of experiments individually (from the state of the universe) from the quantum formalism alone, and only our ignorance of the latter forces us to statistical considerations.
Well, then can you explain to me, why QT is considered the most successful physical theory ever? What is undetermined in your opinion?

You say, it's "only the statistics". But that's the point! Nature is not deterministic on the fundamental level according to QT. E.g., if you have a single radioactive atom (and today you can deal with single atoms, e.g., in traps or storage rings) there's no way to predict the precise time, when it decays (given it is "here" now).

Of course, there's always the possibility that QT is not complete, and we simply do not know the complete set of observables which might determine the precise time, when the atom decays, but so far we don't have any hint that this might be true, and from the various Bell experiments, all confirming QT but disprove any local deterministic HV theories, I tend to believe that QT is rather complete (despite the description of gravity, which is today the only clear indication that QT is not complete). That's of course a believe, I can't prove, but under this assumption, QT tells us that nature is inherently probabilistic, i.e., certain things like the decay of the instable atom simply are random. I don't see, where a problem with this might be. It's rather amazing how accurately we are able to describe this inherent randomness with probability theory (a mathematical axiomatic system, which doesn't tell anything about the concrete probability measure for a given real-world situation) together with QT (a physical theory that provides precise predictions for probabilities of the inherently random processes observed in nature).

I think there's no reason to think that nature may not be random at the most fundamental level of describability.
 
  • Like
Likes meopemuk
  • #61
vanhees71 said:
Well, then can you explain to me, why QT is considered the most successful physical theory ever?
Because it actually determines the statistics with phenomenal success. This is quite a feat!
vanhees71 said:
What is undetermined in your opinion?
Each single outcome, and all details of the fluctuations. Thus most of the stuff that is observed.
But only in the traditional interpretations.

In my opinion, the true, complete quantum physics is the quantum formalism plus the thermal interpretation. It accounts for each single outcome, and for all details of the fluctuations.
 
  • #62
vanhees71 said:
I think there's no reason to think that nature may not be random at the most fundamental level of describability.
This is a completely unverifiable statement of your faith in the traditional quantum philosophy.
 
  • #63
vanhees71 said:
Nature is not deterministic on the fundamental level...

To my mind, there is need to more profound thinking. The onsets of individual clicks in a counter seem to be totally lawless events, coming by themselves and thus being uncaused. Or can one denote a cause which compels these individual effects?
 
  • Like
Likes meopemuk
  • #64
Randonmess is an artifact of measurables. Its not to say it didnt exist for lack of a better word. It is effective in its domain-- dynamics/relation-- average outcome. Like how flatspace is treated in geometry--GR. TI accounts for both. I bare lack of confidence in the unmaleability/universality of time in QM which accounts for every predictive values and observables in S/GR domain-- even the weird ones. I can only suspect that randonmess/indeterminism is not the underlying factor but mere artifacts of probabilty. In the same manner that flatspace is not observable.
 
  • #65
The statistical character is not something we can get away from. This statistical behavior is described by the partition function

∫ dφ eφT = (2π)n/2(det D)-1/2 = exp -1/2 Tr ln D

where F = 1/2 Tr ln D is the free energy. The first integral is the path integral

∫ e-S = e-F

where F is the free energy and the action is S = φTDφ. The formal similarity to thermodynamics is something that tells us that these expressions can only be interpreted statistically. The free energy is defined as a legendre transform in terms of the conjugate variables J and φ. Expressions such as S = Tr ρ ln ρ cannot interpreted in terms of usual microstates because they are defined through wick rotation.
 
Last edited:
  • #66
PrashantGokaraju said:
The formal similarity to thermodynamics is something that tells us that these expressions can only be interpreted statistically.
No. There are many formal similarities in mathematics and physics that cannot be ascribed to a similar interpretation.

The formal similarity only tells that there is a possibility of interpreting it statistically.
 
  • #67
At least intuitively any model involving randomness can be replaced with a deterministic model with extra variables controlling that randomness. Not knowing those variables can be called "incomplete understanding", though I think I'd be content and even consider our understanding complete for such models where certain variables are not even theoretically knowable, as long as their effect is well defined.

Problem with QM is that even such hidden variable models are proven to not work. For me my "incomplete understanding" stems not from the randomness itself, but from the way the random outcomes under different parameters are related. But I guess this is a whole other topic.
 
  • #68
A. Neumaier said:
Because it actually determines the statistics with phenomenal success. This is quite a feat!

Each single outcome, and all details of the fluctuations. Thus most of the stuff that is observed.
But only in the traditional interpretations.

In my opinion, the true, complete quantum physics is the quantum formalism plus the thermal interpretation. It accounts for each single outcome, and for all details of the fluctuations.
Ok, so we agree on the basic facts concerning QT as a very successful physical theory.

Now, it is obviously difficult, even after all this decades, to simply accept the simple conclusion that nature behaves inherently random. If this is true, as strongly suggested by QT and the strong successful experimental tests it has survived up to this day, then there's no way to predict a single measurement's outcome with certainty (of course except in the case, where the system is prepared in a state, where the measured observable takes a certain determined value), because the observable doesn't take a determined value. Then the complete description are indeed probabilities, and to test these probabilities you need an ensemble. Fluctuations are also referring to an ensemble. So if you accept the probabilistic description as complete, there's nothing lacking with QT simply because the outcome of an individual measurment is inherently random.

I still don't understand the thermal interpretion: Recently you claimed within the thermal interpretation the observables are what's in the usual interpretation of QT is called the expectation value of the observable given the state, i.e., ##\langle O \rangle = \mathrm{Tr}(\hat{\rho} \hat{O})##. This is a single value, i.e., it's determined, given the state. Now you claim, there are fluctuations. How do you define them. In the usual interpretations, where the state is interpreted probabilistically, it's clear: The fluctuations are determined by the moments or cumulants of the probability distribution or, equivalently, all expectation values of powers of ##O##, i.e., ##O_n =\mathrm{Tr} (\hat{\rho} \hat{O}^n)##, ##n \in \mathbb{N}##. But then you have again the usual probabilistic interpretation back (no matter, which flavor of additional "ontology" and "metaphysics" you prefer). Just renaming a probabilistic theory avoiding the words statistics, randomness and probability, does not change the mathematical content, as you well know!

So why then call it "thermal" (which is misleading anyway, because it seems to be your intention to provide a generally valid reinterpretation and not just one for thermal equilibrium).
 
  • Like
Likes meopemuk
  • #69
A. Neumaier said:
No. There are many formal similarities in mathematics and physics that cannot be ascribed to a similar interpretation.

The formal similarity only tells that there is a possibility of interpreting it statistically.

This is more than a formal similarity. The euclidean action essentially is the entropy.
 
  • #70
Lord Jestocost said:
To my mind, there is need to more profound thinking. The onsets of individual clicks in a counter seem to be totally lawless events, coming by themselves and thus being uncaused. Or can one denote a cause which compels these individual effects?
That's precisely my point (take the example of radioactive decay and a Geiger counter): The individual clicks ARE random according to QT. In lack of any deterministic explanation (in view of all these accurate Bell tests confirming QT) my conclusion simply is that nature is inherently random, i.e., when the individual atom decays and thus the Geiger counter clicks, is random.

The mathematical model to describe random events is probability theory, and QT is another theory providing the probability measures to be used to describe measurement outcomes in experiments (which are necessarily random experiments due to the inherent randomness of nature), and as it turns out everything else than "lawless". To the contrary QT provides the best estimates of probabilities for a vast number of observations (in fact all observations so far!) ever. I don't know a single other application of probability theory/applied stastistics, which gives as accurate preditions for probabilities/ statistics than QT! Thus we have a precise probabilistic description of the "inherent randomness of nature". No more no less.
 
  • Like
Likes meopemuk

Similar threads

Back
Top