Jürg Fröhlich on the deeper meaning of Quantum Mechanics

In summary, the paper by Jürg Fröhlich discusses the problems with the standard formulation of quantum mechanics and its shortcomings. He feels that the subject has better remain a hobby until later in his career, but when he was approaching mandatory retirement he felt an urge to clarify his understanding of some of the subjects he had had to teach to his students for thirty years. The paper presents a completion of QM, the ''ETH-Approach to QM,'' which is too abstract to become popular. Interesting paper.
  • #36
DarMM said:
What do others think of Fröhlich's argument about the inequivalence of the Schrödinger and Heisenberg pictures?
I haven't yet understood what Fröhlich means with his nonequivalence claim.
dextercioby said:
I have always perceived that the equivalence of Schroedinger and Heisenberg pictures is nothing but a disguised form of the Born's rule
But it has nothing to do with Born's rule, unless you identify Born's rule with the existence of the expectation mapping (which, however, would empty Born's rule from all its empirical content).
Surely it is not equivalent to Born's rule, for it says nothing about measurement.

The equivalence just says that the time dependence of ##Tr~A(t)\rho(t)## can be distributed in different ways to that of ##A## and ##\rho##.
 
  • Like
Likes vanhees71
Physics news on Phys.org
  • #37
A. Neumaier said:
I haven't yet understood what Fröhlich means with his nonequivalence claim
He's basically referring to the fact that his interpretation has "constant collapse" for lack of a better word.

So Fröhlich says that at time ##t## we have the algebra of observables located times ##\geq t##. This is denoted ##\mathcal{E}_{\geq t}##. An event is a particular set of projectors, ##\{\pi_{E,t}\}##, summing to unity. An event is then said to occur at ##t## if its projectors commute with all other observables in ##\mathcal{E}_{\geq t}## under the state ##\omega##:
$$\omega\left(\left[\pi_{E},A\right]\right) = 0$$

This is meant to be a purely mathematical condition with no need for observation as a primitive. In a given state ##\omega## and given a particular time ##t## and its associated observables ##\mathcal{E}_{\geq t}## there will be such a set of projectors. Thus there is always some event that occurs. After that event has occurred one should use the state ##\omega_{E,t}## given by the conventional state reduction rule.

However imagine I am an experimenter in a lab. I have performed a measurement and updated to ##\omega_{E,t}##. Fröhlich's point is that there will then be, under a proper mathematical analysis, some event ##\{\pi_{E^\prime,t^\prime}\}## that via his condition will occur. This will then cause an update to the state ##\omega_{E^\prime,t^\prime}##. However under conventional QM the experimenter, since he has not made a measurement, continues to use ##\omega_{E,t}##. In the ETH-interpretation he has made an error by restricting the events that occur to be solely his measurement events. Thus his state is incorrect.

Fröhlich discusses why usually it is almost completely accurate. Essentially because the event that follows at ##t^\prime## (under certain assumptions about the Hamiltonian) has projectors that almost overlap with those of the event that occurred at ##t##.

This results in the ETH-interpretation having slightly different predictions from standard QM.

Operators evolve under the Heisenberg equations of motion, but states between measurements do not exactly follow Schrödinger evolution. Thus the inequivalence.
 
  • Like
Likes Auto-Didact
  • #38
DarMM said:
Operators evolve under the Heisenberg equations of motion, but states between measurements do not exactly follow Schrödinger evolution. Thus the inequivalence.
But traditionally, if operators evolve under the Heisenberg equations of motion, states remain constant.

Thus Fröhlich changes the meaning of the Heisenberg picture!?

it seems to me that, when viewed in the Schrödinger picture, Fröhlich is proposing something like the piecewise deterministic procesess (PDP) of Breuer & Petruccione referred to in my Part III. There is also old work by Jadczyk on PDP and event-enhanced quantum mechanics: https://arxiv.org/pdf/hep-th/9409189, https://arxiv.org/pdf/quant-ph/9506017, and a few more. But so far I didn't have the time to check out the precise relations to Fröhlich's setting.
 
  • #39
A. Neumaier said:
But traditionally, if operators evolve under the Heisenberg equations of motion, states remain constant.

Thus Fröhlich changes the meaning of the Heisenberg picture!?
Yes I would say. Operators follow the Heisenberg equations of motion, but states do not remain constant. In standard QM they remain constant except upon "collapse", so constant except at measurements. Fröhlich however has "constant collapse" so states are truly always evolving even in the Heisenberg picture.
 
  • #40
A. Neumaier said:
it seems to me that, when viewed in the Schrödinger picture, Fröhlich is proposing something like the piecewise deterministic procesess (PDP) of Breuer & Petruccione referred to in my Part III
There is a relation I suspect, but for Fröhlich the evolution is fundamentally stochastic/random. The state update rule is not an "effective" proscription, but literally true.
 
  • #41
DarMM said:
Fröhlich however has "constant collapse" so states are truly always evolving even in the Heisenberg picture.
Do you mean continuous collapse - at every moment in time, as in continuous measurement theory?
DarMM said:
There is a relation I suspect, but for Fröhlich the evolution is fundamentally stochastic/random. The state update rule is not an "effective" proscription, but literally true.
The same holds in PDP, except that the times of collapse are random, not continuous (else one has a quantum diffusion process - relevant for measuring operators with continuous spectra).
 
  • #42
A. Neumaier said:
Do you mean continuous collapse - at every moment in time, as in continuous measurement theory?
I believe so. He discusses only the case where time is discrete. There he has collapse at each discrete moment of time. The natural extension to continuous time is continuous collapse.

A. Neumaier said:
The same holds in PDP, except that the times of collapse are random, not continuous (else one has a quantum diffusion process - relevant for measuring operators with continuous spectra).
You're right of course. I had in mind your Thermal version view of such cases when contrasting it with Fröhlich. PDP is very similar to Fröhlich as you said.
 
  • #43
I should say as far as I can tell Fröhlich doesn't consider the quantum state to be physically real, just a method of keeping track of which events might occur. So the collapse processes above are physical in the sense of specifying the occurrence of an event, but not the reduction of a physical state vector.

So in ETH the world is composed of a sequence of randomly realized events. Events from non-commuting projector sets are not comparable. A history only involves a subset of possible quantities. This is the typical counterfactual indefiniteness that distinguishes QM from a classical stochastic process, e.g. there will be an event where a value for ##S_x## is realized, not the whole spin vector ##\left(S_x, S_y, S_z\right)##.

In an Bell-Aspect experiment one cannot compare different measurement pair choices for Alice and Bob since they occur in different histories.

So a Copenhagen variant very similar to Decoherent histories and the "Event"-interpretation of Haag @bhobba . Again I'm not really sure there is a true difference between Fröhlich, Haag and Bub here or just a difference of formulation.
 
  • #44
I've not read all the recent postings, but some of the proponents of the claim that there's a measurement problem, raised two issues:

(a) how do measurement outcomes occur?
(b) the need to prove Born's rule.

I don't see any issues with both points since a measurement result comes about through interactions of the measured system with the measurement device. QT gives an adaquate and accurate description about all so far reproducibly observations.

Concerning (b), I consider the Born rule as one of the fundamental postulates of QT, that can not be derived from the other postulates. I think Englert is right!
 
  • Like
Likes bhobba
  • #45
vanhees71 said:
I don't see any issues with both points since a measurement result comes about through interactions of the measured system with the measurement device. QT gives an adaquate and accurate description about all so far reproducibly observations.
I think people's issues is that it doesn't tell you which result will occur. There's also the unusual feature that only the observable you look at "occurs", e.g. for Spin in the x-direction only a ##S_x## outcome occurs, so quantum observables are as much a property of the device as the quantum system itself.

I think you are fine with this because you think there isn't anything but the statistics, i.e. you can't know which occurs because that's what the world is like.
 
  • #46
I consider the rules of the minimal interpretation to be outright contradictory. If something is a contradiction, it can't be correct. On the one hand, one of the rules of the minimal interpretation says that a measurement always results in an eigenvalue of the operator corresponding to the observable being measured. That means that after a measurement, the device is in a definite "pointer state". On the other hand, if you treat the measuring device (plus observer plus the environment plus whatever else is involved) as a quantum mechanical system that evolves under unitary evolution, then unless the observable being measured initially has a definite value, then after the measurement, the measuring device (plus observer, etc) will NOT be in a definite pointer state.

This is just a contradiction. Of course, you can make the use of the quantum formalism consistent by just imposing an ad hoc distinction between measurement devices (or more generally, macroscopic systems) and microscopic systems. But that's not a physical theory, that's a rule of thumb.
 
  • Like
Likes eloheim, Auto-Didact, dextercioby and 1 other person
  • #47
vanhees71 said:
I've not read all the recent postings, but some of the proponents of the claim that there's a measurement problem, raised two issues:

(a) how do measurement outcomes occur?
(b) the need to prove Born's rule.

I don't see any issues with both points since a measurement result comes about through interactions of the measured system with the measurement device. QT gives an adaquate and accurate description about all so far reproducibly observations.

Concerning (b), I consider the Born rule as one of the fundamental postulates of QT, that can not be derived from the other postulates. I think Englert is right!
I agree except perhaps one should say "... a measurement result comes about through non-unitary interactions..." . It is non-unitary-ness that seems to give people a problem.
 
  • #48
stevendaryl said:
This is just a contradiction
That has never been demonstrated.

Your contradiction equally applies to Spekkens model where the device is measuring a system and obtains an outcome ##a## from a set ##\{a,b\}##, but an observer isolated from the device models it as being in a superposition. However one can explicitly see that there isn't a contradiction in Spekkens model.
 
  • Like
Likes dextercioby and vanhees71
  • #49
stevendaryl said:
I consider the rules of the minimal interpretation to be outright contradictory. If something is a contradiction, it can't be correct. On the one hand, one of the rules of the minimal interpretation says that a measurement always results in an eigenvalue of the operator corresponding to the observable being measured. That means that after a measurement, the device is in a definite "pointer state". On the other hand, if you treat the measuring device (plus observer plus the environment plus whatever else is involved) as a quantum mechanical system that evolves under unitary evolution, then unless the observable being measured initially has a definite value, then after the measurement, the measuring device (plus observer, etc) will NOT be in a definite pointer state.

This is just a contradiction. Of course, you can make the use of the quantum formalism consistent by just imposing an ad hoc distinction between measurement devices (or more generally, macroscopic systems) and microscopic systems. But that's not a physical theory, that's a rule of thumb.
In other words your problem is that you don't want to accept the probabilistic nature of the quantum description. That's not a problem of QT, but just prejudice about how nature should be. Science, however, tells us, how nature behave, and the conclusion of the gain of knowledge summarized accurately in the QT-formalism, which lead to correct predictions and descriptions of all objective phenomena observed so far, is that nature is intrinsically probabilistic, i.e. there's no way to prepare a system such that all observables take determined values. Thus, there's no contradiction in the two postulates you claim. To the contrary, indeterminism in the above precise sense of QT makes it a consistent and accurate description of all our experience so far!
 
  • #50
Mentz114 said:
I agree except perhaps one should say "... a measurement result comes about through non-unitary interactions..." . It is non-unitary-ness that seems to give people a problem.
There's no single proof of non-unitariness. In some sense one can even say that everyday experience (validity of thermadynamics) tells the opposite: unitarity ensures the validity of the principle of deatailed balance.
 
  • #51
stevendaryl said:
if you treat the measuring device (plus observer plus the environment plus whatever else is involved) as a quantum mechanical system that evolves under unitary evolution

Then you are saying that no measurement occurred. That removes the contradiction; in its place is simply a choice of whether or not to treat the system as if a measurement occurred.

The issue with the minimal interpretation is that there is no rule that tells you when a measurement occurs. In practice the rule is that you treat measurements as having occurred whenever you have to to match the data. So in your example, since nobody actually observes observers to be in superpositions of pointer states, and observers always observe definite results, in practice we always treat measurements as having occurred by the time an observer observes a result.
 
  • Like
Likes dextercioby and vanhees71
  • #52
vanhees71 said:
There's no single proof of non-unitariness. In some sense one can even say that everyday experience (validity of thermadynamics) tells the opposite: unitarity ensures the validity of the principle of deatailed balance.
I don't agree. My problem is irreversibility, which is demanded of the measurement by the purists but is unobtainable with unitary evolution.
 
  • Like
Likes Auto-Didact
  • #53
vanhees71 said:
In other words your problem is that you don't want to accept the probabilistic nature of the quantum description

No, the problem is you refuse to consider the time evolution of the measuring device itself as the unitary evolution of a quantum system. But this is the only thing that makes sense, since the device is made of electrons and nucleons, which everyone agrees are quantum systems.

You are implicitly dividing the world in two, where the meaning of quantum systems are defined only by the probabilistic responses they trigger in classical devices, which you independently assume to already exist. But there is no sensible way to explain how these classical devices can ever come to exist in the first place.
 
  • Like
Likes eloheim, Auto-Didact and dextercioby
  • #54
This is basically just a discussion over what's going on in Wigner's friend right?

Would be interesting to see how it works out in Fröhlich's view since he doesn't have observers in the usual sense. I think he'd just have his commutation condition determine when the measurement event has occurred in an objective sense.
 
Last edited:
  • #55
vanhees71 said:
In other words your problem is that you don't want to accept the probabilistic nature of the quantum description.

There is nothing I said that suggests that, and it's not true. That's ignoring what I actually said, and pretending that I said something different, that you have a prepared response for.
 
  • Like
Likes Auto-Didact
  • #56
The issue with quantum mechanics is that it is NOT a probabilistic theory, until you specify a basis. Then you can compute probabilities using the Born rule. But what determines which basis is relevant?

The minimal interpretation says it's whichever basis corresponds to the observable being measured. But what does it mean that a variable is being measured? It means, ultimately, that the interaction between the system being measured and the measuring device is such that values of the variable become correlated with macroscopic "pointer variables".

So, the minimal interpretation ultimately gives a preference to macroscopic quantities over other variables, but this preference is obfuscated by the use of the word "measurement". The inconsistency is that if you treat the macroscopic system as a huge quantum mechanical system, then no measurement will have taken place at all. The macroscopic system (plus the environment, and maybe the rest of the universe) will not evolve into a definite pointer state.

So depending on whether you consider a macroscopic interaction a measurement or not leads to different results. That's an inconsistency in the formalism. The inconsistency can be resolved in an ad hoc manner by just declaring that macroscopic systems are to be treated differently than microscopic systems, but there is no support for this in the minimal theory. The minimal theory does not in any way specify that there is a limit to the size of system that can be analyzed using quantum mechanics and unitary evolution.
 
  • Like
Likes eloheim, Auto-Didact, mattt and 1 other person
  • #57
charters said:
You are implicitly dividing the world in two, where the meaning of quantum systems are defined only by the probabilistic responses they trigger in classical devices, which you independently assume to already exist. But there is no sensible way to explain how these classical devices can ever come to exist in the first place.

That's exactly right. The minimal interpretation requires two contradictory things: (1) that any system composed of quantum mechanical particles and fields, no matter how large, evolves unitarily according to the Schrodinger equation, and (2) macroscopic measurement devices are treated as always having definite values for "pointer variables" (the results of measurements). These two are contradictions.
 
  • Like
Likes eloheim and Auto-Didact
  • #58
stevendaryl said:
That's exactly right. The minimal interpretation requires two contradictory things: (1) that any system composed of quantum mechanical particles and fields, no matter how large, evolves unitarily according to the Schrodinger equation, and (2) macroscopic measurement devices are treated as always having definite values for "pointer variables" (the results of measurements). These two are contradictions.
What's the contradiction if one understands the quantum state probabilistically? This exact issue appears in Spekkens model where the resolution is clear. I don't understand what is different about QM that makes this a contradiction.
 
  • #59
vanhees71 said:
..... since a measurement result comes about through interactions of the measured system with the measurement device.

Here I disagree. In Renninger-type of measurements the “reduction” of the wave function is accomplished without any physical interaction. As Nick Herbert writes in “Quantum Reality: Beyond the New Physics”:

The existence of measurements in which “nothing happens” (Renninger-style measurement), where knowledge is gained by the absence of a detection, is also difficult to reconcile with the view that irreversible acts cause quantum jumps. In a Renninger-style measurement, there must always be the “possibility of an irreversible act” (a detector must actually be present in the null channel), but this detector does not click during the actual measurement. If we take seriously the notion that irreversible acts collapse the wave function, Renninger measurements require us to believe that the mere possibility of an irreversible act is sufficient to bring about a quantum jump. The fact that such “interactionless” measurements are possible means that the wave function collapse cannot be identified with some specific random process occurring inside a measuring device.
 
  • #60
Lord Jestocost said:
Here I disagree. In Renninger-type of measurements the “reduction” of the wave function is accomplished without any physical interaction. As Nick Herbert writes in “Quantum Reality: Beyond the New Physics”:

The existence of measurements in which “nothing happens” (Renninger-style measurement), where knowledge is gained by the absence of a detection, is also difficult to reconcile with the view that irreversible acts cause quantum jumps. In a Renninger-style measurement, there must always be the “possibility of an irreversible act” (a detector must actually be present in the null channel), but this detector does not click during the actual measurement. If we take seriously the notion that irreversible acts collapse the wave function, Renninger measurements require us to believe that the mere possibility of an irreversible act is sufficient to bring about a quantum jump. The fact that such “interactionless” measurements are possible means that the wave function collapse cannot be identified with some specific random process occurring inside a measuring device.

The irreversibility is not in the system being measured, but in the system doing the measuring. Any time knowledge is gained, that means that the system doing the measuring has been irreversibly changed.
 
  • #61
Mentz114 said:
I don't agree. My problem is irreversibility, which is demanded of the measurement by the purists but is unobtainable with unitary evolution.
The irreversibility comes into physics through coarse graining. Also in classical physics there's no irreversibility on the fundamental level. Of course, for philosophers, also this opens a can of worms (or even Pandora's box if you wish). There are debates about this even longer than there are debates about QT. From the physics point of view there's no problem. To the contrary it's well understood, and the "arrow of time" comes into physics as a basic postulate in the sense of the "causal arrow of time". As any fundamental assumption/postulate/axiom, however you want to call it, in the edifice of theoretical physics it cannot be proven but it's assumed based on experience, and this is the most fundamental experience of all: That there are "natural laws" which can be described mathematically, and also about this you can build a lot of mysteries and philosophies of all kinds. From a physics point of view that's all irrelevant, but perhaps nice for your amusement in the sense of fairy tales.

The point with unitarity is that it guarantees that the "thermodynamical arrow of time" is inevitable consistent with the "causal arrow of time", and this is not a fundamental law but can be derived from the assumption of a causal arrrow of time and unitarity of the time evolution of closed quantum systems. With the thermodynamical arrow of time also irreversibility is well determined, i.e., the fact that entropy has increased. Also note that also the entropy depends on the level of description or coarse graining. It's measuring the missing information, given the level of description, relative to what's defined as "complete knowledge".
 
  • Informative
Likes Mentz114
  • #62
charters said:
No, the problem is you refuse to consider the time evolution of the measuring device itself as the unitary evolution of a quantum system. But this is the only thing that makes sense, since the device is made of electrons and nucleons, which everyone agrees are quantum systems.

You are implicitly dividing the world in two, where the meaning of quantum systems are defined only by the probabilistic responses they trigger in classical devices, which you independently assume to already exist. But there is no sensible way to explain how these classical devices can ever come to exist in the first place.
Why should I describe a measurement device like this? Do you describe a rocket flying to the moon as a quantum system? I don't believe that this makes much sense. That's why we use the adequately reduced (coarse grained) description for measurement devices or rockets: It's because it's impossible to describe the microstate of a macroscopic system (despite the rare cases, where it is in a simple enough state like some systems close to 0 temperature like liquid He or a superconductor etc.). As it turns out the effective quantum description of macroscopic systems almost always leads to behavior of the relevant macroscopic degrees fo freedom as described by classical physics (Newton's Laws of motion including of gravity for the moon rocket).
 
  • #63
Lord Jestocost said:
Here I disagree. In Renninger-type of measurements the “reduction” of the wave function is accomplished without any physical interaction. As Nick Herbert writes in “Quantum Reality: Beyond the New Physics”:

The existence of measurements in which “nothing happens” (Renninger-style measurement), where knowledge is gained by the absence of a detection, is also difficult to reconcile with the view that irreversible acts cause quantum jumps. In a Renninger-style measurement, there must always be the “possibility of an irreversible act” (a detector must actually be present in the null channel), but this detector does not click during the actual measurement. If we take seriously the notion that irreversible acts collapse the wave function, Renninger measurements require us to believe that the mere possibility of an irreversible act is sufficient to bring about a quantum jump. The fact that such “interactionless” measurements are possible means that the wave function collapse cannot be identified with some specific random process occurring inside a measuring device.
Can you give a concrete example of a real-world experiment, where a measurement occurs without interaction of something measured with some measurement device? I'd say, if the system doesn't interact with the measurement device there cannot be a measurement to begin with. I've no clue what a "Renninger-style measurement" might be.
 
  • #64
Lord Jestocost said:
In a Renninger-style measurement, there must always be the “possibility of an irreversible act” (a detector must actually be present in the null channel), but this detector does not click during the actual measurement. If we take seriously the notion that irreversible acts collapse the wave function, Renninger measurements require us to believe that the mere possibility of an irreversible act is sufficient to bring about a quantum jump. The fact that such “interactionless” measurements
These are not interactionless - a null measurement is obtained, and as a consequence the state collapses (though not necessarily to an eigenstate).
 
  • #65
vanhees71 said:
Do you describe a rocket flying to the moon as a quantum system? I don't believe that this makes much sense.
Do you want to imply that a rocket flying to the moon is not a quantum system? What then is the size where a system loses its describability as a qauntum system?
vanhees71 said:
That's why we use the adequately reduced (coarse grained) description for measurement devices or rockets
Already the possibility of a reduced description requires that there is a theoretically possible, though unknown complete description, which we can reduce by coarse graining.
 
  • Like
Likes eloheim, Auto-Didact and dextercioby
  • #66
vanhees71 said:
Can you give a concrete example of a real-world experiment...

As far as I know, Renninger-type of measurements are thought experiments, see, for example:
Towards a Nonlinear Quantum Physics
 
  • #67
bob012345 said:
Feynman said nobody understands Quantum Mechanics. I think that's even more true today. I think it was Dirac who famously said something paraphrased as "shut up and calculate".

Much confusion arises additionally when one doesn’t recognize that the “objects” which are addressed by quantum theory (QT) are - in a scientific sense - fundamentally different from the “objects” which are addressed by classical physical theories. As pointed out by James Jeans in his book “PHYSICS & PHILOSOPY” (1948):

Complete objectivity can only be regained by treating observer and observed as parts of a single system; these must now be supposed to constitute an indivisible whole, which we must now identify with nature, the object of our studies. It now appears that this does not consist of something we perceive, but of our perceptions, it is not the object of the subject-object relation, but the relation itself. But it is only in the small-scale world of atoms and electrons that this new development makes any appreciable difference; our study of the man-sized world can go on as before.

QT deals with the temporal and spatial patterns of events which we perceive to occur on a space-time scene, our “empirical reality”. QT makes no mention of “deep reality” behind the scene, so QT cannot be the point of contact if one wants to know or to explain “what is really going on.”
 
  • #68
A. Neumaier said:
Do you want to imply that a rocket flying to the moon is not a quantum system? What then is the size where a system loses its describability as a qauntum system?

Already the possibility of a reduced description requires that there is a theoretically possible, though unknown complete description, which we can reduce by coarse graining.
Of course not, as I said in the paragraph later, the classical behavior of the relevant macroscopic degrees of freedom is well understood by the usual coarse-graining procedures from quantum-many-body theory. As far as matter is concerned everything on the basic level is described by relativistic QFT. However, you cannot describe all systems in all microscopic details, and thus one has to make approximations and find effective theories to describe the system at a level at which it can be described, and this can be in various ways. E.g., bound-state problems are usually treated in non-relativistic approximations, whenever this is possible, because it's much simpler than the relativistic description. Then, at the macroscopic level one describes systems by classical physics, because that covers everything that's relevant at this level of description.
 
  • #69
Lord Jestocost said:
As far as I know, Renninger-type of measurements are thought experiments, see, for example:
Towards a Nonlinear Quantum Physics
I don't have this book, but obviously it's not about quantum theory but some extension, I cannot judge, what it is supposed to solve or "correct" on standard quantum mechanics. As far as I could read at google, no valid mathematical description of the described experiment was made, but it's clear that ##D_1## in any case is another obstacle in the way of the particle, with which it interacts, and has thus to be taken into account to describe the system completely. It's a contradiction in itself to assume ##D_1## is an detector and the particles to be measured are not interacting with it at the same time. Even if ##D_1## doesn't give a signal, the particle may still interact with it. To understand the probability that ##D_1## gives a signal or not as well as for ##D_2## has to be analyzed in detail. Usually, there's some non-zero probability for a particle not to be detected at all, depending on the specific setup of the detector(s).
 
  • #70
vanhees71 said:
Of course not, as I said in the paragraph later, the classical behavior of the relevant macroscopic degrees of freedom is well understood by the usual coarse-graining procedures from quantum-many-body theory. As far as matter is concerned everything on the basic level is described by relativistic QFT. However, you cannot describe all systems in all microscopic details
One cannot in practice. But foundations are about the principles, not the practice. All questions of interpretation concern the principles. There one has a single huge quantum system consisting of a tiny measured system, a macroscopic detector, and maybe a heat bath, and wants to understand how the principles lead to unique outcomes (for example) - the measurement results of your coarse grained description.

You answer all these foundational questions by substituting the in principle existing (though unknown) description of this large quantum system by a classical description - just as Bohr did. The foundational problems are then associated with this change of description, where you just say that it works and hence is fine, but people interested in the foundations want to understand why.
 
  • Like
Likes eloheim, Auto-Didact and dextercioby

Similar threads

Replies
48
Views
997
Replies
31
Views
5K
Replies
69
Views
5K
Replies
13
Views
5K
Replies
61
Views
5K
Replies
17
Views
5K
Replies
25
Views
3K
Back
Top