How Does Environmentally Induced Decoherence Affect Quantum State Reduction?

  • B
  • Thread starter Feeble Wonk
  • Start date
  • Tags
    Decoherence
In summary: The unitary dynamic evolution is pure and zero entropy and exact for the composite. The reduced density operator of the system alone is mixed and has higher entropy. The reduced density operator of the environment alone is mixed and has higher entropy. But the total entropy of the composite is zero and this is the only thing that is exact and pure. As to what you are missing, that is subtler but perhaps the following will help. In summary, the concept of spontaneous quantum state reduction through environmentally induced decoherence involves the interaction between a system and its environment, causing the system to become "mixed" and increase in entropy while the composite system remains in a "pure" and zero entropy state.
  • #71
vanhees71 said:
Of course, but then I don't use the reduced description but the state of the full system. It was exactly the example on p 10 of the above mentioned paper which lead to my question. It's as often in these interpretational discussions much ado about nothing!

It is not much ado about nothing. Not distinguishing these has mislead some into believing that decoherence solves the measurement problem, including physicists as distinguished as Anderson. If you don't like interpretation, it must be noted that most great physicists cared deeply about it. In modern times, one can read the comments about the importance of the distinction between proper and improper mixed states in https://www.amazon.com/dp/0198509146/?tag=pfamazon01-20.
 
Last edited by a moderator:
  • Like
Likes eloheim and rkastner
Physics news on Phys.org
  • #72
atyy said:
It is not much ado about nothing. Not distinguishing these has mislead some into believing that decoherence solves the measurement problem, including physicists as distinguished as Anderson. If you don't like interpretation, it must be noted that most great physicists cared deeply about it. In modern times, one can read the comments about the importance of the distinction between proper and improper mixed states in https://www.amazon.com/dp/0198509146/?tag=pfamazon01-20.

Yeah, with the various threads on why quantum mechanics is not weird, I've been trying to clarify in my mind exactly why I still think it is weird. It's definitely the measurement problem, but I have a hard time formalizing exactly why it bothers me. But roughly speaking, orthodox quantum mechanics seems a little schizophrenic. On the one hand, most people like to assume that there is nothing going on in a measurement process that cannot be explained by quantum mechanics. But if you try to describe the whole composite system (system being measured plus system doing the measuring) using quantum mechanics, then I don't see that anything vaguely like the QM collapse postulate--after a measurement, the system is an eigenstate of the property being measured--happens. I don't see anything vaguely like the more minimal description--you get some eigenvalue with probabilities given by the Born rule--happens, either. If we are using QM to describe the composite system, then it's hard to see why there should be definite outcomes for measurements at all, or why probabilities come into play at all.

Decoherence is where the schizophrenia comes in. If you take the density matrix of the complete system, and trace out the environmental degrees of freedom, then you end up with a mixed-state density matrix. But then people want to interpret the mixed state using the ignorance interpretation. That doesn't make sense to me--you KNOW that the mixed state didn't arise from ignorance about the true state, because you just created the mixed state by tracing out the environmental degrees of freedom. It seems as though you're willfully forgetting what you just did.

So to me, orthodox QM just doesn't make sense. Maybe one of the other interpretations--objective collapse, or many-worlds, or Bohmian mechanics--makes sense, but the orthodox interpretation doesn't. It seems like people are willfully fooling themselves.
 
Last edited by a moderator:
  • Like
Likes eloheim and Nugatory
  • #73
Well, maybe in realizing that you have to think hard to figure out what bothers you you only realized that there's nothing to bother about. Quantum theory tells us that nature is inherently stochastic/probabilistic/statistical. So what? That's how it is!

What is (or was for quite a while) an interesting theoretical challenge is that in our everyday experience macroscopic objects obey almost exactly the laws of classical physics, and we do not see quantum interference effects at macroscopic objects. That's why it took some time to discover the quantum behavior (starting with black-body radiation in the late 1880ies). I think, contrary to what atyy said in #71 that is clearly solved by decoherence and that we are simply not able to resolve the fast scales of dynamics of microscopic scales for many-body systems. So we get the classical world from coarse graining the description of the macroscopically relevant slow observables at macroscopic scales. It must also have to do with the formalism of the renormalization group in QFT/stat phys. The Wilsonian interpretation is precisely that picture of effective theories on low energy-momentum (slow and long-distance scale varying) scales emerging from more microscopic theories which reveal themselves only at high energy-momentum (fast and short-distance) scales. In this sense classical theory is an effective theory of quantum theory with some range of applicability.

The socalled measurement problem is then simply the question of how microscopic systems, sufficiently isolated from the environment to reveal quantum behavior, interact with the measurement apparatus, which provides "the environment" in being necessarily the "classicality condition" of measurement apparati as already discussed by Bohr in the early 1930ies (i.e., before Heisenberg confused the quantum community with his collapse in the 50ies ;-)).

What always has bothered me before I learned about the works on decoherence was this quantum-classical cut, introduced ad hoc as an explanation for the classical behavior of measurement apparati and the even more ad hoc assumption of a collapse of the state which in almost all real measurements never occur, because the quantum object is "destroyed" in the measurement process and thus it's not even necessary to find a description as an isolated quantum system anymore. What happens at or shortly after the "measurement" is entirely a property of the measurement apparatus and not of a general theory/model of the world.
 
  • Like
Likes Nugatory
  • #74
vanhees71 said:
Well, maybe in realizing that you have to think hard to figure out what bothers you you only realized that there's nothing to bother about. Quantum theory tells us that nature is inherently stochastic/probabilistic/statistical. So what? That's how it is!

I am not arguing with either you or stevendaryl here (I have a great deal of sympathy for both positions) but the only takeaway here may be that the two have you have different thresholds for weirdness. There is a strong element of personal taste involved when considering whether an internally consistent and empirically supported position is also satisfactory.
 
  • Like
Likes bhobba
  • #75
vanhees71 said:
Well, maybe in realizing that you have to think hard to figure out what bothers you you only realized that there's nothing to bother about. Quantum theory tells us that nature is inherently stochastic/probabilistic/statistical.

But it doesn't tell us that, at all. The only way that probabilities come into it is by the dubious steps of separating the measurement apparatus from the thing being measured, and then treating the former in a way that is inconsistent with the way the latter is treated.
 
  • #76
stevendaryl said:
But it doesn't tell us that, at all. The only way that probabilities come into it is by the dubious steps of separating the measurement apparatus from the thing being measured, and then treating the former in a way that is inconsistent with the way the latter is treated.

Specifically, you treat the system being measured as something whose state evolves unitarily according to Schrodinger's equation, and you treat the measuring device as something that has definite outcomes for measurements. That seems inconsistent to me.
 
  • #77
vanhees71 said:
(i.e., before Heisenberg confused the quantum community with his collapse in the 50ies ;-)).
It was von Neumann who in his 1932 book, where he made QM mathematically fully respectable, also made the collapse (then called state reduction) definite and prominent. Bohm then coined 1951 the name collapse for state reduction. Many people from the quantum optics community finally observed in 1986+ the collapse as quantum jumps in certain continuous measurements of single atoms in an ion trap, so that it is now in various quantum optics books; see, e.g., Section 8.2 of Gerry & Knight 2005.

It is not appropriate to blame Heisenberg for all this - I don't even know what Heisenberg contributed.
 
Last edited:
  • #78
Nugatory said:
I am not arguing with either you or stevendaryl here (I have a great deal of sympathy for both positions) but the only takeaway here may be that the two have you have different thresholds for weirdness. There is a strong element of personal taste involved when considering whether an internally consistent and empirically supported position is also satisfactory.

But that is not the issue. Bohr's position is fine - it's weird live with it, we can do science with it. Dirac's position is also fine - it's weird, but will presumably be solved by quantum theory not being the final theory.

What vanhees71 is claiming is that there is no measurement problem, no classical/quantum cut in a minimal interpretation - ie. without BM or MWI. Vanhees71's claim is extremely controversial, and as far as I can tell, it is wrong, and not a matter of taste. The book by Haroche and Raimond rebuts vanhees71's position that decoherence solves the measurement problem.
 
  • Like
Likes eloheim
  • #79
stevendaryl said:
If you take the density matrix of the complete system, and trace out the environmental degrees of freedom, then you end up with a mixed-state density matrix. But then people want to interpret the mixed state using the ignorance interpretation. That doesn't make sense to me--you KNOW that the mixed state didn't arise from ignorance about the true state, because you just created the mixed state by tracing out the environmental degrees of freedom. It seems as though you're willfully forgetting what you just did.

So to me, orthodox QM just doesn't make sense.
This only proves that the talk in orthodox QM about ignorance doesn't make sense. Once one accepts that the mixed state obtained by tracing out the environmental degrees of freedom is all there is to a state of a subsystem, nothing depends anymore on knowledge or ignorance. The mixed state is a complete description of the single system. In rare cases it happens to be a pure state, for example when one looks at a single silver atom in a Stern-Gerlach experiment, projects the state to the region where one of the beams produced lives, and traces over all degrees of freedom except the silver atom spin. Every case of a preparation of a pure state can be explained in a similar way. Thus there is nothing at all that depends on knowledge or ignorance - except the common talk in the textbooks.
 
  • Like
Likes vanhees71
  • #80
A. Neumaier said:
This only proves that the talk in orthodox QM about ignorance doesn't make sense. Once one accepts that the mixed state obtained by tracing out the environmental degrees of freedom is all there is to a state of a subsystem, nothing depends anymore on knowledge or ignorance. The mixed state is a complete description of the single system.

Okay, but a mixed state can potentially describe a nonzero probability of (say) a cat being dead and a cat being alive. Okay, if you don't want to talk about cats, you can replace it by any other two macroscopically distinguishable possibilities. The mixed state formalism can account for a nonzero probability for two different macroscopically distinguishable possibilities. So either both possibilities are real (which to me means many-worlds), or one or the other is real so somehow a single possibility was selected.
 
  • Like
Likes eloheim
  • #81
stevendaryl said:
but a mixed state can potentially describe a nonzero probability of (say) a cat being dead and a cat being alive.
A theoretical mixed state, but not a mixed state realized in Nature according to the tracing out rule given - unless the state of the big system from which this state was obtained by tracing out the environment was already very weird. A mixed state is admissible in the arguments only if we can tell how to prepare them, given the laws of Nature and the tracing out rule. We can do that for pure spin states and for superpositions of tensor products of a few spin states, but even that only in carefully controlled situations. But no apparatus in the universe would prepare a cat in a mixed state of the kind you proposed. At least no known one - which is sufficient to explain why we don't observe these strange things. Nothing needs to be selected since the state cannot be prepared in the first place.
 
  • Like
Likes Mentz114
  • #82
atyy said:
The book by Haroche and Raimond rebuts vanhees71's position that decoherence solves the measurement problem.
In which paragraph or page?
 
  • #83
A. Neumaier said:
A theoretical mixed state, but not a mixed state realized in Nature according to the tracing out rule given - unless the state of the big system from which this state was obtained by tracing out the environment was already very weird.

Well, part of the difficulty here is that we really can't do quantum mechanics with [itex]10^{23}[/itex] particles except in heuristic ways. So the weirdness is perhaps lost in the complexity. But it seems to me that you could set up a situation in which a microscopic difference (whether an electron is spin-up or spin-down) is magnified to make a macroscopic difference. That's what Schrodinger's cat is about. For that matter, that's what any measurement does. So if you consider it weird for a microscopic difference to be magnified to become a macroscopic difference, then such weirdness is an inherent part of the empirical content of QM.

Suppose you set things up so that:
  • The detection of a spin-up electron leads to a dead cat.
  • The detection of a spin-down electron leads to a live cat.
Then you create an electron that is in a superposition [itex]\alpha |up\rangle + \beta |down\rangle[/itex], and you send it to the detector. What happens? Well, the Copenhagen interpretation would tell us that macroscopic objects like cats are classical, not quantum. So rather than leading to a superposition of a dead cat and a live cat, what we would get is EITHER a dead cat, with probability [itex]|\alpha|^2[/itex], or a live cat, with probability [itex]|\beta|^2[/itex]. But that seems inconsistent to me. Why, for small systems, do we get superpositions, rather than alternatives, but for large systems, we get alternatives? That's the weirdness, if not outright inconsistency, of standard quantum mechanics.

Of course, some people claim that decoherence explains why we get alternatives, rather than superpositions, but I don't think it actually does that. What it explains is that superpositions rapidly spread with time: You start off with a single particle in a superposition of states, and then it interacts with more particles putting that composite system into a superposition, and that composite system interacts with the environment (the electromagnetic field) putting it into a superposition of states. The superposition doesn't go away, but it spreads to infect the whole universe (or our little part of it, anyway). But then a trace over everything other than the system of interest gives us what looks like a mixed state, where we can interpret the components of the mixture as alternatives, rather than superpositions.
 
  • Like
Likes Feeble Wonk
  • #84
I think that's a misunderstanding of decoherence. We don't suddenly change the interpretation when we compute reduced density matrices. In fact, we never need to compute the reduced density matrix for decoherence. We could just work with the full quantum state. It's only a matter of convenience to compute the reduced density matrix. Quantum mechanics is a theory that predicts relative frequencies for certain events. It provides us with a probability distribution for each observable. In fact, we could get rid of the Hilbert space and operators completely and reformulate QM purely as a bunch of evolution equations for these probability distributions. Decoherence explains, why those probability distributions don't usually exhibit oscillatory behaviour. For example it explains, why the probability distribution for the throw of a die is ##P_i = \frac{1}{6}## and not rather ##P_1 = P_3 = P_5 = \frac{1}{3}, P_2 = P_4 = P_6 = 0##. So decoherence explains why the probability distributions, that QM predicts, agree with those that we would expect classically.

What more do you expect from a physical theory than a prediction of relative frequencies? And if you don't expect more, then why does QM have problems?
 
  • #85
rubi said:
Quantum mechanics is a theory that predicts relative frequencies for certain events.

I don't think it really does that. Can you say what an "event" is, without making a macroscopic/microscopic distinction?

[edit]What I should have said is that I don't think quantum mechanics gives probabilities (relative or otherwise) without additional assumptions that seem ad hoc.
 
  • #86
stevendaryl said:
I don't think it really does that. Can you say what an "event" is, without making a macroscopic/microscopic distinction?
Yes, I think so: Let ##A## be any observable you want. Let ##\sigma(A)## be its spectrum and ##\psi_a## be the generalized eigenvectors of ##A##. The set of of events for this observable is ##\mathcal B(\sigma(A))##, the smallest sigma algebra that contains all the open sets of ##\sigma(A)## and for each such event ##B##, its probability is given by ##P(B) = \int_B |\left<\psi_a,\Psi\right>|^2\mathrm d a##. For example, ##A## could be the position operator ##\hat x(t)## at time ##t## and ##B## could just be the event "The position at time t lies between 2 and 3", which would mathematically be represented by the interval ##B=(2,3)##. This should account for every event you could think of.
 
Last edited:
  • #87
rubi said:
Yes, I think so: Let ##A## be any observable you want. Let ##\sigma(A)## be its spectrum and ##\psi_a## be the generalized eigenvectors of ##A##. The set of of events for this observable is ##\mathcal B(\sigma(A))##, the smallest sigma algebra that contains all the open sets of ##\sigma(A)## and for each such event ##B##, its probability is given by ##P(B) = \int_B |\left<\psi_a,\Psi\right>|^2\mathrm d a##. For example, ##A## could be the position operator ##\hat x(t)## at time ##t## and ##B## could just be the event "The position at time t lies between 2 and 3", which would mathematically be represented by the interval ##B=(2,3)##. This should account for every event you could think of.

Okay, this is definitely NOT the standard way of presenting quantum mechanics, which is what I had a complaint about. But let me take your presentation. It does not, so far, have any connection to anything with empirical content. To make a connection with something observable, you have to associate probabilities with measurement outcomes. Which means that you have to face the measurement problem, of what does it mean to measure some observable?
 
  • #88
stevendaryl said:
Okay, this is definitely NOT the standard way of presenting quantum mechanics, which is what I had a complaint about.
The formula ##P(B)## I wrote down is just the Born rule. I just wrote it in a way that allows you to directly plug in the events ##B## that you are interested in. I think it is fairly standard, at least we regularly teach it this way at my university.

But let me take your presentation. It does not, so far, have any connection to anything with empirical content. To make a connection with something observable, you have to associate probabilities with measurement outcomes.
The probabilities are given by ##P(B)##. For each observable, QM allows you to compute such a probability distribution. Let's say we measure the spin of a particle. Then you my formula would give you probabilities ##P_\uparrow = P(\{\uparrow\})## and ##P_\downarrow = P(\{\downarrow\})##. These are the probabilities that predict the relative frequencies of spin measurements.

Which means that you have to face the measurement problem, of what does it mean to measure some observable?
I don't understand this question. Can you explain how you would answer this question in the case of classical mechanics and how it would be different from quantum mechanics? What would it mean to measure an observable in CM?
 
  • #89
rubi said:
I don't understand this question. Can you explain how you would answer this question in the case of classical mechanics and how it would be different from quantum mechanics? What would it mean to measure an observable in CM?

To measure an observable means to set things up so that there is a correspondence between possible values of the observable and macroscopically distinguishable states of the measuring device. An example might be a pointer that pivots in a semicircle. Then you set things up so that the angle of the pointer is affinely related to the value of a real-valued observable.

Implicit in this is the assumption that the pointer actually has a definite value. If the pointer could be in a superposition of positions, then I don't know what it would mean to say that it measures an observable. And that's the case with quantum mechanics. If the system being measured is in a superposition of different values of an observable, and you let the system interact with a measurement device, I would expect (if we analyzed the measurement device itself using quantum mechanics) the result to be that the measurement device would be put into a superposition of states. (or that a larger system, including measuring device + environment, would be put into a superposition of states).
 
  • #90
The problem, which to me seems like an inconsistency in the quantum formalism, is that for a small system, such as a single electron, observables don't have definite values, in general. If an electron has spin state [itex]\left( \begin{array} \\ \alpha \\ \beta \end{array} \right)[/itex], what is the z-component of its spin? The question doesn't have an answer. It's in a superposition of spin-up and spin-down. But if you take a macroscopic system such as a detector, and you measure the z-component of the spin, you don't get a superposition of answers, you get either spin-up or spin-down. The macroscopic system has a definite state.

Why do macroscopic systems have definite states, if microscopic systems don't?
 
  • #91
stevendaryl said:
To measure an observable means to set things up so that there is a correspondence between possible values of the observable and macroscopically distinguishable states of the measuring device. An example might be an arrow that pivots in a semicircle. Then you set things up so that the angle of the pointer is affinely related to the value of a real-valued observable.

Implicit in this is the assumption that the pointer actually has a definite value. If the pointer could be in a superposition of positions, then I don't know what it would mean to say that it measures an observable. And that's the case with quantum mechanics. If the system being measured is in a superposition of different values of an observable, and you let the system interact with a measurement device, I would expect (if we analyzed the measurement device itself using quantum mechanics) the result to be that the measurement device would be put into a superposition of states. (or that a larger system, including measuring device + environment, would be put into a superposition of states).
There is a difference between the mathematical formalism and reality. The fact that QM uses the mathematics of Hilbert spaces and superpositions doesn't mean that the concept of superposition somehow applies to real objects. It can only apply to mathematical objects, like vectors in a Hilbert space. The prediction of QM isn't that something is in a superposition. The prediction is rather that we will find the pointer at ##0^\circ## 50% of the time and ##180^\circ## 50% of the time (for example). Superpositions are just an intermediate mathematical tool that allows us to obtain the numerical values for these relative frequencies, much like virtual particles are an intermediate mathematical tool. The correspondence between measurement apparata and mathematics is given by observables. Every apparatus is mathematically represented as a self-adjoint operator. That doesn't mean that the apparatus is a self-adjoint operator, which of course it isn't. We use the phrase "the particle is in a superposition" just as a metaphor. It really means "the relative frequencies that describe the particle can be adequately modeled using the mathematics of superposition".
 
  • #92
rubi said:
There is a difference between the mathematical formalism and reality

Okay, fine. If you want to say that QM is just a recipe for getting answers, that's the "shut up and calculate" interpretation, which is fine, as far as it goes.
 
  • #93
atyy said:
The book by Haroche and Raimond rebuts vanhees71's position that decoherence solves the measurement problem.
I only found this:
"
We can say, according to Einstein terminology, that the death or life of the cat has, even before being recorded by a human mind, become an element of reality (since all entanglement has been destroyed by decoherence), but this element of reality cannot be predicted, only its probability can be estimated. Some physicists find this state of affairs uncomfortable. Others are ready to accept this inherently statistical feature of quantum theory."
 
  • #94
naima said:
I only found this:
"
We can say, according to Einstein terminology, that the death or life of the cat has, even before being recorded by a human mind, become an element of reality (since all entanglement has been destroyed by decoherence), but this element of reality cannot be predicted, only its probability can be estimated. Some physicists find this state of affairs uncomfortable. Others are ready to accept this inherently statistical feature of quantum theory."

If I remember correctly, Haroche and Raimond discuss decoherence and the measurement problem extensively around p81.
 
  • #95
stevendaryl said:
The problem, which to me seems like an inconsistency in the quantum formalism, is that for a small system, such as a single electron, observables don't have definite values, in general. If an electron has spin state [itex]\left( \begin{array} \\ \alpha \\ \beta \end{array} \right)[/itex], what is the z-component of its spin? The question doesn't have an answer. It's in a superposition of spin-up and spin-down. But if you take a macroscopic system such as a detector, and you measure the z-component of the spin, you don't get a superposition of answers, you get either spin-up or spin-down. The macroscopic system has a definite state.

Why do macroscopic systems have definite states, if microscopic systems don't?
If you drop the idea that mathematical terms can be directly applied to real objects ("ceci n'est pas une pipe"), this problem vanishes. A state is a mathematical representation of reality. A particle doesn't really have a position (i.e. a real number). There is no internal counter within the particle or anything like that. The real number that we ascribe to the particle is just our mathematical representation of facts about reality. You need to distinguish these concepts clearly. The idea that a list of real numbers is enough to capture all the details about the reality of a particle is flawed and the violations of Bell's inequality show that this idea can't possibly be saved (BM doesn't save it either). It's impossible for a theory to have definite values for both spin up and spin left if the theory is supposed to agree with experiments. It is a fundamental fact about our world that this can't be done (unless you want to exploit loopholes), so a theory that acknowledges this fact can't be problematic because of this. If anything, the universe is problematic.

Macroscopic systems don't have definite states (a list of real numbers that defines their physics completely) either. It's just that assuming they do is good enough for all practical purposes.

stevendaryl said:
Okay, fine. If you want to say that QM is just a recipe for getting answers, that's the "shut up and calculate" interpretation, which is fine, as far as it goes.
I'm saying that QM satisfies all properties that a physical theory must have and it doesn't have inconsistencies.
 
  • #96
stevendaryl said:
Why do macroscopic systems have definite states, if microscopic systems don't?
What do you say to the answer given in the discussion in posts #83 - #109 of another thread?
 
  • #97
rubi said:
If you drop the idea that mathematical terms can be directly applied to real objects

As I said, that's the "shut up and calculate" interpretation, which I agree works fine.
 
  • #98
rubi said:
I'm saying that QM satisfies all properties that a physical theory must have and it doesn't have inconsistencies.

And I'm saying that I don't agree. You're basically doing the Copenhagen, or shut up and calculate approach, which to me is inconsistent. It requires treating macroscopic objects in a way that is inconsistent with the way that it treats microscopic objects. Since macroscopic objects are presumably made up of microscopic objects, that seems inconsistent to me.

You could say, as the Copenhagen people did, that no, macroscopic objects aren't made of microscopic objects. The microscopic world doesn't exist, it's just a mathematical fiction for doing calculations. That's fine. But then you need a different theory for macroscopic objects in order to build detectors and so forth. What theory is that? Copenhagen said that we basically treat macroscopic objects classically, which is fine as a heuristic. But to have two different theories--one for macroscopic objects and another for microscopic objects--is very distasteful to me.
 
  • #99
stevendaryl said:
As I said, that's the "shut up and calculate" interpretation, which I agree works fine.
But what more do you expect from a physical theory than a prediction of all relative frequencies?

It seems that you want the theory to assign a list of real numbers to each physical entity. This is not possible in our universe. So if the theory fails to do this, we should not blame the theory.

stevendaryl said:
And I'm saying that I don't agree.
If you claim that there is an inconsistency, you should be able to derive a contradiction from QM, i.e. you should be able to derive a statement of the form ##A\wedge\neg A##. Can you tell me what that statement ##A## could be?
 
  • #100
stevendaryl said:
Then you create an electron that is in a superposition α|up⟩+β|down⟩, and you send it to the detector. What happens? Well, the Copenhagen interpretation would tell us that macroscopic objects like cats are classical, not quantum. So rather than leading to a superposition of a dead cat and a live cat, what we would get is EITHER a dead cat, with probability ##|α|^2##, or a live cat, with probability ##|β|^2##. But that seems inconsistent to me. Why, for small systems, do we get superpositions, rather than alternatives, but for large systems, we get alternatives?
The Copenhagen interpretation says (independent of the size of the system) that the state collapses upon measurement, giving the definite outcome rather than the superposition.
 
  • #101
rubi said:
But what more do you expect from a physical theory than a prediction of all relative frequencies?

I do not believe that continuing to discuss this with you would be fruitful. You are obviously uninterested in my concerns. That's fine, but there is no basis for discussion.
 
  • #102
stevendaryl said:
I do not believe that continuing to discuss this with you would be fruitful. You are obviously uninterested in my concerns. That's fine, but there is no basis for discussion.
I am totally interested in your concerns, I just appear to not understand them.
 
  • #103
stevendaryl said:
And I'm saying that I don't agree. You're basically doing the Copenhagen, or shut up and calculate approach, which to me is inconsistent. It requires treating macroscopic objects in a way that is inconsistent with the way that it treats microscopic objects. Since macroscopic objects are presumably made up of microscopic objects, that seems inconsistent to me.

You could say, as the Copenhagen people did, that no, macroscopic objects aren't made of microscopic objects. The microscopic world doesn't exist, it's just a mathematical fiction for doing calculations. That's fine. But then you need a different theory for macroscopic objects in order to build detectors and so forth. What theory is that? Copenhagen said that we basically treat macroscopic objects classically, which is fine as a heuristic. But to have two different theories--one for macroscopic objects and another for microscopic objects--is very distasteful to me.
When I wrote my earlier post, your post contained only the first sentence.

I'm not doing Copenhagen. I'm probably more close to MWI or consistent histories. These interpretations don't require treating microscopic objects differently than macroscopic ones. They are treated exactly the same way. I'm basically taking MWI, but I don't accept the reality of different branches. I just take the part of MWI that predicts all the relative frequencies correctly. I claim that the relative frequencies from MWI is all we need and the idea that all the different branches really exist is non-sense or at best physically not relevant. (I think this is sometimes called post-Everett interpretation or so.)
 
  • #104
rubi said:
I'm not doing Copenhagen. I'm probably more close to MWI or consistent histories.

Well, my complaints about standard quantum mechanics don't apply to MWI or consistent histories. There are other concerns about those, but that's a completely different subject.
 
  • #105
stevendaryl said:
Well, my complaints about standard quantum mechanics don't apply to MWI or consistent histories. There are other concerns about those, but that's a completely different subject.
Well, nowadays, the old Copenhagen idea of a wave-function collapse is not taken seriously by the vast majority of physicists anymore. Of course, it's still used as a convenient mathematical shortcut, because even a simple calculation might become a PhD thesis in post-Everett interpretations. However, apart from the inconvenience, using QM without collapse purely as a tool to obtain relative frequencies that can be compared to experiment, I don't think there is any problem or inconsistency in the theory. Of course, there is still some weirdness left, which is forced upon us by Bell, but as I said, we can't blame the theory for that. (I would consider every non-classical theory weird, but the world just happens to require a non-classical theory.)
 

Similar threads

  • Quantum Interpretations and Foundations
Replies
7
Views
1K
  • Quantum Interpretations and Foundations
Replies
9
Views
2K
  • Quantum Physics
2
Replies
40
Views
7K
  • Quantum Interpretations and Foundations
Replies
25
Views
2K
Replies
6
Views
3K
Replies
12
Views
2K
Replies
2
Views
2K
Replies
102
Views
16K
  • Beyond the Standard Models
2
Replies
38
Views
8K
  • Quantum Physics
Replies
8
Views
4K
Back
Top