The typical and the exceptional in physics

In summary, the conversation discusses the concept of the superposition principle in quantum mechanics and its implications on macroscopic objects. While there is no limitation on the standard deviation of variables in quantum mechanics, it is argued that successful physics focuses on typical situations rather than exceptional ones. The use of mixed states in statistical mechanics is mentioned as a way to describe macroscopic objects, but it is noted that this already assumes a small standard deviation. The conversation concludes that while it is possible to ignore these problems, it is not a satisfying approach.
  • #316
Demystifier said:
That's very difficult, because vanhees71 has a very complex personality. By applying some amateur psychoanalysis on him, I arrived at the following conclusions:
1. No doubt, he is a very smart guy.
2. He is excellent in the shut-up-and-calculate "interpretation", and when he sticks to that kind of business he is usually consistent.
3. However, he is not completely satisfied with the shut-up-and-calculate business. He has a need to say something more about interpretations.
4. He also thinks that interpretations are irrelevant to physics.
5. Unfortunately, the facts 3. and 4. constitute a contradiction. This contradiction is the main source of complexity in his personality.
6. He tries to reconcile the contradiction between 3. and 4. by defending a sort of minimal interpretation.
7. However, the minimal interpretation does not really satisfy him, so sometimes in his arguments he goes beyond the minimal interpretation. This further increases inconsistency of his arguments and complexity of his personality.
8. Of course, it is very unlikely that he would admit that the above is true (except 1. and 2.)

In short, an interesting combination of high intelligence and unsharp views on quantum interpretations makes the discussions with him very challenging. :smile:

I don't see, why it is a contradiction that my interests go beyond physics. I also think that the minimal interpretation is the only consistent one, but it's interesting to discuss other interpretations as well. So I agree with everything except 5 since it's no contradiction to consider something as irrelevant for science but being nevertheless interested in it ;-).
 
  • Like
Likes atyy and Demystifier
Physics news on Phys.org
  • #317
A. Neumaier said:
Isn't that a consequence of the Heisenberg uncertainty principle? I find that nobody here has both high intelligence and completely sharp views. These are strictly complementary variables.
That's correct. But some of the people here are not so far from a coherent state, which is a state that minimizes the product of uncertainties (product of inverse intelligence and unsharpness). vanhees71 is not close to a coherent state, which is probably good for creativity.
 
  • Like
Likes kith
  • #318
vanhees71 said:
So there are at least some observable and testable consequences of the prediction of black holes (space-time singularities), and thus it's science.

The universe as a whole is unobservable
There are at least some observables of the universe as a whole, for example the mass density in the observable part. There are also testable consequences, obtained by restriction of its unitary dynamics to some observable part and a semiclassical FLRW approximation for the remainder, which produces enough dissipation to decohere everything, and the framework for inflation studies that can be tested through observation of the microwave background.

Thus according to your criterion for black holes, its science, too.
 
  • #319
Demystifier said:
That's very difficult, because vanhees71 has a very complex personality. By applying some amateur psychoanalysis on him, I arrived at the following conclusions:
1. No doubt, he is a very smart guy.
2. He is excellent in the shut-up-and-calculate "interpretation", and when he sticks to that kind of business he is usually consistent.
3. However, he is not completely satisfied with the shut-up-and-calculate business. He has a need to say something more about interpretations.
4. He also thinks that interpretations are irrelevant to physics.
5. Unfortunately, the facts 3. and 4. constitute a contradiction. This contradiction is the main source of complexity in his personality.
6. He tries to reconcile the contradiction between 3. and 4. by defending a sort of minimal interpretation.
7. However, the minimal interpretation does not really satisfy him, so sometimes in his arguments he goes beyond the minimal interpretation. This further increases inconsistency of his arguments and complexity of his personality.
8. Of course, it is very unlikely that he would admit that the above is true (except 1. and 2.)

In short, an interesting combination of high intelligence and unsharp views on quantum interpretations makes the discussions with him very challenging. :smile:

I believe the wave function of vanhees71 is real :)
 
  • Like
Likes Simon Phoenix
  • #320
vanhees71 said:
So I agree with everything except 5
So, you agree that you are not completely satisfied with the minimal (ensemble) interpretation? :wink:
 
  • #321
A. Neumaier said:
Thanks for the papers. I knew Rydberg states, but their variance is not even bounded.

Note that the first paper only treats a 1D mock version of hydrogen, with bounded variance. The second paper treats the real thing and points out: ''That means that there will be a total dephasing in ##\phi##''.
Well, the second paper treats Klauder's hydrogen coherent states (see the link in my post #296) and you're ignoring half of the paper. It also states that ##\theta##, ##r##, the Lenz-Runge vector and the eccentricity vector have bounded variance. Moreover, the first paper, which explains Klauder's construction, says that the energy has bounded variance for high quantum numbers. ##\phi## is the only variable that has unbounded variance, but it grows very slowly for celestial bodies. This is what I said in post #299.

vanhees71 said:
I thought Hawking radiation is radiation due to quantum fluctuations around the event horizon of a black hole. Where do you need the wave function of the universe for that?
No, there is no derivation of Hawking radiation that goes like this. For some reason, however, popularizers explain it this way. The actual derivation of the Hawking effect is not even close. What Hawking really does is decompose the wave function of the universe into field modes that end up in the black hole and modes that reach future infinity. He then traces out the modes that end up in the black hole and finds a thermal state. Hawking's original paper is a bit dense, but it's explained well in Fabbri's book "Modeling black hole evaporation".
 
  • Like
Likes dextercioby
  • #322
atyy said:
I believe the wave function of vanhees71 is real :)

As far as I can tell he hasn't collapsed yet o_O
 
  • Like
Likes atyy, vanhees71 and Demystifier
  • #323
vanhees71 said:
I don't see, why it is a contradiction that my interests go beyond physics.
OK, perhaps I should have say "tension" (rather than contradiction).
 
  • Like
Likes vanhees71
  • #324
vanhees71 said:
I don't see, why it is a contradiction that my interests go beyond physics. I also think that the minimal interpretation is the only consistent one, but it's interesting to discuss other interpretations as well. So I agree with everything except 5 since it's no contradiction to consider something as irrelevant for science but being nevertheless interested in it ;-).

Well, the disagreement is sharper than that.

You think that the locality of relativistic QFT is inconsistent with collapse.

While I am agnostic about the reality of collapse, all evidence I know of shows that the nonlocality of collapse is consistent with the locality of relativistic QFT.

So we reach the point in the discussion where I show that collapse does not affect the Hamiltonian of the system, and does not allow superluminal transmission, which you agree with.

So you criticize that I am not including the measurement apparatus in the Hamiltonian. But we are at the point where you haven't indicated whether there is a quantum state of the LHC (the measurement apparatus).

Other sharp points of disagreement are that I understand the minimal interpretation to have a collapse (or updating) and a Heisenberg cut, whereas it is not clear whether you believe there should be a Heisenberg cut or not. My minimal interpretation is agreed upon by Landau and Lifshitz, and by Weinberg.
 
  • Like
Likes Demystifier
  • #325
rubi said:
It also states that θ, r, the Lenz-Runge vector and the eccentricity vector have bounded variance.
But this alone doesn't make the motion classical. The variance of all variables must be bounded for that, and you had erroneously claimed that in post #296.
 
  • #326
atyy said:
I show that collapse does not affect the Hamiltonian of the system.
There is nothing to show here, as the Hamiltonian of a system is fixed and whatever the state is, it cannot affect the Hamiltonian. The only affection goes in the other direction.
 
  • #327
atyy said:
Well, the disagreement is sharper than that.

You think that the locality of relativistic QFT is inconsistent with collapse.

While I am agnostic about the reality of collapse, all evidence I know of shows that the nonlocality of collapse is consistent with the locality of relativistic QFT.

So we reach the point in the discussion where I show that collapse does not affect the Hamiltonian of the system, and does not allow superluminal transmission, which you agree with.

So you criticize that I am not including the measurement apparatus in the Hamiltonian. But we are at the point where you haven't indicated whether there is a quantum state of the LHC (the measurement apparatus).

Other sharp points of disagreement are that I understand the minimal interpretation to have a collapse (or updating) and a Heisenberg cut, whereas it is not clear whether you believe there should be a Heisenberg cut or not. My minimal interpretation is agreed upon by Landau and Lifshitz, and by Weinberg.

As I said before, in Landau Lifshitz I cannot find the word collapse (by searching the electronic copy I have ;-)), and Weinberg holds the view that the question in interpretational issues is undecided (after a brilliant analysis in an early chapter of his QM lectures book). Then he happily goes on using the standard representation.

I don't think that a Heisenberg cut is in any way justified by the formalism of QT nor is it in anyway justified by observations. It's just a matter of technological challenge how to sufficiently isolate macroscopic systems from perturbations to avoid decoherence to demonstrate quantum effects also on them. You have to decide from case to case at which point in an experimental setup you can treat things (semi-)classically. I think that Bohr was right saying that a measurement apparatus should be within the validity of the semi-classical description. It must be an open system such that you can store the information on the measurements made, which is an irreversible process.

The interaction between measured object and the macroscopic measurement device are part of the Hamiltonian and as such, according to the successful relativistic QFT, a local interaction. The assumption that such a local interaction can cause far-distant instantaneous responses is thus a contradiction in adjecto. According to the standard (minimal) interpretation there's also no need to explain it by far-distant correlations as described by entanglement. It's all standard QT (or functional analysis if you wish).
 
  • #328
A. Neumaier said:
But this alone doesn't make the motion classical. The variance of all variables must be bounded for that.
Well either that or it must grow very slowly as you indicated.

We will never be able to construct coherent states that have bounded variance for all possible observables. One must always pick some small set of observables that should have this property. It's true that bounded variance for ##\phi## is desirable, but sufficiently slow growth of the variance for large quantum numbers makes the system just as classical as bounded variance. Since hydrogen atoms behave very non-classical for low quantum numbers, it's not to be expected that the dynamics behaves classically in that case. After all, electrons apparently don't actually revolve around the nucleus on elliptic orbits.
 
  • Like
Likes A. Neumaier and vanhees71
  • #329
vanhees71 said:
As I said before, in Landau Lifshitz I cannot find the word collapse (by searching the electronic copy I have ;-)), and Weinberg holds the view that the question in interpretational issues is undecided (after a brilliant analysis in an early chapter of his QM lectures book). Then he happily goes on using the standard representation.

I don't think that a Heisenberg cut is in any way justified by the formalism of QT nor is it in anyway justified by observations. It's just a matter of technological challenge how to sufficiently isolate macroscopic systems from perturbations to avoid decoherence to demonstrate quantum effects also on them. You have to decide from case to case at which point in an experimental setup you can treat things (semi-)classically. I think that Bohr was right saying that a measurement apparatus should be within the validity of the semi-classical description. It must be an open system such that you can store the information on the measurements made, which is an irreversible process.

The interaction between measured object and the macroscopic measurement device are part of the Hamiltonian and as such, according to the successful relativistic QFT, a local interaction. The assumption that such a local interaction can cause far-distant instantaneous responses is thus a contradiction in adjecto. According to the standard (minimal) interpretation there's also no need to explain it by far-distant correlations as described by entanglement. It's all standard QT (or functional analysis if you wish).

LL give the update rule, which I and many others call collapse. LL also don't use the term "Heisenberg cut", but they describe it.

Weinberg, although he says interpretation is not settled, still says the minimal interpretation has a Heisenberg cut and collapse.
 
  • #330
A. Neumaier said:
There is nothing to show here, as the Hamiltonian of a system is fixed and whatever the state is, it cannot affect the Hamiltonian. The only affection goes in the other direction.

Yes, that's how I showed it:)
 
  • #331
vanhees71 said:
As I said before, in Landau Lifshitz I cannot find the word collapse (by searching the electronic copy I have ;-)), and Weinberg holds the view that the question in interpretational issues is undecided (after a brilliant analysis in an early chapter of his QM lectures book). Then he happily goes on using the standard representation.

I don't think that a Heisenberg cut is in any way justified by the formalism of QT nor is it in anyway justified by observations. It's just a matter of technological challenge how to sufficiently isolate macroscopic systems from perturbations to avoid decoherence to demonstrate quantum effects also on them. You have to decide from case to case at which point in an experimental setup you can treat things (semi-)classically. I think that Bohr was right saying that a measurement apparatus should be within the validity of the semi-classical description. It must be an open system such that you can store the information on the measurements made, which is an irreversible process.

The interaction between measured object and the macroscopic measurement device are part of the Hamiltonian and as such, according to the successful relativistic QFT, a local interaction. The assumption that such a local interaction can cause far-distant instantaneous responses is thus a contradiction in adjecto. According to the standard (minimal) interpretation there's also no need to explain it by far-distant correlations as described by entanglement. It's all standard QT (or functional analysis if you wish).

I replied to this above, but add another comment to show that it is very clear that Weinberg's standard interpretation has a Heisenberg cut p81, section 3.7:

"The discussion of probabilities in Section 3.1 was based on what is called the Copenhagen interpretation of quantum mechanics, formulated under the leadership of Niels Bohr. According to Bohr, “The essentially new feature of the analysis of quantum phenomena is ... the introduction of a fundamental distinction between the measuring apparatus and the objects under investigation."

So LL and Weinberg both have the Heisenberg cut. The Heisenberg cut is part of the standard or minimal interpretation.
 
  • #332
vanhees71 said:
The interaction between measured object and the macroscopic measurement device are part of the Hamiltonian and as such, according to the successful relativistic QFT, a local interaction. The assumption that such a local interaction can cause far-distant instantaneous responses is thus a contradiction in adjecto. According to the standard (minimal) interpretation there's also no need to explain it by far-distant correlations as described by entanglement. It's all standard QT (or functional analysis if you wish).
In order to speak about local measured object you first have to trace out the distant part and you trace it out in particular basis. Now let's say that you trace out the distant part and after that you measure local object in different basis than the one used for tracing out. And then do the same for distant part. Can you still come up with correct predictions for correlation?
 
  • #333
I don't understand what you want to say here. It's very clear how the far-distant parts of an entangled system are to be "traced out" to get the statistical operator of the corresponding other part of interest. It's also clear, how A updates her knowledge about the entire system, including B's part after gaining information by a measurement. For B to make use of A's information he needs to get this information from A since he cannot instantaneously get it somehow from his system which is far distant from A (due to the locality of interactions in relativistic QFT, which I consider valid). The correlation is inherent in the system from the very beginning by preparing it in the entangled state; it's not due to the local measurments of either A or B.
 
  • #334
atyy said:
I replied to this above, but add another comment to show that it is very clear that Weinberg's standard interpretation has a Heisenberg cut p81, section 3.7:

"The discussion of probabilities in Section 3.1 was based on what is called the Copenhagen interpretation of quantum mechanics, formulated under the leadership of Niels Bohr. According to Bohr, “The essentially new feature of the analysis of quantum phenomena is ... the introduction of a fundamental distinction between the measuring apparatus and the objects under investigation."

So LL and Weinberg both have the Heisenberg cut. The Heisenberg cut is part of the standard or minimal interpretation.
In this sense, of course the Heisenberg cut is there. The only thing I see no justification for is to claim that on a fundamental level there is a "quantum world" and a "classical world" governed by different dynamical laws. I think the classical behavior of macrocopic objects under usual conditions is a phenomenon that can be understood from QT, using the standard ("coarse-graining") techniques of many-body theory.
 
  • #335
vanhees71 said:
It's also clear, how A updates her knowledge about the entire system, including B's part after gaining information by a measurement. For B to make use of A's information he needs to get this information from A since he cannot instantaneously get it somehow from his system which is far distant from A (due to the locality of interactions in relativistic QFT, which I consider valid). The correlation is inherent in the system from the very beginning by preparing it in the entangled state; it's not due to the local measurments of either A or B.
Let's get it straight. There are two sides of the situation we try to model. One is physical situation that results in measurement records of A and B. The other side is how we analyze obtained measurement records by correlating them. This other side is of no concern for physics theory. Look at it this way: some hypothetical physical mechanism results in certain related measurement records for A and B. We test that this physical mechanism works as expected by looking at correlation between A and B records.
 
  • #336
vanhees71 said:
In this sense, of course the Heisenberg cut is there. The only thing I see no justification for is to claim that on a fundamental level there is a "quantum world" and a "classical world" governed by different dynamical laws. I think the classical behavior of macrocopic objects under usual conditions is a phenomenon that can be understood from QT, using the standard ("coarse-graining") techniques of many-body theory.

But your rules for working with quantum mechanics are different for macroscopic variables and microscopic variables. A macroscopic variable such as a pointer position has a definite value at all times, while that value may only predictable from past states probabilistically. In contrast, with a microscopic variable such as the z-component of an electron's spin, it doesn't make any sense to say that it has a definite value at all times, it only makes sense to say that when that variable is coupled to a macroscopic value (via a measurement), it acquires a definite value.

I think it is false to say that the properties of macroscopic variables are derivable from the properties of microscopic variables through coarse-graining. If it were true that the same physics applies to both, you should be able to formulate the laws of physics in a way that is independent of whether you are talking about micro or macro. In contrast, the minimalist interpretation involves both.
 
  • #337
A pointer variable of a macrocopic measurement device has on a very coarse grained level a definite value at all times. The statistical (quantum and thermal) fluctuations are much smaller than the required accuracy of this pointer variable's value!
 
  • #338
To me, the issue about macroscopic versus microscopic is pretty straight-forwardly illustrated by an EPR experiment involving anticorrelated spin-1/2 particles.

Alice measures her particle's spin along the z-axis, and gets spin-up. The question is: Is that the result, "Alice measured spin-up", an objective, physical fact about the universe? In the minimal interpretation, it is treated as an objective physical fact. But in contrast, for an electron that is in a superposition of spin-down and spin-up, neither "The electron is spin-up" nor "The electron is spin-down" is considered an objective physical fact about the universe. People often say that it has no spin until it is measured. So this seems to be a big distinction between microscopic objects and macroscopic objects: macroscopic objects have definite, objective properties, while microscopic objects do not.

I understand that macroscopic objects can't (for long) be in superpositions, because decoherence will rapidly cause correlations between the macroscopic object and a larger chunk of the universe. But decoherence doesn't make properties such as spin more definite and objective, it just enlarges the size of the system that must be considered to be in an indefinite state.

It seems to me that there are only three possibilities:
  1. There is some physics that governs macroscopic interactions (measurements) that doesn't apply to microscopic systems (objective collapse theories, for example), or
  2. Contrary to what is commonly believed, macroscopic systems do not have definite properties, either (that's Many-Worlds), or
  3. Contrary to what is commonly believed, microscopic systems do have definite properties (Bohmian mechanics, for example).
I can understand people who say that there is no need to resolve the question, since we have a recipe for using quantum mechanics that works well enough without answering the question. But I don't understand the people who claim that there is no issue to resolve in the first place.
 
  • #339
You are mixing theories all the time. According to classical physics, applicable to macroscopic physics (in a sense to be understood below!), any observable of an object has a definite value.

Quantum theory extends the validity of our theory tremendously beynd the range of validity of classical physics. Thinking about how the restricted validity of classical physics can be understood from the more comprehensive (in fact today there's no limit of validity of QT known, except our inability to construct a satisfatcory quantum description of gravity), leads at least me to the conclusion that classical physics is valid in an average coarse-grained sense. You get classical behavior of observables averaged over many appropriate microscopic observables. Then very often these coarse grained macroscopic observables are sufficient to effectively describe the behavior of the macroscopic object in terms of classical dynamics. Within the accuracy of the validity of a coarse-grained description the statistical fluctuations of the corresponding macroscopic observables are negligible, and Ehrenfest's theorem leads to the validity of the classical dynamics in this sense.
 
  • #340
ddd123 said:
I don't see how paradigm building and shifts, or even just mathematical intuition, could be reduced to noticing sameness and differences... it's a variety of qualitatively different operations that come into play.
Here I just mean, when Copernicus makes a model that says orbits are circles with the Sun at the center, the samenesses are the circles, and the differences are the different scales of the orbits. Call it symmetries and breaks in symmetries, if you like.
Actually I don't see why, in this context, we should worry about the nature of thinking itself! All we need to know about epistemology is that it concerns our ways of knowing, which involves a plurality of factors. We refer to that knowing with respect to the operations we perform in the lab: we can leave it at intuition, it's even simpler than having an ontology to worry about. We don't need a theory of mind to do physics, why are you worrying about it?
I agree that trying to put in some kind of theory of thinking is premature, we just don't know enough about the mind. I think it will advance physics dramatically once we understand better how we do it, but that's not going to happen in this thread, and maybe not for a thousand years for all I know. Your point is that it is a detour from the basic questions of ontology vs. epistemology, and that's true. So getting back to epistemology, the core issue here is that we like to motivate our epistemologies with ontologies, and my only point is that we always run into trouble when we take the ontologies too seriously by framing them as the purpose of the endeavor. We should instead regard them as tools of understanding, effective pictures we use as we think, because this solves all the problems we have with ontology-- in particular, it justifies why ontologies are always nonunique, and it explains why they invariably end up getting replaced with others that are almost completely different. The epistemology of science is to both seek out and discard useful ontologies, but not because we are looking for the right one, any more than a hermit crab is looking for the right shell.
 
  • #341
vanhees71 said:
You are mixing theories all the time. According to classical physics, applicable to macroscopic physics (in a sense to be understood below!), any observable of an object has a definite value.

Quantum theory extends the validity of our theory tremendously beynd the range of validity of classical physics. Thinking about how the restricted validity of classical physics can be understood from the more comprehensive (in fact today there's no limit of validity of QT known, except our inability to construct a satisfatcory quantum description of gravity), leads at least me to the conclusion that classical physics is valid in an average coarse-grained sense. You get classical behavior of observables averaged over many appropriate microscopic observables. Then very often these coarse grained macroscopic observables are sufficient to effectively describe the behavior of the macroscopic object in terms of classical dynamics. Within the accuracy of the validity of a coarse-grained description the statistical fluctuations of the corresponding macroscopic observables are negligible, and Ehrenfest's theorem leads to the validity of the classical dynamics in this sense.

That's just not true. If an electron is in a superposition of spin-up and spin-down along the z-axis, and it interacts with a measuring device, then the measuring device will evolve into a superposition of "measuring spin-up" and "measuring spin-down". Decoherence then would propagate the indefiniteness to the rest of the universe---the universe would evolve into a superposition of one universe in which the measuring device measures spin-up and another universe in which the measuring device measures spin-down. Coarse graining is not going to change that. It's a complete red herring to bring it up.

If it is true that macroscopic objects obey the same physics as microscopic objects, then a many-worlds type ontology follows. You want to affirm one side of an implication and reject the conclusion, but it is incoherent to do so. Bringing up coarse-graining and decoherence doesn't change the conclusion.
 
  • Like
Likes zonde
  • #342
Ken G said:
Here I just mean, when Copernicus makes a model that says orbits are circles with the Sun at the center, the samenesses are the circles, and the differences are the different scales of the orbits.

What about General Relativity then?

Call it symmetries and breaks in symmetries, if you like.

When you devise a new symmetry, you have to think about it. Whether it's a new 'sameness' or a new 'difference', you have to think about it with a mix of creativity, intuition and logical deduction. It's not like saying '0' or '1' and mixing them up. I mean, at least if you're not going into stuff like generating proofs of theorems using brute force within a Godel representation of maths or something like that (but even if that worked, it would work for maths, physics is more fuzzy and involves more intuitive concepts).

We should instead regard them as tools of understanding, effective pictures we use as we think, because this solves all the problems we have with ontology-- in particular, it justifies why ontologies are always nonunique, and it explains why they invariably end up getting replaced with others that are almost completely different. The epistemology of science is to both seek out and discard useful ontologies, but not because we are looking for the right one, any more than a hermit crab is looking for the right shell.

I think the final point is good, but it's not going to be convincing if you avoid the object of the process: understanding what? Thinking what? By avoiding that sort of question you end up putting it in the spotlight: instead of focusing on matter, you focus on mind, and then suggest that there's no separate reality being described etc (which implies that subject and object are interdependent, which is a philosophical theory, which is outside the scope of physics). In short, you're not agnostic enough. If you're saying that we're not describing an outside reality, and if there is an outside reality (which you're agnostic about), then you're making the ontological statement that we are NOT describing it. Half of your position is subtly ontological. Maybe your readers don't go this far into the nature of your arguments but they sense that you're still trying to superimpose a no-picture picture on reality and this ends up being unconvincing.

It's much smoother, IMHO, to go like this:

We start from intuitive concepts like 'measurement outcome' and 'experimental preparation' in the abstract; to that semantics we associate the structured and collected information which we model by devising a theory (which may also involve a picture as an intuition aid, but of course the picture is in our heads, not outside of our heads).

Here I made no claims whatsoever on the ontology and not even on the relationship between epistemology and ontology: it is pure epistemology in the proper sense, that I'm not even separating epistemology and ontology, which necessarily would be a partially ontological claim.
 
  • #343
stevendaryl said:
It seems to me that there are only three possibilities:
  1. There is some physics that governs macroscopic interactions (measurements) that doesn't apply to microscopic systems (objective collapse theories, for example), or
  2. Contrary to what is commonly believed, macroscopic systems do not have definite properties, either (that's Many-Worlds), or
  3. Contrary to what is commonly believed, microscopic systems do have definite properties (Bohmian mechanics, for example).
I can understand people who say that there is no need to resolve the question, since we have a recipe for using quantum mechanics that works well enough without answering the question. But I don't understand the people who claim that there is no issue to resolve in the first place.
The way to make the problem go away, more so than "resolve" it, is to look for a fourth possibility: that physics doesn't govern things, and that systems do not have properties. Or more correctly, that when we use anthropomorphic language about "laws" and "governances", we are finding associations between our experiences doing experiments and the basic structures with which we have everyday familiarity, and when we talk about "properties" we mean "contraints on how we can successfully think about systems." Using these more precise replacements, replacements that actually dovetail with what we can observe when we watch a scientist do science, we can still use the language to do all the same things for our science, but we can avoid the quagmire of taking our ontologies too seriously. None of this causes any trouble if we frame it all as modes of thought and approaches to manipulating information that involve pictures, cartoons really. The cartoons should have captions such as "I like to picture what is happening like this", but the problems only appear when that correct and well-tested caption gets simplified to the invariably incorrect "this is what is actually happening."
 
  • #344
stevendaryl said:
That's just not true. If an electron is in a superposition of spin-up and spin-down along the z-axis, and it interacts with a measuring device, then the measuring device will evolve into a superposition of "measuring spin-up" and "measuring spin-down".
That's not strictly correct, the measuring device itself is not in a superposition state, only the wavefunction that includes the measuring device (if there is any such thing "in reality", which is very much the question). If you project onto the degrees of freedom of the measuring device, you get a mixed state, not a superposition. However, that's not really the problem, the problem is in deciding what that mixed state means-- is it the actual state of the measuring device, on grounds that a measuring device needs an actual unique state in the reality (which I reject)? If so, then "collapse" hasn't happened yet, it only happens when we look, and get a single outcome, returning the measuring device to a unique state. In my view, the need to regard anything as having a state stems purely from our desire to be able to create correct expectations about that thing, and is in no way some kind of requirement of reality, if there's even any way to give that latter language physical meaning.
Decoherence then would propagate the indefiniteness to the rest of the universe---the universe would evolve into a superposition of one universe in which the measuring device measures spin-up and another universe in which the measuring device measures spin-down.
This is the pre-collapse state, if one takes a universal wavefunction seriously. But the problem hasn't appeared yet-- the problem is when we look at the outcome and only see one. Now we need an interpretation, because our description of this uber-superposition is no longer gibing with our perceived outcome. We can now say that our outcome only represents a small fraction of what is actually happening, in which case we are forced to conclude that what we care about (what happens to us) is a limited amount of the full information. But it's the information we have! So we start with ontology, and are led back to epistemology-- our information is all that matters to us. The ontology has become useless!

Or, we take the Copenhagen view, and say that our information, which is what matters, must be everything that happens. Here we have made ontology matter, but only by shoving epistemology down its throat-- so it really doesn't matter here either, the ontology is so subservient to the epistemology that all that remains is the epistemology anyway!

Or, we can take the Bohmian view, and say that we don't have all the information, so the uber-wavefunction you describe never happens. What happens to us is determined by information we have no access to. So we do achieve an ontology that goes beyond the epistemology, but we do it in the usual way-- by postulating the existence of essentially invisible and unknowable higher powers, here acting in the form of details of the preparation that we could never know. "Preparations work in mysterious ways," where have we seen that before?

So the bottom line is, either the ontology is no more than the epistemology, or anything more that it is becomes a matter of essentially religious interest only.
If it is true that macroscopic objects obey the same physics as microscopic objects, then a many-worlds type ontology follows.
Yet we must carefully track all the suppositions that are implicit in that hypothetical:
1) that there are such things as "objects" and they can be either micro or macroscopic, they are not just concepts we use to manipulate information
2) that objects "obey" laws, as in they are in some form of communication with these immutable laws, rather than "obeying laws" being a familiar language we can use to make sense of the behaviors we see
3) that our current version of those "laws", the Schroedinger equation, is not just the current approximation that is spectacularly accurate in isolated instances, it is the actual immutable law that the actual objects are actually in some form of communication with.

So yes, you do get a many-worlds ontology if you make all those assumptions, but identifying precisely what the assumptions actually are clarifies greatly why we should not be surprised they lead us to a bizarre ontology. Such is always the way.
 
  • #345
Simon Phoenix said:
I was walking across the golf course the other day and then suddenly this golf ball hit me right between the eyes. My doctor told me not to worry about the huge lump on my forehead as it was only epistemic :confused:
Epistemic means how your mind interacts with its stimulus . If anyone thinks it doesn't matter how your mind interacts with its stimulus, I have to wonder what they have been doing with their life all this time! When someone has late-stage terminal cancer, and is told they have only a week to live, and it will be the most painful and awful week of their entire life, and they are given an option to take a drug that will interfere with their brain's ability to detect that their body is dying in awful ways, do they care more about the ontology of the death of their body, or the epistemology of how they get to feel during that process? We always care more about epistemology than ontology, epistemology is no less than everything that matters to us, even if the true believer in ontology must see a dose of self-deception in epistemology.
 
  • Like
Likes vanhees71
  • #346
ddd123 said:
What about General Relativity then?
It stems from the equivalence principle, a classic example of our mind's ability to detect "sameness." But still, I agree this is a detour.
I think the final point is good, but it's not going to be convincing if you avoid the object of the process: understanding what?
If it is less convincing, that may be simply because we are so set in our ways of thinking even when they become barriers. (And I wager we've alll seen that phenomenon!) I'm claiming that understanding is just understanding, period, and claiming that it is understanding "of something" is just the nature of how we understand. If someone understands the fundamental theorem of integral calculus, then what is the "something" they understand? The fundamental theorem of integral calculus, of course-- but that's just more information. We understand by manipulating information, and what we understand is more information-- how could it be any different? How can manipulating information transcend itself? The whole idea of a "something" that we understand is just that-- it is an idea. We can see this, we merely watch someone manipulate that idea, and say to ourselves "yup, they are clearly manipulating an idea there."

By avoiding that sort of question you end up putting it in the spotlight: instead of focusing on matter, you focus on mind, and then suggest that there's no separate reality being described etc (which implies that subject and object are interdependent, which is a philosophical theory, which is outside the scope of physics). In short, you're not agnostic enough.
That's what I don't agree with. If I have no need to regard what you mean by a "separate reality" as anything but your mode of interpreting and organizing the consistent information of your senses (and I think you can agree that is what you are doing there), then why do I need anything more than just that? It's not that I don't need a concept of a separate reality, that's a vastly useful epistemological tool-- it's that I don't need to regard it as anything more than an epistemological tool, because that's the only way it ever gets used. That's agnosticism-- we only take what we need, what we actually use.

If you're saying that we're not describing an outside reality, and if there is an outside reality (which you're agnostic about), then you're making the ontological statement that we are NOT describing it.
I don't say I'm not describing outside reality, I'm saying I never use that I'm describing outside reality as it actually is, I only use what I mean by outside reality-- which is not "outside" me because I'm the one meaning something by it. I'm saying that all I ever use is the concept of an outside reality, a concept I need not take literally or seriously. I can just say "picture matter as though it were made of atoms", or "picture the sensory input you are experiencing as if it were coming from some separate outside reality that is independent of your sensations," and I do just fine. I get everything the true believer in ontology gets, without all the problems that the true believer in ontology has to contend with.

Now, sometimes people hear this wrong, and think I'm saying that if these are just pictures, then I shoudn't care if I drop a rock on my toe. But of course that's wrong, I do care if I experience pain, and the way I make sense of that experience is to say that a rock fell on my toe. I can avoid future pain using that picture, so I am in no way saying "don't model and test", I am saying "do model and test, but there's no need to pretend you are doing anything else."
We start from intuitive concepts like 'measurement outcome' and 'experimental preparation' in the abstract; to that semantics we associate the structured and collected information which we model by devising a theory (which may also involve a picture as an intuition aid, but of course the picture is in our heads, not outside of our heads).
And this is where we agree, I see a completely epistemological approach in your words there, and I would have put it very similarly. So the point is, notice how easily we frame the interpretations of QT in this light-- we say that a wavefunction is an informational structure that we use in the context of the Schroedinger equation and the Born rule, also informational structures, to make sense of the informational structure we like to call "reality." The way we manipulate information is present at every step, along with the limitations of our mental capacities, and the ultimate arbiter of our success is the outcome of observations-- which we have no need to regard as anything but yet another informational structure. The desire to take ontology seriously in any of this is essentially a form of religion, and leads to difficulties in the interpretations. All we need to use ontology for is to generate a nonunique picture that helps guide our sense of understanding, a sense that is itself purely epistemological but at the same time very important to us.

And to clarify, the act of noticing that it is epistemological, meaning it is the way we think, does not give us the power to think any way we like-- our epistemology is constrained to fit the information structures we are actually manipulating. So we create the notion of an objective reality to come to terms with why we cannot control the information we are manipulating, but of course that notion is just another example of how we manipulate the information that there are constraints on the information we manipulate. If someone says "all I mean by objective reality is the observation that I do not have complete control over the inputs to my senses, instead there are constraints that I share with other beings like myself", then that is a perfectly epistemological approach-- and would be refreshing to see in the context of QT interpretations because it puts the focus right where it belongs: how the mind choosing the interpretation copes with the constraints present in the informational structures they are attempting to understand. Above all, the information they are grappling with is not external to themselves, because information is not just a sequence of 0s and 1s, it is how we manipulate a sequence of 0s and 1s, the meaning we give to it.
 
Last edited:
  • #347
Ken G said:
That's what I don't agree with. If I have no need to regard what you mean by a "separate reality" as anything but your mode of interpreting and organizing the consistent information of your senses (and I think you can agree that is what you are doing there), then why do I need anything more than just that? It's not that I don't need a concept of a separate reality, that's a vastly useful epistemological tool-- it's that I don't need to regard it as anything more than an epistemological tool, because that's the only way it ever gets used. That's agnosticism-- we only take what we need, what we actually use.

That sounds more like pragmatism. But I simply think your approach is too convoluted: you have to say, okay I have this characterization of outside reality but I don't characterize the characterization as actually referring to an outside reality... It's so duplicated. It may simply be a matter of words, but look at how much simpler my version of epistemology is. What I renounce in it is simply talking about ontology at all, not even in the negative.

The desire to take ontology seriously in any of this is essentially a form of religion, and leads to difficulties in the interpretations.

Calling it a religion is kind of aggressive (both towards religion and non-religion), and I think it doesn't hold in the sense of dogma. You say ontology is not provable... At some point nothing is provable. It's just easier but not more justified to say that ontology is not provable: if you doubt it characterizes external reality by doubting it possible to characterize external reality at all, the case is closed. Epistemology has the testing ground of it being effective in knowing: but at this point you can doubt your criteria of effectiveness, maybe you're predicting results that are completely skewed from an experimental perspective, because you put in confirmational biases and so on... You can never be fully sure that your epistemology makes sense, or even that epistemology at all makes sense (how would you know, you'd need to apply epistemology to itself...).

It's much more honest to say: having an ontology is a philosophical position, having an epistemology is a philosophical position, I think this is stronger, more justified than that, for these reasons...
 
  • #348
stevendaryl said:
A macroscopic variable such as a pointer position has a definite value at all times
Only at nearly all times. During the (positive) time a measurement is in progress, the pointer reading is not well-defined. Even for classical measurements one has to wait till a sufficiently stationary situation has been achieved, before one can take a reliable reading.
 
  • #349
stevendaryl said:
you should be able to formulate the laws of physics in a way that is independent of whether you are talking about micro or macro.
Such a formulation is indeed given by my thermal interpretation.
 
  • #350
ddd123 said:
That sounds more like pragmatism. But I simply think your approach is too convoluted: you have to say, okay I have this characterization of outside reality but I don't characterize the characterization as actually referring to an outside reality... It's so duplicated. It may simply be a matter of words, but look at how much simpler my version of epistemology is. What I renounce in it is simply talking about ontology at all, not even in the negative.
But we have to talk about ontology, because it's everywhere. We have to be able to converse with someone who says "I believe my life is embedded in a vast superposition of lives of a spectacular number of people very similar to me, on planets very similar to Earth, in universes very similar to mine, because that is what the Schroedinger equation says must be true." So it is to those people I aim my words-- I see their ontology as doing more than motivating how they solve the Schroedinger equation, I see it as a world view that could affect the decisions they make-- in potentially frightening ways, quite frankly (consider, for example, Tegmark's "quantum suicide" paradigm!). So if those people regard themselves as scientists, I want to say to them, but notice that science never uses your world view, it only ever uses the things that matter to you, which are the things you perceive as actually happening. Hence, what you perceive as actually happening is more important than what you regard as actually happening in some more "absolute" sense. The recognition of the difference between what we care about and what is "True" is the path from ontology to epistemology, so we must start in the one place to get to the other.
Calling it a religion is kind of aggressive (both towards religion and non-religion), and I think it doesn't hold in the sense of dogma. You say ontology is not provable...
It's not even testable, which is more to the point. All we can test is that information A is close to information B, that's it. Of course we wish to picture what is going on in that test, and we regard there to be lessons to be learned, but we don't need to take the ontology seriously in science. I am not badmouthing religion, I have no issue with ontology in religion-- I am saying that religion is the proper sphere for ontology whenever the ontology is taken seriously, rather than purely as a cartoon that supports the epistemology (and often in a nonunique way).
It's much more honest to say: having an ontology is a philosophical position, having an epistemology is a philosophical position, I think this is stronger, more justified than that, for these reasons...
Then let me put it this way: having an epistemology is the philosophy of doing science, but having an ontology, in the sense of taking it seriously, is never consistent with the philosophy of doing science-- even though this is not widely recognized.
 

Similar threads

  • Beyond the Standard Models
Replies
3
Views
2K
Replies
12
Views
2K
  • Quantum Interpretations and Foundations
Replies
25
Views
2K
  • Quantum Interpretations and Foundations
4
Replies
135
Views
9K
  • STEM Academic Advising
Replies
4
Views
920
  • Quantum Physics
Replies
19
Views
3K
Replies
62
Views
3K
Replies
11
Views
4K
Replies
2
Views
3K
Back
Top