Is Qualia Intrinsically Linked to Quantum Mechanics?

In summary: Implicate Order. lol.In summary, the author believes that qualia is related to spacetime and quantum mechanics. They are in conflict because in quantum mechanics, the particle is always the particle. The author thinks that the best candidate of quantum interpretations is those that are friendly to qualia.
  • #36
jambaugh said:
I think it is an issue of distinct meaning of "more fundamental" within ontic metaphysics vs within epistemology. Back in the classical period of science we sought the most fundamental reality, the atoms out of which objects are made. In this quantum period what is most fundamental is the atomic act of knowing, say the boolean observation (which is an intimately thermodynamic process), and the object of that observation, the atomic unit of information, the qubit.
Yes, I think the next class of physical theories, those that go beyond the latest new particle, are going to combine theories of dynamics with theories of cognition. We are getting close to the place where we can no longer pretend we are not involved in our own knowing about reality. The device of separating ourselves from our questions got us pretty far, but we had to always know it was a fundamentally bogus approach. I think Douglas Adams had it right-- at some point knowledge looks not like figuring out the right answers for our questions, but rather figuring out the right questions for our answers.
 
Physics news on Phys.org
  • #37
jambaugh said:
I think it is an issue of distinct meaning of "more fundamental" within ontic metaphysics vs within epistemology. Back in the classical period of science we sought the most fundamental reality, the atoms out of which objects are made. In this quantum period what is most fundamental is the atomic act of knowing, say the boolean observation (which is an intimately thermodynamic process), and the object of that observation, the atomic unit of information, the qubit.

I like that idea of the atomic boolean operation. Do you have any cites for this as an approach?

I come at this from the perspective of theoretical biology where people talk about infodynamics for instance. So in some sense, QM is about the smallest grain of observational resolution.

jambaugh said:
In both contexts we still must overlay dynamics, or rather we express dynamics in terms of activity of/between atoms. Dynamics is thus derivative. [**footnote] Where we should go from here is I think is say, a revised definition of metaphysics (metadynamics? dianetics? :wink:) which appropriately disinvokes the postulate of fundamental objective reality (but allows for the description of contingent realities) and in some way invokes dynamic action intrinsically. I'm thinking something like a generalization of Feynman diagrams to a language of actions or phenomena. (And yes I'm w.a.g.ing here.)

Here I would disagree. The presumption should instead be that all is dynamic, and it is only equilbrium states or processes that give the illusion of static, atomic, existence.

In an equilbrium, there is ceaseless change at the microscale, but not longer any global change.

But then you appear to be saying here both that dynamics is derivative and yet we need a new metaphysics that "invokes dynamic action intrinsically". So maybe you do agree with me?

jambaugh said:
(** GR is exceptional to an extent in that it is a dynamic of dynamics type theory.)

Yes, GR is a model of global constraints. But it needs actual energy values plugged into it to make it dynamical. At the moment, this is most unsatisfactory - with for example the need to hand-build in the cosmological constant. A thermo model of GR might find dark energy to be an irreducible fact of the Universe's dissipative structure for instance.

Our standard way of thinking about geometry is rather cold and lifeless. It just lies there flat and static, not changing unless someone forces it to change. But thermodynamics is about the dynamics of gradients and equilibriums. So what would a "hot" geometry look like? :smile: (Open, hyperbolic, fractal, sum over histories?)
 
  • #38
Ken G said:
Yes, I think the next class of physical theories, those that go beyond the latest new particle, are going to combine theories of dynamics with theories of cognition. We are getting close to the place where we can no longer pretend we are not involved in our own knowing about reality. The device of separating ourselves from our questions got us pretty far, but we had to always know it was a fundamentally bogus approach. I think Douglas Adams had it right-- at some point knowledge looks not like figuring out the right answers for our questions, but rather figuring out the right questions for our answers.

Connecting back to the OP, one of the dangers is that QM seems to have something to do with observers, and so something to do with human consciousness. It is a slippery slope of speculation.

And even talking about cognition is problematic if we have no good theory of cognition.

Theories of cognition are in fact where I started out. Theoretical neurobiology. The computational model was obviously flawed (not untrue, but clearly not the whole (or holistic) story). And the dynamical systems approach was equally, somewhat true, yet also fundamentally missing something.

Looking for the right ontological grounding, I found that theoretical biology had been through the same issues in the 1960s and 70s. As a result, in the 80s and 90s, theoretical biologists were realising that the grounding theory for them was some more sophisticated model of thermodynamics. One that gave primacy to the idea of development and gradients - to the kind of open systems thermodynamics of Prigogine rather than the static, close realms of early statistic mechanics.

Then in the 1990s, theoretical biologists made a connection to semiotics (or the Peircean kind) as a way to talk about meaning as well as information in a thermo perspective.

So there is definitely a movement in biology and neuroscience, if not yet a revolution, that sees thermodynamics in some rounded systems sense as its natural basis. The "physics" that grounds the sciences of life and mind is not the one of particles, fields, and other simple material stuff, but the physics of systems.

Then looking around, it seems obvious that even physics and cosmology are attempting to be more systems-based - more holistic and self-organising, less atomistic and background dependent. Which again means that the proper metaphysical grounding would be something a thermodynamic modelling of causality.

So forget "qualia". That simply is the extension of atomism into phenomenology. The claim that consciousness is constructed from collection of subjective atomistic shards just does not fly with anyone who actually has studied neurocognitive processes. It is a fiction that gives some philosophers a respectable career - they look like they are doing good reductionist thinking. But it is a construct as lame as philogiston or aether or other things we now laugh about.

But Jambaugh's point about the boolean observation is right. We need an atomism in the sense that we need a definition of the smallest, or simplest, system-forming action. And there is also something new here because that "atom" is intrinsically dichotomistic. There has to be an observer and the observed in some sense. A meaningful relationship.

But we also have to find the language to describe the "atom of a system" in ways that don't have false connotations. And thermodynamics would seem to be the place to find a jargon that is both neutral enough to apply equally well to physics or mind science, and yet also having the right kind of causal or ontological connotations.

Myself, I find that the dichotomy of degrees-of-freedom~constraints is very useful. A system is in general where you have some set of global constraints that act to particularise a set of degrees of freedom. This is the top-down~bottom-up view of hierarchy theory. A coupling of levels of causality in which the larger scale "observes" the smaller scale - that is, it resolves a broad number of degrees of freedom into the select few which are actually building the system (so it can continue to "observe" and so persist as a system).

So the atom of a system is this dyadic relation between degrees of freedom and constraints. In QM, for example, it would be the interaction between the experimentalist's set-up and the indeterminacy contained in some prepared initial conditions. In mind science, you get the interaction between global anticipatory state and localised sensory "surprises" or salient events.

(If you want the best current neurocognitive model, check out the Bayesian brain work and note how it is based on the very thermo concept of minimising a system's free energy - http://en.wikipedia.org/wiki/Bayesian_brain).

To sum up, thermo is the physics of systems and so is naturally the foundation for complex systems science (such as biology and neurology). And I would argue that it has to be the foundation for "foundational" physics too. What is missing from the current modelling is the "observer" - a theory of the global constraints. And then we need an atomistic model of systems. But this is going to be an intrinsically dyadic story in which we find both observer and observed in their simplest possible form. Some version of the idea of emergent constraints in interaction with (sub)mergent (or otherwise constrained and directed) degrees of freedom.
 
  • #39
apeiron said:
To sum up, thermo is the physics of systems and so is naturally the foundation for complex systems science (such as biology and neurology). And I would argue that it has to be the foundation for "foundational" physics too. What is missing from the current modelling is the "observer" - a theory of the global constraints. And then we need an atomistic model of systems. But this is going to be an intrinsically dyadic story in which we find both observer and observed in their simplest possible form. Some version of the idea of emergent constraints in interaction with (sub)mergent (or otherwise constrained and directed) degrees of freedom.
I think a related theme to the top-down elements of constraints in thermodynamics is the concept of specialness, which likely relates to the concept of symmetry. If I painstakingly place 100 coins, all heads, on the floor of a room, we can all recognize the specialness there. But if I place those same 100 coins with 50 heads and 50 tails, we see no specialness, even though I may have carried out the same process of carefully placing each coin. So each individual pattern of heads and tails are all equally unlikely, the only reason we have a concept of entropy is that we choose groupings or classes that we are going to consider similar in some way, and the smaller classes are then special. It is thus our "eye to the similarity" within the classes that is what creates the concept of entropy, and as soon as we recognize a similarity that can be used to create a class, the universe will always step into make sure that special classes give way to more generic ones. The second law of thermodynamics is a law about what happens as soon as our intelligence identifies similarity classes, but it has no insights to offer prior to that. Hence I would say that the second law is not only a law discovered by intelligence, it is a law that requires intelligence to have any meaning. It is not really something that the universe by itself is doing, because the universe by itself might not have any idea why we decide to lump together all distributions with 50 heads and 50 tails.

So there's a classic example of where the observer/constraints/concepts are not written into the fabric of the dynamics itself, it is rather a kind of template placed over the dynamics. It is not only the act of doing physics that involves intelligence, the whole business is predicated on the attributes of intelligence. We shouldn't say there are physicists because there is physics, we should say there is physics because there are physicists. We see this in the importance of identifying classes of "what matters" in thermodynamics (like the difference between hot and cold, concepts the universe by itself might find odd), in the importance of having a reference frame for relativistic dynamics (the universe itself might care for nothing beyond the invariants, with no concept of what a coordinate system is), and in the importance of having a macroscopic coupling to define the operators of quantum mechanics (the universe might not have the vaguest idea what either a superposition or a mixed state are). Physics is by its very nature a study of top-down constraints, so when it is used to describe how all behaviors percolate up from the atom, we are only seeing one half of the cycle.
 
Last edited:
  • #40
Ken G said:
The pattern is real, yes. But the interference? How is that real? It is an inference you make when you see the pattern. Inferences are not real, they are mental constructs. I'd say the crux of CI is noticing the difference between what is in the reality and what is in our minds. Granted, even the outcome of an observation is in our minds, but CI sees that kind of outcome as more concrete than the mathematical stories we build up around them.
The pattern is real. That the pattern was made by interference is not something you can test. What experiment can you make that comes out X if the pattern is made by interference, and Y if the pattern is made some other way?

So if the interference is in the equation. Pls. tell me what physically happens to a single photon or electron between the emission and the detection in the double slit experiment. Thanks.
 
  • #41
Varon said:
So if the interference is in the equation. Pls. tell me what physically happens to a single photon or electron between the emission and the detection in the double slit experiment.
The central thesis of the CI is that there is no answer to your question. Not just that we don't yet know the answer, that there is none. The reality is not set up to allow our intelligence a means of answering the question, and that is just exactly the same thing as the question having no answer.
 
  • #42
Ken G said:
I think a related theme to the top-down elements of constraints in thermodynamics is the concept of specialness, which likely relates to the concept of symmetry. If I painstakingly place 100 coins, all heads, on the floor of a room, we can all recognize the specialness there. But if I place those same 100 coins with 50 heads and 50 tails, we see no specialness, even though I may have carried out the same process of carefully placing each coin.

Of course, an exact 50/50 would be somewhat special too. Quickly checking on a binomial distribution calculator (ie: I may have the number wrong), random tossing would generate this outcome only about 8 percent of the time.

So "non special" - that is the outcome of some actual fair coin toss - would be naively expected to look more like 48/52 or whatever. This would look less like a deliberate pattern or special arrangement, more like the product of a random gaussian process in which the individual outcome states are highly constrained (to the definite binary states of heads or tails) but the individual choice of outcomes is free, or completely unconstrained.

Ken G said:
It is thus our "eye to the similarity" within the classes that is what creates the concept of entropy, and as soon as we recognize a similarity that can be used to create a class, the universe will always step into make sure that special classes give way to more generic ones. The second law of thermodynamics is a law about what happens as soon as our intelligence identifies similarity classes, but it has no insights to offer prior to that. Hence I would say that the second law is not only a law discovered by intelligence, it is a law that requires intelligence to have any meaning. It is not really something that the universe by itself is doing, because the universe by itself might not have any idea why we decide to lump together all distributions with 50 heads and 50 tails.

I see that you are taking a CI approach to statistical mechanics :smile:. That is an extreme but interesting way to go.

I think the more deflationary interpretation - but still radical in its own way - is the infodynamics view where information is seen as any constraint on entropy production and so "intelligence" is in there as part of the defintion, but we don't mean it has to be a conscious intelligence, just that there is some kind of memory or boundary constraints in play that "makes a measurement" on the system.

See for instance this really excellent paper - http://arxiv.org/PS_cache/arxiv/pdf/0906/0906.3507v1.pdf
 
  • #43
Ken G said:
The central thesis of the CI is that there is no answer to your question. Not just that we don't yet know the answer, that there is none. The reality is not set up to allow our intelligence a means of answering the question, and that is just exactly the same thing as the question having no answer.

You are saying that it is possible nothing physically actually happens? Like the concept of Spacetime which is pure math? So it's related to the ultimate question of why math unreasonable effectiveness in describing reality like somehow there is a platonic realm where math is the primary reality and in between emission and detection, it is located in pure platonic realm? Is this the essence of Copenhagen? Or is Copenhagen more of a context of "don't care" attitude.. meaning the particle could be taking a real path like in Bohmian mechanics but Copenhagenists simply don't care?
 
  • #44
Varon said:
You are saying that it is possible nothing physically actually happens?
I'm saying the CI interpretation is that "what physically happens" is whatever we can assert, with our apparatus, physically happened. There is no other meaning to the term, the rest is practically identifiable with mysticism. I'd say it's a bit like watching a movie-- we are told that all we are seeing is a string of still pictures, yet our minds interpret motion there. Is that motion something physically happening? With the "movie magic" that is done these days, say in superhero epics, oftentimes the motions we perceive never occurred in any reality, the still pictures on the film are the only things that are real there. I believe the CI takes a similar skeptical approach to "the quantum realm."

So it's related to the ultimate question of why math unreasonable effectiveness in describing reality like somehow there is a platonic realm where math is the primary reality and in between emission and detection, it is located in pure platonic realm?
That is certainly one interpretation, but is closer to many worlds than CI. In many worlds, all those parallel worlds are a kind of "Platonic realm" when considered from our world, because their entire justification is to make the mathematics (not the observations) work out in a way that can be intepreted as "real". CI rejects the need to make the mathematics work out that literally, it is fine with viewing the mathematics as being nothing but a tool for predicting (statistically) one reality. So CI sees that "mathematical detour" you are talking about as not part of the reality at all, it is merely a template laid over the reality to get it to fit. So we are still left with wondering why it works-- many worlds has a readier answer to why it works, "because it's the reality." CI is a more skeptical stance: "prove it."

Or is Copenhagen more of a context of "don't care" attitude.. meaning the particle could be taking a real path like in Bohmian mechanics but Copenhagenists simply don't care?
That's closer-- I don't think they'd say they don't care, I think they'd say they don't believe. They don't want physics to require faith in that which can't be observed but "makes sense." They don't expect it to make sense.
 
  • #45
Ken and others,

What do you think of the latest experiment "Observing the Average Trajectories of Single Photons in a Two-Slit Interferometer"

described in:

http://www.physorg.com/news/2011-06-quantum-physics-photons-two-slit-interferometer.html

http://scienceblogs.com/principles/2011/06/watching_photons_interfere_obs.php

My question is. They can do weak measurement on a particle before full collapse. This means the particle has trajectory in contrast to pure Copenhagen concept where a particle only pops up upon collapse of the wave function (which stand for wave of possibility of where the particle would be detected). What do you make of the latest experiment?
 
Last edited by a moderator:
  • #46
Varon said:
They can do weak measurement on a particle before full collapse. This means the particle has trajectory in contrast to pure Copenhagen concept where a particle only pops up upon collapse of the wave function (which stand for wave of possibility of where the particle would be detected).
It doesn't mean that. The second article gives a much more nuanced description than the first. Nothing in that experiment is the trajectory of an individual photon, instead, what they have seems to me is equivalent to what you'd get if you put the detecting screen at various different places and create a field of detection densities, attribute the detection densities to trajectory densities such as could be done with any divergence-free field, and draw the "field lines" and call them average trajectories. I'll wager doing that would generate precisely the same figure. Much ado about nothing.

What they seem to be missing is that the classical picture of waves going through two slits could generate the same figure. What makes the quantum realm so weird is the quantization-- not the averaged behavior. I really don't see what "weak measurement" is adding to the question, it still is not true that you can say which slit any of those electrons went through.
 
Last edited:
  • #47
Ken G said:
It doesn't mean that. The second article gives a much more nuanced description than the first. Nothing in that experiment is the trajectory of an individual photon, instead, what they have seems to me is equivalent to what you'd get if you put the detecting screen at various different places and create a field of detection densities, attribute the detection densities to trajectory densities such as could be done with any divergence-free field, and draw the "field lines" and call them average trajectories. I'll wager doing that would generate precisely the same figure. Much ado about nothing.

What they seem to be missing is that the classical picture of waves going through two slits could generate the same figure. What makes the quantum realm so weird is the quantization-- not the averaged behavior. I really don't see what "weak measurement" is adding to the question, it still is not true that you can say which slit any of those electrons went through.

You mean when these detection densities were put at various different places, it collapsed the wave function at that point? But no, there was still a final inteference in the main screen. The momentum detection at different places from the slit didn't collapse the wave function. This is why it's called weak measurement.. because it only nudges them not enough to collapse it.

Now here's the question. Does it make sense for there to be even ensemble trajectories for uncollapsed state? For you, do you think a single particle has trajectory even before measurement?
Or not? If not. How could a single particle doesn't have trajectory yet ensemble of it have?? Note again putting momentum polarization weak measurement at various points of the slits don't collapse the wave function because there is still final interferece at the final detector screen.
 
  • #48
What I'm saying is, I'm not convinced that "weak measurement" is any different from "compiling average trajectories from treating the wave energy flux like a divergenceless scalar field and drawing 2D lines of force for that field." I maintain you could get that exact same picture by measuring the energy flux of a classical wave passing between two slits, and drawing trajectories such that the line density is proportional to the energy flux density. This would be completely consistent with a macroscopic treatment of an energy flux as a photon number flux. Those trajectories don't really mean anything beyond a statistical treatment of where photons go in large aggregations, that they could get the same picture with "weak measurement" of "one photon at a time" doesn't strike me as being at all profound.

Let me put it another way. The key statement that we don't know the trajectory of an individual photon is that we cannot know which slit it went through, and still have that photon participate in an interference pattern. Does this experiment tell us which slit any of those photons went through? No. So what? There are still no trajectories in the physical reality of what happened to those photons, and it's not at all clear that an "average trajectory" is anything different from the usual macro aggregate measurement in the classical limit. To me, all this experiment is is a kind of consistency check that "weak measurement" can recover statistical aggregates, but I see no threat to the CI interpretation that the reality is still only what you measure and not what happens between the measurements. So they can create weak measurements that don't completely collapse the wave function, then recover the aggregate behavior in the same way that complete measurements that do collapse the wavefunction could easily do also. What does that tell us? That weak measurements don't mess up aggregate results? Why should we be surprised-- the weak measurements don't tell us the trajectories of any of those particles.
 
Last edited:
  • #49
Ken G said:
What I'm saying is, I'm not convinced that "weak measurement" is any different from "compiling average trajectories from treating the wave energy flux like a divergenceless scalar field and drawing 2D lines of force for that field." I maintain you could get that exact same picture by measuring the energy flux of a classical wave passing between two slits, and drawing trajectories such that the line density is proportional to the energy flux density. This would be completely consistent with a macroscopic treatment of an energy flux as a photon number flux. Those trajectories don't really mean anything beyond a statistical treatment of where photons go in large aggregations, that they could get the same picture with "weak measurement" of "one photon at a time" doesn't strike me as being at all profound.

Let me put it another way. The key statement that we don't know the trajectory of an individual photon is that we cannot know which slit it went through, and still have that photon participate in an interference pattern. Does this experiment tell us which slit any of those photons went through? No. So what? There are still no trajectories in the physical reality of what happened to those photons, and it's not at all clear that an "average trajectory" is anything different from the usual macro aggregate measurement in the classical limit. To me, all this experiment is is a kind of consistency check that "weak measurement" can recover statistical aggregates, but I see no threat to the CI interpretation that the reality is still only what you measure and not what happens between the measurements. So they can create weak measurements that don't completely collapse the wave function, then recover the aggregate behavior in the same way that complete measurements that do collapse the wavefunction could easily do also. What does that tell us? That weak measurements don't mess up aggregate results? Why should we be surprised-- the weak measurements don't tell us the particle trajectories.

Hmm... have you actually read the original paper? It has something to do with momentum and polarization giving away the positions.. i wonder if this is similar or compatible to your idea that classical waves can produce positions too. Anyway. Try to read the following descriptions of the experiment if you don't have access to the original:

http://www.scientificamerican.com/blog/post.cfm?id=what-does-the-new-double-slit-exper-2011-06-07

http://www.sciencedaily.com/releases/2011/06/110602143159.htm
 
  • #50
I know they are using a subtle approach to their weak measurements, that's not the point I'm making. I'm saying that no matter how they do it, the "average trajectories" they get are obviously the same as the streamlines of what we would call the "photon fluxes" in a completely classical limit where they are just the energy flux in a classical wave going between two slits. So I could easily draw their exact same figure with entirely classical measurements of an entirely classical wave. So their result (that figure) is nothing the least bit surprising. So what is their claim? That somehow the "weak measurements" are telling us something more than the exact same figure made purely classically? I see no evidence for that claim at all, if you have the exact same output as a classical approach, you don't have any additional information there, you just have a much more complicated way of extracting the same information.

The other way to get that same figure is to send one photon through at a time, and just let it hit a detector on a wall that is at variable distances from the slits, running the experiment over and over. Normalize the patterns on all those walls to have zero divergence, and draw the stream lines. Same picture again, still no trajectories of any individual photons, just an aggregate of different detector realities uniting to make a pretty picture.
 
Last edited:
  • #51
Ken G said:
I know they are using a subtle approach to their weak measurements, that's not the point I'm making. I'm saying that no matter how they do it, the "average trajectories" they get are obviously the same as the streamlines of what we would call the "photon fluxes" in a completely classical limit where they are just the energy flux in a classical wave going between two slits. So I could easily draw their exact same figure with entirely classical measurements of an entirely classical wave. So their result (that figure) is nothing the least bit surprising. So what is their claim? That somehow the "weak measurements" are telling us something more than the exact same figure made purely classically? I see no evidence for that claim at all, if you have the exact same output as a classical approach, you don't have any additional information there, you just have a much more complicated way of extracting the same information.

The other way to get that same figure is to send one photon through at a time, and just let it hit a detector on a wall that is at variable distances from the slits, running the experiment over and over. Normalize the patterns on all those walls to have zero divergence, and draw the stream lines. Same picture again, still no trajectories of any individual photons, just an aggregate of different detector realities uniting to make a pretty picture.

You are saying a classical wave (without any particle) can also produce the same results. Ok.
Try to use pure wave on the following description (see below). What is counterpart to "photon polarization" or "Photons that enter the calcite perpendicular to the surface pass straight through" or "Photons that enter at a shallower angle follow a longer path through the calcite". Can you put pure wave into a calcite? See below:

Excerpt from http://scienceblogs.com/principles/2011/06/watching_photons_interfere_obs.php

"How do you only measure a tiny bit of the momentum? Isn't that a "little bit pregnant" sort of contradiction? The system they used for this is really ingenious: they use the photon polarization as a partial indicator of the momentum. They send their original photons in in a well-defined polarization state, then pass them through a calcite crystal. Calcite is a "birefringent" material, which changes the polarization by a small amount depending on the amount of material the photon passes through.

Photons that enter the calcite perpendicular to the surface pass straight through, and travel a distance equal to the thickness of the calcite. Photons that enter at a shallower angle follow a longer path through the calcite (think of it like cutting a loaf of French bread on the bias-- the angle-cut pieces are longer than the thickness of the loaf), and thus experience a greater change in polarization. The polarization of an individual photon then depends on the angle it took through the calcite, which tells you the direction of its momentum. The magnitude of the momentum is determined by the wavelength, which is the same for all the photons, so this gives you the information you need for the trajectory."
 
Last edited by a moderator:
  • #52
Varon said:
You are saying a classical wave (without any particle) can also produce the same results. Ok.
Try to use pure wave on the following description (see below). What is counterpart to "photon polarization" or "Photons that enter the calcite perpendicular to the surface pass straight through" or "Photons that enter at a shallower angle follow a longer path through the calcite". Can you put pure wave into a calcite? See below:
I'm saying the details of how they generate that figure doesn't matter, what matters is its information content, which I can get in much easier ways. Let me ask if you agree that the "average trajectories" that they plot are indeed exactly the same as we would get via my method #2 above-- running one photon at a time through exactly their configuration, and just putting the wall at different distances, and collect the aggregate detections. Then build up a concept of the aggregate photon flux by taking those measurements, normalizing the total detection numbers to be a constant total for every wall distance used (zero divergence), and then drawing the "field line density" for that divergenceless detection field? That's exactly how we would generate a concept of "aggregate photon flux" in this very two-slit experiment, in a completely classical limit of many iterations of slightly different experimental setups (the distance to the wall being the sole variable).

If we can agree that I can get the exact same figure my way, with no subtle "weak measurements", then the question to ask is: what additional information are they extracting with their clever measurements if they end up with the exact same figure I get?

Note that it makes no difference how clever their measurements are-- if they can tell which slit the photon went through, they won't get that photon to participate in an interference pattern anywhere. That is all the CI needs to hold.
 
  • #53
Wish this thread is in QM Forum so the quantum physicists can challenge your claim or comment. Anyway. Why don't you participate anymore at the QM Forum. Are you bored there, Ken?
 
  • #54
Nah, just busy! You're right, this really should be over there at this point. Hopefully someone will start a thread on this experiment over there, but my guess is, the fact that the result looks just like what anyone would call the photon flux pattern in a two-slit experiment, it won't create much of a stir. My opinion is, these folks have simply created a very roundabout way to measure what is easily construed as a classical wave energy flux distribution.
 
  • #55
Ken G said:
Nah, just busy! You're right, this really should be over there at this point. Hopefully someone will start a thread on this experiment over there, but my guess is, the fact that the result looks just like what anyone would call the photon flux pattern in a two-slit experiment, it won't create much of a stir. My opinion is, these folks have simply created a very roundabout way to measure what is easily construed as a classical wave energy flux distribution.

But even though the experiment didn't make simultaneous position and momentum measurement. At least what it shows is that between measurement, the particle exists! There is a variant of the Copenhagen where they state that in between measurements. The particle is not even in spacetime. A second variant believe the particle turning into pure wave. So you mean this experiment didn't refute this second variant where the particle just turns into pure wave in between measurement? And it only refutes this variant where particles are not located in spacetime in between measurement?
 
  • #56
Ken G said:
If we can agree that I can get the exact same figure my way, with no subtle "weak measurements", then the question to ask is: what additional information are they extracting with their clever measurements if they end up with the exact same figure I get?

The difference here is surely that the photon path is (weakly) observed. Yes, there is absolutely no surprise in the result. But the point was there could have been, with an actual observation.

For instance, the photons could all have come through just a single slit or tunneled through the barrier or whatever. Or been pure wave as Varon says. Not likely outcomes. But the only way to rule them out is observation.

The fact that each path is the average of some 32,000 events makes it very averaged. But still, this seems like new information.
 
  • #57
I can accept that the observation achieves a concept of "photon lines of flux" via a different observation strategy than the most obvious way to do it (with variable wall distances), and who knows if it might have obtained a different result, but it would have been very surprising if it had. From the point of view of "information is surprise", I'd have to say getting the exact same result that classically aggregated "photon flux lines" would give can't be much in the way of new information. The experiment allows us to go on imagining that a classical wave-energy flux is the same thing as an aggregated discrete photon flux, but that's just what we would have imagined already. I can't say what has actually been learned here, but I certainly don't see it as a challenge to the CI stance that "individual particles do not follow trajectories unless they are observed in such a way as to establish a trajectory," nor that "particles don't participate in interference patterns if you know which slit they went through", but you can create a concept of aggregate photon fluxes, and draw streamlines consistent with that, and still have the interference pattern.

The prevailing point is that drawing photon "lines of flux" is just not the same thing as drawing individual photon trajectories, though the two are easily confused. I'd say what they have mostly done is found a very complicated and subtle way of making it easier to fall into that confusion.
 
Last edited:
  • #58
Ken G said:
I can accept that the observation achieves a concept of "photon lines of flux" via a different observation strategy than the most obvious way to do it (with variable wall distances), and who knows if it might have obtained a different result, but it would have been very surprising if it had. From the point of view of "information is surprise", I'd have to say getting the exact same result that classically aggregated "photon flux lines" would give can't be much in the way of new information. The experiment allows us to go on imagining that a classical wave-energy flux is the same thing as an aggregated discrete photon flux, but that's just what we would have imagined already. I can't say what has actually been learned here, but I certainly don't see it as a challenge to the CI stance that "individual particles do not follow trajectories unless they are observed in such a way as to establish a trajectory," nor that "particles don't participate in interference patterns if you know which slit they went through", but you can create a concept of aggregate photon fluxes, and draw streamlines consistent with that, and still have the interference pattern.

The prevailing point is that drawing photon "lines of flux" is just not the same thing as drawing individual photon trajectories, though the two are easily confused. I'd say what they have mostly done is found a very complicated and subtle way of making it easier to fall into that confusion.


Hi Ken, There is an active thread in the QM Forum that discusses precisely this... and many people are confused about it. I shared them your view that pure classical EM wave can produce the same result and some agree, some disagree. So pls. visit the thread and participate, here's the link.


https://www.physicsforums.com/showthread.php?t=503861&page=10

Thanks.
 
  • #59
Ken G said:
The prevailing point is that drawing photon "lines of flux" is just not the same thing as drawing individual photon trajectories, though the two are easily confused. I'd say what they have mostly done is found a very complicated and subtle way of making it easier to fall into that confusion.

Speaking up for ontology, I would say that there is equal danger in the "hey, anything could be happening" view. So I see this as evidence that reality is grainy, resolvable in an approximate way, but never in an absolute way.

Saying reality is determinate and saying it is constrained are two different things. And implying it is fundamentally unconstrained is another thing yet again.

So this experiment increases the evidence in support of a constraints-based view of ontology, and goes against both determinism and the "well, if its not determined, it could be anything" alternative.

CI accepts the epistemic divide between observers and observables. But it is ontically agnostic. It has no theory about "observers". This has to be unsatisfactory in the long run.

Systems science already has developed semi-mathematical theories about constraints and degrees of freedom. It is an ontic framework that can generalise the relationship between observers and observables.
 
  • #60
  • #61
Ken G said:
I'm saying the details of how they generate that figure doesn't matter, what matters is its information content, which I can get in much easier ways. Let me ask if you agree that the "average trajectories" that they plot are indeed exactly the same as we would get via my method #2 above-- running one photon at a time through exactly their configuration, and just putting the wall at different distances, and collect the aggregate detections. Then build up a concept of the aggregate photon flux by taking those measurements, normalizing the total detection numbers to be a constant total for every wall distance used (zero divergence), and then drawing the "field line density" for that divergenceless detection field? That's exactly how we would generate a concept of "aggregate photon flux" in this very two-slit experiment, in a completely classical limit of many iterations of slightly different experimental setups (the distance to the wall being the sole variable).

If we can agree that I can get the exact same figure my way, with no subtle "weak measurements", then the question to ask is: what additional information are they extracting with their clever measurements if they end up with the exact same figure I get?

Note that it makes no difference how clever their measurements are-- if they can tell which slit the photon went through, they won't get that photon to participate in an interference pattern anywhere. That is all the CI needs to hold.

Ken, Expert in the forum I referred to above didn't agree that you could get the same figure by your method. If you have time, pls go there so you can discuss your view as it is QM forum. Here at philosophy, Quantum Mechanic is not here... only armchair philosophers or metaphysicists who hold totally Newtonian views as you agreed before, or in case you really missed the original paper and it produced new stuff... at least you know so.
 
  • #62
Ken G said:
Now you are getting into the variants of Copenhagen. In "purist" Copenhagen, that of Bohr, the opposite is true-- nothing is quantum, there is no "quantum world." There is only the world of our observations-- the entire quantum realm is something just imagined, whatever we need to do the calculation to get the right prediction. von Neumann is bridging from the empiricist Copenhagen view to the rationalist many-worlds view, and his is the only one that I have a hard time seeing the consistency of. That seems to be the thrust of your issue too, but Bohr would not have had that problem.

Ken. We are discussing now about your statement "the entire quantum realm is something just imagined" in the QM forum thread https://www.physicsforums.com/showthread.php?t=494788&page=11

A person my_wan is confused by certain meaning of your statement. He said for example: (See message starting #165)

"Again, this is highly dependent on what is meant by "no" quantum world. Is Ken referring to nonexistent in the sense that our everyday world of observations is all there is because the quantum world is nothing more than that same world, or is it an existential nonexistence? It appears to me Ken is flirting with the existential version here, but even that is tricky. Because what exactly about it is existentially nonexistence if it is merely the world we experience? This I tried to qualitatively formulate previously by showing how even the limited frame dependent notion of space and time disappears at a fundamental level. So I need more to even guess at Ken's response."

So pls. go to that thread and clarify your confusing views. Thanks.

https://www.physicsforums.com/showthread.php?t=494788&page=11
 
  • #63
First of all, they are not my confusing ideas, I am explaining Bohr's perspective. Which I agree with, and do not find confusing. But I'll do it on that thread, yes.
 
  • #64
Varon said:
Ken, Expert in the forum I referred to above didn't agree that you could get the same figure by your method. If you have time, pls go there so you can discuss your view as it is QM forum. Here at philosophy, Quantum Mechanic is not here... only armchair philosophers or metaphysicists who hold totally Newtonian views as you agreed before, or in case you really missed the original paper and it produced new stuff... at least you know so.

Telling people to go to another forum? This thread is closed.
 

Similar threads

Replies
25
Views
1K
Replies
3
Views
781
Replies
4
Views
2K
Replies
4
Views
2K
Replies
12
Views
1K
Replies
13
Views
1K
Back
Top