Ballentine's Ensemble Interpretation Of QM

In summary: I can't remember the word, but it has to do with merging with the quantum state.In summary, the paper discusses how the Ensembles could be merged with the quantum state, which could lead to new discoveries in the field.
  • #71
bhobba said:
Indeed one could argue the Ensemble interpretation is basically just the QM mathematical formalism - but strictly speaking since it has a specific view of probability it has added an interpretive aspect - although IMHO a very minor one.
Interestingly, Krennikov has argued that probability distributions of the Kolmogorov-type is a mathematical assumption that may not be supported by actual physical quantum processes and that a non-Kolmogorov approach/interpretation may, in fact, evade the non-locality of Bell's:
In the frequency approach (if we follow to R. von Mises and define probabilities as limits of relative frequencies and not as abstract Kolmogorov measures) arguments related to locality and determinism do not play an important role in Bell’s framework.
Einstein and Bell, von Mises and Kolmogorov: reality and locality, frequency and probability
http://arxiv.org/pdf/quant-ph/0006016v2.pdf
In this review we remind the viewpoint that violation of Bell’s inequality might be interpreted not only as an evidence of the alternative-either nonlocality or “death of reality” (under the assumption the quantum mechanics is incomplete). Violation of Bell’s type inequalities is a well known sufficient condition of probabilistic incompatibility of random variables-impossibility to realize them on a single probability space. Thus, in fact, we should take into account an additional interpretation of violation of Bell’s inequality-a few pairs of random variables (two dimensional vector variables) involved in the EPR-Bohm experiment are incompatible. They could not be realized on a single Kolmogorov probability space. Thus one can choose between: a) completeness of quantum mechanics; b) nonlocality; c) “ death of reality”; d) non-Kolmogorovness. In any event, violation of Bell’s inequality has a variety of possible interpretations. Hence, it could not be used to obtain the definite conclusion on the relation between quantum and classical models.
Bell’s inequality: Physics meets Probability
http://arxiv.org/pdf/0709.3909.pdf
 
Physics news on Phys.org
  • #72
kye said:
How do you apply quantum computing (qbit) to it?

I am not into Quantum computing so I don't know any of its details.

BUT - if you are dealing with QM then it must be part of the formalism of QM without any actual interpretation. All the Ensemble interpretation does is add an interpretation of probability - nothing mysterious is going on.

Thanks
Bill
 
  • #73
bohm2 said:
a non-Kolmogorov approach/interpretation may, in fact, evade the non-locality of Bell's:

Since the Kolmogerov axioms are the basis of probability I have zero idea what a non-Kolmogerov approach even means. Every single book on probability I am aware of such as Fellers classic that is my bible on it bases probability on them.

Thanks
Bill
 
  • #74
Ken G said:
So you are saying that the ensemble interpretation asserts that QM is a theory about the behavior of large collections of similarly prepared subsystems, and is agnostic about the possibility of other theories that describe individual particles. If it asserts no more than that, however, I have a hard time distinguishing it in any significant way from the CI.

As I understand the term, the Copenhagen interpretation is by definition the view held by Niels Bohr and his followers. This included the collapse of wave function upon measurement, the notoriously difficult and questionable idea of complementarity, and also the assertion that ##\psi## is physically complete theoretical description of the experiment.

If any of these are removed, I think it best that the resulting view be not called Copenhagen interpretation.

On the other hand, the statistical (or ensemble) interpretation most probably does not need to include the above assumptions. It is mainly about the two Born rules and probabilistic reasoning. In principle it permits further description (some additional variables), but I do not think that the statistical interpretation requires it.

That's the part I'm unclear on when someone holds to the ensemble interpretation-- are they expressing some expectation that the nature of ensembles is fundamentally different from the nature of individual systems, as appropriate targets for doing science, or is there no such assertion that could distinguish it from the CI?

Yes, there is a difference - the ensemble is just an auxiliary concept to handle the probability calculations, much as in Gibbs's work on statistical physics. The "individual system" usually refers to some concrete physical object - as one trajectory on a photograph from Wilson's or bubble chamber refers to one concrete charged particle.
 
  • #75
Ken G said:
This sounds interesting, can you elaborate?

Hi Ken

It's simple mate.

It has long been known the correct basis for probability is the Kolmogorov axioms. Its, for example, how one avoids circularity in the frequentest interpretation, the details of which you will find in any modern textbook on probability such as Fellers.

It equally well applies to the Baysian view, and in fact any interpretation of probability is derivable from those axioms. Of course it includes a few reasonableness assumptions you find in any translation from pure to applied, but that's nothing new.

Now in Born's rule it simply talks about probability. So, since its not talking of any particular interpretation, you base it on Kolmogorov's axioms. To apply it you need a particular interpretation. If you use the frequentest interpretation, which is based on a large ensemble of trials via the law of large number, you are naturally led to the Ensemble interpretation. If you use the Bayesian interpretation, its the plausibility or subjective belief in a particular outcome, then you are led to some version of Copenhagen. It's purely how you interpret it so you can apply it. When you do that, since the state determines probability, you have a defacto interpretation of the state.

Thanks
Bill
 
  • #76
bhobba said:
Since the Kolmogerov axioms are the basis of probability I have zero idea what a non-Kolmogerov approach even means. Every single book on probability I am aware of such as Fellers classic that is my bible on it bases probability on them.
Here are a few links:

Non-Kolmogorov probability models and modified Bell's inequality
http://cds.cern.ch/record/429899/files/0003017.pdf

Interpretations of Probability
http://f3.tiera.ru/2/M_Mathematics/MV_Probability/Khrennikov%20A.%20Interpretations%20of%20probability%20(2ed.,%20de%20Gruyter,%202009)(ISBN%203110207486)(236s)_MV_.pdf

Non-Kolmogorov Probability Theory
http://link.springer.com/chapter/10.1007/978-94-009-1483-4_5
 
Last edited by a moderator:
  • #77
Jano L. said:
As I understand the term, the Copenhagen interpretation is by definition the view held by Niels Bohr and his followers. This included the collapse of wave function upon measurement, the notoriously difficult and questionable idea of complementarity, and also the assertion that ##\psi## is physically complete theoretical description of the experiment.

That's true.

The key idea however is the state applies to a single system. The view of the state could be its real or simply something that tells us about the plausibility or subjective belief in the outcomes of observations. That view is perfectly in line with the Baysian view of probability and if you interpret the probabilities of the Born rule that way that's view you are naturally led to.

If its simply a belief, or an indicator of plausibility, and not real collapse it's of zero consequence - its simply something that happens in theoretical calculations. The issue with collapse is if you think its actually real in some sense - but only some variants of Copenhagen do this - for most its simply an indicator of plausibility or something similar.

I have to say sometimes Copenhagen is not explained too well and I was caught in some confusion when I first stated posting on this forum - it took a few discussions to understand the nuances.

The Wikipedia article explains it reasonably well:
http://en.wikipedia.org/wiki/Ensemble_interpretation

It is in this manner, that the ensemble interpretation is quite able to deal with “single” or individual systems on a probabilistic basis. The standard Copenhagen Interpretation (CI) is no different in this respect. A fundamental principal of QM is that only probabilistic statements may be made, whether for individual systems/particles, a simultaneous group of systems/particles, or a collection (ensemble) of systems/particles. An identification that the wave function applies to an individual system in standard CI QM, does not defeat the inherent probabilistic nature of any statement that can be made within standard QM. To verify the probabilities of quantum mechanical predictions, however interpreted, inherently requires the repetition of experiments, i.e. an ensemble of systems in the sense meant by the ensemble interpretation. QM cannot state that a single particle will definitely be in a certain position, with a certain momentum at a later time, irrespective of whether or not the wave function is taken to apply to that single particle. In this way, the standard CI also “fails” to completely describe “single” systems.

However, it should be stressed that, in contrast to classical systems and older ensemble interpretations, the modern ensemble interpretation as discussed here, does not assume, nor require, that there exist specific values for the properties of the objects of the ensemble, prior to measurement.

Thanks
Bill
 
  • #78
bohm2 said:
Here are a few links:

Before going through them, to see if its even worthwhile, which I seriously doubt because what probability is is VERY well known, can you detail how it is even possible to have probability without the Kolmogogrov axioms?

When you have studied probability like I have that's a pretty wild claim - take my word for it. Strong claims require strong evidence.

What they may be talking about are some slight modifications to the axioms to include things like negative probabilities - that sort of thing is sometimes bandied about - but if such is probability in any usual sense is highly debatable.

Ok - to cut this discussion short I had a quick look.

The claim is like non-euclidean geometry you can relax one of Kolmogorov's axioms and get a non standard version. However, you can be 100% guaranteed that the probabilities in the Born rule are assumed to obey these axioms.

Thanks
Bill
 
Last edited:
  • #79
Bill, I do not understand how you arrive at this connection Frequentist -- Ensemble interpretation, Bayesian -- Copenhagen. I do not think there is any such connection at all.

When thinking about probability questions, I adopt the Bayesian view, and when thinking about atoms/molecules, I think in terms of statistical interpretation. I do not see any contradiction. Ensembles are just a way to work with probability - they are "conceptual". They do not require frequentist view of probability. Gibbs used them in his work on statistical physics, and in his explanation of entropy of mixing, I think one reads an example of Bayesian reasoning.
 
  • #80
Jano L. said:
Bill, I do not understand how you arrive at this connection Frequentist -- Ensemble interpretation, Bayesian -- Copenhagen. I do not think there is any such connection at all.

So you don't think a conceptual ensemble is not related to the frequentest interpretation where probabilities are looked on as the behavior of a large number of trials? The ensemble is not the conceptual realization of the outcomes of those trials?

And you don't think a view of probability that its a subjective belief or objective plausibility is not related to viewing the state the same way?

Its perfectly obvious to me and I suspect anyone else that thinks about it even superficially.

Still if you don't see it - shrug.

What can be said of course is since ultimately probability is defined by the Kolmogorov axioms its purely a matter of personal preference how you interpret it in order to apply it. In that sense the Ensemble interpretation and Copenhagen are more alike than is usually thought.

Thanks
Bill
 
Last edited:
  • #81
how it is even possible to have probability without the Kolmogogrov axioms?
Of course, Kolmogorov's axioms about probability are standard today, but they are not the only way to approach and develop the concept. They base probability on the mathematical theory of measure, but leave other aspects untouched. One could alternatively begin with other axioms and in principle, derive Kolmogorov's assertions as theorems.

For example, my favourite book is Jaynes': "Probability theory: the logic of science", some parts accessible at

http://bayes.wustl.edu/etj/prob/book.pdf

He does not begin with Kolmogorov's axioms - he begins with some basic requirements of consistency, which are almost what everybody uses in daily speech. In the end, of course, he find he agrees with Kolmogorov on technical points - but his theory is not based on them.
 
  • #82
Jano L. said:
For example, my favourite book is Jaynes': "Probability theory: the logic of science"

Ah yes - the Cox axioms - they are logically equivalent to Kolmogorov's axioms:
http://en.wikipedia.org/wiki/Cox's_theorem

I am no expert in the area but my understanding is all approaches are either equivalent to or to avoid issues like circularity, based on the Kolmogorov or equivalent axioms.

Thanks
Bill
 
  • #83
bhobba said:
If you use the frequentest interpretation, which is based on a large ensemble of trials via the law of large number, you are naturally led to the Ensemble interpretation. If you use the Bayesian interpretation, its the plausibility or subjective belief in a particular outcome, then you are led to some version of Copenhagen.
I believe I understand. You are saying that if I say I have a 1/5 chance of finding the particle in an excited state, I must say what I mean by that statement. I can then say that if I had a million such systems (conceptually), I would expect about 200,000 to be found in the excited state, in which case I have adopted the ensemble interpretation of whatever matrix element gave me that 1/5. Or, I can say that my expectation of finding the one particle in its excited state has a "degree" that is characterized by 1/5, which allows me to place it as significantly less than 1/2 (the intuitively "even" chance), but not spectacularly less, where the meaning of 1/5 quantifies what I mean by between significantly and spectacularly. This sounds more like CI, because it says that my matrix element is conveying some truth about the reality of this particular system, and my expectations about its results. That all makes reasonable sense to me. But since I don't really feel there is a large difference between a frequentist and a Bayesian attitude toward the meaning of a 1/5 probability, it brings me back to my core belief that CI and ensemble approaches are not nearly so different as, say, many-worlds or deBroglie-Bohm.

To make an ensemble interpretation a truly different animal, it seems to me one must further commit to the idea that QM is just not a theory about the behavior of individual systems, because one cannot test QM on individual systems without repeating the tests on an ensemble of them. If one does stipulate that, you then have solved some of the CI problems (you don't have to view collapse as mystical, because it is just a density matrix of correlations that do not make any statements about what is really happening in a single outcome), but you have introduced new ones (you are left wondering just what the heck is going on in a single system, since your interpretation of QM is agnostic about that issue). So to me, the ensemble interpretation feels like CI equipped with a means for "ducking" the questions that don't have clear-cut answers. There's nothing wrong with ducking a question, if it has no satisfactory answer, but it's different from taking a stand on that question.

ETA: But I see above that you are also of the opinion that CI and ensemble interpretations are more alike than they are different, so we agree there.
 
Last edited:
  • #84
Ken G said:
I believe I understand.

Hi Ken

I am pretty sure you do.

Thanks
Bill
 
  • #85
bhobba said:
Ah yes - the Cox axioms - they are logically equivalent to Kolmogorov's axioms:
http://en.wikipedia.org/wiki/Cox's_theorem

I am no expert in the area but my understanding is all approaches are either equivalent to or to avoid issues like circularity, based on the Kolmogorov or equivalent axioms.
I'm surprised you didn't mention the Cox axioms earlier, since Ballentine uses them as his starting point, not Kolmogorov. :eek:

But in any case, even the Cox axioms are not used in unadulterated form, since the last one must be modified for QM (which Ballentine defers until a much later chapter -- ch9 iirc.) :biggrin:
 
  • #86
strangerep said:
I'm surprised you didn't mention the Cox axioms earlier, since Ballentine uses them as his starting point, not Kolmogorov. :eek:

But in any case, even the Cox axioms are not used in unadulterated form, since the last one must be modified for QM (which Ballentine defers until a much later chapter -- ch9 iirc.) :biggrin:

Hey - thanks for reminding me of that. Indeed he uses axioms closer to the Bayes and Cox than Kolmogorov in showing his axioms obey the laws of probability - I forgot that and refreshed my memory by checking the text.

Of course it doesn't make any essential difference since they are equivalent - well essentially anyway - there are a couple of minor issues like countable additivity.

Of course having been involved in similar discussions some will probably lock onto that as an issue :rolleyes::rolleyes::rolleyes::rolleyes:

Thanks
Bill
 
  • #87
When you have 50 coins and throw it to the floor, the probably distribution is close to 25 heads and 25 tails. Maybe this is what ensemble interpretation is saying that one of the electron goes to the one slit and the other. Does anyone have other examples where this analogy doesn't work well? Let me share a brief passages (for discussions purposes) from Lee Smolin "Time Reborn" about why he thinks the Ensemble Interpretation doesn't make much sense (do you agree with him, why or why not?)

Lee Smolin said in "Time Reborn":

"One afternoon in early fall of 2010, I went to a cafe, opened a notebook to a blank page, and thought about my many failed attempts to go beyond quantum mechanics. I began by thinking about a version of quantum mechanics called the ensemble interpretation. This interpretation ignores the futile hope of describing what goes on in an individual experiment and instead describes an imaginary collection of all the things that might be going on in the experiment. Einstein put it nicely: “The attempt to conceive the quantum-theoretical description
as the complete description of the individual systems leads to unnatural theoretical interpretations, which become immediately unnecessary if one accepts the interpretation that the description refers to ensembles (or collections) of systems and not to individual systems.”

Consider the lone electron orbiting a proton in a hydrogen atom. According to the proponents of the ensemble interpretation, the wave is associated not with the individual atom but with the imaginary collection of copies of the atom. In different members of this collection, the electrons have different positions. Thus, if you were to observe the hydrogen atom, the result would be as if you had picked out an atom at random from this imaginary collection. The wave gives the probabilities of finding the electron in all those different places.

I had liked this idea for a long time, but all of a sudden it seemed totally crazy. How could an imaginary collection of atoms influence a measurement made on one real atom? This would contradict the principle that nothing outside the universe can act on something inside the universe. So I asked myself whether I could replace that imaginary collection with a collection of real atoms. Being real, they would have to exist somewhere in the universe. Well, there are in fact a great many hydrogen atoms in the universe. Could they be
the “collection” that the ensemble interpretation of quantum mechanics refers to?"

Lee Smolin wrote a paper in arxiv about it that is shared in the other thread about this. I'd like to know if his sudden critique on ballentine ensemble interpretation is sound and if you agree too. But I think tossing the 50 coins in my first example is a counter critique.
 
  • #88
Ken G said:
...To make an ensemble interpretation a truly different animal, it seems to me one must further commit to the idea that QM is just not a theory about the behavior of individual systems, because one cannot test QM on individual systems without repeating the tests on an ensemble of them. ... There's nothing wrong with ducking a question, if it has no satisfactory answer, but it's different from taking a stand on that question.

I would agree with your statement if the word “systems” was replaced with a proper concept. Paraphrasing one of your previous inputs I would say: "It is undeniable that the state vector can and should be thought of as a representation of the statistical property of an iterative run of a uniquely prepared experiment which delivers, at each run, one amongst a set of possible outcomes". Stating that the statistical property relates to the flow of qualitative pieces of information produced by an experiment is the only true minimal position that cannot be challenged. Stating that the distribution relates to some physical system or stating that each individual piece of information relates to an individual physical system, that already goes beyond the bare minimum since it cannot be proven experimentally.

Now everyone is (no doubt) aware that the strict minimal statement above, although it fully matches a neutral reading of the QM mathematical formalism, leads to a theory which does not tell anything about what might be an acceptable / efficient model for the physical world. It only deals with transforming the observed statistical outcome of a quantum experiment into the predicted outcome of another quantum experiment, for some specific categories of transformations of the experimental set-up. It is fully legitimate that physicists don't limit themselves to developing this minimal operative view, but they must use it as a reference in order to ascertain the consequences of adding interpretative hypotheses in order to convert the minimal reading of QM (merely dealing with describing experiments) into a model - actually a simulation - of what happens in the world. The rationale for this systematic comparison between the interpreted model and the neutral reading of QM is that the combination of several interpretative hypotheses may induce some side-effects leading to fictitious contradictions inside the simulation of the world. These must be recognised as being fictitious in order not to trigger complex developments of the theory which are not rooted into the experimental realm (on that basis I fully support your second sentence in the above quote).

A key example of such artificial contradictions lies with the famous “measurement problem” which results, in several interpretations of QM, from the combination of three physical hypotheses:
i) the state vector is a property of one single iteration of the experiment;
ii) the state vector is a property of “a physical system” traveling through the device;
iii) the measured value of the state vector holds at the boundary between the “preparation” and the “measurement apparatus”, inside the experimental device.
Then, noting that the location of this boundary is arbitrary in case two or more measurement devices are placed in a series, many physicists concluded that the “quantum state of the physical system” changes in a discontinuous way inside the measurement device. This obviously contrasts with the non-interpreted minimal reading of the QM formalism where the state vector, which is a property of the iterative experiment, can only evolve in response to a change in the experimental set-up, I mean it does not evolve inside the experimental device but inside a configuration space which represents a family of possible experiments.
Thanks.
 
  • #89
kye said:
Consider the lone electron orbiting a proton in a hydrogen atom. According to the proponents of the ensemble interpretation, the wave is associated not with the individual atom but with the imaginary collection of copies of the atom.

You misunderstand. The ensemble is state and observational apparatus.

The interpretation says nothing about the interpretation of the state independent of measurement context.

Thanks
Bill
 
  • #90
bhobba said:
You misunderstand. The ensemble is state and observational apparatus.

The interpretation says nothing about the interpretation of the state independent of measurement context.

Thanks
Bill


It's clear in all interpretations that the world cannot be classically real and existing at all times. It stopped bothering me when I first learned that electrons around the nucleus are not moving(and that if they moved classically-like, they would soon lose their energy and spiral onto the nucleaus), yet in the presence of a measuring apparatus they are moving classically(-like) and this is experimentally verified thousands of times per day(and their kinetic energy and speed can be calculated). This is when the classical world died for me and a host of other issues emerged that remain unresolved.
 
Last edited:
  • #91
kye said:
According to the proponents of the ensemble interpretation, the wave is associated not with the individual atom but with the imaginary collection of copies of the atom. In different members of this collection, the electrons have different positions. Thus, if you were to observe the hydrogen atom, the result would be as if you had picked out an atom at random from this imaginary collection. The wave gives the probabilities of finding the electron in all those different places.

I had liked this idea for a long time, but all of a sudden it seemed totally crazy. How could an imaginary collection of atoms influence a measurement made on one real atom?

The imaginary collection does not influence measurement on an atom. It is the act of measurement which influences the atom !

maui said:
It's clear in all interpretations that the world cannot be classically real and existing at all times. It stopped bothering me when I first learned that electrons around the nucleus are not moving(and that if they moved classically-like, they would soon lose their energy and spiral onto the nucleaus)

I am afraid here you go too far. The idea that in classical EM theory the atom necessarily has to collapse due to radiation of its nucleus and electron, and the calculation of the rate at which this happens, is actually based on very implausible assumption:

The atom is the only one in the universe and no external EM fields act on it, hence the radiated energy has to come from its internal energy.

It is very easy to make this unwarranted assumption. But in any realistic model of stability of atoms, the atoms are under action of external EM fields, if only the thermal EM field, and perhaps also the zero-point radiation. In such situation, the classical theory predicts that the electron will be maintained in chaotic motion around the nucleus. See the paper

Daniel C. Cole, Yi Zou, Quantum Mechanical Ground State of Hydrogen Obtained from Classical Electrodynamics, 2003

http://arxiv.org/abs/quant-ph/0307154
 
  • #92
kye said:
Lee Smolin wrote a paper in arxiv about it that is shared in the other thread about this. I'd like to know if his sudden critique on ballentine ensemble interpretation is sound and if you agree too.
It sounds to me like he is asking a different question than Ballentine. Smolin seems to be asking the question "what is going on for a single real particle, such that it can behave like it is a random member of a large ensemble of real particles?" And he concludes, maybe that's because it is a random member of a large ensemble of real particles. In other words, he takes a literal interpretation of the ensemble, to achieve a more realist outcome. Ballentine, on the other hand, seems to be asking, "how can I conceptualize what is going on for a single real particle, such that I can understand what it means to say it behaves statistically?" The answer there is, we can give meaning to our words about the single particle by embedding it in a much larger imaginary collection, and use that abstraction to justify using the laws of probability. The difference is that for Ballentine, a "law" would need to be no more than an abstract way of thinking about something such that it makes sense, whereas Smolin appears to be searching for laws that are more literally true in some kind of absolute sense.

Of course both of the viewpoints have their issues-- Ballentine has the problem that Smolin points out, it's not very satisfactory from a realist perspective because the abstract ensemble could not have real influences on a single particle. Smolin has the problem that he seems to suggest that quantum mechanics couldn't work in a much smaller universe, it requires a kind of "Mach indeterminism", where if Mach says that inertia for mass here comes from all the masses over there, Smolin says that indeterminism for a particle here comes from all the particles over there.

Personally, I have no issue with Ballentine's approach, because I think it is just perfectly demonstrably true that a law of physics is, and has always been, an abstract conceptualization used by physicists to give a more formal or mathematically precise meaning to the language they invoke to make sense of observations.
 
  • #93
Jano L. said:
The imaginary collection does not influence measurement on an atom. It is the act of measurement which influences the atom !



I am afraid here you go too far. The idea that in classical EM theory the atom necessarily has to collapse due to radiation of its nucleus and electron, and the calculation of the rate at which this happens, is actually based on very implausible assumption:

The atom is the only one in the universe and no external EM fields act on it, hence the radiated energy has to come from its internal energy.

It is very easy to make this unwarranted assumption. But in any realistic model of stability of atoms, the atoms are under action of external EM fields, if only the thermal EM field, and perhaps also the zero-point radiation. In such situation, the classical theory predicts that the electron will be maintained in chaotic motion around the nucleus. See the paper

Daniel C. Cole, Yi Zou, Quantum Mechanical Ground State of Hydrogen Obtained from Classical Electrodynamics, 2003

http://arxiv.org/abs/quant-ph/0307154

Months ago I was interested in Stochastic Electrodynamics but no longer interested because it can't even explain:

1. the right hyperfine splitting of the hydrogen atom
2. the anomalous magnetic moment of the electron with many decimal digits accuracy
3. lamb shift

So I was discouraged with SED.

About Ballentine's interpretation. It's not unlikely. Remember electron spin doesn't occur in real axis but in abstract space, the fundamental forces came from gauge symmetries which don't have nut and bolts correlates, even the higgs field is a result of terms in the equations so as not to destroy the equations symmetry, general covariance shows spacetime is not a thing, so it's really possible electrons or other particles don't have real paths but only looks like they have a path because of our measurement context.
 
  • #94
Sugdub said:
Stating that the statistical property relates to the flow of qualitative pieces of information produced by an experiment is the only true minimal position that cannot be challenged. Stating that the distribution relates to some physical system or stating that each individual piece of information relates to an individual physical system, that already goes beyond the bare minimum since it cannot be proven experimentally.
Spoken like a true empiricist! I agree that empiricism is the bedrock of science, but I would point out that other approaches to empirical constraints are possible. Many physicsts prefer a more rationalist stance, whereby observations are used to say "which theory is most right", rather than using theories to "make sense of the observations." In practical applications of the doing of science, that distinction is rather minor, which is how science works as a consensus endeavor despite these different philosophical perspectives. But in terms of what we think we are actually doing when we do science, those two perspectives are extremely different, and can create a kind of "tower of Babel" problem when people start to discuss interpretations!

It is fully legitimate that physicists don't limit themselves to developing this minimal operative view, but they must use it as a reference in order to ascertain the consequences of adding interpretative hypotheses in order to convert the minimal reading of QM (merely dealing with describing experiments) into a model - actually a simulation - of what happens in the world.
Yes, this is a vivid statement of the core of empiricism-- that which is real is what is observed, the rest is interpretive matrix. But the rationalist does not accept that stance-- they typically feel that what is real is necessarily perceived by our limited minds as something abstract, and our reliance on observation is like a crutch that sees "through the looking glass darkly", as it were.

The rationale for this systematic comparison between the interpreted model and the neutral reading of QM is that the combination of several interpretative hypotheses may induce some side-effects leading to fictitious contradictions inside the simulation of the world. These must be recognised as being fictitious in order not to trigger complex developments of the theory which are not rooted into the experimental realm (on that basis I fully support your second sentence in the above quote).
Here you might be talking about QM and GR, and the issues with unifying them. I agree, I think we take these models too seriously. But I would have to say that I tend to side with empiricists, though I also think that rationalism has its place. I kind of see them as different hats that we can put on depending on the dress code of the affair we are attending. The empiricist hat that you are describing so accurately seems to fit me best, but I can put on the other hat too, and see reasons and places where I might want to do that.

A key example of such artificial contradictions lies with the famous “measurement problem” which results, in several interpretations of QM, from the combination of three physical hypotheses:
i) the state vector is a property of one single iteration of the experiment;
ii) the state vector is a property of “a physical system” traveling through the device;
iii) the measured value of the state vector holds at the boundary between the “preparation” and the “measurement apparatus”, inside the experimental device.
Then, noting that the location of this boundary is arbitrary in case two or more measurement devices are placed in a series, many physicists concluded that the “quantum state of the physical system” changes in a discontinuous way inside the measurement device. This obviously contrasts with the non-interpreted minimal reading of the QM formalism where the state vector, which is a property of the iterative experiment, can only evolve in response to a change in the experimental set-up, I mean it does not evolve inside the experimental device but inside a configuration space which represents a family of possible experiments.
That's an interesting way to frame the measurement problem, I will cogitate on it!
 
  • #95
I reviewed this in wikipedia and read this:

"The Ensemble interpretation is not popular, and is regarded as having been decisively refuted by some physicists. John Gribbin writes:-

"There are many difficulties with the idea, but the killer blow was struck when individual quantum entities such as photons were observed behaving in experiments in line with the quantum wave function description. The Ensemble interpretation is now only of historical interest."[12]

The "12" rrefers to Q for quantum book. So if Lee Smolin is mistakened about it, Griffen can too?
Actually I have tried reading Ballentine book years ago. I thought he was referring to throwing coins in the table and getting heads up and tails up half of the time when you can see individual pieces producing the head, tail. So in Ballentine's. Do all agree with Hobba that it is basically about "The interpretation says nothing about the interpretation of the state independent of measurement context."? Because without this emphasis, it can confused generations, and confused griffin's too?
 
  • #96
kye said:
The Ensemble interpretation is not popular, and is regarded as having been decisively refuted by some physicists.
From my understanding, there seem to be sub-interpretations of the Ensemble interpretation just like there are sub-interpretations of all the other interpretations which just leads to more confusion.
 
  • #97
kye said:
The Ensemble interpretation is now only of historical interest.

I think someone needs to chat to Ballentine about it.

I mean calling his textbook a modern approach and what it advocates is only of historical interest.

:rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes::rolleyes:

Thanks
Bill
 
  • #98
bhobba said:
You misunderstand. The ensemble is state and observational apparatus. The interpretation says nothing about the interpretation of the state independent of measurement context.
I don't think I agree. If we have prepared a state |ψ(t0)> the conceptual ensemble evolves in time according to the Schrödinger equation. How is the interpretation of the quantum state as an ensemble at an arbitrary later time t dependent on a (future) measurement context? I would agree that it is dependent on a preparation procedure but this is not part of the ensemble.

kye said:
How could an imaginary collection of atoms influence a measurement made on one real atom?
The ensemble is accociated with a preparation procedure. If you know that your free hydrogen atom can be described by state |ψ> this implies that a process has occurred which prepared it in this state. Such processes can in principle not lead to a state where all observables have well-defined values due to the uncertainty principle. So what influences the measurement of a single atom is not the "imaginary collection" but the fact that the preparation procedure is imperfect and thus compatible with different values of the observable you want to measure.
 
  • #99
kye said:
Months ago I was interested in Stochastic Electrodynamics but no longer interested because it can't even explain:

1. the right hyperfine splitting of the hydrogen atom
2. the anomalous magnetic moment of the electron with many decimal digits accuracy
3. lamb shift

Kye, you probably wanted to say "did not predict" instead of "can't explain". Showing the latter would be quite an accomplishment. If it actually has been shown, could you please post a reference?

Anyway, I was answering Maui's argument based on the belief that atom cannot be stable in the classical EM theory. In light of modern developments, the original reason for it is seen as unwarranted. There is no longer any "atom-collapse" problem in the classical EM theory.

kith said:
I don't think I agree. If we have prepared a state |ψ(t0)> the conceptual ensemble evolves in time according to the Schrödinger equation. How is the interpretation of the quantum state as an ensemble at an arbitrary later time t dependent on a (future) measurement context? I would agree that it is dependent on a preparation procedure but this is not part of the ensemble.

Very good point! Our choice of the Psi function ##\psi(\mathbf r, 0)## or spin vector ##|S(0)\rangle##*used to describe the system at ##t = 0## may depend on what happened to the atom previously, but in the standard use of these things, there is no part in them referring to any measurement apparatus.

It may only be that the resulting derived probability may be referring to some measurements. Here comes the source of the confusion; the situation is different for particle positions, and different for spin projections.

-> For particle positions, ##\psi(\mathbf r, t)## and the ensemble we use to interpret its squared modulus indeed do not depend on any measurements.

For example, the probability for volume ##\Delta V##

$$
P(\Delta V) = \int_{\Delta V} |\psi(\mathbf r, t)|^2\,dV
$$

is simply the probability that the particle is at ##\Delta V##, not that "it will be measured to be in ##\Delta V##". Such measurement for small ##\Delta V## is impossible to do for atoms (low wavelength light will ionize the atom), and for many-particle systems like ##\text{H}_2 \text{O}## the probability in question is for configurations of particles, which are even more unmeasurable.

-> For spins, it is very different, both mathematically and physically. In general, it is not possible to repeat the above reasoning and correctly interpret the spin ket ##|S\rangle = c_1 |z+\rangle + c_2 |z-\rangle## by saying that the spin has value either ##+1/2## or ##-1/2## irrespective of measurement, with probabilities given by ##|c_1|^2,|c_2|^2##, except for special case when either ##|c_1|^2 = 1## or ##|c_2|^2 = 1##. The SG-like experiments measuring spin projections along different axes refute such idea - we know that prior to the measurement, spin ket can have any orientation in space, not just those two we measure, and it is the interaction with the SG-magnet who selects one value from ##+1/2, -1/2##.

So the role of measurement for continuous configurations and discrete results of SG-like measurements is different, and we should take this into consideration everytime the confusion about measurements creeps in.
 
  • #100
Jano L. said:
I am afraid here you go too far. The idea that in classical EM theory the atom necessarily has to collapse due to radiation of its nucleus and electron, and the calculation of the rate at which this happens, is actually based on very implausible assumption:

The atom is the only one in the universe and no external EM fields act on it, hence the radiated energy has to come from its internal energy.

It is very easy to make this unwarranted assumption. But in any realistic model of stability of atoms, the atoms are under action of external EM fields, if only the thermal EM field, and perhaps also the zero-point radiation. In such situation, the classical theory predicts that the electron will be maintained in chaotic motion around the nucleus. See the paper

Daniel C. Cole, Yi Zou, Quantum Mechanical Ground State of Hydrogen Obtained from Classical Electrodynamics, 2003

http://arxiv.org/abs/quant-ph/0307154

Correct me if I am wrong but if electrons moved in trajectories around the nucleus, covalent bonds between atoms would be impossible and a big chunk of quantum chemistry would go out the window(e.g. H has just one electron and it's shared in the H2O molecule).
 
  • #101
Correct me if I am wrong but if electrons moved in trajectories around the nucleus, covalent bonds between atoms would be impossible and a big chunk of quantum chemistry would go out the window(e.g. H has just one electron and it's shared in the H2O molecule).

I do not understand how you can derive such conclusion. The theory of chemical bond is based on Schroedinger's equation for point-like charged particles. It is not based on rejection of existence of trajectories. Sure, it does not use the concept, but it also does not disprove it.

Covalent bond involves sharing of electrons and the probability distribution for positions is spread all over the molecule, so the natural way to imagine this is that electrons move all around the molecule, visiting randomly all the nuclei in the vicinity. Since electrons are negatively charged and with greatest probability in between the nuclei, they attract positive nuclei and keep them in some average distance despite their mutual repulsion. If some atoms are very attractive to electrons, they can leave some other atoms without the bonding electrons for long periods of time and the glueing effect of the latter may cease - hence the dissociation of polar molecules into ions that happens in solvents. I believe this is quite standard picture adopted in theoretical chemistry (see works by John Slater, David Cook, ...) It may not calculate exact trajectories, but for sure it does not disprove them either.
 
  • #102
Jano L. said:
For example, the probability for volume ##\Delta V##

$$
P(\Delta V) = \int_{\Delta V} |\psi(\mathbf r, t)|^2\,dV
$$

is simply the probability that the particle is at ##\Delta V##, not that "it will be measured to be in ##\Delta V##".
How could an experiment distinguish between these two statements?

Jano L. said:
The SG-like experiments measuring spin projections along different axes refute such idea [...]
Why can't we apply the same logic to a measurement sequence of position, momentum, position, ... ?
 
  • #103
But surely a classical picture is quite clunky, because the guts of quantum mechanics is that the classical action per mode of excitation is in some sense both quantized and bounded below at h. I'd have a hard time seeing how this clearly universal principle can arise from a picture of chaotic classical buffeting by some zero-point field. Why quantized? Why always h? Maybe it can be done, but only by bending over backward, at best. Rather than reverse-engineering a classical picture to work in a domain where it is clearly not well suited, it would seem better to embrace the new picture of wave mechanics and the straightforward path to quantization that it presents.
 
  • #104
kith said:
How could an experiment distinguish between these two statements?
You cannot measure configuration of electronic position vectors in an atom. If you try that by light of wavelength comparable to the size of the atom, the radiation will have great immediate impact on its configuration, and the analysis of the scattered radiation is very hard. X-ray scattering study of electronic frequency density around nuclei of some molecule crystals of many such molecules to accumulate the signal, and it provides only frequency distribution averaged over these many molecules. Currently it is not possible to measure positions of electron in an individual molecule on the scale of its typical size.

The Born rule for ##|\psi(\mathbf r)|^2## used in treatment of atoms and molecules is not a rule to predict results of measurements of their configuration. It is a way to make sense of ##\psi##, independently of such measurements, a very important thing since they cannot currently be performed.

kith said:
Why can't we apply the same logic to a measurement sequence of position, momentum, position, ... ?

When we measure spin projection along some axis ##\mathbf o## chosen beforehand, the atoms split into two different trajectories and we label these by 1/2, -1/2. These two trajectories depend on the axis ##\mathbf o## chosen, so also the resulting value ##\sigma =1/2## or ##\sigma=-1/2## refers to this axis; it has no meaning without it. The actually measured value of projection ##\sigma## refers to the axis ##\mathbf o## and thus could not have existed in the atom before this axis was chosen.

This kind of reasoning does not apply in the case of position or momentum, because their meaning does not refer to any variable setting of the measurement apparatus.

When we measure position of a particle, we let the particle impinge on a detector screen. We do not choose any variable analogous to ##\mathbf o##, and the result of measurement ##\mathbf r## (center of the spot left by the particle, for example, a photo grain or charged CCD pixel) has direct meaning of position, independently of any choice in the setting of the detector screen.

We cannot rotate the axis ##\mathbf o## of the position detector, show that positions will be different and conclude that they too cannot exist before the orientation of the detector was chosen. Rotating position detector has no such effect.

In short, the position has always the same thing, irrespective of how we measure it, in contrast to spin projection, which depends on how we orient the axis of the measuring magnet.
 
  • #105
Jano L. said:
Covalent bond involves sharing of electrons and the probability distribution for positions is spread all over the molecule, so the natural way to imagine this is that electrons move all around the molecule, visiting randomly all the nuclei in the vicinity. Since electrons are negatively charged and with greatest probability in between the nuclei, they attract positive nuclei and keep them in some average distance despite their mutual repulsion. If some atoms are very attractive to electrons, they can leave some other atoms without the bonding electrons for long periods of time and the glueing effect of the latter may cease - hence the dissociation of polar molecules into ions that happens in solvents. I believe this is quite standard picture adopted in theoretical chemistry (see works by John Slater, David Cook, ...) It may not calculate exact trajectories, but for sure it does not disprove them either.
You seem to misunderstand what I asked. In the H2o(water) molecule, each of the two H atoms has just 1 electron. That 1 electron is shared between the atoms and the only possible case where that would be plausible is when the electron is spread out over the volume of two atoms at the same time, i.e. when the electron does not sit on a trajectory(at least not a on a single trajectory). If the single electron in the H atom of the H2O molecule had a trajectory, the covalent bond would fall apart and water would turn to H and O.
 

Similar threads

Replies
3
Views
2K
Replies
109
Views
8K
Replies
199
Views
16K
Replies
90
Views
8K
Replies
105
Views
6K
Replies
21
Views
3K
Back
Top