Why Is Quantum Mechanics So Difficult? - Comments

In summary: I like Landau and Lifshitz too. Their Mechanics book was a revelation; QM, while good and better than most, wasn't quite as impressive to me as Ballintine. But like all books in that series it's, how to put it, terse, and the problems are, again how to put it, challenging, but to compensate actually relevant.
  • #36
bolbteppa said:
They seem like pretty fatal flaws to me, or at least good reasons to choose Landau instead of this potentially shaky stuff...

That's a complete misunderstanding of the ensemble interpretation.

Its a conceptual ensemble, exactly the same as a conceptual ensemble in the frequentest interpretation of probability.

If there is a flaw in it, there is a flaw in the frequentest interpretation of probability - which of course there isn't since circularity has been removed by basing it on the Kolmogorov axioms - it would mean a flaw in those axioms and many areas would be in deep doo doo.

The Wikipedia article on it explains it quite well:
http://en.wikipedia.org/wiki/Ensemble_interpretation

The usual criticisms revolve around applying it to single systems - but as the article correctly says:
'However, the "ensemble" of the ensemble interpretation is not directly related to a real, existing collection of actual particles, such as a few solar neutrinos, but it is concerned with the ensemble collection of a virtual set of experimental preparations repeated many times. This ensemble of experiments may include just one particle/one system or many particles/many systems. In this light, it is arguably, difficult to understand Neumaier's criticism, other than that Neumaier possibly misunderstands the basic premise of the ensemble interpretation itself'

Thanks
Bill
 
Last edited:
Physics news on Phys.org
  • #37
The "ensemble" is not only conceptual, it's created all the time when physicists measure things in the lab. They perform the experiment many times with as independent realizations as possible and measure always the same quantities again and again, evaluate the outcome via statistical methods and give the result of the measurement. Often the value of the measurement is simple compared to give a well estimated systematical error.

For many-body systems you also have another type of "ensemble". The ensemble is realized by the many-body system itself. You do not ask about all positions of all gas molecules in a container (or the wave-function of a [itex]10^{24}[/itex]-particle systems) but look at pretty "coarse grained" quantities like the density, flow-velocity field, pressure, temperature, etc. Here the coarse-graining is over space-time volumes which can be taken as small on a scale over which such macroscopic quantities change considerable but large on a microscopic scale. It involves the average over some time interval and some volume containing still many particles. In this way you can derive the macroscopic behavior of everyday many-particle objects around us. The gas will be described by its thermodynamic equation of state (equilibrium or local equilibrium; hydrodynamical level of description) or by the Boltzmann(-Uehling-Uhlenbeck) equation (off-equilibrium; transport level of description), etc.

Of course, there is a conceptual problem with physics (not only quantum theory!) concerning single events. You can only deduce physical laws from reproducible well-defined objective setups of an experiment. You cannot conclude much from a single event. E.g., the idea to ask for a "wave function/quantum state" of the entire universe is flawed, because whatever an answer you might give, how should you experimental verify or falsify this hypothesis? What we observe in cosmology are very small areas of a part of the universe like the measurement of the temperature fluctuations of the cosmic microwave background radiation or its polarization (COBE, WMAP, PLANCK satelites). Another example is the measurement of the redshift-distance relation of far-distant supernovae (Hubble space telescope etc.).
 
  • #38
bhobba said:
That's a complete misunderstanding of the ensemble interpretation.

Its a conceptual ensemble, exactly the same as a conceptual ensemble in the frequentest interpretation of probability.

If there is a flaw in it, there is a flaw in the frequentest interpretation of probability - which of course there isn't since circularity has been removed by basing it on the Kolmogorov axioms - it would mean a flaw in those axioms and many areas would be in deep doo doo.

The Wikipedia article on it explains it quite well:
http://en.wikipedia.org/wiki/Ensemble_interpretation

The usual criticisms revolve around applying it to single systems - but as the article correctly says:
'However, the "ensemble" of the ensemble interpretation is not directly related to a real, existing collection of actual particles, such as a few solar neutrinos, but it is concerned with the ensemble collection of a virtual set of experimental preparations repeated many times. This ensemble of experiments may include just one particle/one system or many particles/many systems. In this light, it is arguably, difficult to understand Neumaier's criticism, other than that Neumaier possibly misunderstands the basic premise of the ensemble interpretation itself'

Thanks
Bill

If the ensemble is only notional, there is no difference between Ensemble and Copenhagen, if we take the probabilities in Copenhagen to be frequentist. In Copenhagen, the state vector is not necessarily real, but the outcomes and their probabilities are, so a frequentist interpretation is allowed. So basically if Ensemble is correct, then it is just Copenhagen renamed. Unfortunately, Ballentine disparages Copengagen and wilfully deletes one axiom from it, rendering Ballentine's version of the Ensemble interpretation incorrect quantum mechanics. Basically, Ballentine appears to claim that Landau and Lifshitz and Weinberg are wrong! But I believe Landau and Lifshitz and Weinberg are correct, wherever there is a disagreement between Ballentine and them.

There is one error in the tradition that Landau and Lifshitz and Weinberg come from, but that error (as far as I know) does not appear in their books. That error is the von Neumann proof against hidden variables, which came to light partly through Bohm and Bell, although it was known before. Since (as far as I know) this error does not appear in Landau and Lifshitz or Weinberg, I recommend their books as good presentations of quantum mechanics.
 
Last edited:
  • #39
atyy said:
If the ensemble is only notional, there is no difference between Ensemble and Copenhagen... .So basically if Ensemble is correct, then it is just Copenhagen renamed.

I prefer "Copenhagen without collapse" to "Copenhagen renamed", because the ensemble interpretation doesn't carry along the additional and somewhat problematic notion of collapse. If there's no collapse I don't have to worry about how measurements cause collapse, and because I'm just using the theory to generate statements about the outcomes of interactions I can put the Von Neumann cut wherever I find it computationally convenient.

Of course if you want something with deeper explanatory behavior, the ensemble interpretation is infuriating/exasperating/frustrating because it stubbornly refuses to say anything about why the probabilities are what they are.
 
  • #40
Nugatory said:
I prefer "Copenhagen without collapse" to "Copenhagen renamed", because the ensemble interpretation doesn't carry along the additional and somewhat problematic notion of collapse. If there's no collapse I don't have to worry about how measurements cause collapse, and because I'm just using the theory to generate statements about the outcomes of interactions I can put the Von Neumann cut wherever I find it computationally convenient.

Yes, one can have Copenhagen without collapse, if one always pushes all measurements to the end of the experiment. If all measurements occur at the end, and in the same location, then there are no further measurements, no need for a quantum state after the measurement, and no collapse. In this viable view, one simply denies the existence of measurements at spacelike separation.

However, in a Bell test, where there are simultaneously measurements at spacelike separation, those measurements will not be simultaneous in another reference frame. So if there are measurements at spacelike separation, and if any reference frame can be used in quantum mechanics, then there will be collapse in one frame.

Here is one example of how collapse might be used to analyse measurements at spacelike separation: http://arxiv.org/abs/1007.3977.
 
Last edited:
  • #41
Nugatory said:
Of course if you want something with deeper explanatory behavior, the ensemble interpretation is infuriating/exasperating/frustrating because it stubbornly refuses to say anything about why the probabilities are what they are.

It is the same with Copenhagen, except that since we already are agnostic about the reality of the wave function, and the wave function is just a tool to calculate the probabilities which we can observe, then there is nothing problematic about collapsing the wave function - it is just another tool like the wave function that we use to calculate the probabilities of outcomes.
 
  • #42
atyy said:
However, in a Bell test, where there are simultaneously measurements at spacelike separation, those measurements will not be simultaneous in another reference frame. So if there are measurements at spacelike separation, and if any reference frame can be used in quantum mechanics, then there will be collapse in one frame.

The statement "If you perform the two measurements, the results will be correlated by cos2Θ" is frame-independent and doesn't care about the temporal ordering of the two measurements.

Whether it's satisfying or not is a different question.
 
  • #43
Nugatory said:
The statement "If you perform the two measurements, the results will be correlated by cos2Θ" is frame-independent and doesn't care about the temporal ordering of the two measurements.

Whether it's satisfying or not is a different question.

Sure, but in anyone frame there is a temporal ordering, and in anyone frame there is wave function evolution. So if you use wave function evolution in anyone frame, part of the correct evolution of the wave function in that frame involves collapse.

Take a look at http://arxiv.org/abs/1007.3977.
 
  • #44
bhobba said:
That's a complete misunderstanding of the ensemble interpretation.

Its a conceptual ensemble, exactly the same as a conceptual ensemble in the frequentest interpretation of probability.

If there is a flaw in it, there is a flaw in the frequentest interpretation of probability - which of course there isn't since circularity has been removed by basing it on the Kolmogorov axioms - it would mean a flaw in those axioms and many areas would be in deep doo doo.

Well first of all, this isn't correct - the limitations of the frequentist interpretation of probability:

Remark 1.8. (Limitations of Frequency Interpretation of Probability)
1. If an experiment is repeated a very large number of times or indefinitely, then the conditions of the experiment need not remain homogeneous. As a consequence, the frequency ratio of A is subject to change.

2. The frequency ratio of A, [itex]f_n(A) = \tfrac{n(A)}{n}[/itex] need not converge [itex] \lim f_n(A) = P(A)[/itex] to a unique value. Hence, P(A) is not well-defined.


(In a random experiment E is repeated n times an event A occurs n(A) times thus [itex]f_n(A)[/itex] is it's frequency ratio)

(Ballentine mentions the second) mean that frequentist probability itself is flawed. Ballentine also mentions this, but the probability in Ballentine's book is not frequentist. The word propensity interpretation is used on page 32 of the 1st edition as a means to take the good and leave the bad in the frequentist interpretation. Ballentine derives this propensity interpretation from Cox's probability axioms which are similar to Kolmogorov's, but not the same...

(Frequentist flaws aren't somehow fixed by Kolmogorov btw, they can't be fixed inside a frequentist perspective. If frequentist probability gives a correct result, you can derive it from Kolmogorov's axioms, but the issue is that a frequentist foundation leads to problems while Kolmogorov's foundation doesn't. This is all irrelevant though, as Ballentine is working from Cox's probability)


So Ballentine not only asks us to throw away axioms of quantum mechanics, he also asks us to throw away the most widely used and basic form of probability, Kolmogorov's probability. My issue is the following:

bhobba said:
The usual criticisms revolve around applying it to single systems - but as the article correctly says:
'However, the "ensemble" of the ensemble interpretation is not directly related to a real, existing collection of actual particles, such as a few solar neutrinos, but it is concerned with the ensemble collection of a virtual set of experimental preparations repeated many times. This ensemble of experiments may include just one particle/one system or many particles/many systems. In this light, it is arguably, difficult to understand Neumaier's criticism, other than that Neumaier possibly misunderstands the basic premise of the ensemble interpretation itself'

Thanks
Bill

I can see how that applies to Neumaier's criticism, but it doesn't say anything about Lubos' criticism of the ensemble interpretation as being nothing but a restricted and modest view of the power of QM. I'm curious what people think of this criticism.

In other words, why should we throw away both Kolmogorov probability and some axioms of standard quantum mechanics in favour of less axioms and another form of probability when all we get is a restricted and modest view of the power of QM?

To be clear, I haven't read Ballentine. Lubos's issues already turned me off a while ago, so I posted here to find a reason to give it a chance. After just finding out I also have to throw away Kolmogorov probability, I'm now even less inclined, but I'd still consider it if there's good enough a reason. Are you guys aware this is how deep into the rabbit hole you have to go?
 
  • #45
bolbteppa said:
Well first of all, this isn't correct - the limitations of the frequentest interpretation of probability

Sure they need not remain homogeneous - but the conceptualisation is they do - its a straw man argument.

Many, many books explain the validity of the frequentest interpretation when backed by the Kolmogorov axioms eg
https://www.amazon.com/dp/0471257087/?tag=pfamazon01-20

bolbteppa said:
but the probability in Ballentine's book is not frequentist.

The conceptual ensemble the outcome is selected from is by definition frequentest.

bolbteppa said:
Ballentine derives this propensity interpretation from Cox's probability axioms

Its true he doesn't use the usual Kolmogorov axioms - and uses the Cox axioms - but they are equivalent. Usually however when people talk about the Cox axioms they mean the interpretation based on plausibility - he is not doing that.

He does use propensity - but I think he uses it simply as synonymous with probability - most certainly in the equations he writes that's its meaning.

What he assumes is states apply to a very large number (an ensemble) of similarly prepared systems with a particular outcome of an observation. From the law of large numbers they occur in proportion to the probability of that outcome. That's the way its frequentest.

He actually goes a bit further than that thinking of them as infinite. I personally have a bit of difficulty with that - and think of them as very large - but not infinite. In applying the law of large numbers you imagine some probability so close to zero for all practical purposes it is zero and we have a large, but finite, number of trials whose entries are in proportion to their probability.

bolbteppa said:
Frequentist flaws aren't somehow fixed by Kolmogorov btw,

The law of large numbers says otherwise - again this is fully explained in books like Feller. I am pretty sure I know your issue - its concerned with the law of large numbers convergence in probability or almost assuredly - however simple assumptions made when applying it fix that issue. Again any good book on probability such as Feller will explain this - but its simple. There is obviously a probability below which its impossible in practice to tell from zero. That sort of assumption is made all the time in applying theories. That being the case in the law of large numbers you simple assume the conceptual outcome of a large number of trials is well below that level.

bolbteppa said:
So Ballentine not only asks us to throw away axioms of quantum mechanics, he also asks us to throw away the most widely used and basic form of probability, Kolmogorov's probability

Errrr. He bases it on the two stated axioms in Chapter 2. Nothing is thrown out.

In fact it can be based on one axiom as detailed in post 137 of the link I gave previously.

Exactly what don't you get about Gleason and it showing (with the assumption of non-contextuality) that a state exists and it obeys the Born Rule?

This, IMHO, is clearer than Ballentine's approach that assumes two axioms then shows they are compatible with the axioms of probability.

bolbteppa said:
but it doesn't say anything about Lubos' criticism of the ensemble interpretation as being nothing but a restricted and modest view of the power of QM. I'm curious what people think of this criticism.

To be frank I don't even understand Lubos's criticism - mind carefully explaining it to me?

bolbteppa said:
In other words, why should we throw away both Kolmogorov probability and some axioms of standard quantum mechanics in favour of less axioms and another form of probability when all we get is a restricted and modest view of the power of QM?

He doesn't do that.

bolbteppa said:
To be clear, I haven't read Ballentine.

You should. But what leaves me scratching my head is you seem to have all these issues with it - but haven't gone to the trouble to actually study it. I could understand that if it was generally considered crank rubbish - but it isn't. Its a very well respected standard textbook. It is possible for sources of that nature to have issues - and it does have a couple - but they are very minor.

It should be fairly obvious major issues with standard well respected textbooks are more than likely misunderstandings.

bolbteppa said:
Are you guys aware this is how deep into the rabbit hole you have to go?

Your misunderstandings are not flaws - just misunderstandings.

Thanks
Bill
 
Last edited by a moderator:
  • #46
bolbteppa said:
I can see how that applies to Neumaier's criticism, but it doesn't say anything about Lubos' criticism of the ensemble interpretation as being nothing but a restricted and modest view of the power of QM. I'm curious what people think of this criticism.
As often, I am skeptical about whether this criticism is specific to the quantum case. It seems to me that Lubo's thought experiment is not much different to the throwing of real (non-identical) coins. If a certain probability interpretation can be applied to this situation I think it can also be applied to the quantum case.
 
  • #47
bolbteppa said:
To be clear, I haven't read Ballentine. Lubos's issues already turned me off a while ago, so I posted here to find a reason to give it a chance. After just finding out I also have to throw away Kolmogorov probability, I'm now even less inclined, but I'd still consider it if there's good enough a reason. Are you guys aware this is how deep into the rabbit hole you have to go?
The question is do you want to learn about the physics or the metaphysics?

For the first part, Ballentine is an excellent book. I know a good deal of standard textbooks and the only other book which gave me a similar feeling of understanding important things about the physics is Sakurai. Ballentine talks about quite a few things which I haven't read anywhere else and he goes more into detail than Sakurai (for example when he examines the implications of Galilean symmetry). On the other hand I really like Sakurai's writing style. I recommend to just try which book suits you better. As for Landau / Liflshitz and Weinberg, I have only skimmed them. They seem to be good books but I can't comment on them in detail.

For the second part, working through Ballentine completely is probably overkill. There are however many thought-provoking bits in different parts of the book and he is very outspoken about his opinion on interpretational issues. I think his view makes sense but even if you have issues with it, it will probably be enriching to read what he thinks. What I don't like is that he doesn't present it as an opinion.
 
  • #48
kith said:
As often, I am skeptical about whether this criticism is specific to the quantum case. It seems to me that Lubo's thought experiment is not much different to the throwing of real (non-identical) coins. If a certain probability interpretation can be applied to this situation I think it can also be applied to the quantum case.

Mate - I can't follow it at all - I have zero idea what he is driving at.

I am also scratching my head at Bolbteppa's exact concern.

As far as I can see it's that Ballentine uses the term 'propensity' to describe probability, rather than say plausibility like Bayesian's do.

I think philosophers get caught up in terms like that, but my background is applied math, and I really can't see the point. If you think in terms of plausibility, states of knowledge etc, you get something like Copenhagen. If, regardless of how you view probability, plausibility, something abstract as in the Kolmogorov axioms, it doesn't really matter, but apply the law of large numbers you get something like the ensemble, which is very frequentest like.

I think most applied math types with a background in stochastic modelling (which is what I have) view it in a frequentest way backed by the Kolmogorov axioms via the law of large numbers. Most certainly books like Feller, and Ross (Introduction to Probability Models) that I have view it that way. For example its the simplest way to view the important limit theorems of Markov chains.

There is an issue with the law of large numbers in that it converges in probability or almost assuredly so a bit of care is required. But it's not a particularly difficult thing - you simply assume that some very small probability is for all practical purposes zero - its the type of thing you do in applied math all the time.

I have discussed this sort of thing before, but I still don't understand why people worry about it - I guess it's a philosophy thing.

Thanks
Bill
 
Last edited:
  • #49
kith said:
The question is do you want to learn about the physics or the metaphysics?

I dug up my copy of Feller and reacquainted myself with what he says.

From page 3
'We shall no more attempt to explain the true meaning of probability than the modern physicist dwells on the real meaning of the mass and energy or the geometer discusses the nature of a point. instead we shall prove theorem's and show how they are applied'

And that's exactly what going on here. I mentioned the fundamental axiom I applied Gleason to:
'An observation/measurement with possible outcomes i = 1, 2, 3 ... is described by a POVM Ei such that the probability of outcome i is determined by Ei, and only by Ei, in particular it does not depend on what POVM it is part of.'

Probability is an assumed primitive of the theory. Its described by the Kolmogorov axioms. You can apply the law of large numbers and get a frequentest view - that would be Balentines Ensemble. You can call it 'propensity' - what difference it makes is beyond me. My view is Fellers - its simply an assumed primitive. You can view it as plausibility, state of knowledge and get something like Copenhagen. That's fine.

I like Ballentine because it's pictorially nice - you simply view an observation as selecting an element from an ensemble. But that's all there is to it - it simply appeals to me.

Thanks
Bill
 
  • #50
bhobba said:
If you think in terms of plausibility, states of knowledge etc, you get something like Copenhagen. If, regardless of how you view probability, plausibility, something abstract as in the Kolmogorov axioms, it doesn't really matter, but apply the law of large numbers you get something like the ensemble, which is very frequentest like.

it seems to me that you have also an other possibility develop by E.T.Jaynes "probability theory as an extension of logic". In this context probability is not reduce to random variables.

A proof of Cox’s Theorem.

Patrick
 
  • #51
microsansfil said:
it seems to me that you have also an other possibility develop by E.T.Jaynes "probability theory as an extension of logic". In this context probability is not reduce to random variables.

That's the Bayesian view where its how plausible something is.

What you do is come up with reasonable axioms on what plausibility should be like - these are the so called Cox axioms. They are logically equivalent to the Kolmogorov axioms where exactly what probability is is left undefined.

Ballentine bases it on those axioms but called it propensity - which isn't really how Coxes axioms are usually viewed. It's logically sound since its equivalent to Kolomogorovs axioms - just a bit different.

In applied math what's usually done is simply to associate this abstract thing called probability defined by the Kolmogerov axioms with independent events. Then you have this thing called the law of large numbers (and its a theorem derivable from those axioms) which basically says if you do a large number of trials the proportion of outcomes tends toward the probability. That's how you make concrete this abstract thing and its certainly how I suspect most people tend to view it.

Basically what Ballentine does it look at probability as a kind of propensity obeying the Cox axioms. Then he uses the law of large numbers to justify his ensemble idea.

There is no logical issues with this, but personally I wouldn't have used propensity - simply an undefined thing as per Kolomogorov's axioms.

But really its no big deal.

Thanks
Bill
 
Last edited:
  • #52
bhobba said:
just a bit different.
In his book E.T Jaynes write.

Foundations: From many years of experience with its applications in hundreds of real problems, our views on the foundations of probability theory have evolved into something quite complex, which cannot be described in any such simplistic terms as \pro-this" or \anti-that." For example, our system of probability could hardly be more different from that of Kolmogorov, in style,philosophy, and purpose. What we consider to be fully half of probability theory as it is needed in current applications the principles for assigning probabilities by logical analysis of incomplete information|is not present at all in the Kolmogorov system.

As noted in Appendix A, each of his axioms turns out to be, for all practical purposes, derivable from the Polya-Cox desiderata of rationality and consistency. In short, we regard our system of probability as not contradicting Kolmogorov's; but rather seeking a deeper logical foundation that permits its extension in the directions that are needed for modern applications. In this endeavor, many problems have been solved, and those still unsolved appear where we should naturally expect them: in breaking into new ground.

However, our system of probability differs conceptually from that of Kolmogorov in that we do not interpret propositions in terms of sets, but we do interpret probability distributions as carriers of incomplete information. Partly as a result, our system has analytical resources not present at all in the Kolmogorov system. This enables us to formulate and solve many problems- particularly the so-called "ill posed" problems and "generalized inverse" problems - that would be considered outside the scope of probability theory according to the Kolmogorov system. These problems are just the ones of greatest interest in current applications.

E.T Jaynes purposefully do not use the term “random variable”, as it is a much too restrictive a notion, and carries with it all the baggage of the Kolmogorov approach to probability theory, but a random variable seem to be an example of an unknown/incomplete information.

Possible point of view : Quantum mechanics is basically a mathematical recipe on how to construct physical models. Since it is a statistical theory, the meaning and role of probabilities in it need to be defined and understood in order to gain an understanding of the predictions and validity of quantum mechanics.

For instance, the statistical operator or density operator, is usually defined in terms of probabilities and therefore also needs to be updated when the probabilities are updated by acquisition of additional data. Furthermore, it is a context dependent notion.

Patrick
 
  • #53
bhobba said:
Sure they need not remain homogeneous - but the conceptualisation is they do - its a straw man argument.

Many, many books explain the validity of the frequentest interpretation when backed by the Kolmogorov axioms eg
https://www.amazon.com/dp/0471257087/?tag=pfamazon01-20

Having looked through Feller, he actually doesn't claim that the frequency interpretation of probability is justified by Kolmogorov's axioms, and just to be clear - if such a passage actually existed then it would imply both me and Ballentine are wrong when we say frequentist probability is flawed. Ballentine mentions this issue uniqueness of the limit on page 32:

One of the oldest interpretations is the limit frequency interpretation. If the conditioning event C can lead to either A or ∼A, and if in n repetitions of such a situation the event A occurs m times, then it is asserted that P(A|C) = limn→∞(m/n). This provides not only an interpretation of probability, but also a definition of probability in terms of a numerical frequency ratio. Hence the axioms of abstract probability theory can be derived as theorems of the frequency theory. In spite of its superficial appeal, the limit frequency interpretation has been widely discarded, primarily because there is no assurance that the above limit really exists for the actual sequences of events to which one wishes to apply probability theory.

The defects of the limit frequency interpretation are avoided without losing its attractive features in the propensity interpretation. The probability P(A|C) is interpreted as a measure of the tendency, or propensity, of the physical conditions describe by C to produce the result A. It differs logically from the older limit-frequency theory in that probability is interpreted, but not redefined or derived from anything more fundamental. It remains, mathematically, a fundamental undefined term, with its relationship to frequency emerging, suitably qualified, in a theorem. It also differs from the frequency theory in viewing probability (propensity) as a characteristic of the physical situation C that may potentially give rise to a sequence of events, rather than as a property (frequency) of an actual sequence of events.

Calling my argument a strawman argument is calling Ballentine's argument a strawman argument. I notice you only focused on homogeneity, but what about the issue of uniqueness of the limit that me and Ballentine brought up?

bhobba said:
He does use propensity - but I think he uses it simply as synonymous with probability - most certainly in the equations he writes that's its meaning.

As the quote from Ballentine given above shows, it seems he uses this word as a way to give the closest thing to a frequentist interpretation possible, but qualifies this by saying it's merely a word given to a theorem proven from Cox's axioms. That's an extremely important distinction in the sense that, logically, it's very different from taking the crass frequentist interpretation that you implied, and doubly important since you are claiming both that frequentist probability can be justified by Kolmogorov's axioms and that Ballentine is taking a frequentist interpretation when he clearly says he isn't...

So he's not using frequentist probability, he's using Cox's probability axioms and just interpreting some theorems in a way that lies closest to a frequentist interpretation possible. That's fine, but had I not checked that out I'd be left with a completely wrong impression of Ballentine based on this thread.

bhobba said:
To be frank I don't even understand Lubos's criticism - mind carefully explaining it to me?

All I'm going off is the conclusion which is that all we really get from the ensemble interpretation is a restricted and modest view of the power of QM. Hopefully someone who understands it fully will be able to challenge it.

bhobba said:
You should. But what leaves me scratching my head is you seem to have all these issues with it - but haven't gone to the trouble to actually study it. I could understand that if it was generally considered crank rubbish - but it isn't. Its a very well respected standard textbook. It is possible for sources of that nature to have issues - and it does have a couple - but they are very minor.

Well I want to find out about the book, which is why I'm posting. Thus far I have been given the impression that it's based on frequentist probability, and been told such a position can be justified by Kolmogorov's axioms, when in fact the book explicitly says it's not based on frequentist probability and actually uses Cox's axioms. Then we have the two main issues, one about the theory applying to a single particle, which may be more complicated than Neumaier implied http://physics.stackexchange.com/a/15553/25851 and also Lubos' claim that all we really get from the ensemble interpretation anyway is just a restricted and modest view of the power of QM. Sounds awfully unappealing at this stage.

From all the comments on Ballentine I've read on here that stick in my head, the only benefit compared to Landau is that a) it's easier than Landau, b) you can prove one or two things Landau assumes (though apparently at the price of a less general form of QM) as long as you take a different interpretation of QM to that of Landau, an interpretation that, at best, is ultimately no more justifiable than Landau's perspective, and at worse is less general. In that light, it seems like the book is a waste of time, but I'm happy to be wrong.
 
Last edited by a moderator:
  • #54
I mentioned--either in this thread, or another--the "propensity" interpretation of probabilities, but in my opinion, it's not an interpretation, at all. It's just another word for "probability". Maybe it's supposed to be that part of probability that is left over after all probabilities due to ignorance are stripped away. So in the context of QM in the density-matrix approach, pure states represent propensities, while mixed states combine propensities and subjective probabilities?
 
  • #55
stevendaryl said:
So in the context of QM in the density-matrix approach, pure states represent propensities, while mixed states combine propensities and subjective probabilities?
Here a critique of Popper's interpretation of quantum mechanics and the claim that the propensity interpretation of probability resolves the foundational problems of the theory, by http://www.philosophy.umd.edu/people/bub.


Patrick
 
  • #56
bolbteppa said:
what about the issue of uniqueness of the limit that me and Ballentine brought up?
It's not an issue. The assignment of probabilities in the purely mathematical part of the theory, is just an assignment of relative sizes to subsets. These assignments tell us nothing about the real world on their own. That's why the theory consists of the mathematics and a set of correspondence rules that tell us how to interpret the mathematics as predictions about results of experiments. Those rules tell us that the relative frequency of a particular result in a long sequence of identical measurements, will be equal to the probability that has been assigned to (a subset that represents) that particular result.

The correspondence rules can't just say that probabilities are propensities, because we need to know how to test the accuracy of the theory's predictions. If we can't, it's not a theory.

The non-existence of a limit wouldn't be relevant even if we had a theory that has a chance of being exactly right, because

1. You can't perform an infinite sequence of measurements.
2. The measurements won't be perfectly accurate.
3. The measurements won't be identical.
4. If a very long sequence of identical measurements would (for example) sometimes go into the interval 1.000000001-1000000002 and then hop around inside it, and in another experiment go into the interval 1.0000000005-10000000006 and then hop around inside it, the conclusion would be that somewhere around the tenth decimal, we're hitting the limits of the theory's domain of validity. This is not a problem, unless we had the completely unjustified belief that the theory was exactly right.
 
  • #57
bolbteppa said:
That's an extremely important distinction in the sense that, logically, it's very different from taking the crass frequentist interpretation that you implied

That's the precise problem. Ballentine and I are not advocating a 'crass' frequency interpretation. We are advocating the modern version where it is based on the Kolmogorov axioms (or equivalent) and applying the law of large numbers.

It matters not if you call it propensity, plausibility, or leave it it semantically open, it implies exactly the same thing.

Thanks
Bill
 
  • #58
bolbteppa said:
From all the comments on Ballentine I've read on here that stick in my head, the only benefit compared to Landau is that a) it's easier than Landau, b) you can prove one or two things Landau assumes (though apparently at the price of a less general form of QM) as long as you take a different interpretation of QM to that of Landau, an interpretation that, at best, is ultimately no more justifiable than Landau's perspective, and at worse is less general. In that light, it seems like the book is a waste of time, but I'm happy to be wrong.

Since I'm in the extremely small minority that dislikes Ballentine's book, let me say that I don't think the criticisms from Neumaier and Motl are that relevant to my point of view (although Neumaier and Motl may be correct, but I won't comment on that, since Ballentine's Ensemble interpretation itself appears to have changed between his famous erroneous review and the book, and Neumaier and Motl might be commeting on the review). Neither is the issue about the interpretation of probability important to me. Clearly, Copenhagen works despite its acknowledged problem of having to postulate an observer as fundamental. One cannot just declare that individual systems don't have states, or that collapse is wrong, since that would mean Copenhagen is wrong (Ballentine erroneously claims that Copenhagen is wrong, but my point if that even if we forgive him that, that does not fix his problems). The major approaches to interpretation never claim that Copenhagen is wrong. Rather, they seek to derive Copenhagen, but remove the observer as a fundamental component of the postulates. Ballentine doesn't even try to do that, and his theory has a Heisenberg cut, so it is not really an interpretation. Rather it is at best a derivation of Copenhagen or "Operational Quantum Theory" from axioms other than those found in Landau and Lifshitz, Shankar, Sakurai and Napolitano, Weinberg, or Nielsen and Chuang. Excellent examples in this spirit are those of Hardy http://arxiv.org/abs/quant-ph/0101012 or Chribella, D'Ariano and Perinotti http://arxiv.org/abs/1011.6451. So the question is does Ballentine's derivation work? I believe it doesn't, and that it is technically flawed.

The key question is whether Ballentine is able to derive his Eq 9.30. For comparison, one may see Laloe's treatment of the same equation in http://arxiv.org/abs/quant-ph/0209123, where it is Eq 37. If Ballentine did derive that equation, I think the other mistakes could be overlooked. If he did not, his interpretation has a hole and is not quantum mechanics.

Now should all approaches to interpretation be without flaw? No, but they should be clear where their flaws and issues are. For example, Wallace makes clear that the issue of how probability arises at all in Many-Worlds is still an issue, even if his derivation of the Born rule were to be correct. Similarly, there is the well known limitation that Bohmian Mechanics at present sits uncomfortably with exact Lorentz invariance. For the same reason, Landau and Lifshitz and Weinberg are excellent Copenhagen books because they explicitly point out the Heisenberg cut, rather than sweeping it under the rug.
 
Last edited:
  • #59
bhobba said:
We are advocating the modern version where it is based on the Kolmogorov axioms (or equivalent)
This leave with the impression that Kolmogorov’s axiomatization was born full grown. Kolmogorov only translates probability concept, well known many years later, into an axiomatic/formal mathematical language. The mathematical theory of probability is now included in mathematical theory of measure.

The measurement theory is the branch of mathematics that deals with measured spaces and is the axiomatic foundation of probability theory.

The basic intuition in probability theory remain the notion of randomness based on the notion of random variable.

There are certain ‘non commutative’ versions that have their origins in quantum mechanics, for instance K. R.
Parthasarathy (an introduction to quantum stochastic calculus), that are generalizations of the Kolmogorov Model.

Patrick
 
  • #60
atyy said:
since Ballentine's Ensemble interpretation itself appears to have changed between his famous erroneous review

It did.

He had to take on board Kochen-Specker.

He assumed, initially (in his original review article), it had the property when measured. Kochen-Specker says you can't do that. Fredrick put his finger on it - originally it was basically BM in disguise.

However with decoherence you can do that - but of course it still doesn't fully resolve the measurement problem - which looked like was his hope.

Thanks
Bill
 
  • #61
microsansfil said:
Here a critique of Popper's interpretation of quantum mechanics and the claim that the propensity interpretation of probability resolves the foundational problems of the theory

Without even reading it, its fairly obvious calling probability propensity, plausibility or any other words you can think of, will not change anything.

What probability is, is defined by the Kolmogorov axioms.

The rest is simply philosophical waffle IMHO.

Those axioms all by themselves are enough, via the law of large numbers, to show Ballintines ensembles conceptually exist, which is all that required to justify his interpretation.

If you think of probability as some kind of plausibility then you get something like Copenhagen - although the law of large numbers still applies and you can also conceptually define ensembles if you wish.

I sometimes say guys with a background in applied math like me and philosophers sometimes talk past one another.

Here's an example from Rub's paper:
'The propensity interpretation may be understood as a generalization of the classical interpretation. Popper drops the restriction to "equally possible cases," assigning "weights" to the possibilities as "measures of the propensity, or tendency, of a possibility to realize itself upon repetition." He distinguishes probability statements from statistical statements. Probability statements refer to frequencies in virtual (infinite) sequences of well-defined experiments, and statistical statements refer to frequencies in actual (finite) sequences of experiments. Thus, the weights assigned to the possibilities are measures of conjectural virtual frequencies to be tested by actual statistical frequencies: "In proposing he propensity interpretation I propose to look upon probability statements as statements about some measure of a property (a physical property, comparable to symmetry or asymmetry) of the whole experimental arrangement; a measure, more precisely, of a virtual frequency'

My view is just like Fellers:
'We shall no more attempt to explain the true meaning of probability than the modern physicist dwells on the real meaning of the mass and energy or the geometer discusses the nature of a point. instead we shall prove theorem's and show how they are applied'

Conceptual infinite ensembles are easily handled by simply assuming there is a very small probability below which it is indistinguishable in practical terms from zero. If you do that the law of large numbers leads to large, but finite ensembles.

For example we know there is a very small probability all the atoms in a room will go in the same direction at once and levitate a chair into the air - but in practice it never happens - we can safely assumes probabilities that small can be neglected - just like in calculus at an applied level we often think of dx as a small increment in x such that dx^2 can be ignored.

That's why guys with my background and those with a philosophical bent sometimes talk past each other.

Thanks
Bill
 
Last edited:
  • #62
atyy said:
Chribella, D'Ariano and Perinotti http://arxiv.org/abs/1011.6451. So the question is does Ballentine's derivation work?

Alexei Grinbaum "THE SIGNIFICANCE OF INFORMATION IN QUANTUM THEORY"

Interest toward information-theoretic derivations of the formalism of quantum theory has been growing since early 1990s thanks to the emergence of the field of quantum computation.

In Part II we derive the formalism of quantum theory from information-theoretic axioms. After postulating such axioms, we analyze the twofold role of the observer as physical system and as informational agent. Quantum logical techniques are then introduced, and with their help we prove a series of results reconstructing the elements of the formalism. One of these results, a reconstruction theorem giving rise to the Hilbert space of the theory, marks a highlight of the dissertation. Completing the reconstruction, the Born rule and unitary time dynamics are obtained with the help of supplementary assumptions. We show how the twofold role of the observer leads to a description of measurement by POVM, an element essential in quantum computation.

Patrick
 
  • #64
bhobba said:
Can you detail the relevance to Atty's statement about Ballentine's interpretation?
Formal systems seem to be rigid because purely syntactic, but their semantics embedded in the axioms is unspoken. In MQ i agree with the point of view that axiomatization has to be based on postulates that can be precisely translated in mathematical terms but not vice versa. The Alexei Grinbaum's work is an example among others.

Patrick
 
  • #65
bhobba said:
about Ballentine's interpretation?
About : " So the question is does Ballentine's derivation work?" included I am my quote is simply a mistake of cut and paste.

What is the meaning of "work" in the context of interpretation ?

Patrick
 
Last edited:
  • #66
microsansfil said:
What is the meaning of "work" in the context of interpretation ?

Mate all I am asking is for you to detail the point you are trying to make because I am confused about it.

What Ballentine does is show the probability axioms are consistent with his two axioms. He calls probability propensity, but that's not really relevant; philosophers get caught up in that sort of thing but mathematically it the axioms whatever it is obeys that's important. He uses the Cox axioms, but they are equivalent to the Kolmogorov axioms.

That implies the existence of ensembles which is all that is required - its got nothing to do with the semantics of the situation.

Is that what you mean by information theoretic?

If so information theoretic is not what I would use - axiomatic based would be my description.

Added Later:
While I was penning the above you did another post that hopefully clarified what you had in mind. Will address that.

Thanks
Bill
 
  • #67
bhobba said:
What probability is, is defined by the Kolmogorov axioms.

The area of ​​relevance of a formal system is confined - by design - to the field of relevance of a hidden semantic, whose presence is unspoken.

indeed,there is a comparability between other formalism like Cox-Jaynes’s approach to probability and de Finetti. Yet as written http://www.siam.org/pdf/news/86.pdf

In summary, we see no substantive conflict between our system of probability and Kolmogorov’s as far as it goes;
rather, we have sought a deeper conceptual foundation which allows it to be extended to a wider class of applications, required by current problems of science.

Patrick
 
  • #68
bhobba said:
Mate all I am asking is for you to detail the point you are trying to make because I am confused about it.

What Ballentine does is show the probability axioms are consistent with his two axioms.
I don't know Ballentine "Point of view". Is it an oher interpretation of MQ or is it a new axiomatic of MQ ?

Patrick
 
  • #69
microsansfil said:
Formal systems seem to be rigid because purely syntactic, but their semantics embedded in the axioms is unspoken.

That's the whole point - they are semantic neutral.

Again - read what Feller said:
'We shall no more attempt to explain the true meaning of probability than the modern physicist dwells on the real meaning of the mass and energy or the geometer discusses the nature of a point. Instead we shall prove theorem's and show how they are applied'

This is the modern view.

BTW when I say modern it developed during the 19th century where a more cavalier attitude caused problems (eg 1 - 1 + 1 - 1 ... converged in naive Fourier series) and permeated all of modern pure and applied math - including physics. Many say the pure guys went a bit too far, which led to a bit of good natured ribbing between applied and pure camps, but both have taken on the central lesson.

Thanks
Bill
 
  • #70
microsansfil said:
I don't know Ballentine "Point of view". Is it an oher interpretation of MQ or is it a new axiomatic of MQ ?

His view is similar to Popper - and would not be my choice of how to attack it.

The key point I am trying to get across is his arguments depend on the axioms - not how you interpret them.

I have already posted my derivation of the two axioms that starts with a single axiom:
'An observation/measurement with possible outcomes i = 1, 2, 3 ... is described by a POVM Ei such that the probability of outcome i is determined by Ei, and only by Ei, in particular it does not depend on what POVM it is part of.'

That way you don't have to show its compatible with probability - its there right from the start - without any semantic baggage.

It's clearer IMHO what's going on that way.

Of course Ballentine isn't wrong - but as this thread shows it gets caught up in semantic baggage.

Thanks
Bill
 
Last edited:

Similar threads

Replies
118
Views
12K
Replies
11
Views
856
Replies
21
Views
3K
  • Sticky
3
Replies
70
Views
18K
Replies
4
Views
702
Replies
18
Views
3K
Replies
22
Views
6K
Back
Top