Eigenvalue degeneracy in real physical systems

In summary: BillIn summary, according to quantum mechanics, degeneracies can be associated with symmetry or topological characteristics of the system. If a system has an odd number of electrons, for example, it will have at least a two-fold degeneracy. When all the operators in a system are represented by non-degenerate matrices, it is true that the eigenvalues are distinguishable. However, this is only true when all the observables in the system are measured. If not, the system is said to be in a superposition of different eigenstates and the collapse postulate must be taken with some grain of salt.
  • #36
ErikZorkin said:
So, is it merely a way of defining operators corresponding to measurements of a specific range of outcomes rather than discrete values?
No. It accounts for a more general class of measurements. Maybe the discussion here helps.
 
  • Like
Likes ErikZorkin
Physics news on Phys.org
  • #37
stevendaryl said:
Are you on a tight deadline for acquiring this understanding?

:wink:
Well, better said, I lack time badly! And I am neither a physicist nor am I that much interested in physics (rather, mathematical foundation thereof).
 
  • #38
A. Neumaier said:
No. It accounts for a more general class of measurements. Maybe the discussion here helps.
Nice post! Could you give a link to an example where usage of POVMs is demonstrated along with measurement imperfection?
 
  • #39
ErikZorkin said:
nor am I that much interested in physics (rather, mathematical foundation thereof).
I don't give advice to superficial thinkers who think that instant understanding is just a few clicks away.

You won't get far in the mathematical foundations of physics without learning some physics and spending a lot of time. Take a slower pace and you'll benefit a lot from it.

The book by Peres is wholly about the foundations of quantum mechanics (only). For foundations of measurement see e.g. https://labs.psych.ucsb.edu/ashby/gregory/klstv2.pdf . And these are only the tips of two huge icebergs...
 
Last edited by a moderator:
  • Like
Likes vanhees71
  • #40
Well, thanks for directing me to POVMs
 
  • #41
However, POVMs can't resolve the mathematical computability issue that ErikZorkin brought up, since they can always be seen as PVMs on a larger Hilbert space, so if they could resolve the issue, then the issue with the PVMs would also be resolved, which is apparently impossible. I think the physical resolution is what I have written in posts #17 and #19 and of course it can also be formulated using POVMs.
 
  • Like
Likes ErikZorkin
  • #42
rubi said:
and we have observed the value a=5a=5 with a precision of σ=0.5\sigma=0.5. We don't need the projector onto the eigenspace to the eigenvalue 55, but rather a projector P(4.5,5.5)P(4.5,5.5) onto the space of states which is spanned by eigenstates with eigenvalues 4.5≤a≤5.54.5 \leq a \leq 5.5.
This doesn't solve the problem of principle since the precision 0.5 is uncertain, too, whereas your construction assumes that it and the observed value are both known to infinite precision.

It is well-known and experimentally verifiable that projection-valued measures are often far too crude, whereas POVMs (and their ''square roots'') give a generally good model for this kind of measurements.
 
  • #43
Well, the value ##0.5## is what the experimenter hands me. If they claim that their measurement uncertainty is ##0.5## and this leads to disagreements between the theory and the experiment, then either the theory is false or the experimenter has made systematic errors and his uncertainty isn't really ##0.5##, but rather something else.

I don't doubt that POVMs are better suited for realistic measurement. I just don't think that they resolve the specific problem the OP has brought up.
 
  • #44
That's kind'a right. The suggestion on POVM might be a bit misleading. The connection to PVMs is established by the Neumark's theorem that establishes one-to-one correspondence.
 
Last edited:
  • #45
rubi said:
Well, the value ##0.5## is what the experimenter hands me. If they claim that their measurement uncertainty is ##0.5## and this leads to disagreements between the theory and the experiment, then either the theory is false or the experimenter has made systematic errors and his uncertainty isn't really ##0.5##, but rather something else.
The measurement error could also be nonsystematic.

It could be 0.51 or 0.49 - and would lead to a significantly different projector in case the wave function contains a large contribution in the symmetric difference of the twospectraldomains.

What an experimenter hands you is always inaccurate, and the uncertainty is usually much more inaccurate than the value itself - because it is much less well determined operatinally.

It is ridiculous to that Nature responds to a quantum measurement according to whatever the experimenter hands you.
 
  • #46
A. Neumaier said:
The measurement error could also be nonsystematic.

It could be 0.51 or 0.49 - and would lead to a significantly different projector in case the wave function contains a large contribution in the symmetric difference of the twospectraldomains.

What an experimenter hands you is always inaccurate, and the uncertainty is usually much more inaccurate than the value itself - because it is much less well determined operatinally.
If the experimenter did not make any systematic errors and computed a value of ##0.5## for his measurement uncertainty, then the theory better predict the experimental results correctly, given this value for the uncertainty. Otherwise it is false and has to be rejected. The theory just wouldn't be compatible with the experimental results.

It is ridiculous to that Nature responds to a quantum measurement according to whatever the experimenter hands you.
Well, the experimenter can't hand me any number he likes. He must hand me the value that he computed for his measurement uncertainty. However, I agree that this is ridiculous. I think that the projection postulate is nonsensical and will eventually be abandoned. I'm just answering from the point of view of a Copenhagenist, since this is what the OP (implicitly) asked for.
 
  • #47
rubi said:
If the experimenter did not make any systematic errors and computed a value of 0.5 for his measurement uncertainty, then the theory better predict the experimental results correctly, given this value for the uncertainty.
No. The value for the uncertainty is always itself uncertain, and typically over conservative. There is a large literature about how to compute and report uncertainties and they advise to be conservative in case of doubt.

rubi said:
I agree that this is ridiculous. I think that the projection postulate is nonsensical and will eventually be abandoned. I'm just answering from the point of view of a Copenhagenist, since this is what the OP (implicitly) asked for.
He asked about the realistic situation. The real situation is often described by a POVM - but the experimenter will not know the precise parameters of the POVM, only an approximate description. And in most cases the optimally fitting POVM will be not projection-valued - hence treating it in the Copenhagen way will introduce asystematic error.

But even with optimal POVM and optimmal assessment of result and uncertainty, the latter will deviate from the true result given by the POVM. This is unavoidable. There are always the error due to the modeling plus the additional error due to the actual reading.
 
  • #48
A. Neumaier said:
No. The value for the uncertainty is always itself uncertain, and typically over conservative. There is a large literature about how to compute and report uncertainties and they advise to be conservative in case of doubt.
Well, as a matter of fact, Copenhagen-style QM does have the projection postulate and its predictions depend on the uncertainty. I have never seen a Copenhagenist explain, what uncertainty must be taken in order to get correct predictions. However, the only number that we actually have is the uncertainty computed by the experimenter. What other number do you propose? Unless we have such a number, Copenagen-style QM isn't even a physical theory at all, since it doesn't tell us which projector to use in order to make predictions.

There must be a recipe that tells us the right projector to use in the projection postulate. This recipe can be falsified.

He asked about the realistic situation.
I interpreted his question to be about how we can make predictions with the projection postulate if the eigenspace decomposition is actually uncomputable. But maybe I just interpreted him wrongly.
 
  • #49
rubi said:
point of view of a Copenhagenist, since this is what the OP (implicitly) asked for.
Well, not necessary. That's at least what I am familiar with. And by the way. it's more of a problem with exact computation of operator spectra, which is impossible, than with interpretations of QM.

A. Neumaier said:
only an approximate description
Do POVMs admit constructive approximation (up to arbitrary precision) ?

rubi said:
I interpreted his question to be about how we can make predictions with the projection postulate if the eigenspace decomposition is actually uncomputable. But maybe I just interpreted him wrongly.
This is exactly what I asked. In other words, it's the issue of uncomputability of spectra.
 
  • #50
rubi said:
There must be a recipe that tells us the right projector to use in the projection postulate.
There is no such recipe for a general measurement. The Born rule is well-defined (through aprecise specification of the meaning of '''measurement'') only for interpreting the results of collision experiments, i.e., the S-matrix elements.Born originally had it only in the form of a law for predicting the result of collisions (where the measured operator is itself a projection), and it is verifiable in these situations.

Later it was abstracted into the modern form by on Neumann, who introduced an ''ideal'' measurement without aclear meaning - so that only the conformance to the rule ''defines'' whether a particular measurement is ''ideal''. - Almost none is. Neither photodetection nor electron detection works as claimed by the rule.

For the interpretation of real measurmeents one uses instead sophisticated models of Lindblad type that predict the dynamics of the state and the probabilities of the outcomes.
 
Last edited:
  • #51
ErikZorkin said:
This is exactly what I asked.
No. You had asked the following:
ErikZorkin said:
Now, if we were to simulate all (observable in real world) physical systems, we would need to know whether the eigenvalues of all Hermitian operators that correspond to the real physical systems are distinguishable. Otherwise, our "supercomputer" would be unable to determine, which eigenstate the system falls into after measurement. In particular, it is true when all the operators are represented by non-degenerate matrices.

Are there (or have there been observed) real-world physical systems known to have indistinguishable eigenvalues?
ErikZorkin said:
it's the issue of uncomputability of spectra.
The reference to the real world assumes real measurements of real systems. They are never known to infinite precision hence the question of uncomputability of the spectra is irrelevant - it would be the inaccurate spectrum of an operator that is inaccurate anyway, and would apply only to the idealized situation, since the measurement is not of the Copenhagen type, so errors in the simulations don't matter.

Simulations are approximate also, by their very nature - so who cares about uncomputability? Already ##e^x## is uncomputable for most ##x## - since one needs an infinite time to get the exact answer. A simulation only uses approximations to everything. This eliminates all problems of uncomputabiliy.

In cases where the Born rule applies (e.g., scattering events) one has an integral spectrum (highly degenerate but with a priori known projectors).

In other cases, for example when measuring enrgies through spectra, one has a discrete energy spectrum where energy differences are measured as spectral lines (with a width computable only by using more detailed models), etc.

If you pose the wrong question you shouldn't expect to get answers to what you had in mind.
 
Last edited:
  • #52
ErikZorkin said:
Well, not necessary. That's at least what I am familiar with. And by the way. it's more of a problem with exact computation of operator spectra, which is impossible, than with interpretations of QM.
This is exactly what I asked. In other words, it's the issue of uncomputability of spectra.
Well, this problem only appears in Copenhagen-style interpretations, where the collapse is an essential part of the dynamics. This view is out of fashion today anyway. However, it could be in principle resolved by what I've written in posts #17 and #19.

A. Neumaier said:
There is no such recipe for a general measurement.
Unless there is such a recipe, the projective dynamics is ill-defined. Different choices of projectors will lead to different predictions. Just look at these extreme examples:
1. We could choose the projector onto the whole Hilbert space, since it certainly projects onto the measured eigenspace. This is equivalent to having no projection postulate at all and it can't explain for instance the quantum Zeno effect. (Note: I assume a model that explicitely does not include decoherence and uses the projection postulate instead!)
2. We could choose a very narrow projector. This might remove a part of the wave-function might later become important. For example if we perform a filtering in a Stern-Gerlach experiment and somehow the filtered electrons are led back into the beam, this will impact the results of the experiment and this impact wouldn't be reflected in our description, since we have removed the filtered electrons from the picture.

It is therefore crucial in a quantum theory with projection postulate to know, which projector must be choosen and a canonical choice would be to take the measurement uncertainty. The dynamics is ill-defined if you don't supply such a choice. Of course, this does not apply to theories without projection postulate, but the OP is specifically interested in projective dynamics.

The Born rule is well-defined (through aprecise specification of the meaning of '''measurement'') only for interpreting the results of collision experiments, i.e., the S-matrix elements.Born originally had it only in the form of a law for predicting the result of collisions, and it is verifiable in these situations.

Later it was abstracted into the modern form by on Neumann, who introduced an ''ideal'' measurement without aclear meaning - so that only the conformance to the rule ''defines'' whether a particular measurement is ''ideal''. - Almost none is. Neither photodetection nor electron detection works as claimed by the rule.
The Born rule and the projection postulate are two different things. You can have the Born rule without having the projection postulate.

For the interpretation of real measurmeents one uses instead sophisticated models of Lindblad type that predict the dynamics of the state and the probabilities of the outcomes.
I am aware of that. It's just not what the OP asked for. His question is specifically about projective dynamics. When someone asks a question about the Bohr model, telling him that it is outdated and he should really be considering quantum mechanics, wouldn't be an appropriate answer either.

A. Neumaier said:
Simulations are approximate also, by their very nature - so who cares about uncomputability? Already ##e^x## is uncomputable for most ##x## - since one needs an infinite time to get the exact answer. A simulation only uses approximations to everything. This eliminates all problems of uncomputabiliy.
Uncomputability is much worse than the fact that computations are approximate. If the predictions of a theory can't be computed in principle, then it's questionable whether the theory is a scientific theory at all. Computability theory is a part of the foundations of mathematics. It's not just an engineering topic. The exponential function is a computable function.
 
Last edited:
  • Like
Likes ErikZorkin
  • #53
rubi said:
The Born rule and the projection postulate are two different things. You can have the Born rule without having the projection postulate.
True, but ErikZorkin was explicitly interested in the projection version:
ErikZorkin said:
Otherwise, our "supercomputer" would be unable to determine, which eigenstate the system falls into after measurement.
So I wonder which reality he wants to simulate - since almost no part of reality satisfies the postulate!
rubi said:
When someone asks a question about the Bohr model, telling him that it is outdated and he should really be considering quantum mechanics, wouldn't be an appropriate answer either.
It would be fully appropriate if he'd first discuss the Bohr model and then ask which real life atoms had two Bohr orbits with the same radius. it is exactly this kind of question that was asked.
 
  • #54

I think you totally misunderstand the term "computability". ex is computable
 
  • #55
rubi said:
It is therefore crucial in a quantum theory with projection postulate to know, which projector must be choosen and a canonical choice would be to take the measurement uncertainty. The dynamics is ill-defined if you don't supply such a choice. Of course, this does not apply to theories without projection postulate, but the OP is specifically interested in projective dynamics.

Well, to be honest, I am starting to see so many flaws in this framework that I'd better to look for other interpretations maybe. Let me generalize a bit. I feel that the major problem is with the spectral decomposition. What alternatives are there that are more suited for practice and more computable?
 
  • #56
ErikZorkin said:
I think you totally misunderstand the term "computability". ex is computable
To arbitrary finite precision only, not exactly. Matters of computability in your sense don't matter in physics, only in the foundations of computer science.

In practical issues (including all simulation) it is completely irrelevant.

We don't even know whether solutions of the Navier-Stokes equations exist for natural initial conditions - let alone whether they are computable. Nevertheless physicsist in the airplane industry routinely compute solutions of interest using a precision of 16 decimal digits only in their computation - and they get results of a quality that we trust entering an airplane and expect exiting it at the destination.

That's the real world.
 
  • #57
ErikZorkin said:
the major problem is with the spectral decomposition. What alternatives are there that are more suited for practice and more computable?
In th POVM approach you only need the condition ##\sum_k P_k^*P_k=1##, which poses no diffiulties at all.
 
  • #58
A. Neumaier said:
To arbitrary finite precision only, not exactly. Matters of computability don't matter in physics, only in the foundations of computer science.

In practical issues (including all simulation) it is completely irrelevant we don't even know whether solutions of the Navier-Stokes equations exist for natural initial conditions - let alone whether they are computable. Nevertheless physicsist in the airplane industry routinely compute solutions of interest - to a quality that we trust entering an airplane and expect exiting it at the destination.

That's the real world.

I'd like to avoid such a discussion to be honest.

A. Neumaier said:
In th POVM approach you only need the condition ##\sum_k P_k^*P_k=1##, which poses no diffiulties at all.

I sympathize with this approach. But some subtleties, such as Neumark's theorem, get me worried. After all, how can you even claim that POVMs themselves are computable? I've googled a bit and found some approaches, but they don't seem to be recognized solutions. Seems that you substitute one uncomputable apparatus with another.
 
  • #59
ErikZorkin said:
I sympathize with this approach. But some subtleties, such as Neumark's theorem, get me worried. After all, how can you even claim that POVMs themselves are computable? I've googled a bit and found some approaches, but they don't seem to be recognized solutions. Seems that you substitute one uncomputable apparatus with another.
In real life you fit free parameters in a model of the ##P_k## to the available data. If done correctly, this gives a description of the real apparatus with the usual ##O(N^{-1/2})## accuracy for the resulting parameters. More is not needed for probabilistic modeling.

In a simulation, you would simply define the apparatus by specifying a family of ##P_k##s that does what you want it to do. Thus you have complete control over everything of computational relevance.

By the way, I am a math professor with a chair in computational mathematics. I know a lot about simulation in practice!
 
  • #60
A. Neumaier said:
In a simulation, you would simply define the apparatus by specifying a family of PkP_ks that does what you want it to do. Thus you have complete control over everything of computational relevance.

I do sympathize with this framework as it (correct me if I am wrong) allows avoiding usage of spectral theorem. But:

rubi said:
However, POVMs can't resolve the mathematical computability issue that ErikZorkin brought up, since they can always be seen as PVMs on a larger Hilbert space, so if they could resolve the issue, then the issue with the PVMs would also be resolved, which is apparently impossible. I think the physical resolution is what I have written in posts #17 and #19 and of course it can also be formulated using POVMs.

However, and I would like to encourage rubi to clarify, just the mere equivalence between POVMs and PVMs might not be the issue. It's spectral decomposition, that leads to troubles. It turns out spectral theorem is only computable in approximate manner, whence we might drop off important content of the final state as pointed out by rubi, or if we know the multiplicity of eigenvalues in advance, which is speculative.
 
  • #61
ErikZorkin said:
I do sympathize with this framework as it (correct me if I am wrong) allows avoiding usage of spectral theorem.
Nothing needs correction, except your interpretation of Naimak's theorem.

That you can simulate a POVM in a - different, nonphysical - Hilbert space doesn't have any practical relevance. POVMs work not because of Naimark's theorem but because of agreement with experiments.
 
  • Like
Likes ErikZorkin
  • #62
rubi said:
However, POVMs can't resolve the mathematical computability issue that ErikZorkin brought up, since they can always be seen as PVMs on a larger Hilbert space, so if they could resolve the issue, then the issue with the PVMs would also be resolved, which is apparently impossible. I think the physical resolution is what I have written in posts #17 and #19 and of course it can also be formulated using POVMs.

I think Naimark's equivalence between POVMs and PVMs depending on small or large Hilbert spaces only applies to the Born rule part of the observables, not the collapse (I don't think the projection postulate exists for continuous variables).

However, I do agree with you that POVMs are not the solution, since the question asked by the OP can be stated for discrete variables. It needs some generalization for continuous variables, but the discrete version is not misleading.
 
  • #63
atyy said:
I think Naimark's equivalence between POVMs and PVMs depending on small or large Hilbert spaces only applies to the Born rule part of the observables, not the collapse (I don't think the projection postulate exists for continuous variables).

However, I do agree with you that POVMs are not the solution, since the question asked by the OP can be stated for discrete variables. It needs some generalization for continuous variables, but the discrete version is not misleading.

Actually, I was thinking of a simple discrete example in the first place, not even continuous. Take Stern-Gerlach experiment, for instance. There, you can easily demonstrate the degeneracy problem. If the beam splitting is solely undetectable, how can you "project" your state correctly? I thought POVMs could at least describe rigorously what an approximate measurement is (in terms of measuring range, not exact value). Which is, at least for me, a reminiscent of post #17. Because POVMs describe what the final state is explicitly without application of spectral decomposition, as far as I understand.
 
  • #64
ErikZorkin said:
Actually, I was thinking of a simple discrete example in the first place, not even continuous. Take Stern-Gerlach experiment, for instance. There, you can easily demonstrate the degeneracy problem. If the beam splitting is solely undetectable, how can you "project" your state correctly? I thought POVMs could at least describe rigorously what an approximate measurement is (in terms of measuring range, not exact value). Which is, at least for me, a reminiscent of post #17. Because POVMs describe what the final state is explicitly without application of spectral decomposition, as far as I understand.

For discrete observables, POVMs and the old-fashioned projection rule are equivalent, depending on how big of a Hilbert space one chooses to work with. I think rubi gave you the answer back around post #17? The generalization of the von Neumann rule that includes degenerate spaces is called the Luders rule http://arxiv.org/abs/1111.1088v2.
 
  • #65
atyy said:
For discrete observables, POVMs and the old-fashioned projection rule are equivalent, depending on how big of a Hilbert space one chooses to work with. I think rubi gave you the answer back around post #17? The generalization of the von Neumann rule that includes degenerate spaces is called the Luders rule http://arxiv.org/abs/1111.1088v2.

Thanks for the hint!

Well, rubi gave a good answer, but then he himself pointed out some difficulties with it. See post #52
 
  • #66
ErikZorkin said:
Thanks for the hint!

Well, rubi gave a good answer, but then he himself pointed out some difficulties with it. See post #52

They are not real difficulties, as long as the variable is discrete. Within Copenhagen, what counts as a "measurement" is subjective. So we can always take the Luders rule and add any unitary operation to it, and count the (Luders rule + unitary operation) as the "measurement".

However, it should be said that if one considers the spirit of Copenhagen to be a "smaller Hilbert space" view, in the sense that a sensible interpretation of the wave function of the universe is not available, then POVMs are more fundamental than projection measurements. http://mattleifer.info/wordpress/wp-content/uploads/2008/11/commandments.pdf
 
  • #67
atyy said:
They are not real difficulties, as long as the variable is discrete

Well, discrete means all distinct? Because otherwise, we are in trouble. If we simply approximate the spectrum and projections, we might drop off something important.

By the way, what about other interpretations? Does, say, Bohmian pilot wave interpretation also suffer from spectral decomposition?
 
  • #68
ErikZorkin said:
Well, discrete means all distinct? Because otherwise, we are in trouble. If we simply approximate the spectrum and projections, we might drop off something important.

The projection postulate only holds for discrete variables. If a position measurement is made, the state after that cannot be a position eigenstate, because the position eigenstate is not a valid state (not square integrable).

ErikZorkin said:
By the way, what about other interpretations? Does, say, Bohmian pilot wave interpretation also suffer from spectral decomposition?

I didn't quite understand the spectral decomposition problem. I was referring to the collapse rule needing an additional assumption to become defined (the instrument and measurement operators, as can be seen for the generalized collapse rule for POVMs).
 
  • #69
atyy said:
I didn't quite understand the spectral decomposition problem.

Spectral decomposition is uncomputable. You can't even DEFINE eigenvalues/vectors/spaces/projections. Only in approximate format or if it is known beforehand that eigenvalues are distinct.
 
  • #70
ErikZorkin said:
Spectral decomposition is uncomputable. You can't even DEFINE eigenvalues/vectors/spaces/projections. Only in approximate format or if it is known beforehand that eigenvalues are distinct.

Really, could you give a reference?
 
Back
Top