What is really that density matrix in QM?

In summary: This is the so-called density matrix.According to this interpretation, the density matrix formalism is a tool to derive the statistical operator from the laws of quantum mechanics.So in summary, the basic assumptions of QM about the density matrix are:-There exists a wave function of the system-The density operator is the most general form of a quantum state-The statistical operator has the following properties:-It's a self-adjoint positive semidefinite operator-Its trace is 1-The density matrix is the most general form of quantum states
  • #36
vanhees71 said:
You are contradicting yourself, and I'm sure that is by some misunderstanding, because the math is indeed utterly simple and is taught in the 1st semester in the linear-algebra lecture: The trace of a matrix is independent of the basis used to calculate it.
Sure, the math is simple and I didn't make a mistake. I was giving a contexgt where the proper entropy is not the trace. Thus there is no contradiction.
 
Physics news on Phys.org
  • #37
atyy said:
(2) "Improper" mixtures: Assume a system is in a pure state, but you only make a measurement on a subsystem.
In reality, every quantum system is a subsystem of a larger system. Thus one must regard every not artificially mixed state as an improper mixture. These are essentially never in a pure state.

Even in case of a Stern-Gerlach beam (with the second beam blocked), one has strictly speaking not a pure up state since it is impossible to prepare the magnetic field such that it defines a unique up direction.
 
  • #38
Well I suppose I still don't see this difference in the quantum and classical case as there are two common ways of looking at the quantum entropy where upon it looks very similar to the classical Shannon entropy. I will write this post a spoiler for those who might not know information theory.

Firstly as has been mentioned above the Shannon entropy for a classical information source which has outcomes ##i=1\dots k## with probability ##p_i## is the "surprise" present asymptotically in a single outcome.

This is roughly because if we assume that the the information source has transmitted a large amount ##N## of outcomes then we can assume that all ##N## sequences of outcomes are "typical" in the sense that each outcome ##i## has occurred a fraction of ##p_i## times in the sequence. Atypical sequences become vanishingly rare due to the law of large numbers so we ignore them.

Combinatorics gives us quite simply that the number of such typical cases is:
$$\Omega = \frac{N}{(n_1)!\dots(n_k)!}$$
with ##n_1## being the number of times outcome ##i=1## occurs, which in a typical sequence is ##Np_1## thus:
$$\Omega = \frac{N}{(Np_1)!\dots(Np_k)!}$$
To find the number of bits needed to encode such a sequence of ##N## outcomes we simply take the log of ##\Omega## and since ##N## is large we can use Sterling's formula to obtain:
$$\ln\left(\Omega\right) = -N\sum_{i=1}^{k}{p_i \ln\left(p_i\right)}$$
Thus the information per bit is:
$$\frac{\ln\left(\Omega\right)}{N} = -\sum_{i=1}^{k}{p_i \ln\left(p_i\right)}$$
The entropy. Thus entropy is asymptotically (i.e. large ##N##) how much a given outcome encoded in bits identifies which sequence of outcomes the source has outputted. Low entropy sources are very likely to produce a small set of sequences so a given bit tells you very little as you can mostly predict the sequence in advance, i.e. there is less surprise/knowledge gained in a single outcome.

To distinguish quantum and classical entropy I will call the former ##S## and the latter ##H##.

In quantum mechanics to have outcomes we need a context, i.e. a complete set of commuting observables. This is nothing more than complimentarity/counterfactual indefiniteness/"unobserved outcomes have no results". Only the quantity we measure has outcomes other variables/observables do not.

We can always map a quantum state ##\rho## into being a classical probability distribution for those observables alone using what is called a Gelfand representation denoted ##G##. Thus to see what classical probability distribution ##\rho## gives for a context (set of commuting observables) we transform ##\rho## into the basis associated with that context and map it with ##G##, i.e. ##G\left(U\rho U^{\dagger}\right)##

Note: In full detail ##G## maps ##\rho## into a probability distribution over the spectrum of the observables associated with the context. For a very simple case of an observable ##A## with eigenvalues ##\lambda_i## it maps ##\rho## into a distribution ##p\left(\lambda_i \right)##

It turns out that quantum entropy obeys:
\begin{align*}
S\left(\rho\right) & = -Tr\left(\rho\ln\left(\rho\right)\right)\\
& = \min_{U}\left[H\left(G\left(U\rho U^{\dagger}\right)\right)\right]
\end{align*}
That is each context has a Shannon entropy and the quantum entropy is lowest entropy among the contexts, i.e. the surprise factor or knowledge gained in the context with the least amount of surprise/greatest predictability of results.

This is no surprise as quantum probability is a generalization of classical probability to the case of multiple entwined sample spaces.

Thus pure states are states of maximal knowledge because they contain one context which is utterly predictable.

Above @atyy mentioned that in some cases there are no pure states. In QFT finite volume systems have no pure states (it is possible that there are no infinite volume pure states either due to coloumb fields in QED, but that is still an open issue). Thus every system has non-zero entropy and this is nothing more than the statement that finite systems treated realistically with QFT have no context with completely predictable outcomes.

Also note here we do not need to conceive of mixed states as uncertainty about pure states. They're just states and pure states are a special case with a totally predictable context.

Now we can understand the entropy of entangled states. The entire state is pure so there is a context where the outcomes are certain, e.g. measurement in the Bell basis for entangled spin-half particles with vanishing total angular momentum. However for somebody with access to only one particle no single particle context has completely predictable outcomes and so the entropy of a single particle is non-zero.

An alternative characterization of entropy is how many bits it takes to faithfully transmit a source of outcomes.

For a classical source ##W## of outcomes the noiseless source coding theorem tells us that a source can be faithfully transmitted with error less than ##\epsilon## if we have access to a resource of ##H\left(W\right) + \delta## bits for each outcome. ##\epsilon## and ##\delta## become smaller as ##N##, the number of outcomes the source generates, increases. The entropy then is a hard limit, the fundamental amount of information or knowledge in an outcome.

Similarly by Schumacher's theorem a quantum source ##\rho## of outcomes can be faithfully transposed (quantum analogue of transmission) with error less than ##\epsilon## if we have access to a resource of ##S\left(\rho\right) + \delta## qubits for each outcome.

Thus it seems to me from two separate views the quantum entropy is very similar to the classical entropy and has a similar "knowledge" or "surprise" based reading. However this knowledge is not to be understood as missing knowledge of a pure state but as the information content of an outcome in the most predictable context of measurements one can perform on the system.

If one feels better saying "information" rather than knowledge I won't argue semantics of these English language words.
 
Last edited:
  • Like
Likes dextercioby, microsansfil, atyy and 1 other person
  • #39
DarMM said:
I still don't really see the issue with thinking of it in terms of knowledge or surprise. I mean pick out a context and the resultant Gelfand representation makes it equivalent to the classical case in that context. Hence directly it seems to me that you can think of quantum entropy in similar terms to classical entropy, i.e. how informative/surprising a given measurement outcome is.

Perhaps it would be easier to ask what is entropy in QM then in your view?

As @A. Neumaier has mentioned, the classical entropy for continuous variables is not invariant under arbitrary smooth transformations, so one has to choose additional conditions to specify it eg. canonically conjugate variables to specify the entropy for Hamiltonian classical mechanics. For classical continuous variables if one needs an invariant notion of information, one must use relative notions such as the mutual information or relative entropy (also called the surprise or Kullback-Leibler divergence), as these make sense for discrete and continuous variables. So in the classical case, surprise is often considered to be different from the entropy.
 
Last edited:
  • Like
Likes DarMM
  • #40
DarMM said:
Well I suppose I still don't see this difference in the quantum and classical case as there are two common ways of looking at the quantum entropy where upon it looks very similar to the classical Shannon entropy.
I didn't know either of the two results you mention. They make indeed quantum entropy an information theoretic property of the state alone.
Can you please give references?
DarMM said:
To distinguish quantum and classical entropy I will call the former ##S## and the latter ##H##.
\begin{align*}
S\left(\rho\right) & = -Tr\left(\rho\ln\left(\rho\right)\right)\\
& = \min_{U}\left[H\left(G\left(U\rho U^{\dagger}\right)\right)\right]
\end{align*}
That is each context has a Shannon entropy and the quantum entropy is lowest entropy among the contexts, i.e. the surprise factor or knowledge gained in the context with the least amount of surprise/greatest predictability of results.

[...] we do not need to conceive of mixed states as uncertainty about pure states. They're just states and pure states are a special case with a totally predictable context.

Now we can understand the entropy of entangled states. The entire state is pure so there is a context where the outcomes are certain, e.g. measurement in the Bell basis for entangled spin-half particles with vanishing total angular momentum. However for somebody with access to only one particle no single particle context has completely predictable outcomes and so the entropy of a single particle is non-zero.

An alternative characterization of entropy is how many bits it takes to faithfully transmit a source of outcomes.

For a classical source ##W## of outcomes the noiseless source coding theorem tells us that a source can be faithfully transmitted with error less than ##\epsilon## if we have access to a resource of ##H\left(W\right) + \delta## bits for each outcome. ##\epsilon## and ##\delta## become smaller as ##N##, the number of outcomes the source generates, increases. The entropy then is a hard limit, the fundamental amount of information or knowledge in an outcome.

Similarly by Schumacher's theorem a quantum source ##\rho## of outcomes can be faithfully transposed (quantum analogue of transmission) with error less than ##\epsilon## if we have access to a resource of ##S\left(\rho\right) + \delta## qubits for each outcome.
What is the meaning of ''faithfully transposed''?
DarMM said:
Thus it seems to me from two separate views the quantum entropy is very similar to the classical entropy and has a similar "knowledge" or "surprise" based reading. However this knowledge is not to be understood as missing knowledge of a pure state but as the information content of an outcome in the most predictable context of measurements one can perform on the system.

If one feels better saying "information" rather than knowledge I won't argue semantics of these English language words.
Both classically and qauntum mechanically, information is a much mre neutral word than knowledge, since nothing in your exposition depends on knowledge (in the usual sense) in any way. Information content has a precise, observer-inependent mathematical definition independent of the meaning of the transmitted details, while knowledge is ambiguous, observer-dependent, and meaning-sensitive.

Knowledge is needed only marginally, in that one needs to know the state in order to find the optimal transmission protocol. But one needs to know the state for everything one wants to predict in science, hence this involvement of knowledge is nothing state-specific.

Nevertheless, even wih your interpretation, the following remains valid:
A. Neumaier said:
Entropy is a basic physical quantity like energy, where we also cannot say what it ''is'' except through the mathematical relations it has to other physical concepts. This notion of entropy was undisputed and already eminently successful for 100 years before Jaynes discovered relations to the concept of information. His interpretation didn't add the slightest to the successes of the entropy concept, hence can safely be regarded as an - in my opinion interesting but often misleading - analogy.
One cannot faithfully transmit/transpose a macroscopic state, using the resources of a local region in spacetime. Thus information theory does not apply to thermodynamics. And one doesn't need information theory at all to derive thermodynamics from quantum physics.
 
  • Like
Likes DarMM
  • #41
I'm not sure if @A. Neumaier would agree with my reason, but I too am not a fan of Jaynes. For me Jaynes fails because the Gibbs entropy is not unique, it is only one of the Renyi entropies.
 
  • #42
atyy said:
I'm not sure if @A. Neumaier would agree with my reason, but I too am not a fan of Jaynes. For me Jaynes fails because the Gibbs entropy is not unique, it is only one of the Renyi entropies.
The most serious problem with Jaynes' subjectivism is that his maximum entropy principle predicts complete physical nonsense if you assume the knowledge of the expectation of ##H^2## rather than that of ##H##. One gets the correct expressions for the density operator only if one assumes the correct knowledge.

This is a problem like in betting: One succeeds in betting only if the believed subjective probabilities are close to the actual ones. Card game players know. Thus it is not the belief but the amount of agreement with reality that determines the probabilities.
 
  • Like
Likes atyy
  • #43
DarMM said:
Thus it seems to me from two separate views the quantum entropy is very similar to the classical entropy and has a similar "knowledge" or "surprise" based reading. However this knowledge is not to be understood as missing knowledge of a pure state but as the information content of an outcome in the most predictable context of measurements one can perform on the system.
Even if we can find a similar point of view regarding the notion of entropy , communication through a quantum channel cannot be described by the results of classical information theory (in particular the notion of capacity); it requires the generalization of classical information theory by quantum perception of the world.

This article "A Survey on Quantum Channel Capacities" shows the difficulties to federate the different use cases.

From "A Survey on Quantum Channel Capacities" :
Many new capacity definitions exist for quantum channels in comparison to a classical communication channel. In the case of a classical channel, we can send only classical information while quantum channels extend the possibilities, and besides the classical information we can deliver entanglement assisted classical information, private classical information, and of course, quantum information [53], [134]. On the other hand, the elements of classical information theory cannot be applied in general for quantum information –in other words, they can be used only in some special cases. There is no general formula to describe the capacity of every quantum channel model, but one of the main results of the recent researches was a simplified picture in which various capacities of a quantum channel (i.e., the classical, private, quantum) are all non-additive [242].

/Patrick
 
Last edited:
  • #44
microsansfil said:
Even if we can find a similar point of view regarding the notion of entropy , communication through a quantum channel cannot be described by the results of classical information theory (in particular the notion of capacity); it requires the generalization of classical information theory by quantum perception of the world
Certainly, but this has never been in doubt or disputed. To say otherwise would be the daft claim that quantum information theory is no more than classical information theory.
The claim has never been that a quantum information channel can be described by classical information theory, but that there is a similarity between quantum entropy and classical entropy.
 
  • #45
DarMM said:
Certainly, but this has never been in doubt or disputed.
...
but that there is a similarity between quantum entropy and classical entropy.
Yes, but if there is a similarity between quantum entropy and classical entropy, as classical capacity is defined with the use of the concept of mutual information which is linked to the concept of entropy.

Why can't we also establish similarities between quantum capacity and classical capacity?

/Patrick
 
  • #46
A. Neumaier said:
I didn't know either of the two results you mention. They make indeed quantum entropy an information theoretic property of the state alone.
Can you please give references?
Schumacher's theorem is to be found in:
Schumacher, B., Quantum coding. Phys. Rev. A 51, 2738-2747 (1995).

As for the other theorem I first learned of it in Scott Aaronson's "Quantum Computing Since Democritus", but only a found a proof later in Section 12.2 of
Bengtsson, I., & Zyczkowski, K. (2006). Geometry of Quantum States: An Introduction to Quantum Entanglement. Cambridge: Cambridge University Press

A. Neumaier said:
What is the meaning of ''faithfully transposed''?
Schumacher explains this well in his paper.
 
  • Like
Likes vanhees71
  • #47
microsansfil said:
Why can't we also establish similarities between quantum capacity and classical capacity?
That would turn this thread into discussing every difference between Classical and Quantum Information theory. We also don't need the notion of discord in the Classical theory. There certainly are differences that's not in dispute.
 
  • #48
A. Neumaier said:
The most serious problem with Jaynes' subjectivism is that his maximum entropy principle predicts complete physical nonsense if you assume the knowledge of the expectation of ##H^2## rather than that of ##H##. One gets the correct expressions for the density operator only if one assumes the correct knowledge.

This is a problem like in betting: One succeeds in betting only if the believed subjective probabilities are close to the actual ones. Card game players know. Thus it is not the belief but the amount of agreement with reality that determines the probabilities.
Why does the MEM lead to "complete physical nonsense" assuming to know the expectation value of ##H^2## instead of assuming to know that of ##H##?

In this case MEM gives the stat. op.
$$\hat{\rho}=\frac{1}{Z} \exp (-\lambda \hat{H}^2).$$
What's wrong with this state?
 
  • #49
vanhees71 said:
Why does the MEM lead to "complete physical nonsense" assuming to know the expectation value of ##H^2## instead of assuming to know that of ##H##?

In this case MEM gives the stat. op.
$$\hat{\rho}=\frac{1}{Z} \exp (-\lambda \hat{H}^2).$$
What's wrong with this state?
It is time invariant hence stationary but leads to completely wrong predictions for thermal q-expectations such as the internal energy.
 
  • #50
atyy said:
the classical entropy for continuous variables is not invariant under arbitrary smooth transformations, so one has to choose additional conditions to specify it
A. Neumaier said:
Nevertheless, even wih your interpretation, the following remains valid
atyy said:
I'm not sure if @A. Neumaier would agree with my reason, but I too am not a fan of Jaynes. For me Jaynes fails because the Gibbs entropy is not unique, it is only one of the Renyi entropies.
Regardless of these issues with the interpretation of classical entropy are we agreed that mixed states are just general quantum states for two reasons:
  1. Their state space is ##Tr\left(\mathcal{H}\right)## not ##\mathcal{L}^{1}\left(\mathcal{H}\right)##, thus they seem not to quantify classical ignorance of a pure state since they cannot be read as probability distributions over pure states
  2. In QFT finite volume systems have no pure states.
Pure states are then just a special case where you have one totally predictable context, they don't constitute "the true state" of which one is ignorant. Such a totally predictable context seems to be absent in QFT, there is always some measurement uncertainty in QFT thus only mixed states.
 
  • #51
DarMM said:
That would turn this thread into discussing every difference between Classical and Quantum Information theory.
The thread has already changed the original question which was about: What is really that density matrix in QM?
If the similarities go beyond a simple point of view, we should detect the physical consequences that result from them.
/Patrick
 
Last edited:
  • #52
A. Neumaier said:
And to measure in an eigenbasis of rho itself, where the two agree, is infeasible in all but the simplest situations. This answers @Morbert's query.

This basis is often an energy eigenbasis, which is why quantum chemists are interested in the von Neumann entropy as the Shannon entropy of the natural orbitals/modes of the system (It's a step in various investigations about the relationship between independent fermion entropy and wavefunction compactness, correlation energy, interaction energy functionals etc.)

PS I should clarify that by maximal knowledge I meant with respect to other measurement contexts as opposed to other possible preparations of the system.
 
  • #53
microsansfil said:
The thread has already changed the original question which was about: What is really that density matrix in QM?

/Patrick
Not for me, I was only considering entropy in so far as it related to distinguishing pure and mixed states. Even ignoring this you seemed to be discussing that quantum information sources cannot be modeled with Classical Information theory which I don't think anybody would dispute or has disputed.
 
  • #54
Morbert said:
This basis is often an energy eigenbasis
For your recipe to apply one needs an eigenbasis of rho. This can be an eigenbasis of energy only if the state is stationary. While this case is important it is still a very special state.
 
  • Like
Likes vanhees71 and Morbert
  • #55
DarMM said:
Not for me, I was only considering entropy in so far as it related to distinguishing pure and mixed states.
OK

The same density matrice can represent different mixed states, i.e the spectral decomposition of a density matrix is not unique (unless the state is pure) and there are several ways to achieve the same density matrix by mixing pure states.

/Patrick
 
  • #56
microsansfil said:
OK

The same density matrice can represent different mixed states, i.e the spectral decomposition of a density matrix is not unique (unless the state is pure) and there are several ways to achieve the same density matrix by mixing pure states.

/Patrick
A density matrix is a mixed state, as far as I've seen they are synonymous tetms. Though the decomposition into a sum of pure states, if it exists, is not unique as you said. Though again that has not been in question.
 
  • #57
DarMM said:
Though again that has not been in question.
orisomech said:
Is the density matrix formulation is necessary to understand results from real experiments, or is it possible to do it without it ?

if entropy represents an intrinsic property of a physical system, the ambiguity associated with the representation carried by a density matrix does not make it an appropriate tool for talking about entropy.
/Patrick
 
  • #58
DarMM said:
Regardless of these issues with the interpretation of classical entropy are we agreed that mixed states are just general quantum states for two reasons:
  1. Their state space is ##Tr\left(\mathcal{H}\right)## not ##\mathcal{L}^{1}\left(\mathcal{H}\right)##, thus they seem not to quantify classical ignorance of a pure state since they cannot be read as probability distributions over pure states
  2. In QFT finite volume systems have no pure states.
Pure states are then just a special case where you have one totally predictable context, they don't constitute "the true state" of which one is ignorant. Such a totally predictable context seems to be absent in QFT, there is always some measurement uncertainty in QFT thus only mixed states.

I don't understand this, and will have to take your word for it, since this doesn't occur in elementary quantum mechanics (eg. it doesn't occur in anything Englert says). I have questions about the case in which pure states don't exist:

1) Does the Stinesprung theorem fail? Usually the Stinesprung theorem means that mixed states can be obtained as subsystems of pure states

2) If there are no pure states, how is time evolution defined? Is unitary evolution still fundamental? Isn't unitary evolution only expected for pure states or proper mixtures of pure states? For example, an improper mixture is not expected to undergo unitary evolution.
 
  • #59
atyy said:
I don't understand this, and will have to take your word for it, since this doesn't occur in elementary quantum mechanics (eg. it doesn't occur in anything Englert says).

For example, an improper mixture is not expected to undergo unitary evolution.
Unitary evolution is reserved for truly isolated systems. Restricting unitary dynamics to an isolated subsystem preserves unitary evolution. If the isolation is imperfect the preservatives is imperfect. This is the usual situation.
 
  • #60
atyy said:
1) Does the Stinesprung theorem fail? Usually the Stinesprung theorem means that mixed states can be obtained as subsystems of pure states

2) If there are no pure states, how is time evolution defined? Is unitary evolution still fundamental? Isn't unitary evolution only expected for pure states or proper mixtures of pure states? For example, an improper mixture is not expected to undergo unitary evolution.
The distinction between proper and improper mixed states breaks down in quantum field theory. See Section 4 of this review of entanglement and open systems in QFT:
https://arxiv.org/abs/quant-ph/0001107
Time evolution of any finite volume system will be non-unitary as everything is an open system in QFT.
 
  • Informative
Likes atyy
  • #61
microsansfil said:
if entropy represents an intrinsic property of a physical system, the ambiguity associated with the representation carried by a density matrix does not make it an appropriate tool for talking about entropy.
I don't understand, has anybody been discussing basing things on this decomposition?
 
  • #62
DarMM said:
The distinction between proper and improper mixed states breaks down in quantum field theory. See Section 4 of this review of entanglement and open systems in QFT:
https://arxiv.org/abs/quant-ph/0001107
Time evolution of any finite volume system will be non-unitary as everything is an open system in QFT.

Is it be possible to understand that as a case in which the state of every subsystem is an improper mixture?

If so, wouldn't the state of the total system still be pure, so that unitary evolution still applies to the total system, and governs the evolution of the subsystems?
 
  • #63
atyy said:
Is it be possible to understand that as a case in which the state of every subsystem is an improper mixture?

If so, wouldn't the state of the total system still be pure, so that unitary evolution still applies to the total system, and governs the evolution of the subsystems?
The total system would be the entire universe. This remains an open question in QFT whether there are global pure states. Complications involved are:
  1. QED coloumb fields might always be mixed. This is the mathematically most intractable problem.
  2. Such a pure state has no operational meaning. The theorized global purification might be a state over an algebra of self-adjoint operators that includes Wigner's friend type "observing macroscopic systems to the atomic scale" type observables which almost certainly lie outside the observable algebra. Thus over the true algebra of observables the state will still be mixed.
  3. Poorly understood complications from QFT in curved backgrounds, e.g. the information loss problem.
 
Last edited:
  • Like
Likes akvadrako
  • #64
A. Neumaier said:
It is time invariant hence stationary but leads to completely wrong predictions for thermal q-expectations such as the internal energy.
How do you come to this conclusion? It depends on the system!

Of course the MEM doesn't tell you what's the relevant information for a given system. That you have to determine yourself. The MEM is a very general principle. It's almost inevitable to make "objective" guesses based on the available information about the system.

Now 2 issues are usually raised against it, also in this thread.

(1) Choice of the "relevant observables" to be constraint in maximizing the entropy

Your criticism is quite common: Given only the constraint that you want a stationary state, constrains the possible choice of "relevant information" to be either averages of conserved quantities (like for energy in the canonical and grand-canonical ensemble; in the latter case you also give the average on one or seveal conserved charge-like quantities) or strict constraints of conservation laws (energy is strictly in an on macroscopic scales very small interval, as for the microcanonical ensemble).

That usually the Gibbs ensembles are preferred, i.e., giving at least constraints on ##U=\langle \hat{H} \rangle## and not some on any other function of ##f(\hat{H})##, is due to the fact that one considers the thermodynamical state as a small system coupled (weakly) to some reservoir, defining the canonical and grand-canonical ensembles (depending on whether you allow for exchange of conserved charges or not). Treating than the closed system large (reservoir+system)-system as a microcanonical ensemble (where it doesn't matter, which function ##f(\hat{H})## you look at), you are inevitably let to the usual canonical or grand-canonical ensemble with ##\langle \hat{H}_{\text{sys}} \rangle## is the relevant constraint. In the thermodynamic limit higher cumulants don't play a role. For small systems it can be necessary to take such higher-order constraints into account. The same argument holds if you have other relevant conserved quantities to take into account: In the typical "reservoir situation" for macroscopic thermodynamical systems the additive conserved quantities are the relevant observables and usually not more general functions thereof. For details, see

https://journals.aps.org/rmp/abstract/10.1103/RevModPhys.85.1115

and references therein.

This view is also solidified by the usual dynamical arguments using non-equlibrium descriptions of open quantum systems (master equations, transport equations, etc.): For short-range interactions in the collision term and truncating the BBGKY hierarchy (or the corresponding analogs for more detailed quantum descriptions; as the general Schwinger-Dyson hierarchy of QFT) at the lowest order leads to the standard equilibrium distributions (Bose-Einstein, Fermi-Dirac, Maxwell-Boltzmann), corresponding to the choice of additive conserved quantities in the MEM as discussed above.

(2) Choice of the information measure

This is the question, which type of entropy to use. The Shannon-Jaynes one in the physics context refers to the classical Boltzmann-Gibbs entropy. It's well-known that this doesn't work with lang-range forces present (electromagnetic, but that's pretty harmless, because in many-body systems you usually have Debye screening which comes to the rescue, but there's no such thing for gravity, and that's important for structure formation in the universe and our very existence). Here other (non-additive) entropy forms like Renyi or Tsallis entropies may be the better choice. As far as I know, there's however no generally valid dynamical argument as in the Boltzmann-Gibbs case.
 
  • #65
microsansfil said:
OK

The same density matrice can represent different mixed states, i.e the spectral decomposition of a density matrix is not unique (unless the state is pure) and there are several ways to achieve the same density matrix by mixing pure states.

/Patrick
No! The statistical operator uniquely determines the state of the system, and its spectral decomposition is unique, if you use a complete set of compatible observables, including the statistical operator. It's a self-adjoint operator!

It's of course right that mixing pure states is not unique, but why should it be?
 
  • #66
vanhees71 said:
No! The statistical operator uniquely determines the state of the system, and its spectral decomposition is unique, if you use a complete set of compatible observables, including the statistical operator. It's a self-adjoint operator!

It's of course right that mixing pure states is not unique, but why should it be?

here

1565603264136.png


/Patrick
 
  • #67
DarMM said:
The total system would be the entire universe. This remains an open question in QFT whether there are global pure states. Complications involved are:
  1. QED coloumb fields might always be mixed. This is the mathematically most intractable problem.
  2. Such a pure state has no operational meaning. The theorized global purification might be a state over an algebra of slef-adjoint operators that includes Wigner's friend type "observing macroscopic systems to the atomic scale" type observables which almost certainly lie outside the observable algebra. Thus over the true algebra of observables the state will still be mixed.
  3. Poorly understood complications from QFT in curved backgrounds, e.g. the information loss problem.

Is this related: https://arxiv.org/abs/1406.7304 ?

Is it really true then that there is no unitary time evolution in QFT? For measurement, I can buy that there are no mixed states. But I find it hard to buy that there is no unitary evolution. Really? Then there will be complications with the information loss problem, as you say.
 
  • #68
atyy said:
Is this related: https://arxiv.org/abs/1406.7304 ?

Is it really true then that there is no unitary time evolution in QFT? For measurement, I can buy that there are no mixed states. But I find it hard to buy that there is no unitary evolution. Really? Then there will be complications with the information loss problem, as you say.
It's probably true, see remark 15 (p.31) in this paper:
https://arxiv.org/abs/1412.5945
 
  • Informative
  • Like
Likes Auto-Didact and atyy
  • #69
DarMM said:
It's probably true, see remark 15 (p.31) in this paper:
https://arxiv.org/abs/1412.5945

But how about for flat spacetime? In the case of no no pure states in flat spacetime, can we have unitary time evolution?
 
  • #70
That's an open question due to issues with infrared representations in QED.
 

Similar threads

Replies
32
Views
3K
Replies
5
Views
223
Replies
15
Views
2K
Replies
9
Views
2K
Replies
17
Views
2K
Replies
6
Views
101
Replies
90
Views
8K
Replies
3
Views
2K
Back
Top