How Is Entropy Calculated for a Density Matrix with Eigenvalues 0 and 1?

In summary, the Von Neumann entropy for a density matrix with eigenvalues 0 and 1 is zero, as it is a pure state with minimum uncertainty. To calculate it numerically, one can use the limit ##\displaystyle \lim_{\lambda_1\rightarrow 0}\left[\lambda_1\ln{\lambda_1}\right]## or use L'Hospital's Rule. The physical interpretation is that entropy measures our uncertainty about the state of a system, and a pure state has minimum uncertainty, thus zero entropy.
  • #1
LagrangeEuler
717
20

Homework Statement


Calculate entropy for density matrix with eigenvalues ##0## and ##1##.



Homework Equations


##S=-\lambda_1 \ln \lambda_1-\lambda_2 \ln \lambda_2##
where ##\lambda_1## and ##\lambda_2## are eigenvalues of density matrix.


The Attempt at a Solution


How to calculate this when ##\ln 0## is not defined?
 
Physics news on Phys.org
  • #2
LagrangeEuler said:

Homework Statement


Calculate entropy for density matrix with eigenvalues ##0## and ##1##.



Homework Equations


##S=-\lambda_1 \ln \lambda_1-\lambda_2 \ln \lambda_2##
where ##\lambda_1## and ##\lambda_2## are eigenvalues of density matrix.


The Attempt at a Solution


How to calculate this when ##\ln 0## is not defined?
I'm no expert, but...

What is the value of ##\displaystyle \lim_{\lambda_1\rightarrow 0}\left[\lambda_1\ln{\lambda_1}\right]##?

I think that is the only way to get a numerical answer here: make it approach what you want.
 
  • #3
For a density matrix ρ with eigenvalues only 0 and 1, we have [itex] ρ = ρ^{2} [/itex]. This is true only for pure states and thus we know the Von Neumann entropy must be zero. To calculate it numerically I would guess the approach Mandelbroth suggested is valid.
 
  • #4
What is interpretation of that. For pure state entropy is zero. Why?
##-1\ln 1-\lim_{\lambda \rightarrow 0}\lambda \ln \lambda=-\lim_{\lambda \rightarrow 0}\lambda \ln \lambda ##
How to calculate this limit?
 
  • #5
LagrangeEuler said:
What is interpretation of that. For pure state entropy is zero. Why?
##-1\ln 1-\lim_{\lambda \rightarrow 0}\lambda \ln \lambda=-\lim_{\lambda \rightarrow 0}\lambda \ln \lambda ##
How to calculate this limit?

To calculate the limit let [itex] t = 1/x [/itex]. Then you have [tex] \lim_{t\to\infty}=\frac{log(1/t)}{t} = \frac{\infty}{\infty}.[/tex] Then use L'Hospital's Rule and you will get the answer. Otherwise, you could just type it into wolfram alpha.
 
  • #6
Tnx a lot! And physically why entropy of pure state is zero?
 
  • #7
It is easy to see mathematically why the entropy of a pure state is zero. However, why its true physically seems a much harder question, one I'm not sure I know how to answer.
 
  • #8
Entropy is in some sense a measure of our uncertainty about the state a system. If a system is in a mixed state, is our uncertainty big or small? What about when it is in a pure state?
 
  • #9
Mute said:
Entropy is in some sense a measure of our uncertainty about the state a system. If a system is in a mixed state, is our uncertainty big or small? What about when it is in a pure state?

Pure state is minimum uncertainty so it makes sense Entropy would be zero.
 

FAQ: How Is Entropy Calculated for a Density Matrix with Eigenvalues 0 and 1?

What is entropy in thermodynamics?

Entropy is a measure of the disorder or randomness of a system. In thermodynamics, it is often referred to as the measure of the amount of unavailable energy in a closed thermodynamic system.

How is entropy calculated?

The entropy of a system can be calculated using the formula S = kB ln W, where S is the entropy, kB is the Boltzmann constant, and W is the number of possible microstates that the system can be in.

What is the relationship between entropy and energy?

Entropy and energy are inversely related. As the entropy of a system increases, the amount of available energy decreases. This is due to the fact that a system with high entropy is more disordered, and therefore has less potential for work to be done.

How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that in any natural process, the entropy of the system and its surroundings will increase, leading to a more disordered state.

What is the role of entropy in quantum statistics?

In quantum statistics, entropy is used to describe the distribution of particles in a system. It is used to calculate the probability of a particle occupying a specific energy level and helps to explain the behavior of particles at the quantum level.

Back
Top