How Does Adding Energy Affect Entropy in Harmonic Oscillators?

In summary, the question is about a system of 8 one-dimensionaly harmonic oscillators with 3 quanta of energy. The equation used to calculate the entropy change is S=k*ln(omega), where omega=(q+N-1)!/(q!*(N-1)!). The attempt at a solution involves plugging in the values and calculating the difference, but the answer is incorrect due to a wrong value provided for the Boltzmann constant.
  • #1
gc33550
22
0

Homework Statement


Consider a system of 8 one-dimensionaly harmonic oscillators. Initially this system has 3 quanta of energy. Byhow much does the entropy change if you add one more quanta of energy?

Homework Equations


S=k*ln(omega)
omega=(q+N-1)!/(q!*(N-1)!)

The Attempt at a Solution


(6.7E-11*ln(4+8-1)!/(4!*(8-1)!))-(6.7E-11*ln(3+8-1)!/(3!*(8-1)!))

Edited:

(1.38E-23*ln(4+8-1)!/(4!*(8-1)!))-(1.38E-23*ln(3+8-1)!/(3!*(8-1)!))

I think this is how you solve the problem but I am not getting the right answer

I have the right answer My professor provided me with the wrong boltzman constant on the equation sheet. Thank you for your help
 
Last edited:
Physics news on Phys.org
  • #2
Don't create more than one thread for the same question gc33550. Be patient.
 
  • #3
I apologize for making an extra thread I was trying to edit the first one and hit the back button before I realized there was an edit button I apologize
 

FAQ: How Does Adding Energy Affect Entropy in Harmonic Oscillators?

What is statistical physics?

Statistical physics is a branch of physics that uses statistical methods and concepts to study the behavior of large systems of particles. It seeks to understand the macroscopic properties of matter and energy by studying the microscopic behavior of individual particles.

What is entropy?

Entropy is a measure of the disorder or randomness of a system. In statistical physics, it is often used to describe the number of possible microstates that a system can have given its macroscopic properties.

How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that in a closed system, the total entropy will always increase over time. This means that systems tend to become more disordered and chaotic over time, leading to an increase in entropy.

What is the connection between entropy and information theory?

In information theory, entropy is used to quantify the amount of uncertainty or randomness in a system. It can also be seen as a measure of the information content of a system. This connection between entropy and information theory has applications in fields such as data compression and cryptography.

How does statistical physics explain the behavior of gases?

Statistical physics provides a microscopic explanation for the macroscopic behavior of gases, such as pressure, temperature, and volume. It does this by studying the behavior of individual gas particles and using statistical methods to describe the collective behavior of a large number of particles.

Back
Top