Calculating entropy for a simple scenario

In summary: Non-equilibrium thermodynamics is the study of systems that are not in equilibrium. It is difficult or impossible to define entropy at an instant of time in macroscopic terms for systems not in thermodynamic equilibrium. So we may be out of luck.
  • #36
Stephen Tashi said:
The main point of my posts is that attempts to define entropy by methods 1) and 2) appear to be dead-ends.
Please try thinking with my head. I'm not posting here just because I'd like to convince you, my intension is only that I'd like to solve the problem I faced and I need some help form others who have greater knowledge on the topic. I don't really need explanations on why I can't calculate entropy -- but describing the problems may help to solve them one by one. For example if you say that "It would be possible to calculate the entropy if we could find a way to numerize the volume change through non-equilibrium conditions" then together maybe we will be able to find a good, analitycal approximation, and then we can step forward. I think I had several good ideas for this problem, but if it is not necessary or dead-end, then let's skip it, and find another way to entropy.

I answered your questions and reacted to your critics, but then I decided to not post here, because I'm not sure it leads to a solution. If you are interested, I'd be happy to send you my thoughts in a personal message.

I think you also would like the result if I would be able to create a simulation in which a deterministic process with particles demonstrates the change of macro values (temperature, pressure, entropy). If you have any ideas or suggestion on how to accomplish this, I'd be thankful for it.
 
<h2>What is entropy?</h2><p>Entropy is a measure of the disorder or randomness in a system. In simple terms, it is the amount of uncertainty or unpredictability in a system.</p><h2>How is entropy calculated?</h2><p>In a simple scenario, entropy can be calculated by using the formula S = k ln W, where S is the entropy, k is the Boltzmann constant, and W is the number of microstates in the system.</p><h2>What is a microstate?</h2><p>A microstate is a specific arrangement or configuration of particles in a system. The number of microstates in a system is related to the number of ways the particles can be arranged or distributed within the system.</p><h2>What is the relationship between entropy and energy?</h2><p>In a closed system, as the energy increases, the number of possible microstates also increases, resulting in an increase in entropy. This means that systems tend to move towards higher entropy states, where there is more disorder and randomness.</p><h2>Can entropy be negative?</h2><p>In theory, entropy can be negative, but in practice, it is always a positive value. This is because the number of microstates can never be less than one, and the natural logarithm of one is zero, making the entropy always positive.</p>

FAQ: Calculating entropy for a simple scenario

What is entropy?

Entropy is a measure of the disorder or randomness in a system. In simple terms, it is the amount of uncertainty or unpredictability in a system.

How is entropy calculated?

In a simple scenario, entropy can be calculated by using the formula S = k ln W, where S is the entropy, k is the Boltzmann constant, and W is the number of microstates in the system.

What is a microstate?

A microstate is a specific arrangement or configuration of particles in a system. The number of microstates in a system is related to the number of ways the particles can be arranged or distributed within the system.

What is the relationship between entropy and energy?

In a closed system, as the energy increases, the number of possible microstates also increases, resulting in an increase in entropy. This means that systems tend to move towards higher entropy states, where there is more disorder and randomness.

Can entropy be negative?

In theory, entropy can be negative, but in practice, it is always a positive value. This is because the number of microstates can never be less than one, and the natural logarithm of one is zero, making the entropy always positive.

Back
Top