How to visualize entropy in thermodynamics

In summary, the concept of entropy is the measure of disorder in a system and is represented by the formula S = q/t. However, understanding it practically can be challenging, especially when trying to solve calculus-based entropy problems. One way to approach it is by using the formula S = (boltzmann's constant)lnw, where w is the number of macrostates in the system. This can also be used to find the number of macrostates by equating it with the formula q/t, although this may result in a large number. The thermodynamic definition of entropy, dS = dQ/T, can also be used to better understand this concept. Additionally, the definition of temperature, 1/(boltzmann's constant
  • #1
vijay123
122
0
entropy is the measure of disorder...but i just cannot visualize it in a practical standpoint...i mean...the disorder taking place with soo many particles when heated and yet all i know is that s=q/t...can anyone explain this concept better to me?

one more question...how does one prove using statistical mechanics that
s=(boltzmann's constant)lnw...were w is the number of macrostates...

i was tryng to do calculus based entropy probelms...but can do them in refrence to s=(boltzmann's constant)lnw, by counting the number of macrostates available??

and is it possible to find th number of macrostates by equating (boltzmann's constant)lnw=q/t...is this a very large number?
 
Physics news on Phys.org
  • #2
vijay123 said:
one more question...how does one prove using statistical mechanics that
s=(boltzmann's constant)lnw...were w is the number of macrostates...
Entropy of a composite system is given by:
S* = SA + SA'-----(1)
Number of accessible states to A* is:
W* = WAWA'------(2)

S*,SA, SA' are entropies of composite system A*, A and A' respectively(A* composed of A and A').
The entropy is a "state" function.
For (1) & (2) to hold good simultaneously, what should be the nature of this function?
 
  • #3
W in S=klnW is the number of microstates.

I was indtroduced to entropy through S=klnW as the definition, therefore it's not provable. What definition of S do you use??
 
  • #4
vijay123 said:
entropy is the measure of disorder...but i just cannot visualize it in a practical standpoint...i mean...the disorder taking place with soo many particles when heated and yet all i know is that s=q/t...can anyone explain this concept better to me?

one more question...how does one prove using statistical mechanics that
s=(boltzmann's constant)lnw...were w is the number of macrostates...

i was tryng to do calculus based entropy probelms...but can do them in refrence to s=(boltzmann's constant)lnw, by counting the number of macrostates available??

and is it possible to find th number of macrostates by equating (boltzmann's constant)lnw=q/t...is this a very large number?
Use the thermodynamic definition of entropy and forget about the concept of disorder: dS = dQ/T

AM
 
  • #5
You'll need to use the definition of temperature [tex]\frac{1}{k_{B}T} = \frac{\partial\ln\Omega}{\partial U}[/tex] as well.
 
  • #6
...though you might want to show where that comes from, it might be kinda easy otherwise...
 

FAQ: How to visualize entropy in thermodynamics

What is entropy in thermodynamics?

Entropy is a measure of the disorder or randomness in a system. In thermodynamics, it is a measure of the unavailable energy in a closed system.

How can I visualize entropy?

Entropy can be visualized as the level of disorder or randomness in a system. For example, a neat and organized room has low entropy, while a messy and cluttered room has high entropy.

How does entropy change in a system?

Entropy tends to increase in a closed system over time, as energy is dispersed and the system becomes more disordered. However, if energy is input into the system, the entropy may decrease temporarily before eventually increasing again.

What is the relationship between entropy and temperature?

There is a direct relationship between entropy and temperature. As the temperature of a system increases, the entropy also increases. This is because at higher temperatures, molecules have more energy and can move more freely, increasing the system's disorder.

How is entropy used in thermodynamics?

Entropy is an important concept in thermodynamics as it helps us understand how energy flows and changes in a system. It is used in calculations to determine the efficiency of processes and to predict the direction of spontaneous changes in a system.

Back
Top