What is the role of Shot and Scold in the entropy equation?

In summary, the difference between 1S2 and ∫δQ/T is that 1S2 evaluates over a reversible path and therefore the entropy change is < entropy change.
  • #1
theBEAST
364
0

Homework Statement


In the entropy equation 1S2 + ∫δQ/T = S2 - S1, what is the difference between 1S2 and ∫δQ/T?

So for example if we have two reservoirs, one hot and one cold and heat is transferred in between we have:

Shot = Q/Thot
Scold = Q/Tcold

Where do the Shot and Scold terms go in that equation?
 
Physics news on Phys.org
  • #2
A reservoir has a constant temperature by definition. So what does that enable you to do in evaluating ∫δQ/T ?

And BTW ∫δQ/T = 1S2 only if evaluated over a reversible path. Otherwise the integral is < entropy change.
 
  • #3
rude man said:
A reservoir has a constant temperature by definition. So what does that enable you to do in evaluating ∫δQ/T ?

And BTW ∫δQ/T = 1S2 only if evaluated over a reversible path. Otherwise the integral is < entropy change.

Sorry I meant ∫δQ/T = 1S2,generated.

But if ∫δQ/T = 1S2,gen, then that means S2 - S1 = 2∫δQ/T
 
  • #4
theBEAST said:
Sorry I meant ∫δQ/T = 1S2,generated.

But if ∫δQ/T = 1S2,gen, then that means S2 - S1 = 2∫δQ/T

No it does not. S2 - S1 = ∫δQ/T over a reversible path. S2 - S1 is the entropy change in ONE of the reservoirs. Since you have two reservoirs at two different temperatures T1 and T2 , each has its own S2 - S1.

Have you figured out how that integral is simplified if you're dealing with entropy changes of reservoirs?
 
  • #5
rude man said:
No it does not. S2 - S1 = ∫δQ/T over a reversible path. S2 - S1 is the entropy change in ONE of the reservoirs. Since you have two reservoirs at two different temperatures T1 and T2 , each has its own S2 - S1.

Have you figured out how that integral is simplified if you're dealing with entropy changes of reservoirs?

Yeah I know how to do that. Thanks!
 

FAQ: What is the role of Shot and Scold in the entropy equation?

What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a concept commonly used in physics, chemistry, and information theory.

Why is entropy important?

Entropy helps us understand the behavior of physical and chemical systems. It is also a key concept in thermodynamics, where it is used to explain the direction and efficiency of energy transfer.

What is the relationship between entropy and disorder?

In general, as entropy increases, so does disorder. This is because higher entropy means there are more possible arrangements or configurations of a system, and some of these arrangements may appear random or disordered.

Does entropy always increase?

In a closed system, entropy tends to increase over time. This is known as the second law of thermodynamics. However, in an open system, such as the Earth, entropy can decrease locally as long as it is balanced by an increase in entropy elsewhere.

How can entropy be calculated?

The exact calculation of entropy depends on the system and its properties. In thermodynamics, it is often calculated using the equation S = k ln W, where S is the entropy, k is the Boltzmann constant, and W is the number of microstates (possible arrangements) of the system.

Back
Top