Exploring Entropy: Order vs Disorder in Physical Systems

In summary: Over time, the entropy will increase to the point where there are many microstates corresponding to whatever macrostate you have.
  • #1
HARI A
4
0
In my TD classes I have to deal with entropy. But what does it really mean? I know that it is the degree of disorder and the equation S=q/t. But what is this disorder? How can a physical system be aware of order and disorder since it is an abstract concept?They say that solids have high order than liquids and thus low entropy. Do entropy have any relation with vander walls forces and potential energy of system which predominates in solids than liquids. Why should the system try to increase entropy?Is it an attempt to reduce P E by keeping particles in a distinct manner? Please help.Thanks in advance.
 
Science news on Phys.org
  • #2
Rather than "disorder", know that entropy is the logarithm of the number of microstates available to a macrostate (or system). You can forget the logarithm for understanding, its there to help with calculations. Entropy is the number of microstates available to your macrostate. Entropy is a function of the macrostate you are considering.
 
  • #3
ModusPwnd said:
Rather than "disorder", know that entropy is the logarithm of the number of microstates available to a macrostate (or system). You can forget the logarithm for understanding, its there to help with calculations. Entropy is the number of microstates available to your macrostate. Entropy is a function of the macrostate you are considering.

Wow... that's confusing!

It makes sense that fluids have a higher disorder because they have more motion - think about how in a solid the atoms are packed close to each other - there's less motion.

I don't know if that's the correct approach but it's the one that made sense to me.
 
  • #4
Well thinking about it purely from macro perspective , liquids , fluids and gasses have higher entropy than solids ,as water is easy to mix with other liquids so is air , whenever you open the window the air rushes to get equilibrium temperature so the cold one comes in and the hot one goes out until they perfectly mix.

Well subatomic particles and quantum states have their own entropy and from that entropy and states macroscopic things and phenomenon get their corresponding states and properties although they don't have to be similar they can be different.

Just like with the hot and cold air mixing and the hot air rushing towards the cold so does all matter tend to go towards bigger entropy and energy tends to be converted via different mechanisms so that not only particles but whole systems always tend to go towards lower energy levels with time.
 
  • #5
one of the least confusing and overall flexible definitions I've come across on entropy is as follows

Entropy represents the diversity of internal movement of a system.

its from this site

http://www.science20.com/train_thought/blog/entropy_not_disorder-75081.

personally I would love to see the terms order disorder stricken from any and all definitions of entropy

the above unfortunately is not easy to convert to information entropy I'm sure there are other usages of entropy that the above does not describe well. Unfortunately I don't think there is a good single definition of entropy that can cover all its usage. At least not one that is easily understood by any student of any academic field or layman
 
Last edited:
  • #6
Why should a system be aware of anything? No need to bring consciousness into it. Entropy as a statistical concept was defined in post #2, and this logic should lead you to why entropy tends to increase- if you put a system in a low entropy state then there are only a few microstates corresponding to whatever macrostate you have. But atoms in a gas or solid are always interacting, colliding and transferring energy to each other. This happens so much that over some time, the entropy will increase to the point where there are many microstates corresponding to whatever macrostate you have.

It's a purely statistical result, if you start with 100 bottles of beer on the wall and then leave them in a storm, after a while you're going to have much less than 100. The macrostate of "all bottles still on the wall" corresponds to a single microstate "bottle 1 on the wall, bottle 2 on the wall... bottle 100 on the wall", and your system entropy is kb*log(1) = 0.

After 50 have blown off, your new macrostate "50 bottles left on the wall" corresponds to quite a lot of microstates. Bottles 1 through 50 may have fallen off. Bottles 1 through 49, plus bottle 51 may have fallen off. My maths is awful but I'd briefly guess that "100 choose 50" is the number of microstates of this new macrostate, so your new entropy is 66*k_B.

Obviously this example is pretty awful because there is a clear tendency for bottles to fall rather than get picked back up and replaced. But in a system of gas particles, or a set of paramagnets or all the standard examples, there is no macroscopic object (like the wind) forcing the bottles off the wall- just microscopic interactions which are completely random. And yet the system's macrostate will drift towards one with more and more corresponding microstates, so that entropy increases.

The more I think about it, the worse that example was
 
  • #7
definition #2 is good but not easily understood, take my situation of having to explain entropy to new apprentice boiler operators or instrumentation techs. Thats where I found the definition I posted most useful. Granted this is a scientific forum, however that does not mean we cannot simplify a definition so that it is easily understood by the layman regardless of field of academic study.
 
  • #8
I'm not sure I agree with that article. Entropy is not heat capacity; it is the amount of heat per kelvin that is not available to do useful work. If your system has half the molecules at 100K and half at 200K then its entropy is lower than a system with all its molecules at 150K. That is because there are fewer microstates corresponding to the nonuniform case than there are to the uniform case. And, as a result, the nonuniform system can do more useful work than the uniform system because a system in thermal equilibrium is in a state of maximum entropy/minimum free energy.
 
  • #9
Laymans explanation:

If you do a really bad fart, it's almost certain that everyone in the room will smell it!
Because...
The fart invariably diffuses.
Because...
The are many permutations of ways the fart in the room can exist in a diffuse state compared to the eventuallities where the fart stays in clumped together.

The fart diffuses because of entropy.
Entropy increases with time.
Therefore...
Leave the room quickly after farting.
 

FAQ: Exploring Entropy: Order vs Disorder in Physical Systems

What is entropy?

Entropy is a scientific concept that measures the level of disorder or randomness in a physical system. It is a measure of the number of possible arrangements or states that a system can have.

How does entropy relate to order and disorder?

In general, a system with low entropy is considered to be more ordered and structured, while a system with high entropy is considered to be more disordered and random. This is because as the number of possible arrangements or states increases, the system becomes more unpredictable and chaotic.

What causes entropy to increase?

The Second Law of Thermodynamics states that in a closed system, entropy will always tend to increase over time. This is due to the natural tendency of energy to spread out and become more evenly distributed. In other words, as energy is transferred and work is done, entropy increases.

Can entropy be reversed or decreased?

While it is possible for entropy to temporarily decrease in a local system, the overall trend is always towards increasing entropy. This is because it requires energy to create and maintain order, and as energy is expended, entropy increases. In order to decrease entropy in a system, energy would need to be constantly added and maintained, which is not sustainable.

How does entropy impact our daily lives?

Entropy plays a role in many aspects of our daily lives, from the aging of our bodies to the deterioration of buildings and infrastructure. It also affects the efficiency of processes, such as the conversion of heat energy into mechanical energy. Understanding and managing entropy is crucial in fields such as engineering, ecology, and thermodynamics.

Similar threads

Replies
3
Views
1K
Replies
1
Views
2K
Replies
4
Views
1K
Replies
5
Views
2K
Replies
15
Views
9K
Replies
19
Views
3K
Replies
8
Views
1K
Replies
2
Views
3K
Back
Top