Does Entropy Mean Less Information or More Complexity?

In summary, the conversation discusses the concept of entropy and its relationship to information within a system. It is stated that entropy always increases and nature tends to move from a more orderly state to a disorderly state. This is also related to the amount of information contained within an object, with the suggestion that nature tends to go from more information states to less information states. However, there is some disagreement and confusion about whether a glass or a broken glass contains more information, as well as the difference between data and information.
  • #1
dhruv.tara
46
0

Homework Statement


I haven't studied thermodynamics and we have just a very brief use of thermo in my power plants course.

I have read that entropy always increases, universe tends to disorderly state from an orderly state. The teacher related this to probability that nature tends to go from less probable state to more probable state.

All these statements were fine.

Then they related this to the information contained within the system. I was told that nature tends to go from more information states to less information states. Things tend to lose information.

E.g. A glass -> Has information associated with it, its structure and other properties.
A broken glass -> Completely random. Not much information associated with it.

I cannot convince myself of the above. Can't we say the opposite?
A glass had much lesser information associated with it or most of the properties were redundant that given a few parameters it was possible to give a complete description of the object.
Broken glass-> Has too much information. We just can't classify or find parameters (or need way too many parameters) to describe a broken glass. (Rather than saying that its just random)

My guess is that it can have something to do with the difference between data and information, but that's only a guess...

Thanks... Any help is appreciated.
 
Physics news on Phys.org
  • #2
I think you might be mixing up the "information" needed to describe an object with the information the object itself does possess. Take some more time to think about the glass and the broken chards of glass and which has more structured information.
 

FAQ: Does Entropy Mean Less Information or More Complexity?

What is entropy?

Entropy is a measure of the disorder or randomness in a system. In physics, it is a thermodynamic quantity that describes the amount of energy that is unavailable for work in a system.

How does entropy relate to information?

In information theory, entropy is a measure of uncertainty or randomness in a system. It is used to quantify the amount of information contained in a message or signal.

What is the relationship between entropy and order?

Entropy and order are inversely related. As the level of order or organization in a system increases, the entropy decreases. This is because a highly ordered system has less randomness or uncertainty.

What are some real-world examples of entropy?

Examples of entropy can be seen in everyday life, such as a messy room, a disorganized desk, or a pile of tangled wires. In these situations, the level of disorder or randomness is high, indicating a high level of entropy.

How is entropy used in different fields of science?

Entropy is used in various fields of science, including physics, chemistry, biology, and information theory. It is a fundamental concept in thermodynamics and is used to understand the behavior of systems, such as heat engines and chemical reactions. In information theory, entropy is used to measure the amount of information contained in a message or signal. It is also used in biology to describe the level of disorder or randomness in biological systems.

Similar threads

Replies
16
Views
1K
Replies
6
Views
1K
Replies
8
Views
1K
Replies
2
Views
1K
Replies
4
Views
2K
Replies
1
Views
2K
Replies
4
Views
1K
Replies
7
Views
2K
Back
Top