- #1
lukephysics
- 60
- 15
- TL;DR Summary
- why do they say things have entropy such as ‘the early universe has low entropy’ when they don’t specify who is the observer and what thing are they predicting?
I always got a bit confused when listening to podcasts about arrow of time and entropy in the universe. So I was reading more about information theory. I learned today that for physical systems entropy is not defined. All it means is how much uncertainty an observer has when making a prediction about something of particular interest to them.
So why do they say things have entropy such as ‘the early universe has low entropy’ when they don't say who is the observer and what thing are they predicting?
Another example is entropy in chemical reactions. Is that a different definition of entropy? Or is it fundamentally the same?
So why do they say things have entropy such as ‘the early universe has low entropy’ when they don't say who is the observer and what thing are they predicting?
Another example is entropy in chemical reactions. Is that a different definition of entropy? Or is it fundamentally the same?
Last edited: