Entropy & Information Content: Examining the Difference

In summary, entropy in the context of information theory refers to the amount of uncertainty or randomness in a system. It is often used to measure the information content of a message or signal, with higher entropy indicating a greater amount of information. It is a more technical term than information content, which can refer to any form of data or content.
  • #1
louislaolu
15
0
What does entropy in the following sentence means? Does it mean the same as the term "information content" before it? Is entropy more technical a term than information content?

He remembered taking a class in information theory as a third-year student in college. The professor had put up two pictures: One was the famous Song Dynasty painting, full of fine, rich details; the other was a photograph of the sky on a sunny day, the deep blue expanse broken only by a wisp of cloud that one couldn't even be sure was there. The professor asked the class which picture contained more information. The answer was that the photograph's information content--its entropy exceeded the painting's by one or two orders of magnitude.
 
Last edited:
Physics news on Phys.org
  • #2
Google for a formal definition of entropy in the context of information theory.
 
  • Like
Likes sophiecentaur
  • #3
Borek said:
Google for a formal definition of entropy in the context of information theory.
... or Shannon entropy.
 
  • Like
Likes sophiecentaur and Borek

Similar threads

  • Special and General Relativity
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Classical Physics
Replies
18
Views
2K
  • Programming and Computer Science
Replies
9
Views
3K
  • Quantum Physics
Replies
2
Views
1K
Replies
62
Views
13K
Replies
2
Views
980
  • Sticky
  • General Discussion
Replies
0
Views
279
Replies
7
Views
2K
Replies
10
Views
2K
Back
Top