Concavity of Entropy: Is it True?

  • Thread starter emma83
  • Start date
  • Tags
    Entropy
In summary, Shannon entropy is a concave function defined as H(X)=-\sum_{x}p(x)\log p(x). Conditional Shannon entropy is defined as H(X|Y)=\sum_{y} p(y) H(X|Y=y)=-\sum_{y} p(y)\sum_{x}p(x|Y=y)\log p(x|Y=y). It can be deduced that \sum_{y} p(y)H(X|Y=y)\geq H(X|Y=y) because of the concavity of Shannon entropy. The constraints of p(y) should be considered when solving this problem.
  • #1
emma83
33
0

Homework Statement


Shannon entropy is a concave function defined as follows:
[tex]H(X)=-\sum_{x}p(x)\log p(x)[/tex]

Conditional Shannon entropy is defined as follows:
[tex]H(X|Y)=\sum_{y} p(y) H(X|Y=y)=-\sum_{y} p(y)\sum_{x}p(x|Y=y)\log p(x|Y=y)[/tex]

Can we deduce that:
[tex]\sum_{y} p(y)H(X|Y=y)\geq H(X|Y=y)[/tex]

Homework Equations


The Attempt at a Solution


I would say yes because of the concavity but I am confused with the 2 random variables.
 
Physics news on Phys.org
  • #2
I'm assuming the RHS of your final expression is also meant to be summed over [tex]y[/tex], otherwise it doesn't make much sense...

The way to go about this is to think about the nature of [tex]p(y)[/tex]. What constraints do you know about the values that [tex]p(y)[/tex] can take?

PS Do you really mean [tex]\geq[/tex] in the last line, or do you mean [tex]\leq[/tex]?
 

FAQ: Concavity of Entropy: Is it True?

What is concavity of entropy?

Concavity of entropy refers to the property of entropy, a measure of disorder or randomness in a system, to be a concave function. This means that as the system becomes more disordered, the rate at which entropy increases slows down.

Is it true that entropy always increases?

Yes, according to the second law of thermodynamics, the total entropy of a closed system (one that does not exchange matter or energy with its surroundings) always increases over time. This is because the natural tendency of systems is to become more disordered and spread out.

How does concavity of entropy relate to the second law of thermodynamics?

The concavity of entropy is a mathematical representation of the second law of thermodynamics. It shows that the rate at which entropy increases slows down as a system becomes more disordered, eventually approaching a maximum value.

Can entropy ever decrease?

In a closed system, entropy can never decrease. However, it is possible for the entropy of a specific part of a system to decrease, as long as there is an overall increase in the total entropy of the system.

How is concavity of entropy important in scientific research?

Concavity of entropy is important in understanding and predicting the behavior of complex systems, such as chemical reactions, biological processes, and thermodynamic systems. It also has applications in fields such as information theory and statistical mechanics.

Similar threads

Replies
1
Views
1K
Replies
8
Views
1K
Replies
6
Views
1K
Replies
22
Views
1K
Back
Top