Entropy question for 2 input 1 output system

In summary, the conversation revolves around a question regarding the solution to a problem, specifically the line that states H(Y|X) = H(Z|X). The person is wondering why this equality holds and mentions that in order for it to hold, p(Y,X) must equal p(Z,X). They are seeking help in showing this and also mention a potential thread name change for the conversation.
  • #1
perplexabot
Gold Member
329
5
Hey all. I have a question regarding the solution to a question, both shown below (only part of the solution is shown). Specifically the line that states: H(Y|X) = H(Z|X). Why does this equality hold? Expanding and using the definition of entropy, I can see that for the above equality to hold, p(Y,X) must equal p(Z,X), right? I am not able to show this. Please some help will be greatly appreciated.

Here is the question with part of the solution:
2015-05-25-222302_1366x768_scrot.png
 
Engineering news on Phys.org
  • #2
Hey again! I found the answer to my question, here it is:
temp_ans.png


PS: Maybe a better name for this thread would have been "Entropy of sum question." I would have renamed it if I could.
Thanks for reading.
 

FAQ: Entropy question for 2 input 1 output system

What is entropy in a 2 input 1 output system?

Entropy is a measure of the amount of disorder or randomness in a system. In a 2 input 1 output system, entropy is a measure of the randomness in the inputs and the output of the system.

How is entropy calculated in a 2 input 1 output system?

Entropy is typically calculated using the Shannon entropy formula, which takes into account the probability of each input and output occurring in the system. It is calculated as -Σ(p_i*log(p_i)), where p_i is the probability of input i or output i occurring.

What is the relationship between entropy and information in a 2 input 1 output system?

Entropy and information are inversely related in a 2 input 1 output system. As entropy increases, the amount of information in the system decreases. This is because high entropy means high disorder or randomness, making it difficult to extract meaningful information from the system.

How does entropy change over time in a 2 input 1 output system?

In a closed system, entropy will tend to increase over time. This is due to the natural tendency of a system to move towards a state of higher disorder. However, in an open system, entropy can decrease if there is a constant influx of ordered energy into the system.

Can entropy be reversed in a 2 input 1 output system?

No, entropy cannot be reversed in a 2 input 1 output system. The second law of thermodynamics states that entropy always increases in a closed system. While it is possible to decrease entropy in an open system, it would require a constant input of energy and effort to maintain this state.

Similar threads

Replies
3
Views
3K
Replies
4
Views
1K
Replies
10
Views
859
Replies
10
Views
2K
Replies
10
Views
4K
Replies
1
Views
2K
Back
Top