Relation between Mutual information and Expectation Values

In summary, the relation between mutual information and expectation values highlights how mutual information quantifies the amount of information shared between two random variables, while expectation values represent the average outcomes of random variables. Mutual information can be expressed in terms of expectation values, showing how the expected values of joint distributions relate to the individual distributions of the variables. This connection emphasizes the role of mutual information in understanding dependencies and correlations between variables within probabilistic frameworks.
  • #1
blueinfinity
1
0
Homework Statement
Alice and Bob share the Bell state
\begin{align*}
|\psi\rangle = \frac{1}{\sqrt{2}}(|00\rangle+|11\rangle).
\end{align*}
Consider the pair of observables
\begin{align*}
\mathcal{O}_A =
\begin{pmatrix}
1 & 0 \\ 0 & \frac{1}{2}
\end{pmatrix}
, \qquad \mathcal{O}_B =
\begin{pmatrix}
1 & 0 \\ 0 & \frac{1}{3}
\end{pmatrix}
.
\end{align*}
Show the mutual information between Alice and Bob is larger than $(\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle - \langle\psi |\mathcal{O}_A|\psi \rangle \langle\psi |\mathcal{O}_B|\psi \rangle)^2 $
Relevant Equations
\begin{align*}
|\psi\rangle = \frac{1}{\sqrt{2}}(|00\rangle+|11\rangle).
\end{align*}
Consider the pair of observables
\begin{align*}
\mathcal{O}_A =
\begin{pmatrix}
1 & 0 \\ 0 & \frac{1}{2}
\end{pmatrix}
, \qquad \mathcal{O}_B =
\begin{pmatrix}
1 & 0 \\ 0 & \frac{1}{3}
\end{pmatrix}
.
\end{align*}
I've make progress in obtaining the values for the mutual information using the following:
$I(\rho_A:\rho_B) = S(\rho_A) +S(\rho_B) - S(\rho_{AB}) = 1 + 1 - 0 = 2.$

I would like to compute the expectation but I'm facing a problem in the case of $\langle\psi |\mathcal{O}_A|\psi \rangle$ since the size of matrices in this multiplication do not match. namely, $\langle\psi$ is of size $1\times 4$ and $|\psi \rangle$ is of size $4\times 1$ and the matrix $\mathcal{O}_A$ is $2 \times 2$.
I'm very new to the subject and I would greatly appreciate if I could have some guidance on how the computations for this expectation would be carried out.

additionally I have computed the $\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle$ by first computing the tensor product of the two matrices $A,B$ and then taken the multiplication with the Bra and Ket of the state respectively deducing
$$\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle = \frac{7}{12}$$.

I would appreciate any insight on this.
 
Physics news on Phys.org
  • #2
:welcome:

You need two hashes as the delimiter for inline Latex.
 
  • #3
Here it is edited:
Show the mutual information between Alice and Bob is larger than ##(\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle - \langle\psi |\mathcal{O}_A|\psi \rangle \langle\psi |\mathcal{O}_B|\psi \rangle)^2 ##

I've make progress in obtaining the values for the mutual information using the following:
##I(\rho_A:\rho_B) = S(\rho_A) +S(\rho_B) - S(\rho_{AB}) = 1 + 1 - 0 = 2.##

I would like to compute the expectation but I'm facing a problem in the case of ##\langle\psi |\mathcal{O}_A|\psi \rangle## since the size of matrices in this multiplication do not match. namely, ##\langle\psi## is of size ##1\times 4## and ##|\psi \rangle## is of size ##4\times 1## and the matrix ##\mathcal{O}_A## is ##2 \times 2##.
I'm very new to the subject and I would greatly appreciate if I could have some guidance on how the computations for this expectation would be carried out.

additionally I have computed the ##\langle\psi | \mathcal{O}_A \otimes \mathcal{O}_B| \psi\rangle## by first computing the tensor product of the two matrices ##A,B## and then taken the multiplication with the Bra and Ket of the state respectively deducing...

I think that for ##\langle\psi |\mathcal{O}_A|\psi \rangle## you calculate ##\langle\psi | \mathcal{O}_A \otimes \mathcal I| \psi\rangle##, and for ##\langle\psi |\mathcal{O}_B|\psi \rangle##, it is ##\langle\psi |\mathcal I \otimes \mathcal{O}_B | \psi\rangle##.

(But I am new to it, too.)
 
Last edited:
  • Like
Likes blueinfinity

FAQ: Relation between Mutual information and Expectation Values

What is mutual information?

Mutual information is a measure of the amount of information that one random variable contains about another random variable. It quantifies the reduction in uncertainty about one variable given knowledge of the other and is used in various fields such as information theory, statistics, and machine learning.

How is mutual information related to expectation values?

Mutual information is related to expectation values through the concept of entropy, which itself is an expectation value. Specifically, mutual information can be expressed as the difference between the sum of the individual entropies of two variables and their joint entropy. Since entropy is an expected value of the information content, mutual information is inherently tied to expectation values.

How do you calculate mutual information using expectation values?

To calculate mutual information using expectation values, you can use the formula: \( I(X; Y) = E[\log(\frac{P(X,Y)}{P(X)P(Y)})] \), where \( P(X,Y) \) is the joint probability distribution of \( X \) and \( Y \), and \( P(X) \) and \( P(Y) \) are the marginal probability distributions. This formula leverages the expectation of the logarithm of the ratio of joint probability to the product of marginal probabilities.

Why is mutual information important in the context of expectation values?

Mutual information is important because it provides a way to quantify the dependency between variables using the concept of expectation values. It allows researchers to understand and measure the amount of shared information between variables, which can be crucial in fields like machine learning for feature selection, in neuroscience for understanding brain connectivity, and in communication systems for optimizing data transmission.

Can mutual information be negative, and what does it imply about expectation values?

Mutual information cannot be negative; it is always non-negative. This is because it is derived from entropy values, which are non-negative expectation values. A mutual information of zero implies that the variables are independent, meaning knowing one variable does not provide any information about the other. Positive mutual information indicates some level of dependency between the variables.

Back
Top