Joint Probability/Mutual Information Normalization

In summary: P(x) and P(y). These can be found by integrating the joint distribution over the other variable. For example, to find P(x), we can integrate the joint distribution over y:P(x) = \\int\\p(x,y)dySubstituting in the joint Gaussian distribution and using the value of \\alpha found above, we get:P(x) = \\frac\\(1}{2\\pi\\sqrt\\(ab\\)\\sqrt\\(c^2\\)}\\int exp(-(ax^2/2))exp(-(by^2/2))dyUsing the known properties of the Gaussian distribution, we can solve this integral to
  • #1
Nahtix
2
0

Homework Statement


Find the Corresponding normalization constants for the joint guassian

[tex]\\p\\(x,y) \alpha\ exp(-(ax^2/2) - (by^2/2) -cxy) [/tex]

ie: find P(x), P(y), and P(X,Y) normalization constants


relevant equations:
Mutual Information: M(x,y)
[tex]\\M(x,y) = \sum\\p(x,y)ln(p(x,y)/(p(x)*p(y)))[/tex]

The Attempt at a Solution


I've tried an integral over y then x but I end up with some messed up erf that I can't get rid of. My professor told me that the real point of this is that later we can use mutual information to get a ratio of the constants and that should equal [tex]\\1/2\\ln(1-c^2/ab)[/tex] but I haven't gotten anywhere close. I really just need a hint of where to start on this one as I have gotten nowhere with it any help is very appreciated.
 
Physics news on Phys.org
  • #2




To find the normalization constants for a joint Gaussian distribution, we can use the fact that the integral of the joint distribution over all possible values of x and y must equal 1. This is because the probability of any event occurring in a probability distribution must be equal to 1. Therefore, we can set up the following integral:

\\int\\int\\p(x,y)dx dy = 1

We can then substitute the given joint Gaussian distribution into this integral:

\\int\\int\\alpha exp(-(ax^2/2) - (by^2/2) -cxy) dx dy = 1

Next, we can use the properties of exponential functions to simplify this integral:

\\alpha\\int\\intexp(-(ax^2/2) - (by^2/2) -cxy) dx dy = 1

We can then use the fact that the exponential function can be written as the product of two exponential functions to split the integral into two separate integrals:

\\alpha\\int\\intexp(-(ax^2/2))exp(-(by^2/2))exp(-cxy) dx dy = 1

Now, we can use the fact that the integral of a product of functions is equal to the product of the integrals of each function to solve this integral:

\\alpha\\int\\intexp(-(ax^2/2))dx\\int\\intexp(-(by^2/2))dy\\int\\intexp(-cxy)dxdy = 1

The integrals of the exponential functions can be easily solved using known properties of Gaussian distributions. The integral of exp(-(ax^2/2))dx is equal to \\sqrt\\(2\\pi/a\\) and the integral of exp(-(by^2/2))dy is equal to \\sqrt\\(2\\pi/b\\). The integral of exp(-cxy)dxdy is equal to \\sqrt\\(2\\pi/c^2\\). Therefore, we can substitute these values into the equation and solve for the normalization constant \\alpha:

\\alpha\\sqrt\\(2\\pi/a\\)\\sqrt\\(2\\pi/b\\)\\sqrt\\(2\\pi/c^2\\) = 1

\\alpha = \\frac\\(1}{2\\pi\\sqrt\\(ab\\)\\sqrt\\(c^2\\)}

We can then use
 

FAQ: Joint Probability/Mutual Information Normalization

1. What is joint probability?

Joint probability is a statistical measure that represents the likelihood of two or more events occurring together. It is calculated by multiplying the individual probabilities of each event.

2. What is mutual information normalization?

Mutual information normalization is a technique used to measure the amount of information shared between two random variables. It involves dividing the mutual information by the maximum possible value it could have, in order to obtain a value between 0 and 1.

3. How is joint probability used in data analysis?

Joint probability is commonly used in data analysis to understand the relationship between two or more variables. It can help identify patterns and correlations, and can be used to make predictions about future events.

4. What are the benefits of using mutual information normalization?

Mutual information normalization has several benefits, including being able to compare the strength of relationships between different pairs of variables, regardless of the units or scales used. It also allows for easier interpretation and comparison of results.

5. Can joint probability and mutual information normalization be applied to non-numerical data?

Yes, joint probability and mutual information normalization can be applied to both numerical and non-numerical data. However, the calculations may differ depending on the type of data being analyzed.

Back
Top