Function of a random variable and conditioning

In summary, the conversation discusses the variables Z, X1, X2, Y, and W and their corresponding distributions. The main goal is to find the conditional distribution of X1 and Y given W. There is a disagreement on the technique to use, with one approach involving a transformation and the other involving a joint distribution. The question remains about the inverse and derivative in the calculation of the probability density of W.
  • #1
Hejdun
25
0
Ok, since nobody answered my last problem, I simplify. :)

Let Z = γ1X1 + γ2X2, where the gammas are just constants
p(Z) = exp(Z)/(1 + exp(Z))
X1 and X2 are bivariate normal and put
Y = α + β1X1 + β2X2 + ε where ε ~ N(0,σ).

Now, we want to find f(p(Z)|X1,Y). In this case, is it legal to do the
operation f(p(Z)|X1,Y)=f(p(Z|X1,Y))?

That is can we write
f(exp(Z|X1,Y)/(1 + exp(Z|X1,Y)))?

Thanks for any help!
/H
 
Physics news on Phys.org
  • #2
Hejdun said:
Let Z = γ1X1 + γ2X2, where the gammas are just constants
p(Z) = exp(Z)/(1 + exp(Z))
X1 and X2 are bivariate normal and put
Y = α + β1X1 + β2X2 + ε where ε ~ N(0,σ).
Let [itex] X_1 [/itex] and [itex] X_2 [/itex] each be bivariate normal random variables
Let [itex] Z = \gamma_1 X_1 + \gamma_2 X_2 [/itex] where [itex] \gamma_1 [/itex] and [itex] \gamma_2 [/itex] are constants.
Let [itex] Y = \alpha + \beta_1 X_1 + \beta_2 X-2 + \epsilon [/itex] where [itex] \alpha,\beta_1,\beta_2 [/itex] are each constants and [itex] \epsilon [/itex] is a normal random variable with mean 0 and standard deviation [itex] \sigma [/itex].

Now, we want to find f(p(Z)|X1,Y).

Is that notation supposed mean you want the probability density function for [itex] Z [/itex] given [itex] X_1 [/itex] and [itex] Y [/itex] ?

In this case, is it legal to do the
operation f(p(Z)|X1,Y)=f(p(Z|X1,Y))?
That is can we write
f(exp(Z|X1,Y)/(1 + exp(Z|X1,Y)))?

I don't know what that notation means. The conditional density function of [itex] Z [/itex] is some function of the variables [itex] Z, X_1,Y [/itex] but what does the notation "exp(Z|X1,Y)" mean?
 
Last edited:
  • #3
Stephen Tashi said:
Is that notation supposed mean you want the probability density function for [itex] Z [/itex] given [itex] X_1 [/itex] and [itex] Y [/itex] ?

I don't know what that notation means. The conditional density function of [itex] Z [/itex] is some function of the variables [itex] Z, X_1,Y [/itex] but what does the notation "exp(Z|X1,Y)" mean?

Yes.


For instance, if we want to know the distribution of p(Z) = exp(Z)/(1 + exp(Z)) and we know the distribution of Z, then we make a simple transformation, put the inverse in the pdf of Z and multiply with the derivative as usual.

However, the problem is finding the disitrbution of p(Z)|Y. My idea was then to put the inverse in the pdf of Z|Y and then multiply with the inverse. I am not sure if my approach is correct, but if you have another suggestion of how to proceed I would be grateful.

/H
 
  • #4
Let's try again:

Let [itex] X_1 [/itex] and [itex] X_2 [/itex] be random variables that have a joint bivariate normal distribution (rather than each of them being bivariate normal).
Let [itex] Z = \gamma_1 X_1 + \gamma_2 X_2 [/itex] where [itex] \gamma_1 [/itex] and [itex] \gamma_2 [/itex] are constants.
Let [itex] W = \exp(Z)/(1 + \exp(Z)) [/itex]
Let [itex] Y = \alpha + \beta_1 X_1 + \beta_2 X_2 + \epsilon [/itex] where [itex] \alpha,\beta_1,\beta_2 [/itex] are each constants and [itex] \epsilon [/itex] is a normal random variable with mean 0 and standard deviation [itex] \sigma [/itex].

Do you want the conditional distribution of [itex] W [/itex] given [itex] X_1 [/itex] and [itex] Y [/itex] ? (Your other thread mentioned a joint distribution instead of conditional distribution and also it mentioned that the final goal was to find an expected value.)
 
  • #5
Stephen Tashi said:
Let's try again:

Let [itex] X_1 [/itex] and [itex] X_2 [/itex] be random variables that have a joint bivariate normal distribution (rather than each of them being bivariate normal).
Let [itex] Z = \gamma_1 X_1 + \gamma_2 X_2 [/itex] where [itex] \gamma_1 [/itex] and [itex] \gamma_2 [/itex] are constants.
Let [itex] W = \exp(Z)/(1 + \exp(Z)) [/itex]
Let [itex] Y = \alpha + \beta_1 X_1 + \beta_2 X_2 + \epsilon [/itex] where [itex] \alpha,\beta_1,\beta_2 [/itex] are each constants and [itex] \epsilon [/itex] is a normal random variable with mean 0 and standard deviation [itex] \sigma [/itex].

Do you want the conditional distribution of [itex] W [/itex] given [itex] X_1 [/itex] and [itex] Y [/itex] ? (Your other thread mentioned a joint distribution instead of conditional distribution and also it mentioned that the final goal was to find an expected value.)

The final goal is to find the conditional distribution of X1 and Y given W. Of course, there are different ways of getting there depending on how you calculate the joint X1, Y, W.

My question in this thread may solve a part of the problem and also the disitrbution of W given X1 and Y.
 
  • #6
Now that the problem is established, help me understand the question about technique.

Hejdun said:
For instance, if we want to know the distribution of p(Z) = exp(Z)/(1 + exp(Z)) and we know the distribution of Z, then we make a simple transformation, put the inverse in the pdf of Z and multiply with the derivative as usual.

Who's inverse and who's derivative are you talking about? Let's say [itex] Z [/itex] has cumulative distribution [itex] F_Z(x) [/itex] with inverse function [itex] {F_Z}^{-1}(x) [/itex]. Using that notation, what is your claim about the probability density (or cumulative distribution) of [itex] W [/itex] ?
 

FAQ: Function of a random variable and conditioning

1. What is a random variable?

A random variable is a variable whose value is determined by the outcome of a random process. It can take on different values, and the probability of each value occurring is defined by a probability distribution.

2. What is the function of a random variable?

The function of a random variable is to describe the relationship between the possible outcomes of a random process and the probabilities of those outcomes occurring. It allows us to model and analyze the behavior of the random process in a mathematical way.

3. How is conditioning related to a random variable?

Conditioning is the process of updating our knowledge or beliefs about a random variable based on new information. In other words, it is the process of revising the probability distribution of a random variable based on additional information.

4. What is the role of a probability distribution in understanding a random variable?

A probability distribution is a mathematical function that assigns probabilities to each possible value of a random variable. It helps us understand the likelihood of different outcomes occurring and allows us to make predictions about the behavior of the random variable.

5. How does the function of a random variable affect its properties?

The function of a random variable can affect its properties in various ways. For example, the function can determine the range of possible values the random variable can take on, the shape of its probability distribution, and its expected value and variance. The function can also determine whether the random variable is discrete or continuous, which can impact the methods used for analysis and modeling.

Similar threads

Replies
30
Views
3K
Replies
1
Views
716
Replies
5
Views
1K
Replies
2
Views
2K
Replies
1
Views
1K
Back
Top