Are W and Z equal as random variables and do they have equal expected values?

In summary, equality of random variables refers to the concept that two or more random variables have the same statistical properties. This includes having the same probability distribution, expected value, and variance. It is different from independence, which simply means there is no relationship between two variables. Two variables can be equal without being independent. Two random variables with different distributions cannot be considered equal. Equality of random variables is a transitive property, meaning that if variable A is equal to variable B, and variable B is equal to variable C, then variable A is also equal to variable C. There are various statistical tests that can be used to determine if two random variables are equal, such as the Kolmogorov-Smirnov test, chi-square test,
  • #1
kingwinner
1,270
0
Suppose the random varaible Y has non-zero probability at 0,1,2,3,... (i.e. the support of Y is the set of non-negative integers).

Define a random variable W:
W=0 ,if Y=0,1,2,or 3
--=Y-3 ,if Y=4,5,...

Define a random variable Z:
Z=max{0,Y-3}=0 ,if Y≦3
--------------=Y-3 ,if Y>3

And I have 2 questions...

1) Can I say that W and Z are equal as random variables (i.e. W=Z) ?
(what is bothering me is that W is undefined at e.g. Y=0.5, Y=2.2, etc. while Z is defined everywhere, my notes say that W and Z are equal random varaibles, but I just struggle to understand why)

2) Is it true that E(W)=E(Z) ?

Hopefully someone can clarify this! Thank you!
 
Physics news on Phys.org
  • #2
I'm having a little trouble following your specifications. For example, Y can't have a uniform distribution if its defined over all the non negative integers. Do you know why? If you have a binomial distribution, how do you define the mean (np) when n is infinite?

In general, probability distributions are defined by their mass functions (discrete) or density functions (continuous). Any given mass function or density function is defined by its parameters. For example, a Gaussian distribution can be completely defined by the values of its first two moments: the mean and the variance. Specifications of additional moments will define variations from the standard Gaussian (skewness, kurtosis). If random variables have the same distribution over the same range of values we say they are identically distributed. Because they are random variables, we don't say they are "equal" because, by definition, the values they take cannot be predicted precisely.
 
Last edited:
  • #3
hmm...I don't think I said it's uniformly distributed.

Both W and Z are functions of the same Y.
W=g(Y)
Z=h(Y)

And I'm asking if we can say that W=Z.
(what is bothering me is that W is undefined at e.g. Y=0.5, Y=2.2, etc. while Z is defined everywhere, my notes say that W and Z are equal random varaibles, but I just struggle to understand why)

Thanks!
 
  • #4
kingwinner said:
hmm...I don't think I said it's uniformly distributed.

Both W and Z are functions of the same Y.
W=g(Y)
Z=h(Y)

And I'm asking if we can say that W=Z.
(what is bothering me is that W is undefined at e.g. Y=0.5, Y=2.2, etc. while Z is defined everywhere, my notes say that W and Z are equal random varaibles, but I just struggle to understand why)

Thanks!

I don't know the source of your notes, but random variables by definition take unpredictable values according to a probability distribution. If two random variables have the same distribution, we say they are identically distributed. On what basis would they be equal?

You show W and Z as two different functions of Y. On what basis do you say W=Z even if they were not random variables?

You say Y is a random variable over the (infinite) set of the non negative integers. This implies a discrete distribution. It doesn't matter whether the distribution is uniform or not. I also showed how a binomial distribution would have to have an infinite mean if defined over the set of non negative integers.

Note that if you defined Y in terms of a uniform, binomial or Poisson distribution for some finite n or k, this last point would not be a problem. This would define a finite subset of the set of non negative integers.
 
Last edited:
  • #5
kingwinner said:
hmm...I don't think I said it's uniformly distributed.

Both W and Z are functions of the same Y.
W=g(Y)
Z=h(Y)

And I'm asking if we can say that W=Z.
(what is bothering me is that W is undefined at e.g. Y=0.5, Y=2.2, etc. while Z is defined everywhere, my notes say that W and Z are equal random varaibles, but I just struggle to understand why)

Thanks!

Yes there isn't enough information to say that W=Z, all we have is that P[W=Z]=1, i.e. W is almost surely equal to Z.

For a counterexample simply augment the event space \Omega with a point w such that P[{w}]=0 and set Y(w) and W(Y(w)) to whatever value you like.
 
  • #6
bpet said:
Yes there isn't enough information to say that W=Z, all we have is that P[W=Z]=1, i.e. W is almost surely equal to Z.

For a counterexample simply augment the event space \Omega with a point w such that P[{w}]=0 and set Y(w) and W(Y(w)) to whatever value you like.

OK. This means we are talking about P(W=Z) = 0 for continuous random variables. However, the fact that P=0 does not mean it is impossible. P(W = Z) converges to 0 over the real interval [0,1]. For discrete distributions, the value of P(W=Z) depends on the value of n (in the uniform and/or binomial case) which is why I made such a point of how the distribution of Y is defined. If you define both W and Z to be 0 (as the OP seems to have done for certain values of Y, although I have problems with how the OP defined this), then of course P(W=Z)=1 for those certain values of Y.
 
Last edited:

FAQ: Are W and Z equal as random variables and do they have equal expected values?

1. What is the definition of equality of random variables?

Equality of random variables refers to the concept that two or more random variables have the same statistical properties and thus are considered equivalent. This means that they have the same probability distribution, expected value, and variance.

2. How is equality of random variables different from independence?

Equality of random variables implies that the variables have the same statistical properties, while independence refers to the lack of a relationship between two variables. Two variables can be equal without being independent, and vice versa. In other words, equality of random variables is a stronger condition than independence.

3. Can two random variables with different distributions be considered equal?

No, two random variables with different distributions cannot be considered equal. Equality of random variables requires that they have the same probability distribution, meaning that they have the same likelihood of taking on certain values. If the distributions are different, the variables cannot be considered equal.

4. Is equality of random variables a transitive property?

Yes, equality of random variables is a transitive property. This means that if variable A is equal to variable B, and variable B is equal to variable C, then variable A is also equal to variable C. This follows from the fact that equality of random variables requires that they have the same statistical properties, which is a transitive relationship.

5. How can we test for equality of random variables?

There are various statistical tests that can be used to determine if two random variables are equal. These include the Kolmogorov-Smirnov test, the chi-square test, and the t-test. These tests compare the observed data from the two variables to the expected values based on their respective distributions, and can provide a p-value to determine the likelihood of the variables being equal.

Back
Top