Property of independent random variables

In summary, the probability mass function for a discrete random variable is given by fZ(z) = Pr[Z=z], where Pr[Z=z] is the probability that Z takes the value z. The index of summation is used to find the value of z at which the sum of X and Y is equal to z.
  • #1
alfred2
10
0
hello!

I'm trying to understand the following property:
Let X and Y be independent random variables z: = X + Y. Then
http://imageshack.us/a/img268/9228/71pe.png
where fZ (z) is the probability mass function for a discrete random variable defined as follows:

http://imageshack.us/a/img801/9218/q71.png

and Wx is the set of possible values ​​for the random variable X

Proof: Using the Law of total probability, which is this:
http://imageshack.us/a/img198/8461/qbn.png
we obtain
http://imageshack.us/a/img855/2067/mbd4.png
Question-I don't see how Law of total probability helps us. I even don't understand how we get the first line of the proof-End of Question

Then my book proposes this example: We roll two dices. Let X/Y random variables, which indicate the number of points in the first/second die. We calculate the density of Z: = X + Y:
http://imageshack.us/a/img10/7541/5fx.png
For 2 <= z <= 7 we obtain
http://imageshack.us/a/img17/9047/stvo.png
And 7 <z <= 12:
http://imageshack.us/a/img90/5522/fuz.png
Here I just get that:
http://imageshack.us/a/img593/7442/ok0i.png
But I do not understand why we have this index of summation nor why we put min and max in the next step
Could you help me please? Thank you!
 
Mathematics news on Phys.org
  • #2
alfred said:
Then my book proposes this example: We roll two dices. Let X/Y random variables, which indicate the number of points in the first/second die. We calculate the density of Z: = X + Y:
http://imageshack.us/a/img10/7541/5fx.png
For 2 <= z <= 7 we obtain
http://imageshack.us/a/img17/9047/stvo.png
And 7 <z <= 12:
http://imageshack.us/a/img90/5522/fuz.png
Here I just get that:
http://imageshack.us/a/img593/7442/ok0i.png
But I do not understand why we have this index of summation nor why we put min and max in the next step
Could you help me please? Thank you!
The number of spots on the first die is $x$, and on the second die is $y$. You want to know when the sum $x+y$ (the total number of spots on the two dice) is equal to $z$.

If the sum of the spots on the two dice is to be $z$, given that $X=x$, then obviously $x$ must be at least $1$ (because that is the smallest possible value for $x$). But also $x$ must be at least $z-6$ (because $y = z-x$, and $y$ cannot be larger than $6$). Thus we must have $x\geqslant \max\{1,z-6\}$. Next, $x$ cannot be bigger than $6$ (obviously), but also $x$ must not be bigger than $z-1$ (because $y = z-x$, and $y$ cannot be less than $1$). Thus we must have $x\leqslant \min\{6,z-1\}$. Provided both those conditions hold, there will then be a probability of $1/6$ that $y$ will be equal to $z-x$. That is where the expression \(\displaystyle \sum_{x=\max\{1,z-6\}}^{\min\{6,z-1\}}\frac1{36}\) comes from.

I think that if you follow this example carefully, you will begin to see how the proof in the first part of your post works.
 
  • #4
alfred said:
Okay, so we have got as far as \(\displaystyle \text{Pr}[Z=z] = \frac16 \sum_{x=\max\{1,z-6\}}^{\min\{6,z-1\}}\text{Pr}[Y= z-x]\). In that formula, $z$ is fixed. Once we are inside the summation, $x$ is also fixed, because at that stage we are dealing with what happens for a particular value of $x$. So $\text{Pr}[Y= z-x]$ is the probability that $y$ takes the fixed value $z-x$. And of course the probability that $y$ takes any given single value (in the range 1,...,6) is 1/6.

That gives the formula \(\displaystyle \text{Pr}[Z=z] = \sum_{x=\max\{1,z-6\}}^{\min\{6,z-1\}}\frac1{36} \). Thus each term in the sum is equal to 1/36, and to evaluate the sum we have to multiply 1/36 by the number of terms. If $z$ lies between 2 and 7, then $\max\{1,z-6\} = 1$ and $\min\{6,z-1\} = z-1$. So the sum goes from $x=1$ to $x=z-1$. Hence there are $z-1$ terms in the sum, and since each term is equal to 1/36, the sum is \(\displaystyle \frac{z-1}{36}\). In a similar way, you should be able to work out that if $z$ lies between 7 and 12 then the number of terms in the sum is $13-z$ and so their sum is \(\displaystyle \frac{13-z}{36}\).
 
  • #5
Opalg said:
Okay, so we have got as far as \(\displaystyle \text{Pr}[Z=z] = \frac16 \sum_{x=\max\{1,z-6\}}^{\min\{6,z-1\}}\text{Pr}[Y= z-x]\). In that formula, $z$ is fixed. Once we are inside the summation, $x$ is also fixed, because at that stage we are dealing with what happens for a particular value of $x$. So $\text{Pr}[Y= z-x]$ is the probability that $y$ takes the fixed value $z-x$. And of course the probability that $y$ takes any given single value (in the range 1,...,6) is 1/6.

That gives the formula \(\displaystyle \text{Pr}[Z=z] = \sum_{x=\max\{1,z-6\}}^{\min\{6,z-1\}}\frac1{36} \). Thus each term in the sum is equal to 1/36, and to evaluate the sum we have to multiply 1/36 by the number of terms. If $z$ lies between 2 and 7, then $\max\{1,z-6\} = 1$ and $\min\{6,z-1\} = z-1$. So the sum goes from $x=1$ to $x=z-1$. Hence there are $z-1$ terms in the sum, and since each term is equal to 1/36, the sum is \(\displaystyle \frac{z-1}{36}\). In a similar way, you should be able to work out that if $z$ lies between 7 and 12 then the number of terms in the sum is $13-z$ and so their sum is \(\displaystyle \frac{13-z}{36}\).

Thank you very much! I understand it with your explanation. =D
 

FAQ: Property of independent random variables

What does it mean for two random variables to be independent?

When two random variables are independent, it means that the outcome of one variable does not affect the outcome of the other variable. In other words, the occurrence or value of one variable does not influence the probability of the other variable.

How do you determine if two random variables are independent?

To determine if two random variables are independent, you can use the definition of independence. This means that the joint probability of the two variables is equal to the product of their individual probabilities. If this is true, then the variables are independent. Additionally, you can also check for correlation between the variables. If there is no correlation, then the variables are likely independent.

Can two dependent random variables be considered independent in certain cases?

No, two dependent random variables cannot be considered independent in any cases. By definition, dependent variables are influenced by each other and therefore cannot be considered independent.

How does independence impact statistical analyses?

When working with independent random variables, statistical analyses can be simplified. This is because the variables do not affect each other, allowing for separate analyses to be conducted. Additionally, the properties of independence allow for certain mathematical operations to be performed, making statistical calculations easier.

Is independence the same as uncorrelated?

No, independence and uncorrelated are not the same. As mentioned before, independence refers to the lack of influence between two variables. Uncorrelated, on the other hand, refers to the lack of linear relationship between two variables. Two variables can be uncorrelated but still be dependent, and vice versa.

Similar threads

Back
Top