Sum of two independent uniform random variables

In summary, the limits of 1 < z < 2 are necessary in order to determine the conditions for the integrand to be 0, which is crucial in solving the integral. This is based on the conditions of 0 < y < 1 and y < z, or 0 < y < 1 and z - 1 < y < 1, depending on the value of z.
  • #1
Pixel08
3
0
Hi,

http://www.dartmouth.edu/~chance/teaching_aids/books_articles/probability_book/Chapter7.pdf (see page 8, sum of two independent random variables).

I don't understand why they had to go further into the limits, 1 < z < 2. Why do they have to do that? And also, where did they get it from?

Can someone explain why? I've been looking at it for several hours now! :(
 
Physics news on Phys.org
  • #2
Hi Pixel08! :smile:
Pixel08 said:
I don't understand why they had to go further into the limits, 1 < z < 2. Why do they have to do that?

In ∫ f(z-y) dy, the integrand is 0 unless 0 < y < 1 and 0 < z - y < 1

The second condition is the same as y < z and y > z - 1

If z < 1, we can ignore the last condition (because z - 1 < 0), so that's 0 < y < 1 and y < z, ie 0 < y < z

If z > 1, we can ignore y < z (because y < 1 anyway), so that's 0 < y < 1 and y > z - 1, ie z - 1 < y < 1 :wink:
 

FAQ: Sum of two independent uniform random variables

What is the formula for calculating the sum of two independent uniform random variables?

The formula for calculating the sum of two independent uniform random variables is: X + Y = U(a + b), where X and Y are the two random variables with uniform distributions on the interval [a, b].

What is the probability distribution of the sum of two independent uniform random variables?

The probability distribution of the sum of two independent uniform random variables is a triangular distribution, with a peak at the midpoint between the minimum and maximum values of the two variables.

What is the mean of the sum of two independent uniform random variables?

The mean of the sum of two independent uniform random variables is the sum of the means of the two individual variables. In other words, E(X + Y) = E(X) + E(Y).

What is the variance of the sum of two independent uniform random variables?

The variance of the sum of two independent uniform random variables is the sum of the variances of the two individual variables. In other words, Var(X + Y) = Var(X) + Var(Y).

How can the central limit theorem be applied to the sum of two independent uniform random variables?

According to the central limit theorem, as the number of independent random variables increases, the sum of those variables will approach a normal distribution. Therefore, for a sufficiently large number of independent uniform random variables, the sum will also approach a normal distribution.

Similar threads

Replies
10
Views
3K
Replies
10
Views
5K
Replies
12
Views
1K
Replies
16
Views
2K
Replies
5
Views
2K
Back
Top