Probability Review - Expectations

In summary, the homework statement is that the student is trying to review basic probability. A and B are independent random variables, uniform distribution on [0,1]. Find: E(min(A,B))
  • #1
spitz
60
0

Homework Statement




I'm trying to review basic probability; haven't looked at it in a couple of years. Am I on the right track here?

A and B are independent random variables, uniform distribution on [0,1]. Find: E(min(A,B))

2. The attempt at a solution

[tex]\displaystyle\int_{0}^{1}\int_{0}^{a}b\,db\,da + \displaystyle\int_{0}^{1} \int_{a}^{1}a\,db\,da[/tex]

[tex]=\displaystyle\int_{0}^{1}\frac{a^2}{2}\,da+\int_{0}^{1}a-a^2\,da[/tex]

[tex]=1/6+3/6-2/6=1/3[/tex]
 
Physics news on Phys.org
  • #2
Looks good to me.
 
  • #3
Thanks. I also need to find [tex]E(|A-B|)[/tex] and [tex]E((A+B)^2)[/tex]

For the second one: [tex]E((A+B)^2)=E(A)+2E(A)E(B)+E(B)[/tex] and so on ...

Can somebody give me a hint for: [tex]E(|A-B|)[/tex]
 
Last edited:
  • #4
You'll need to break the integral up into two regions again, A<B and B>A. In one region, |A-B| = A-B, and in the other, |A-B| = B-A.
 
  • #5
spitz said:
Thanks. I also need to find [tex]E(|A-B|)[/tex] and [tex]E((A+B)^2)[/tex]

For the second one: [tex]E((A+B)^2)=E(A)+2E(A)E(B)+E(B)[/tex] and so on ...

Can somebody give me a hint for: [tex]E(|A-B|)[/tex]

The claim [tex]E((A+B)^2)=E(A)+2E(A)E(B)+E(B)[/tex] is false. For general bivariate (A,B) the correct result is [tex] E (A+B)^2 = E(A^2) + E(B^2) + 2E(AB).[/tex] If A and B happen to be independent (or, at least, uncorrelated) then we have [itex] E(AB) = E(A) \cdot E(B), [/itex] but for general (A,B) this fails. More generally, if A has variance [itex]\sigma_A^2[/itex], B has variance [itex] \sigma_B^2[/itex] and the pair (A,B) has covariance [itex] \sigma_{AB},[/itex] then
[tex] E(A+B)^2 = \mbox{Var}(A+B) + (EA + EB)^2 = \sigma_A^2 + \sigma_B^2 + 2 \sigma_{AB} + (EA + EB)^2. [/tex]

RGV
 
  • #6
Oh yes, I forgot to square [itex]A[/itex] and [itex]B[/itex]. For this problem they are independent (guess I should have mentioned that). So:
[tex]E((A+B)^2)=E(A^2)+E(B^2)+2E(A)B(A)[/tex]
 

FAQ: Probability Review - Expectations

What is the definition of expectation in probability?

The expectation in probability is the weighted average of all possible outcomes of a random variable, with each outcome being multiplied by its respective probability. It represents the predicted value or long-term average of a random variable.

How is expectation calculated in probability?

To calculate the expectation in probability, you multiply each possible outcome of a random variable by its respective probability, and then add all of these values together. This can also be represented mathematically as E(X) = ∑xP(x), where x represents each possible outcome and P(x) represents the probability of that outcome.

What is the relationship between expectation and variance?

The expectation and variance are both measures of central tendency and dispersion, respectively, in a probability distribution. The expectation represents the average value of a random variable, while the variance represents how spread out the values of the random variable are from the expectation.

How does the Law of Large Numbers relate to expectation?

The Law of Large Numbers states that as the number of trials or observations increases, the average of those values will approach the expected value. This means that in the long run, the observed outcomes will closely match the expected value. In other words, the larger the sample size, the more reliable and accurate the expectation will be.

Can expectation be negative in probability?

Yes, it is possible for the expectation to be negative in probability. This can occur when the possible outcomes of a random variable have a higher probability of being negative rather than positive. However, the expectation is still a useful measure of central tendency in this case, as it represents the average value of the random variable.

Back
Top