Expectation of a sum of random variables

In summary, the expectation of a sum of random variables states that the expected value of the sum is equal to the sum of their expected values. Mathematically, if \(X_1, X_2, \ldots, X_n\) are random variables, then \(E(X_1 + X_2 + \ldots + X_n) = E(X_1) + E(X_2) + \ldots + E(X_n)\). This property holds true regardless of whether the random variables are independent or not, making it a fundamental concept in probability theory and statistics.
  • #1
docnet
Gold Member
799
486
Homework Statement
Given ##E[A]=0## and ##E[B]=b##, can ##E[(A+B)^2]## be simplified?
Relevant Equations
##E[A]## is the expectation of ##A##.
$$\begin{align*}
E[(A+B)^2]&=E[A^2+2AB+B^2]\\
&=E[A^2]+2E[AB]+E[B^2]\\
&=2E[AB]+E[B^2].
\end{align*}$$

Can the terms ##2E[AB]## and ##E[B^2]## be simplified any more? Thanks, friends.
 
Physics news on Phys.org
  • #2
How did you estimate E[A^2]?
E[ AB ]=E[ A ]E[ B ]
 
  • #3
anuttarasammyak said:
E[AB]=E[A]E How did you estimate ##E[A^2]##?
Oh I have made a mistake.. we do not know that ##E[A^2]=0##.

If ##A## takes values ##-1## and ##1## with equal probability then ##A^2=1## with probability ##1##. So ##E[A^2]=1.##.

Thank you!
 
  • Like
Likes Gavran and anuttarasammyak
  • #4
anuttarasammyak said:
How did you estimate E[A^2]?
E[ AB ]=E[ A ]E[ B ]

This is only valid if [itex]A[/itex] and [itex]B[/itex] are independent (it doesn't hold, for example, when [itex]A = B[/itex]), and we are not told that they are.

Using [tex]\newcommand{\Var}{\operatorname{Var}}\Var(X) = \mathbb{E}[X^2] - (\mathbb{E}[X])^2[/tex], we have [tex]
\mathbb{E}[(A + B)^2] = \Var(A + B) + (\mathbb{E}[A + B])^2 = \Var(A + B) + b^2.[/tex]
 
  • Like
Likes docnet, anuttarasammyak and WWGD
  • #5
pasmith said:
This is only valid if [itex]A[/itex] and [itex]B[/itex] are independent (it doesn't hold, for example, when [itex]A = B[/itex]), and we are not told that they are.

Using [tex]\newcommand{\Var}{\operatorname{Var}}\Var(X) = \mathbb{E}[X^2] - (\mathbb{E}[X])^2[/tex], we have [tex]
\mathbb{E}[(A + B)^2] = \Var(A + B) + (\mathbb{E}[A + B])^2 = \Var(A + B) + b^2.[/tex]
That is extremely useful ! Thanks.
 
  • #6
pasmith said:
This is only valid if [itex]A[/itex] and [itex]B[/itex] are independent (it doesn't hold, for example, when [itex]A = B[/itex]), and we are not told that they are.

Using [tex]\newcommand{\Var}{\operatorname{Var}}\Var(X) = \mathbb{E}[X^2] - (\mathbb{E}[X])^2[/tex], we have [tex]
\mathbb{E}[(A + B)^2] = \Var(A + B) + (\mathbb{E}[A + B])^2 = \Var(A + B) + b^2.[/tex]
I can think further of your example when ##A=B \sim~N(0,1)##, so that ##A^2 \~ sim mathbb Chi^2(1)##, with expectation/expected value ##1##, while ##E[A]=0##, so that ##E[A^2]=1 \neq E[A]E[A]=0.0=0##
 
Last edited:
  • Like
Likes docnet
  • #7
pasmith said:
This is only valid if [itex]A[/itex] and [itex]B[/itex] are independent (it doesn't hold, for example, when [itex]A = B[/itex]), and we are not told that they are.

Using [tex]\newcommand{\Var}{\operatorname{Var}}\Var(X) = \mathbb{E}[X^2] - (\mathbb{E}[X])^2[/tex], we have [tex]
\mathbb{E}[(A + B)^2] = \Var(A + B) + (\mathbb{E}[A + B])^2 = \Var(A + B) + b^2.[/tex]
I can think further of your example when ##A=B \sim\mathbb N(0,1)##, so that ##A^2 \sim \mathbb{ \chi^2(1)}##, with expectation/expected value ##1##, while ##E[A]=0##, so that ##E[A^2]=1 \neq E[A]E[A]=0.0=0##
 
Last edited:
  • Like
Likes docnet
  • #8
WWGD said:
I can think further of your example when ##A=B ## ~##\mathbb N(0,1)##, so that ##A^2 ## ~##\ mathbb \chi^2(1)##, with expectation/expected value ##1##, while ##E[A]=0##, so that ##E[A^2]=1 \neq E[A]E[A]=0.0=##
Try ##\sim##
 
  • Like
Likes WWGD and docnet
  • #9
Orodruin said:
Try ##\sim##
Thanks, fixed it. Phew! Need an upgrade and refresher on my Tex.
 
Last edited:
  • Like
Likes docnet
  • #10
What is it meant by simplified? $$ E [ ( A + B ) ^ 2 ] $$ is simpler than $$ E [ A ^ 2 ] + 2 E [ A B ] + E [ B ^ 2 ] $$ and demands less calculation. Only in the case where A and B are independent $$ E [ ( A + B ) ^ 2 ] $$ can become a simpler expression which is $$ E [ A ^ 2 ] + E [ B ^ 2 ] $$ and which demands less calculation.
 
  • Like
Likes docnet
  • #11
Gavran said:
What is it meant by simplified? $$ E [ ( A + B ) ^ 2 ] $$ is simpler than $$ E [ A ^ 2 ] + 2 E [ A B ] + E [ B ^ 2 ] $$ and demands less calculation. Only in the case where A and B are independent $$ E [ ( A + B ) ^ 2 ] $$ can become a simpler expression which is $$ E [ A ^ 2 ] + E [ B ^ 2 ] $$ and which demands less calculation.

[itex]A[/itex] and [itex]B[/itex] being independent does not imply that [itex]E[AB] = 0[/itex].
 
  • Like
Likes docnet
  • #12
pasmith said:
[itex]A[/itex] and [itex]B[/itex] being independent does not imply that [itex]E[AB] = 0[/itex].
It does if you pair it with ##E[A]=0## as given in the OP.
 
  • Like
Likes docnet

FAQ: Expectation of a sum of random variables

What is the expectation of the sum of two random variables?

The expectation of the sum of two random variables is equal to the sum of their expectations. Mathematically, if \(X\) and \(Y\) are random variables, then \(E[X + Y] = E[X] + E[Y]\).

Does the expectation of the sum of random variables require them to be independent?

No, the expectation of the sum of random variables does not require them to be independent. The linearity of expectation holds regardless of whether the random variables are independent or not.

How do you generalize the expectation of the sum for more than two random variables?

The expectation of the sum of more than two random variables is the sum of their individual expectations. For random variables \(X_1, X_2, \ldots, X_n\), the expectation is \(E[X_1 + X_2 + \cdots + X_n] = E[X_1] + E[X_2] + \cdots + E[X_n]\).

What is the expectation of the sum of a constant and a random variable?

If \(c\) is a constant and \(X\) is a random variable, the expectation of their sum is \(E[c + X] = c + E[X]\). This follows from the linearity of expectation.

Can the expectation of the sum be used to find the variance of the sum of random variables?

While the expectation of the sum is straightforward, finding the variance of the sum of random variables requires additional information about their covariance. For independent random variables \(X\) and \(Y\), the variance of their sum is \(Var(X + Y) = Var(X) + Var(Y)\). For dependent variables, the covariance term must be included: \(Var(X + Y) = Var(X) + Var(Y) + 2Cov(X, Y)\).

Similar threads

Replies
11
Views
935
Replies
16
Views
2K
Replies
1
Views
1K
Replies
6
Views
1K
Replies
1
Views
1K
Back
Top