- #1
OhMyMarkov
- 83
- 0
Hello Everyone!
What $b$ minimizes $E[(X-b)^2]$ where $b$ is some constant, isn't it $b=E[X]$? Is it right to go about the proof as follows:
$E[(X-b)^2] = E[(X^2+b^2-2bX)] = E[X^2] + E[b^2]-2bE[X]$, but $E = b$, we differentiate with respect to $b$ and set to zero, we obtain that $b=E[X]$. Is this proof correct? I was thinking it was until I got this problem:
What $Y$ minimizes $E[(Y-aX-b)^2]$? The given expression contains variances and covariances, but all I get was $Y=aE[X]+b$.
What am I doing wrong here?
Any help is appreciated! :D
What $b$ minimizes $E[(X-b)^2]$ where $b$ is some constant, isn't it $b=E[X]$? Is it right to go about the proof as follows:
$E[(X-b)^2] = E[(X^2+b^2-2bX)] = E[X^2] + E[b^2]-2bE[X]$, but $E = b$, we differentiate with respect to $b$ and set to zero, we obtain that $b=E[X]$. Is this proof correct? I was thinking it was until I got this problem:
What $Y$ minimizes $E[(Y-aX-b)^2]$? The given expression contains variances and covariances, but all I get was $Y=aE[X]+b$.
What am I doing wrong here?
Any help is appreciated! :D