- #1
NotEuler
- 58
- 2
Hi,
I'm trying to figure out something I'm pretty sure is true, but don't know how to prove it. I couldn't find the answer with a google search, but hopefully someone here knows the answer!
So I have a linear least squares multiple regression model:
Y=a+bX1+cX2+e
where a is the intercept, X1 and X2 predictor/independent variables, and e denotes the residuals.
The model (i.e. the values of a, b and c) is fitted so that Ʃe^2 is minimized.
How do I prove that cov(e,X1)=cov(e,X2=0?
Thanks!
NotEuler
I'm trying to figure out something I'm pretty sure is true, but don't know how to prove it. I couldn't find the answer with a google search, but hopefully someone here knows the answer!
So I have a linear least squares multiple regression model:
Y=a+bX1+cX2+e
where a is the intercept, X1 and X2 predictor/independent variables, and e denotes the residuals.
The model (i.e. the values of a, b and c) is fitted so that Ʃe^2 is minimized.
How do I prove that cov(e,X1)=cov(e,X2=0?
Thanks!
NotEuler