- #1
slakedlime
- 76
- 2
Homework Statement
Any help on this would be immensely appreciated! I am having trouble interpreting what my instructor is trying to say.
Consider a simple linear regression model: [itex]y_i = \beta_0 + \beta_1x_i + u[/itex]
(a) In regression through the origin, the intercept is assumed to be equal to zero. For this model, derive the formula for the OLS estimator. Let's denote this estimator by [itex]\tilde{\beta_1}[/itex].
(b) Show that [itex]\tilde{\beta_1}[/itex] is consistent (assuming [itex]\beta_0=0)[/itex]
The Attempt at a Solution
For part (a), I derived:
We know, [itex]y_i = \beta_1x_i + u[/itex]. Hence:
[itex]\tilde{\beta_1} = \frac{\sum_{i=1}^{n}x_iy_i}{\sum_{i=1}^{n}x_i^2}[/itex]If we wanted to show the error term, u, we plug in [itex]y_i = \beta_1x_i + u[/itex]:
[itex]\tilde{\beta_1}[/itex]
[itex]= \frac{\sum_{i=1}^{n}x_i(\beta_1x_i + u)}{\sum_{i=1}^{n}x_i^2}[/itex]
[itex]= \frac{\sum_{i=1}^{n}\beta_1x_i^2 + ux_i}{\sum_{i=1}^{n}x_i^2}[/itex]
For part (b), I would have to prove that [itex]\tilde{\beta_1}[/itex] converges in probability to [itex]\beta_1[/itex]. My instructor hinted that this involved proving that the denominator of the formula to be derived in part (a) will converge to a constant. I don't see how this can be possible, since the denominator might be infinitely large.
The only possible answer I can see is the following:
[itex]\tilde{\beta_1} [/itex]
[itex]= \frac{\sum_{i=1}^{n}\beta_1x_i^2}{\sum_{i=1}^{n}x_i^2} + \frac{\sum_{i=1}^{n}ux_1}{\sum_{i=1}^{n}x_i^2}[/itex]
[itex]= \beta_1 + \frac{\sum_{i=1}^{n}ux_1}{\sum_{i=1}^{n}x_i^2}[/itex]
So if u = 0, then the OLS estimator for regression through the origin is consistent given that [itex]\beta_0 = 0[/itex]. This would show that any inconsistency in the estimator
[itex]\tilde{\beta_1} [/itex] is due to the error (u).
However, I think that my answer is incorrect because I haven't applied any convergence methods (law of large numbers, Slutsky's theorem, continuous mapping theorem).
Any help would be immensely appreciated. If someone could shed light on what my instructor is saying, that would be great too. Thank you!