OLS standard error that corrects for autocorrelation but not heteroskedasticity

  • #1
Usagi
45
0
Question: By mapping the OLS regression into the GMM framework, write the formula for the standard error of the OLS regression coefficients that corrects for autocorrelation but *not* heteroskedasticity. Furthermore, show that in this case, the conventional standard errors are OK if the 's are uncorrelated over time, even if the errors are correlated over time.

Attempt:
So the general model is . OLS picks parameters to minimize the variance of the residual:

where the notation denotes the sample mean. We find from the first-order condition, which states that:

In the GMM context, here, the number of moments equals the number of parameters. Thus, we set the sample moments exactly to zero and solve for the estimate analytically:

Using the known result from GMM theory that

where in this case (the identity matrix), , and with .

So the general formula for the standard error of OLS is


Now I know from the OLS assumptions:

(i) No autocorrelation:

(ii) No heteroskedasticity:



What would the OLS standard error become if I correct for autocorrelation but not heteroskedasticity? Also how do I show that the conventional standard errors are OK if the 's are uncorrelated over time, even if the errors are correlated over time?
 

Similar threads

Back
Top