OLS standard error that corrects for autocorrelation but not heteroskedasticity

Therefore, they will be the same as the conventional standard errors, which assume no autocorrelation or heteroskedasticity.
  • #1
Usagi
45
0
Question: By mapping the OLS regression into the GMM framework, write the formula for the standard error of the OLS regression coefficients that corrects for autocorrelation but *not* heteroskedasticity. Furthermore, show that in this case, the conventional standard errors are OK if the 's are uncorrelated over time, even if the errors are correlated over time.

Attempt:
So the general model is . OLS picks parameters to minimize the variance of the residual:

where the notation denotes the sample mean. We find from the first-order condition, which states that:

In the GMM context, here, the number of moments equals the number of parameters. Thus, we set the sample moments exactly to zero and solve for the estimate analytically:

Using the known result from GMM theory that

where in this case (the identity matrix), , and with .

So the general formula for the standard error of OLS is


Now I know from the OLS assumptions:

(i) No autocorrelation:

(ii) No heteroskedasticity:



What would the OLS standard error become if I correct for autocorrelation but not heteroskedasticity? Also how do I show that the conventional standard errors are OK if the 's are uncorrelated over time, even if the errors are correlated over time?
 
Physics news on Phys.org
  • #2
Answer: In this case, the OLS standard error would be: The conventional standard errors are OK if the 's are uncorrelated over time, even if the errors are correlated over time. This follows from the fact that the OLS standard errors correct for autocorrelation in the error terms, but not heteroskedasticity. Since the 's are assumed to be uncorrelated over time, the OLS standard errors will be the same as if there were no autocorrelation in the error terms.
 

Similar threads

Back
Top