- #1
Usagi
- 45
- 0
Question: By mapping the OLS regression into the GMM framework, write the formula for the standard error of the OLS regression coefficients that corrects for autocorrelation but *not* heteroskedasticity. Furthermore, show that in this case, the conventional standard errors are OK if the $x$'s are uncorrelated over time, even if the errors $\varepsilon$ are correlated over time.
Attempt: So the general model is $y_t = \beta' x_t + \varepsilon_t$. OLS picks parameters $\beta$ to minimize the variance of the residual:
$$\min_{\beta} E_T[(y_t-\beta' x_t)^2] $$
where the notation $E_t(\cdot) = \frac{1}{T} \sum_{t=1}^T( \cdot )$ denotes the sample mean. We find $\widehat{\beta}$ from the first-order condition, which states that:
$$g_T(\beta) = E_T[x_t(y_t - x_t' \beta)] =0$$
In the GMM context, here, the number of moments equals the number of parameters. Thus, we set the sample moments exactly to zero and solve for the estimate analytically:
$$\widehat{\beta} = [E_T(x_tx_t')]^{-1} E_T(x_t y_t)$$
Using the known result from GMM theory that
$$Var(\widehat{b}) = \frac{1}{T} (ad)^{-1} aSa^{\prime} (ad)^{-1 \prime}$$
where in this case $a = I$ (the identity matrix), $d = -E[x_t x_t']$, and $S = \sum_{j=-\infty}^{\infty} E[f(x_t, b), f(x_{t-j}, b)']$ with $f(x_t, \beta) = x_t(y_t - x_t'\beta) = x_t \varepsilon_t$.
So the general formula for the standard error of OLS is
$$Var(\widehat{\beta}) = \frac{1}{T}E(x_t x_t')^{-1} \left[\sum_{j=-\infty}^{\infty} E(\varepsilon_t x_t x_{t-j}' \varepsilon_{t-j})\right]E(x_t x_t')^{-1}$$
Now I know from the OLS assumptions:
(i) No autocorrelation: $E(\varepsilon_t \mid x_t, x_{t-1}, \cdots, \varepsilon_{t-1}, \varepsilon_{t-2}, \cdots) =0$
(ii) No heteroskedasticity: $E(\varepsilon_t^2 \mid x_t, x_{t-1}, \cdots, \varepsilon_{t-1}, \cdots) = constant = \sigma_{\varepsilon}^2$
What would the OLS standard error become if I correct for autocorrelation but not heteroskedasticity? Also how do I show that the conventional standard errors are OK if the $x$'s are uncorrelated over time, even if the errors $\varepsilon$ are correlated over time?
Attempt: So the general model is $y_t = \beta' x_t + \varepsilon_t$. OLS picks parameters $\beta$ to minimize the variance of the residual:
$$\min_{\beta} E_T[(y_t-\beta' x_t)^2] $$
where the notation $E_t(\cdot) = \frac{1}{T} \sum_{t=1}^T( \cdot )$ denotes the sample mean. We find $\widehat{\beta}$ from the first-order condition, which states that:
$$g_T(\beta) = E_T[x_t(y_t - x_t' \beta)] =0$$
In the GMM context, here, the number of moments equals the number of parameters. Thus, we set the sample moments exactly to zero and solve for the estimate analytically:
$$\widehat{\beta} = [E_T(x_tx_t')]^{-1} E_T(x_t y_t)$$
Using the known result from GMM theory that
$$Var(\widehat{b}) = \frac{1}{T} (ad)^{-1} aSa^{\prime} (ad)^{-1 \prime}$$
where in this case $a = I$ (the identity matrix), $d = -E[x_t x_t']$, and $S = \sum_{j=-\infty}^{\infty} E[f(x_t, b), f(x_{t-j}, b)']$ with $f(x_t, \beta) = x_t(y_t - x_t'\beta) = x_t \varepsilon_t$.
So the general formula for the standard error of OLS is
$$Var(\widehat{\beta}) = \frac{1}{T}E(x_t x_t')^{-1} \left[\sum_{j=-\infty}^{\infty} E(\varepsilon_t x_t x_{t-j}' \varepsilon_{t-j})\right]E(x_t x_t')^{-1}$$
Now I know from the OLS assumptions:
(i) No autocorrelation: $E(\varepsilon_t \mid x_t, x_{t-1}, \cdots, \varepsilon_{t-1}, \varepsilon_{t-2}, \cdots) =0$
(ii) No heteroskedasticity: $E(\varepsilon_t^2 \mid x_t, x_{t-1}, \cdots, \varepsilon_{t-1}, \cdots) = constant = \sigma_{\varepsilon}^2$
What would the OLS standard error become if I correct for autocorrelation but not heteroskedasticity? Also how do I show that the conventional standard errors are OK if the $x$'s are uncorrelated over time, even if the errors $\varepsilon$ are correlated over time?