- #1
trap101
- 342
- 0
So I'm trying to show that the estimators for the normal distribution by the method of moments are consistent. So to show consistency, I have to :
1) Show E(θ(estimator) = θ (parameter)
2) lim n-->∞ Var(θ(estimator) = 0
So Since there are two estimators in the normal distribution ( [itex]\mu[/itex],[itex]\sigma[/itex]2) I have to prove that they are each consistent:
To prove [itex]\mu[/itex](estimator):
E([itex]\mu[/itex](estimate) = E(([itex]\sum[/itex]Xi)/n)
Working along I get that [itex]\mu[/itex] (estimate) is unbias as well as [itex]\sigma[/itex]2, but now to show the second condition I get tripped up for both estimators.
for [itex]\mu[/itex](estimate):
V([itex]\mu[/itex](estimate) = V(([itex]\sum[/itex]Xi)/n
= 1/n2[itex]\sum[/itex]V(Xi)
Similarly for V( [itex]\sigma[/itex]2(estimate):
1/n2[itex]\sum[/itex]V(Xi-X(bar))2
How do I proceed for these two estimators from here?
1) Show E(θ(estimator) = θ (parameter)
2) lim n-->∞ Var(θ(estimator) = 0
So Since there are two estimators in the normal distribution ( [itex]\mu[/itex],[itex]\sigma[/itex]2) I have to prove that they are each consistent:
To prove [itex]\mu[/itex](estimator):
E([itex]\mu[/itex](estimate) = E(([itex]\sum[/itex]Xi)/n)
Working along I get that [itex]\mu[/itex] (estimate) is unbias as well as [itex]\sigma[/itex]2, but now to show the second condition I get tripped up for both estimators.
for [itex]\mu[/itex](estimate):
V([itex]\mu[/itex](estimate) = V(([itex]\sum[/itex]Xi)/n
= 1/n2[itex]\sum[/itex]V(Xi)
Similarly for V( [itex]\sigma[/itex]2(estimate):
1/n2[itex]\sum[/itex]V(Xi-X(bar))2
How do I proceed for these two estimators from here?