How to show that the Method of Moment Estimators for the Normal

In summary: Once you have that, squaring and expanding is just a matter of doing the algebra.Assuming \sigma2 estimate:= 1/n (\sumE(Xi-X(bar))2>>\sigma^2 + \mu^2You need to square this, expand it out, then take the expectation----and yes, I am perfectly serious! Try it yourself; it is not as bad as you might think at first. The basic properties you need are ##E X_j^2 = \sigma^2 + \mu^2## and the fact that the different ##X_j## are independent, so that ##E X_i X_j## is easy to get
  • #1
trap101
342
0
So I'm trying to show that the estimators for the normal distribution by the method of moments are consistent. So to show consistency, I have to :

1) Show E(θ(estimator) = θ (parameter)

2) lim n-->∞ Var(θ(estimator) = 0

So Since there are two estimators in the normal distribution ( [itex]\mu[/itex],[itex]\sigma[/itex]2) I have to prove that they are each consistent:

To prove [itex]\mu[/itex](estimator):

E([itex]\mu[/itex](estimate) = E(([itex]\sum[/itex]Xi)/n)

Working along I get that [itex]\mu[/itex] (estimate) is unbias as well as [itex]\sigma[/itex]2, but now to show the second condition I get tripped up for both estimators.

for [itex]\mu[/itex](estimate):

V([itex]\mu[/itex](estimate) = V(([itex]\sum[/itex]Xi)/n

= 1/n2[itex]\sum[/itex]V(Xi)

Similarly for V( [itex]\sigma[/itex]2(estimate):

1/n2[itex]\sum[/itex]V(Xi-X(bar))2

How do I proceed for these two estimators from here?
 
Physics news on Phys.org
  • #2
trap101 said:
So I'm trying to show that the estimators for the normal distribution by the method of moments are consistent. So to show consistency, I have to :

1) Show E(θ(estimator) = θ (parameter)

2) lim n-->∞ Var(θ(estimator) = 0

So Since there are two estimators in the normal distribution ( [itex]\mu[/itex],[itex]\sigma[/itex]2) I have to prove that they are each consistent:

To prove [itex]\mu[/itex](estimator):

E([itex]\mu[/itex](estimate) = E(([itex]\sum[/itex]Xi)/n)

Working along I get that [itex]\mu[/itex] (estimate) is unbias as well as [itex]\sigma[/itex]2, but now to show the second condition I get tripped up for both estimators.

for [itex]\mu[/itex](estimate):

V([itex]\mu[/itex](estimate) = V(([itex]\sum[/itex]Xi)/n

= 1/n2[itex]\sum[/itex]V(Xi)

Similarly for V( [itex]\sigma[/itex]2(estimate):

1/n2[itex]\sum[/itex]V(Xi-X(bar))2

How do I proceed for these two estimators from here?

If you are estimating ##\sigma^2## using
[tex] \text{est}(\sigma^2) = \frac{1}{n} \sum_{i=1}^n (X_i - \bar{X})^2[/tex]
then your estimate is biased.
 
  • #3
Ray Vickson said:
If you are estimating ##\sigma^2## using
[tex] \text{est}(\sigma^2) = \frac{1}{n} \sum_{i=1}^n (X_i - \bar{X})^2[/tex]
then your estimate is biased.


Then I established that result wrong. I did this:

Assuming [itex]\sigma[/itex]2 estimate:

= 1/n ([itex]\sum[/itex]E(Xi-X(bar))2

= 1/n ([itex]\sum[/itex][itex]\sigma[/itex]2

= 1/n (n[itex]\sigma[/itex]2)

= [itex]\sigma[/itex]2
 
  • #4
Your error is stating that
[tex]
E\left(X_i - \overline{X}\right)^2 = \sigma^2
[/tex]
 
  • #5
trap101 said:
Then I established that result wrong. I did this:

Assuming [itex]\sigma[/itex]2 estimate:

= 1/n ([itex]\sum[/itex]E(Xi-X(bar))2

= 1/n ([itex]\sum[/itex][itex]\sigma[/itex]2

= 1/n (n[itex]\sigma[/itex]2)

= [itex]\sigma[/itex]2

As 'statdad' has explained to you, ##E(X_i - \bar{X})^2 \neq \sigma^2##.

Take ##i = 1##, for example. We have [tex]X_1 - \bar{X} = \left(1-\frac{1}{n}\right) X_1 - \frac{1}{n} X_2 - \cdots - \frac{1}{n} X_n[/tex]
You need to square this, expand it out, then take the expectation----and yes, I am perfectly serious! Try it yourself; it is not as bad as you might think at first. The basic properties you need are ##E X_j^2 = \sigma^2 + \mu^2## and the fact that the different ##X_j## are independent, so that ##E X_i X_j## is easy to get for ##i \neq j##.
 

FAQ: How to show that the Method of Moment Estimators for the Normal

How do I use the method of moment estimators to find the parameters of a normal distribution?

The method of moment estimators for the normal distribution involves finding the sample moments (such as the mean and variance) from a given set of data and equating them to the theoretical moments of a normal distribution. Solving for the parameters using these equations will give you the estimated values for the mean and variance of the distribution.

Is the method of moment estimators accurate for all sample sizes?

The accuracy of the method of moment estimators depends on the sample size. For larger sample sizes, the estimators tend to be more accurate and approach the true population parameters. However, for smaller sample sizes, the estimators may be less accurate and have a higher margin of error.

Can the method of moment estimators be used for other distributions besides the normal?

Yes, the method of moment estimators can be used for other distributions as well. It involves equating sample moments to the theoretical moments of the desired distribution and solving for the parameters. However, the method may not always produce accurate estimators for all distributions.

How do I interpret the results from the method of moment estimators?

The results from the method of moment estimators give you the estimated values for the parameters of the normal distribution. The mean represents the center of the distribution, while the standard deviation (square root of the variance) represents the spread of the data. These values can be used to describe and make inferences about the population from which the sample was drawn.

What are the limitations of using the method of moment estimators?

The method of moment estimators relies on the assumption that the sample moments accurately represent the theoretical moments of the population. If this assumption is not met, the estimators may be biased and produce inaccurate results. Additionally, the method may not work well for distributions with heavy tails or outliers, as it is based on the first two moments of the distribution.

Similar threads

Back
Top