Expectation of an Uniform distribution maximum likelihood estimator

In summary, the conversation discusses the determination of the maximum likelihood estimator for an Uniform distribution U(0,k) and the bias of this estimator. The expectation of the maximum is found by integrating the density for the max, while the expectation of the second largest observation requires the use of conditional probability.
  • #1
artbio
14
0
Hi had this question on my last "Statistical Inference" exam. And I still have some doubts about it. I determined that the maximum likelihood estimator of an Uniform distribution U(0,k) is equal to the maximum value observed in the sample. That is correct. So say my textbooks. After that the bias of the estimator was demanded. And to determine the bias I need to determine its expectation first. And I am not sure if its expectation is equal to k or k/2

Is there any kind soul who can help?
Thanks.
 
Physics news on Phys.org
  • #2
You can find the expectation by first writing down the density for the max, then proceeding with integration the same way you find any expectation. The answer is neither k nor k/2.
 
Last edited:
  • #3
So:

[tex]
E[max(x_1,...,x_n)]=\int_0^k \! max(x_1,...,x_n)\frac{1}{k} \, dx=\frac{1}{k}\int_0^k \!max(x_1,...,x_n)\, dx
[/tex]

Is this correct?

Now I have a problem. Since the "max" is also a random variable, for which I don't know the density function. How do I integrate this?
 
  • #4
Think about this (some steps missing on purpose)

1) You are dealing with a random sample of size [tex] n [/tex], so the individual [tex] x_i [/tex] are independent

2) I'll call the maximum of the variables [tex] M [/tex] (non-standard, but it will work) The cumulative distribution function of the maximum is, by definition,

[tex]
F(t) \equiv P(M \le t)
[/tex]

3) If the maximum value is <= t, that means all of the variables are, so

[tex]
P(M \le t) = P(X_1 \le t \text{ and } X_2 \le t \text{ and } \dots X_n \le t)
[/tex]

What does independence tell you about how the statement immediately above can be simplified?

4) Once you have an expression for the distribution function, differentiate it w.r.t [tex] t [/tex] to get the density - call it [tex] f(t) [/tex]

5) The expected value of the max is

[tex]
\int t f(t) \, dt
[/tex]

- integrate over the range suitable for the uniform distribution. The result will be an expression involving [tex] k [/tex] and [tex] n [/tex].
 
  • #5
Thanks man. You're awesome!

Let's see if I got it.

I will use the [tex]\Theta[/tex] greek letter instead of a k because that was the one used on the original question.

1) Let [tex](X_1,X_2,...X_n)[/tex] be a random sample. The individual [tex]X_i[/tex] are independent identically distributed random variables that follow an Uniform distribution [tex]X_i[/tex] ~ [tex]U(0,\Theta)[/tex]

2) The cumulative distribution function of the maximum is, by definition:

[tex]F(t) \equiv P(M \le t)[/tex]

3) If the maximum value is [tex]\le t[/tex], that means all of the variables are, so:

[tex]P(M \le t) = P(X_1 \le t \; \cap \; X_2 \le t \; \cap \; \dots X_n \le t)=\prod_{i=1}^n\!P(X_i \le t)[/tex]

The product follows because the individual [tex]X_i[/tex] are independent random variables.

4) The probability density of the above Uniform distribution is:

[tex]f(x)=\frac{1}{\Theta} \; , \; 0 \le x \le \Theta[/tex]

5) So it's distribution function is:

[tex]F(t)=0 \; , \; t \le 0[/tex]
[tex]F(t)=\frac{t}{\Theta} \; , \; 0 < t \le \Theta[/tex]
[tex]F(t)=1 \; , \; t > \Theta[/tex]

6) So the product:

[tex]\prod_{i=1}^n\!P(X_i \le t)[/tex]

equals the product of n distribution functions defined in point 5) which yields:

[tex]F(t)=0 \; , \; t \le 0[/tex]
[tex]F(t)=(\frac{t}{\Theta})^n \; , \; 0 < t \le \Theta[/tex]
[tex]F(t)=1 \; , \; t > \Theta[/tex]

And the distribution of the maximum was found!

7) By differentiating the distribution function of the maximum one gets it's density function:

[tex]f(t)=\frac{nt^{n-1}}{\Theta^n} \; , \; 0 \le t \le \Theta[/tex]

8) Finally the expected value of the max is:

[tex]E[max(X_1,X_2,...,X_n)]=E[t]=\int_{-\infty}^{+\infty}tf(t) \, dt=\int_0^{\Theta}t\frac{nt^{n-1}}{\Theta^n} \, dt=\frac{n}{\Theta^n}\int_0^{\Theta}t^n \, dt=\frac{n}{n+1}\Theta[/tex]

So the result is neither [tex]\Theta[/tex] nor [tex]\Theta \over 2[/tex] but a value that depends on the sample size and lies between these two. It will be exactly [tex]\Theta \over 2[/tex] when the sample size equals unity and it tends to [tex]\Theta[/tex] when the sample size approaches infinity. This result is according to my original intuition so I think it is correct.

What do you think?
 
  • #6
Looks good. Now you can find the bias.

Note: very soon after this type of problem (needing the distribution of the maximum in a particular case) it is customary to give a problem that requires determining the distribution of the MINIMUM value, so you may want to begin thinking about how you'd find the distribution function for that.
 
  • #7
Is the minimum the same except for
[tex]
\sum_{i=1}^n\!P(X_i \le t)
[/tex]
instead of the
[tex]
\prod_{i=1}^n\!P(X_i \le t)
[/tex]
which would make
[tex]
F_{M}(t) = n\frac{t}{\theta}
[/tex]
... ?
 
  • #8
no. If [itex] X_1 [/itex] is the minimum, setting up

[tex]
P(X_1 \le t)
[/tex]

won't help you (for any underlying distribution). Try starting with

[tex]
P(X_1 > t)
[/tex]

and then ask: if the minimum is larger than some value, what can I conclude about
all the rest of the values (compared to that same [tex] t [/tex])?
 
  • #9
Aha, yes
I think I get it now :)
Thank you
 
  • #10
This thread has been very helpful to me. I have a question. I was asked to ffind the bias of the Jacknife estimator for the uniform distribution. Managed to get the expectation. I now have to find the expectation of the second largest observation, that is Xn-1. I proceeded in the same manner, but could not get the required answer. Do I have to use conditional probability?
 

FAQ: Expectation of an Uniform distribution maximum likelihood estimator

1. What is the "Expectation of an Uniform distribution maximum likelihood estimator"?

The expectation of an Uniform distribution maximum likelihood estimator is a statistical concept that involves estimating the value of a parameter in a uniform distribution, using the maximum likelihood estimation method. It is also known as the mean of the maximum likelihood estimator.

2. How is the "Expectation of an Uniform distribution maximum likelihood estimator" calculated?

To calculate the expectation of an Uniform distribution maximum likelihood estimator, you need to take the integral of the parameter multiplied by the likelihood function. This integral is taken over the range of values for the parameter.

3. Why is the "Expectation of an Uniform distribution maximum likelihood estimator" important?

The expectation of an Uniform distribution maximum likelihood estimator is important because it is an unbiased estimator of the true value of the parameter in a uniform distribution. It also has the property of efficiency, meaning that it has the smallest variance among all unbiased estimators.

4. What is the relationship between the "Expectation of an Uniform distribution maximum likelihood estimator" and the sample mean?

The expectation of an Uniform distribution maximum likelihood estimator is equivalent to the sample mean. This means that in a uniform distribution, the sample mean is the most efficient and unbiased estimator of the true mean.

5. How does the sample size affect the "Expectation of an Uniform distribution maximum likelihood estimator"?

The expectation of an Uniform distribution maximum likelihood estimator is not affected by the sample size. This means that regardless of the number of observations in the sample, the expectation of the maximum likelihood estimator will remain the same.

Back
Top