Maximum Likelihood Estimators for Uniform Distribution

In summary: If $\theta$ is bigger than the maximum value in the sample, then the likelihood will be smaller than it would be if $\theta$ were equal to the maximum value.
  • #1
Julio1
69
0
Find maximum likelihood estimators of an sample of size $n$ if $X\sim U(0,\theta].$

Hello MHB :)! Can any user help me please :)! I don't how follow...
 
Physics news on Phys.org
  • #2
Julio said:
Find maximum likelihood estimators of an sample of size $n$ if $X\sim U(0,\theta].$

Hello MHB :)! Can any user help me please :)! I don't how follow...

Hi Julio!

We want to maximize the likelihood that some $\theta$ is the right one.

The likelihood that a certain $\theta$ is the right one given a random sample is:
$$\mathcal L(\theta; x_1, ..., x_n) = f(x_1|\theta) \times f(x_2|\theta) \times ... \times f(x_n|\theta)$$
where $f$ is the probability density function.

Since $X\sim U(0,\theta]$, $f$ is given by:
$$f(x|\theta)=\begin{cases}\frac 1 \theta&\text{if }0 < x \le \theta \\ 0 &\text{otherwise}\end{cases}$$

Can you tell for which $\theta$ the likelihood will be at its maximum?
 
  • #3
Thanks I like Serena :).

Good have that the likelihood function is $L(\theta)=\dfrac{1}{\theta^n}.$ Then applying logarithm we have that

$\ln (L(\theta))=\ln(\dfrac{1}{\theta^n})=-n\ln(\theta).$ Now for derivation with respect $\theta$ we have that $\dfrac{\partial}{\partial \theta}(\ln L(\theta))=\dfrac{\partial}{\partial \theta}(-n\ln(\theta))=-\dfrac{n}{\theta}.$ Thus, match with zero have $-\dfrac{n}{\theta}=0$, i.e., $n=0$.

But why? :(, remove the parameter $\theta$?... Then I don't find an estimator for $\theta$?
 
  • #4
You're welcome Julio!

What's missing in your approach is that it's not taken into account that the function is piecewise.
So we need to inspect what happens at the boundaires.

Note that any $x_i$ that is in the sample has to be $\le \theta$, because otherwise its probability is $0$.
So $\theta$ has to be at least the maximum value that is in the sample.
What happens to the likelihood if $\theta$ is bigger than that maximum value?
 

FAQ: Maximum Likelihood Estimators for Uniform Distribution

What is a Maximum Likelihood Estimator (MLE)?

A Maximum Likelihood Estimator (MLE) is a statistical method used to estimate the parameters of a probability distribution. It is based on the principle of maximum likelihood, which states that the most likely values of the parameters are those that make the observed data most probable.

How is a Maximum Likelihood Estimator calculated?

A Maximum Likelihood Estimator is calculated by finding the values of the parameters that maximize the likelihood function, which is a function of the parameters and the observed data. This can be done analytically or numerically through optimization methods.

What are the assumptions of Maximum Likelihood Estimators?

The main assumptions of Maximum Likelihood Estimators are that the data is independent and identically distributed, and that the probability distribution being used to model the data is the true distribution.

What are the advantages of using Maximum Likelihood Estimators?

The advantages of using Maximum Likelihood Estimators include their consistency, efficiency, and asymptotic normality. They also have good statistical properties, such as being unbiased and having a lower variance compared to other estimation methods.

When is Maximum Likelihood Estimation not appropriate?

Maximum Likelihood Estimation may not be appropriate when the assumptions of the method are violated, such as when the data is not normally distributed or when there are outliers present. In these cases, alternative estimation methods may be more suitable.

Similar threads

Replies
6
Views
2K
Replies
16
Views
2K
Replies
1
Views
1K
Replies
3
Views
2K
Replies
11
Views
2K
Replies
19
Views
2K
Replies
3
Views
1K
Back
Top