Maximum of the Likelihood estimation

In summary: Yes.Now we have found the most likely distribution based on an assumed distribution with an unknown parameter.To be fair, it looks like a somewhat unrealistic distribution. Then again, it's just an example how maximum likelihood works.
  • #1
mathmari
Gold Member
MHB
5,049
7
Hey! :eek:

I am looking at the Likelihood function. I have understood how we define that function, but having find the maximum of the Likelihood estimation, what is the meaning of that?

What information do we have, when we have found the $\theta$ so that the Likelihood function L is maximized?

(Wondering)
 
Physics news on Phys.org
  • #2
mathmari said:
Hey! :eek:

I am looking at the Likelihood function. I have understood how we define that function, but having find the maximum of the Likelihood estimation, what is the meaning of that?

What information do we have, when we have found the $\theta$ so that the Likelihood function L is maximized?

Hey mathmari! (Smile)

In statistics we typically try to find a probability distribution that describes a population.
It helps us to understand the population, and moreover to make predictions.
Often enough that's a normal distribution with a mean value and a standard deviation.
It's usually fairly straight forward to find an approximation for those particular parameters.

More generally a population is described by the assumption of a distribution combined with the parameters defining it, like a normal distribution characterized by $\mu$ and $\sigma$, but we can also have different distributions, parameters, and it may not be so straight forward to find the parameters.
Maximizing the likelihood function means we find the most likely approximation of parameters given an assumed distribution. (Thinking)
 
  • #3
I like Serena said:
In statistics we typically try to find a probability distribution that describes a population.
It helps us to understand the population, and moreover to make predictions.
Often enough that's a normal distribution with a mean value combined and a standard deviation.
It's usually fairly straight forward to find and approximation for those particular parameters.

More generally a population is described the assumption of a distribution combined with the parameters defining it, like a normal distribution characterized by $\mu$ and $\sigma$, but we can also have different distributions, parameters, and it may not be so straight forward to find the parameters.
Maximizing the likelihood function means we find the most likely approximation of parameters given an assumed distribution. (Thinking)

Let's consider a specific example.

An employee starts work around 8:00 am The general beginning of duty varies by up to $2$ minutes up or down. We have the following:

$X$ = Beginning of duty (Difference to 8 o'clock in minutes) $-2$ $-1$ $1$ $2$
$\mathbb{P}(X=x)$ $0.2\, \theta$$0.3\, \theta$$0.5\, \theta$$1-\theta$
For $10$ consecutive working days, the following values have occurred:
\begin{equation*}-1, \ \ \ 2, \ -2, \ -2, \ \ \ 1, \ \ \ 1, \ \ \ 2, \ -1, \ \ \ 1, \ -1\end{equation*}

From these infotmation we get the Likelihood function:
\begin{equation*}L(-1, 2, -2, -2, 1, 1, 2, -1, 1, -1 \mid \theta ) = 0.000135\cdot \theta^8\cdot (1-\theta)^2 \end{equation*}

The maximum Likelihood estimator is $\hat{\theta}=\frac{4}{5}$.

Do we calculate each probabilities $\mathbb{P}(X=x_i)$ with this value of $\theta$ and then we have the predictions what time the duty begins? (Wondering)
 
  • #4
mathmari said:
Do we calculate each probabilities $\mathbb{P}(X=x_i)$ with this value of $\theta$ and then we have the predictions what time the duty begins? (Wondering)

Yes.
Now we have found the most likely distribution based on an assumed distribution with an unknown parameter.
To be fair, it looks like a somewhat unrealistic distribution. Then again, it's just an example how maximum likelihood works. (Thinking)
 

FAQ: Maximum of the Likelihood estimation

What is Maximum Likelihood Estimation (MLE)?

Maximum Likelihood Estimation is a statistical method used to estimate the parameters of a probability distribution by maximizing the likelihood function. It is based on the principle that the best estimate for the parameters is the one that makes the observed data most probable.

How is MLE different from other estimation methods?

MLE is different from other estimation methods because it uses the likelihood function, which takes into account all the available data rather than just the sample mean or median. This makes MLE a more efficient and accurate method for estimating parameters.

How is MLE used in scientific research?

MLE is commonly used in scientific research to estimate the parameters of a statistical model, such as in genetics, ecology, and epidemiology. It is also used in machine learning and artificial intelligence for parameter estimation in various models.

What are the assumptions of MLE?

The main assumptions of MLE are that the data follows a known probability distribution and that the observations are independent and identically distributed (IID). Additionally, MLE assumes that the sample size is sufficiently large and that the data is free from outliers or influential observations.

How do you determine the maximum likelihood estimate?

The maximum likelihood estimate is determined by finding the parameter values that maximize the likelihood function. This can be done analytically by taking the derivative of the likelihood function and setting it equal to zero, or numerically using optimization algorithms such as gradient descent.

Similar threads

Back
Top