Maximum Likelihood Estimation Formula Simplification and Solution for p?

In summary, the conversation discusses the Maximum likelihood estimation formula, which is used in statistics and probability calculations. The formula involves a parameter and a probability distribution, and the goal is to find the maximum likelihood estimate for the parameter that makes a specific sample most likely. The conversation also includes a probability distribution formula and discusses the convergence of a sum.
  • #1
SheepLon
4
0
Hey guys !

My mother language is not English by the way. Sorry for spelling and gramme. :)

I'm curious to see if you can help me with my problem.I have already tried for almost a week and did not get to a solution. I also know, that the Maximum likelihood estimation is part of statistics and probability calculation. But since it is about formula transformation, I used the analysis forum.

The Maximum likelihood estimation formula is the following:

$L(\theta = p) = \prod\limits_{i=1}^N \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k_i-a}{2}} \cdot (1-p)^{\frac{k_i+a}{2}}+ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k_i+b}{2}} \cdot (1-p)^{\frac{k_i-b}{2}} \right]$

$= \prod\limits_i \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \left(\sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k_i-a}{2}} \cdot (1-p)^{\frac{k_i+a}{2}} + \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k_i+b}{2}} \cdot (1-p)^{\frac{k_i-b}{2}} \right)\right]$

where $p \in [0,1]$, $0^0:=1$, $a,b \in \mathbb{N}\backslash\{0\}$ with $a+b = r$I'm looking forward to your ideas.

"Hints":
- I exclude first the $p^{\frac{k_i}{2}} \cdot (1-p)^{\frac{k_i}{2}}$
That gave me at least another product without $m$, that I was able to pull out of the sum. However the other $p$'s I was not able to pull out.

**********************************************************************
**********************************************************************
**********************************************************************

A second, very similar Maximum likelihood estimation I need is the following
$L(\theta = p) = \prod\limits_{i=1}^N \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k_i-a}{2}} \cdot (1-p)^{\frac{k_i+a}{2}}+\sum\limits_{m = 1}^{r-1} \frac{1}{r} \cdot 2^{h_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{h_i-1} \cdot \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{h_i+b}{2}} \cdot (1-p)^{\frac{h_i-b}{2}} \right]$where $p \in [0,1]$, $0^0:=1$, $a,b \in \mathbb{N}\backslash\{0\}$ with $a+b = r$
 
Last edited:
Physics news on Phys.org
  • #2
What, exactly, is the question? To calculate a "maximum likelihood estimate", we have to be given a probabililty distribution depending on some parameter as well as a specific sample of that distribution. The "maximum likelihood estimate" of the parameter is the value that makes that specific sample most likely. I see none of those here. Instead you give a "formula" depending on a number of things with no explanation of what they are or what they mean.

"My mother language is not English by the way. Sorry for spelling and gramme."
Actually your only misspelling is of "grammer"!
 
  • #3
Thank you for the quick respond. I thought you could read the parameter for the ML estimator directly. I would like to determine $p$ by observed values $k_i$ so that the observation is as plausible as possible. I already set up the ML-estimator. However, the probabililty distribution for the first example is the following formula:

$P_k = \mathbb{P}\left(S_k = \{0,r\}\right) = \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k-1} \cdot \sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k-a}{2}} \cdot (1-p)^{\frac{k+a}{2}}+ \frac{1}{r} \cdot 2^{k} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k-1} \cdot \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k+b}{2}} \cdot (1-p)^{\frac{k-b}{2}} \right]$

$= \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k-1} \cdot \left(\sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k-a}{2}} \cdot (1-p)^{\frac{k+a}{2}}+ \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k+b}{2}} \cdot (1-p)^{\frac{k-b}{2}} \right)\right]$

where $p \in [0,1]$, $0^0:=1$, $a,b \in \mathbb{N}\backslash\{0\}$ with $a+b = r$ and $k \in \mathbb{N}\backslash\{0\}$. The sum from $k=1$ to $\infty$ converges against $1$.

Does that make any sense to you? Do you know what I mean? : )
 
  • #4
Is that too hard or does someone got an idea. :)
 
  • #5
You still haven't given us a complete problem from which we can give an estimate. In addition to a probability distribution, containing a parameter, we must be given a specific sample. Then the "maximum likelihood estimate" for the parameter is the value of the parameter that makes that sample "most likely"- having the highest probability.
 
  • #6
Sorry, but I don’t get what further information is needed. In my opinion we got everything we need in order to do a ML- Estimation für p for having the highest probability. The solution should depend on the sample I get for all my k’s and depending on the values a and b that are fix values.

So you don’t need more information. I got the ML equation, which needs to be simplified, Derivated and solved for p.
 

FAQ: Maximum Likelihood Estimation Formula Simplification and Solution for p?

What is maximum likelihood estimation?

Maximum likelihood estimation is a statistical method used to estimate the parameters of a probability distribution by finding the values that make the observed data most likely to occur. It is based on the principle that the best estimate of a parameter is the one that maximizes the likelihood of the observed data.

How is maximum likelihood estimation different from other estimation methods?

Maximum likelihood estimation differs from other estimation methods in that it takes into account the entire probability distribution of the data, rather than just a single point estimate. It also allows for the inclusion of multiple parameters and can be used for both discrete and continuous data.

What are the assumptions of maximum likelihood estimation?

The main assumptions of maximum likelihood estimation are that the data is independent and identically distributed, and that the probability distribution being used to model the data is the correct one. Additionally, the data should be of a sufficient size and the parameters being estimated should be identifiable.

How is the likelihood function used in maximum likelihood estimation?

The likelihood function is used in maximum likelihood estimation to calculate the probability of observing the data given a set of parameter values. The goal is to find the parameter values that maximize this likelihood function, as they will be the most likely values to have produced the observed data.

What are the advantages and disadvantages of maximum likelihood estimation?

The main advantages of maximum likelihood estimation are that it is a widely applicable method and can provide efficient and unbiased estimates. However, it also has some limitations, such as being sensitive to outliers and the need for large sample sizes. Additionally, it relies on the correct specification of the probability distribution, which may not always be known.

Back
Top