Maximum Likelihood Estimation Formula Simplification and Solution for p?

  • #1
SheepLon
4
0
Hey guys !

My mother language is not English by the way. Sorry for spelling and gramme. :)

I'm curious to see if you can help me with my problem.I have already tried for almost a week and did not get to a solution. I also know, that the Maximum likelihood estimation is part of statistics and probability calculation. But since it is about formula transformation, I used the analysis forum.

The Maximum likelihood estimation formula is the following:

$L(\theta = p) = \prod\limits_{i=1}^N \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k_i-a}{2}} \cdot (1-p)^{\frac{k_i+a}{2}}+ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k_i+b}{2}} \cdot (1-p)^{\frac{k_i-b}{2}} \right]$

$= \prod\limits_i \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \left(\sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k_i-a}{2}} \cdot (1-p)^{\frac{k_i+a}{2}} + \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k_i+b}{2}} \cdot (1-p)^{\frac{k_i-b}{2}} \right)\right]$

where $p \in [0,1]$, $0^0:=1$, $a,b \in \mathbb{N}\backslash\{0\}$ with $a+b = r$I'm looking forward to your ideas.

"Hints":
- I exclude first the $p^{\frac{k_i}{2}} \cdot (1-p)^{\frac{k_i}{2}}$
That gave me at least another product without $m$, that I was able to pull out of the sum. However the other $p$'s I was not able to pull out.

**********************************************************************
**********************************************************************
**********************************************************************

A second, very similar Maximum likelihood estimation I need is the following
$L(\theta = p) = \prod\limits_{i=1}^N \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k_i-1} \cdot \sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k_i-a}{2}} \cdot (1-p)^{\frac{k_i+a}{2}}+\sum\limits_{m = 1}^{r-1} \frac{1}{r} \cdot 2^{h_i} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{h_i-1} \cdot \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{h_i+b}{2}} \cdot (1-p)^{\frac{h_i-b}{2}} \right]$where $p \in [0,1]$, $0^0:=1$, $a,b \in \mathbb{N}\backslash\{0\}$ with $a+b = r$
 
Last edited:
Physics news on Phys.org
  • #2
What, exactly, is the question? To calculate a "maximum likelihood estimate", we have to be given a probabililty distribution depending on some parameter as well as a specific sample of that distribution. The "maximum likelihood estimate" of the parameter is the value that makes that specific sample most likely. I see none of those here. Instead you give a "formula" depending on a number of things with no explanation of what they are or what they mean.

"My mother language is not English by the way. Sorry for spelling and gramme."
Actually your only misspelling is of "grammer"!
 
  • #3
Thank you for the quick respond. I thought you could read the parameter for the ML estimator directly. I would like to determine $p$ by observed values $k_i$ so that the observation is as plausible as possible. I already set up the ML-estimator. However, the probabililty distribution for the first example is the following formula:

$P_k = \mathbb{P}\left(S_k = \{0,r\}\right) = \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k-1} \cdot \sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k-a}{2}} \cdot (1-p)^{\frac{k+a}{2}}+ \frac{1}{r} \cdot 2^{k} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k-1} \cdot \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k+b}{2}} \cdot (1-p)^{\frac{k-b}{2}} \right]$

$= \sum\limits_{m = 1}^{r-1}\left[ \frac{1}{r} \cdot 2^{k} \cdot \sin{\left(\frac{m \pi}{r}\right)} \cdot \cos{\left(\frac{m \pi}{r}\right)}^{k-1} \cdot \left(\sin{\left(\frac{am \pi}{r}\right)} \cdot p^{\frac{k-a}{2}} \cdot (1-p)^{\frac{k+a}{2}}+ \sin{\left(\frac{bm \pi}{r}\right)} \cdot p^{\frac{k+b}{2}} \cdot (1-p)^{\frac{k-b}{2}} \right)\right]$

where $p \in [0,1]$, $0^0:=1$, $a,b \in \mathbb{N}\backslash\{0\}$ with $a+b = r$ and $k \in \mathbb{N}\backslash\{0\}$. The sum from $k=1$ to $\infty$ converges against $1$.

Does that make any sense to you? Do you know what I mean? : )
 
  • #4
Is that too hard or does someone got an idea. :)
 
  • #5
You still haven't given us a complete problem from which we can give an estimate. In addition to a probability distribution, containing a parameter, we must be given a specific sample. Then the "maximum likelihood estimate" for the parameter is the value of the parameter that makes that sample "most likely"- having the highest probability.
 
  • #6
Sorry, but I don’t get what further information is needed. In my opinion we got everything we need in order to do a ML- Estimation für p for having the highest probability. The solution should depend on the sample I get for all my k’s and depending on the values a and b that are fix values.

So you don’t need more information. I got the ML equation, which needs to be simplified, Derivated and solved for p.
 
Back
Top