Stuck on obtaining a closed form for parameter using MLE

In summary, we are discussing a Markov Random Field and its log likelihood function, which involves a sum over the parameters θs and a sum over all possible arrangements of the values in the vector x. Taking the derivative with respect to θs, we are trying to determine the maximum likelihood estimates. However, it is not straightforward to obtain a closed form for θs, so it has been suggested to consider cases where xs is either 0 or 1 separately.
  • #1
NATURE.M
301
0

Homework Statement


We have a Markov Random Field with the log likelihood as such:
$$ l(\theta) = \sum\limits_{i=1}^L \log p(x^{(i)}|\theta) = \sum\limits_{i=1}^L \left( \sum\limits_{s \in V} \theta_{s} x_{s}^{(i)} - \log \sum\limits_{x} \exp \left\lbrace \sum\limits_{s \in V} \theta_{s} x_{s} \right\rbrace \right) $$

Note L is the number of data examples.
Also each x(i) is a vector where each component xs(i) is a binary variable taking on a value 0 or 1.
The set V denotes the set of vertices/nodes in the Markov Random Field. The nodes are the components xs of the vector x. Each node or xs has a parameter denoted θs.
And the sum over x, is the sum over every possible arrangement of the values in the vector x.

I then take the derivative with respect to θs to determine the ML estimates.
\begin{align*}
\frac{\partial l(\theta)}{\partial \theta_{s}} &= \sum\limits_{i=1}^L \left( x_{s}^{(i)} - \frac{\partial}{\partial \theta_{s}} \log \sum\limits_{x} \exp \left\lbrace \sum\limits_{s \in V} \theta_{s} x_{s} \right\rbrace \right)
\\ &= \sum\limits_{i=1}^L x_{s}^{i} - \dfrac{L \sum\limits_{x} \exp \left( \sum\limits_{s \in V} \theta_{s} x_{s} \right)x_{s}}{\sum\limits_{x} \exp \left( \sum\limits_{s \in V} \theta_{s} x_{s} \right)}
\end{align*}

At this point I'm not really sure how to approach the problem. I'm having difficulty directly obtaining a closed form for θs. It has been suggested to consider cases when xs = 0 and 1 separately. But I don't understand what that entails.
 
Physics news on Phys.org
  • #2
Homework EquationsThe Attempt at a SolutionI'm not sure how to approach this problem. I'm having difficulty directly obtaining a closed form for θs. It has been suggested to consider cases when xs = 0 and 1 separately. But I don't understand what that entails.
 

FAQ: Stuck on obtaining a closed form for parameter using MLE

What is MLE and why is it important?

MLE stands for Maximum Likelihood Estimation and it is a statistical method used to estimate the parameters of a probability distribution. It is important because it allows us to find the most likely values for these parameters based on a given set of data.

What do you mean by "obtaining a closed form" for a parameter using MLE?

A closed form solution is a mathematical expression that can be written in a finite number of algebraic operations. In the context of MLE, it means finding an analytical solution for the parameter instead of an iterative or numerical solution.

Why is obtaining a closed form solution for a parameter using MLE challenging?

In most cases, it is not possible to obtain a closed form solution for the parameter using MLE. This is because the likelihood function, which is used to find the parameter, often involves complex integrals or summations that do not have a closed form solution.

Can MLE be used for any type of data?

MLE can be used for any type of data as long as it follows a known probability distribution. However, in some cases, the data may not follow a known distribution, making it difficult to use MLE.

What are some alternative methods to MLE for estimating parameters?

Some alternative methods to MLE include Bayesian estimation, method of moments, and least squares estimation. These methods may be used when a closed form solution for the parameter using MLE is not possible.

Back
Top