Maximum Likelihood and Fisher Information

In summary, to find the maximum likelihood estimate (MLE) of θ, you first take the log of the likelihood function and then take the partial derivative with respect to θ. However, since θ is limited by 0 and the smallest observed x, if the MLE gives a value outside of these bounds, the MLE must be at one of the two bounds. The concept of Fisher Information may also be relevant in this problem.
  • #1
dspampi
16
0

Homework Statement


Let X1, X2,...Xn be a random sample from pdf,
f(x|θ) = θx-2 where 0 < θ ≤ x < ∞

Find the MLE of θMy attempt:

Likelihood fxn: L(θ|x) = ∏θx-2 = θn∏ θx-2

And to find MLE, I take Log of that function and partial derivative (w.r.t θ, of log L(θ|x) and set that = 0, and get: n/θ = 0

However, I realize that θ ≤ x and θ > 0...what do I need to do to incorporate this to my likelihood function?
In class we discuss about Fisher Information and I have a guess that it has some involvement with this problem, but I'm not sure why and what we can use Fisher Information for this problem?[/SUP][/SUP][/SUP][/SUP][/SUB][/SUB][/SUB]
 
Physics news on Phys.org
  • #2
Those two bounds are absolute limits - θ cannot be zero or negative, and it cannot be larger than the smallest observed x. If your likelihood estimate gives such an unreasonable value, the maximal likelihood has to be at one of the two bounds.
 

Related to Maximum Likelihood and Fisher Information

What is Maximum Likelihood?

Maximum Likelihood is a statistical method used to estimate the parameters of a probability distribution by finding the values that make the observed data most likely to occur.

How is Maximum Likelihood calculated?

Maximum Likelihood is calculated by taking the product of the probability density function of the distribution for each data point and maximizing this value over all possible parameter values.

What is the Fisher Information?

The Fisher Information is a measure of the amount of information that a sample of data contains about the parameters of a probability distribution. It is calculated by taking the negative second derivative of the log-likelihood function.

How is the Fisher Information used in Maximum Likelihood?

The Fisher Information is used in Maximum Likelihood to assess the quality of the estimated parameter values. It can also be used to calculate the standard errors of the estimated parameters.

What are the assumptions of Maximum Likelihood?

The main assumptions of Maximum Likelihood are that the data are independent and identically distributed, and that the probability distribution used to model the data is correctly specified.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
873
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
2K
Replies
0
Views
501
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
Back
Top