- #1
probabilityst
- 3
- 0
I actually have two questions, both of which are on the same topic
Consider X = number of independent trials until an event A occurs. Show that X has the probability mass function f(x) = (1-p)^x p, x = 1,2,3,..., where p is the probability that A occurs in a single trial. Also show that the maximum likelihood estimate of p(^p) is 1/(sample mean), where (sample mean) = (sum from 1 to n of x)/n. This experiment is referred to as the negative binomial experiment
2. Find the maximum likelihood estimate for the paramter mu of the normal distribution with known variance sigma^2 = sigma_0_^2
_0_ is subscript
Um, I'm not actually sure how to go about solving this problem. Is maximum likelihood estimate similar to minimum variance unbiased estimator?
Any help is greatly appreciated, thank you
Homework Statement
Consider X = number of independent trials until an event A occurs. Show that X has the probability mass function f(x) = (1-p)^x p, x = 1,2,3,..., where p is the probability that A occurs in a single trial. Also show that the maximum likelihood estimate of p(^p) is 1/(sample mean), where (sample mean) = (sum from 1 to n of x)/n. This experiment is referred to as the negative binomial experiment
2. Find the maximum likelihood estimate for the paramter mu of the normal distribution with known variance sigma^2 = sigma_0_^2
_0_ is subscript
Homework Equations
The Attempt at a Solution
Um, I'm not actually sure how to go about solving this problem. Is maximum likelihood estimate similar to minimum variance unbiased estimator?
Any help is greatly appreciated, thank you