Estimation, bias and mean squared error

In summary, the proposed estimate of θ is m = max(x1,...xn) and the attempt at a solution is to differentiate with respect to θ to get the density function.
  • #1
stukbv
118
0

Homework Statement



(x1,x2,...xn) is modeled as observed values of independent random variables X1,X2,...Xn each with the distribution 1/θ for x in [0,θ] and 0 otherwise.
A proposed estimate of θ is m = max(x1,...xn) Calculate the distribution of the random variable M=max(X1,X2,...Xn) and considering M as an estimator for θ, its bias and mean squared error.

2. The attempt at a solution
P(max(X1,...Xn)≤m) = P(X1≤m)P(X2≤m)...P(Xn≤m)
via independent of the Xi's.

Then since they have the same distribution this is just
(m/θ)n

So to get the distribution do I just differentiate with respect to θ
Which would give me

n(m/θ)n-1 * (-m/(θ2))

Is this the right way to think about it ?

Thank you
 
Last edited:
Physics news on Phys.org
  • #2
stukbv said:

Homework Statement



(x1,x2,...xn) is modeled as observed values of independent random variables X1,X2,...Xn each with the distribution 1/θ for x in [0,θ] and 0 otherwise.
A proposed estimate of θ is m = max(x1,...xn) Calculate the distribution of the random variable M=max(X1,X2,...Xn) and considering M as an estimator for θ, its bias and mean squared error.

2. The attempt at a solution
P(max(X1,...Xn)≤m) = P(X1≤m)P(X2≤m)...P(Xn≤m)
via independent of the Xi's.

Then since they have the same distribution this is just
(m/θ)n

So to get the distribution do I just differentiate with respect to θ
Which would give me

n(m/θ)n-1 * (-m/(θ2))

Is this the right way to think about it ?

Thank you

You have the (cumulative) distribution function F(m) = Pr{M <= m}. How do you get the density function of M from that? There is a standard formula; you just need to use it.

RGV
 
  • #3
I know that you differentiate to get the density function but I can't work out whether its with respect to theta (which is what I did above) or with respect to m?
 
  • #4
##f(m) = {d \over dm} F(m)##.
 
  • #5
The standard formula would tell you exactly what to do---no confusion!

RGV
 
  • #6
I see thamk you!
 

FAQ: Estimation, bias and mean squared error

What is estimation?

Estimation is the process of using a sample of data to make a prediction or inference about a larger population. It involves using statistical methods to estimate unknown parameters or characteristics of the population.

What is bias in estimation?

Bias in estimation refers to the systematic error or tendency for an estimator to consistently overestimate or underestimate the true value of a population parameter. It can be caused by various factors such as sampling methods, measurement errors, or the choice of estimator.

What is mean squared error (MSE)?

Mean squared error (MSE) is a measure of the average squared difference between the estimated values and the true values of a population parameter. It is calculated by taking the average of the squared differences between each estimated value and the true value.

How is bias related to mean squared error (MSE)?

Bias and mean squared error (MSE) are both measures of the accuracy of an estimator. A biased estimator will have a non-zero MSE, while an unbiased estimator will have an MSE of zero. Therefore, reducing bias can help improve the accuracy of an estimator and decrease its MSE.

How can bias and mean squared error (MSE) be minimized?

Bias and mean squared error (MSE) can be minimized by using unbiased estimation methods, increasing the sample size, and improving the quality of data collection. Additionally, using cross-validation techniques and other bias-reducing methods can also help minimize bias and MSE.

Similar threads

Replies
7
Views
17K
Replies
2
Views
4K
Replies
5
Views
3K
Replies
16
Views
2K
Replies
11
Views
2K
Replies
1
Views
2K
Replies
4
Views
1K
Back
Top