MSE estimation with random variables

In summary, the approach for the given problem on MSE estimation/linear prediction is incorrect. Instead of substituting the estimate ##\hat S## for ##S##, it should be substituted with ##\sum c_i X_i## in the expression for expected values. By expanding and numerically minimizing the resulting expression, a solution can be obtained with two coefficients near 1 and one near 0. The optimization may also be solvable analytically.
  • #1
ashah99
60
2
Homework Statement
Please see below: finding an MSE estimate for random variables
Relevant Equations
Expectation formula, MSE = E( (S_hat - S)^2 )
Hello all, I am wondering if my approach is correct for the following problem on MSE estimation/linear prediction on a zero-mean random variable. My final answer would be c1 = 1, c2 = 0, and c3 = 1. If my approach is incorrect, I certainly appreciate some guidance on the problem. Thank you.

Problem
1667568284000.png

Approach:
1667568360483.png
 
Physics news on Phys.org
  • #2
Yes, the approach is incorrect. When you take expected values, you assume that
$$E\left[ \hat S X_i\right] = E\left[ S X_i\right]$$
But we have no reason to suppose that is correct. ##\hat S## is only an estimate of ##S##, not identical to it, and cannot be substituted for it, except in very limited circumstances.

Instead, substitute ##\sum c_i X_i## for ##\hat S## in ##E[(\hat S - S)^2]##, then expand to get an expression in expected values of first and second order terms in ##X_1, X_2, X_3, S##, with unknowns ##c_1,c_2,c_3##. We have been given values for all those terms except ##E[S^2]##, which we can ignore, since it it is not multiplied by any of the unknown coefficients. Numerically minimising that expression, I get a solution where two of the coefficients are near 1 and one is near 0. Possibly the optimisation can be solved analytically, but I didn't try.
 

FAQ: MSE estimation with random variables

What is MSE estimation with random variables?

MSE estimation with random variables is a statistical technique used to estimate the mean squared error (MSE) of a population using a sample of random variables. It involves calculating the squared difference between the estimated values and the true values of the population, and then taking the average of these squared differences.

How is MSE calculated?

MSE is calculated by taking the squared difference between the estimated values and the true values of the population, and then taking the average of these squared differences. It is expressed as the sum of the squared errors divided by the number of observations.

What is the purpose of MSE estimation with random variables?

The purpose of MSE estimation with random variables is to measure the accuracy of an estimator in predicting the true values of a population. It is commonly used in regression analysis and other statistical modeling techniques to evaluate the performance of different models and to choose the best one.

How is MSE estimation with random variables different from other estimation techniques?

MSE estimation with random variables is different from other estimation techniques in that it takes into account the variability of the data and provides a measure of the accuracy of the estimator. Other estimation techniques may only consider the average error, without taking into account the spread of the data.

Can MSE estimation with random variables be used for any type of data?

Yes, MSE estimation with random variables can be used for any type of data as long as the data is randomly sampled from a population and the estimator is unbiased. However, it is most commonly used for continuous data and may not be suitable for categorical data.

Similar threads

Replies
1
Views
1K
Replies
1
Views
859
Replies
1
Views
826
Replies
1
Views
1K
Replies
30
Views
3K
Replies
4
Views
1K
Replies
8
Views
995
Replies
4
Views
1K
Back
Top