Mean squared error (why mean?)

In summary, a question was asked about an equation found in a machine learning book that aims to minimize the mean squared error. The equation includes a \frac{1}{2} term, which the questioner finds confusing as it should be \frac{1}{N} for a mean. However, there is no reference in the text to y or t being restricted to only 2, so it cannot be that N=2. The necessary reference for further clarification was also requested.
  • #1
sunone
3
0
Hi
I found this equation in a machine learning book:
"we want to minimize the mean squared error:"
$E= \frac{1}{2} \sum_{n=1}^N (y-t)^2$

what I do not understand is the \frac{1}{2} , if it is a mean it should be \frac{1}{N},
why are they restricting to 2? In the text there is no reference to y or t being only 2. So it cannot be that N=2.
 
Mathematics news on Phys.org
  • #2
If you ask a question here, please always be sure to include the necessary reference. What book did you read this in? Title, author, page number?
 
  • #3
sunone said:
Hi
I found this equation in a machine learning book:
"we want to minimize the mean squared error:"
$E= \frac{1}{2} \sum_{n=1}^N (y-t)^2$

what I do not understand is the \frac{1}{2} , if it is a mean it should be \frac{1}{N},
why are they restricting to 2? In the text there is no reference to y or t being only 2. So it cannot be that N=2.

Fix Latex.
 

FAQ: Mean squared error (why mean?)

What is mean squared error (MSE)?

Mean squared error (MSE) is a commonly used metric in statistics and machine learning to measure the average squared difference between the predicted values and the actual values in a dataset. It is used to evaluate the performance of a model and is often used in regression analysis.

Why is mean used in MSE?

The "mean" in mean squared error refers to the average of the squared differences between the predicted and actual values. By taking the average, the MSE gives equal weight to all the data points and allows for a more comprehensive evaluation of the model's performance.

What are the advantages of using MSE?

MSE is a popular metric because it is easy to interpret and calculate. It also penalizes large errors more heavily than smaller errors, making it more sensitive to outliers in the data. Additionally, it is a differentiable function, which makes it useful for optimization algorithms.

What are the limitations of using MSE?

MSE is highly influenced by outliers in the data, which can significantly affect the results. It also assumes that the errors are normally distributed, which may not always be the case. Furthermore, it does not provide any insight into the direction of the errors, only their magnitude.

How can MSE be improved upon?

Some alternatives to MSE include mean absolute error (MAE), root mean squared error (RMSE), and mean absolute percentage error (MAPE). These metrics may be more suitable for certain types of data and can provide additional insights into the model's performance. Additionally, using a combination of different metrics can give a more comprehensive understanding of the model's strengths and weaknesses.

Back
Top