- #1
Juliayaho
- 13
- 0
Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?
Juliayaho said:Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?
Juliayaho said:Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?
Juliayaho said:Hi does anyone know the "formula/process" for the least square method if the are giving several (x,y) points and asking me to find y=Ax^2 + B?
Juliayaho said:Hi does anyone know the "formula/process" for the least square method if there are giving several (x,y) points and asking me to find y=Ax^2 + B?
chisigma said:Let's suppose that the approximating polynomial is $y=A\ x + B$ so that we have the so called 'least square straight line'. In this case if we have a discrete set of N + 1 points $[y_{i},x_{i}],\ i=0,1,...,N$ then we have to minimize respect to A and B the sum...
$\displaystyle S= \sum_{i=0}^{N} [y_{i} - A\ x_{i} - B]^{2}$ (1)
Proceeding in standard fashion we compute the partial derivatives and impose them to vanish...
$\displaystyle \frac{\partial{S}}{\partial{A}} = - 2\ \sum_{i=0}^{N} x_{i}\ (y_{i} - A x_{i} - B) = 0$
$\displaystyle \frac{\partial{S}}{\partial{B}} = - 2\ \sum_{i=0}^{N} (y_{i} - A x_{i} - B) = 0$ (2)
Reordering we arrive to the 2 x 2 linear system of equations...
$\displaystyle A\ \sum_{i=0}^{N} x_{i}^{2} + B\ \sum_{i=0}^{N} x_{i} = \sum_{i=0}^{N} x_{i}\ y_{i}$
$\displaystyle A\ \sum_{i=0}^{N} x_{i} + B\ (N+1) = \sum_{i=0}^{N} y_{i}$ (3)
... the solution of which is...
$\displaystyle A = \frac{(N+1)\ \sum_{i=0}^{N} x_{i}\ y_{i}- \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$
$\displaystyle B = \frac {\sum_{i=0}^{N} x_{i}^{2}\ \sum_{i=0}^{N} y_{i} - \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} x_{i}\ y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$ (4)
Very well!... but what to do if we want to use a polynomial of degree n>1?... we will examine that [if necessary...] in a next post...
chisigma said:Let's suppose that the approximating polynomial is $y=A\ x + B$ so that we have the so called 'least square straight line'. In this case if we have a discrete set of N + 1 points $[y_{i},x_{i}],\ i=0,1,...,N$ then we have to minimize respect to A and B the sum...
$\displaystyle S= \sum_{i=0}^{N} [y_{i} - A\ x_{i} - B]^{2}$ (1)
Proceeding in standard fashion we compute the partial derivatives and impose them to vanish...
$\displaystyle \frac{\partial{S}}{\partial{A}} = - 2\ \sum_{i=0}^{N} x_{i}\ (y_{i} - A x_{i} - B) = 0$
$\displaystyle \frac{\partial{S}}{\partial{B}} = - 2\ \sum_{i=0}^{N} (y_{i} - A x_{i} - B) = 0$ (2)
Reordering we arrive to the 2 x 2 linear system of equations...
$\displaystyle A\ \sum_{i=0}^{N} x_{i}^{2} + B\ \sum_{i=0}^{N} x_{i} = \sum_{i=0}^{N} x_{i}\ y_{i}$
$\displaystyle A\ \sum_{i=0}^{N} x_{i} + B\ (N+1) = \sum_{i=0}^{N} y_{i}$ (3)
... the solution of which is...
$\displaystyle A = \frac{(N+1)\ \sum_{i=0}^{N} x_{i}\ y_{i}- \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$
$\displaystyle B = \frac {\sum_{i=0}^{N} x_{i}^{2}\ \sum_{i=0}^{N} y_{i} - \sum_{i=0}^{N} x_{i}\ \sum_{i=0}^{N} x_{i}\ y_{i}}{(N+1)\ \sum_{i=0}^{N} x_{i}^{2} - (\sum_{i=0}^{N} x_{i})^{2}}$ (4)
Very well!... but what to do if we want to use a polynomial of degree n>1?... we will examine that [if necessary...] in a next post...
Kind regards
$\chi$ $\sigma$
I like Serena said:The set of derivatives of all the equations with respect to the coefficients is a Jacobian matrix J.
Solving the system is equivalent to multiplying by $(J^T J)^{-1}J^T$.
This can be generalized to more complicated expressions.
The solution remains the same as long as the expression is linear in the coefficients.
chisigma said:From the 'pure theoretically' point of view what You say is true... from the 'pratical' point of view, when You have an approximating polynomial of degree n>1, the procedure leads in most cases to an ill conditioned linear problem so that a different approach [disovered by the German mathematician Karl Friedriek Gauss...] has to be used...
Kind regards
$\chi$ $\sigma$
The least square method process is a mathematical technique used to find the best fitting line or curve for a set of data points. It aims to minimize the sum of the squared differences between the actual data points and the predicted values from the model.
The purpose of using the least square method process is to find a line or curve that best represents the relationship between two variables in a dataset. This can help in making predictions, identifying trends and patterns, and understanding the underlying relationship between the variables.
The least square method process works by calculating the sum of the squared differences between the actual data points and the predicted values from a model. The model is adjusted iteratively until the sum of squared differences is minimized, resulting in the best fitting line or curve.
The assumptions of the least square method process include: the relationship between the variables is linear, the data points are independent, the errors are normally distributed, and the errors have equal variance.
The limitations of the least square method process include: it assumes a linear relationship between variables, it is sensitive to outliers, it does not consider the effect of other variables, and it may not be suitable for non-linear relationships.