3D Least Squares Fit and some Linear Algebra

In summary, SVD is a method used to find the direction cosines of the line of best fit for a given set of data.
  • #1
mjdiaz89
11
0
Hello,

I am trying to write an algorithm to calculate the Least Squares Fit Line of a 3D data set. After doing some research and using Google, I came across this document, http://www.udel.edu/HNES/HESC427/Sphere%20Fitting/LeastSquares.pdf (section 2 in page 8) that explains the algorithm for
It uses something from Linear Algebra I have never seen called Singular Value Decomposition (SVD) to find the direction cosines of the line of best fit. What is SVD? What is a direction cosine? The literal angle between the x,y,z axes and the line?

For simplicity's sake, I'm starting with the points (0.5, 1, 2) ; (1, 2, 6) ; (2, 4, 7). So the A matrix, as denoted by the document is (skipping the mean and subtractions)
[tex]A = \left \begin{array} {ccc}
[-1.6667 & -1.1667 & -2.8333 \\
-2.0000 & -1.0000 & 3.0000 \\
-2.3333 & -0.3333 & 2.6667 \end{array} \right][/tex]

and the SVD of A is
[tex]SVD(A) = \left \begin{array} {ccc}
[6.1816 \\
0.7884 \\
0.0000 \end{array} \right][/tex]
but the document says "This matrix A is solved by singular value decomposition. The smallest singular value
of A is selected from the matrix and the corresponding singular vector is chosen which
the direction cosines (a, b, c)" What does that mean?

Any help will greatly be appreciated. Note: I am working in MATLAB R2009a

Thank you in advance!

*NOTE* I POSTED THIS IN THE WRONG MATH FORUM AND CANNOT DELETE THE FIRST POST.
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
I APOLOGIZE FOR THE INCONVENIENCESingular Value Decomposition (SVD) is a method in linear algebra used to decompose a matrix into its constituent components. It works by decomposing the matrix A into three matrices U, D, and V, such that A = UDV^T, where U and V are orthogonal matrices and D is a diagonal matrix containing the singular values of A. The direction cosines of the line of best fit are the components of the vector U corresponding to the smallest singular value (i.e. the components of the last column of U). This vector corresponds to the line of best fit because it is the eigenvector of the covariance matrix of A associated with the smallest eigenvalue. This is because the solution of the least squares problem is the eigenvector associated with the smallest eigenvalue of the covariance matrix of A. So, for your example, the direction cosines are the components of the last column of U (which is the third column of U since U is a 3x3 matrix).
 

Related to 3D Least Squares Fit and some Linear Algebra

1. What is a 3D Least Squares Fit?

A 3D Least Squares Fit is a statistical method used to find the best-fitting line or plane for a set of three-dimensional data points. It minimizes the sum of the squared distances between the data points and the line/plane, resulting in a line/plane that closely represents the overall trend of the data.

2. How is Least Squares Fit different from other regression methods?

Unlike other regression methods that only consider the vertical distance between data points and the fitting line/plane, Least Squares Fit takes into account the perpendicular distance between the data points and the line/plane. This results in a more accurate and robust fitting model.

3. What is the role of Linear Algebra in 3D Least Squares Fit?

Linear Algebra plays a crucial role in 3D Least Squares Fit as it provides the mathematical framework for solving the system of equations that determine the parameters of the fitting line/plane. It also helps in understanding the geometric interpretation of the solution and finding the best-fitting line/plane.

4. Is 3D Least Squares Fit suitable for all types of data?

No, 3D Least Squares Fit is most suitable for data sets that exhibit a linear or planar trend. If the data is highly non-linear or has a complex structure, other regression methods may be more appropriate.

5. What are the limitations of 3D Least Squares Fit?

3D Least Squares Fit assumes that the data points have equal and independent error in all directions. If this assumption is not met, the fitting model may not accurately represent the data. It also does not account for outliers, so the presence of extreme data points can significantly affect the results.

Similar threads

Replies
4
Views
19K
Replies
2
Views
2K
Replies
30
Views
2K
Replies
4
Views
3K
  • Poll
Replies
4
Views
6K
Replies
4
Views
3K
Replies
2
Views
3K
Back
Top