Linear Least Squares: Solving 3D Data Points in C++

In summary, you need to set up two matrices to represent the data points, and then use a Cholesky decomposition to solve for the x-coordinate.
  • #1
Lindley
7
0
I have a simple problem. I have a set of 3D data points and I want to fit a line through them using linear least squares. I understand the basic approach required: set up two matrices such that Ax = b, then make it a square matrix A^t*Ax = A^t*b, then solve for x using a Cholesky decomposition. (I'm working in C++ using the Eigen 2.0 matrix library.)

The problem is that I'm not sure how to set up A and b to begin with. This would be simple if the data points were in 2D: b would be a column vector of all the ys, and A would be an Nx2 matrix with [x, 1] on each row. Then the unknowns would be x = [m, b] for the normal y = mx+b line.

But with 3D points, it's less clear to me how I have to arrange things.

Also, any suggestions for how to make the operation more numerically stable (such as subtracting off the mean from each data point, I know that one at least) would be welcome.
 
Physics news on Phys.org
  • #2
It depends on what type of line you have such as can it be curved or is it only straight. If it is straight or you are trying to fit a straight line to the data then the following way could be possible, but only if the data points can be numbered from 1 to n. If you can do this then you can split your data into 3 groups of parametric equations of (t,x), (t,y), (t,z) which are all 2D and you can figure it out from there. After you get all three you would have the parametric function <x(t),y(t),z(t)>.
 
  • #3
I only need straight lines.

The parametric approach makes sense, thanks, I'll give that a try!
 

Related to Linear Least Squares: Solving 3D Data Points in C++

1. What is Linear Least Squares and why is it useful in solving 3D data points?

Linear Least Squares is a mathematical method used to find the best fit line or curve that minimizes the squared errors between a set of data points and a predicted line or curve. In 3D data analysis, this method is useful for finding the most accurate model that represents the relationship between three variables. It is often used in fields such as statistics, engineering, and computer graphics.

2. How does Linear Least Squares work in C++?

In C++, Linear Least Squares can be implemented using the least_squares function from the Eigen library. This function takes in a matrix of data points and a vector of observations and uses the Ordinary Least Squares method to find the coefficients of the best fit line or curve. These coefficients can then be used to predict values for new data points.

3. What is the difference between Linear Least Squares and other regression methods?

Linear Least Squares is a specific type of regression method that is used to find the best fit line or curve for a set of data points. Unlike other regression methods such as logistic regression or polynomial regression, Linear Least Squares assumes that the relationship between the variables is linear. This means that the predicted line or curve will be a straight line or a polynomial of degree 1.

4. Can Linear Least Squares be used for non-linear relationships?

No, Linear Least Squares is designed to find the best fit for a linear relationship between variables. If the relationship between the variables is non-linear, other regression methods such as polynomial regression or non-linear least squares may be more appropriate.

5. What are the limitations of using Linear Least Squares?

Linear Least Squares assumes that the data points are normally distributed and that the errors between the data points and the predicted line or curve are also normally distributed. If these assumptions are not met, the results may not be accurate. Additionally, Linear Least Squares may not be suitable for data sets with a large number of variables, as it may result in overfitting the data.

Similar threads

  • Linear and Abstract Algebra
Replies
5
Views
2K
  • Linear and Abstract Algebra
Replies
9
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Linear and Abstract Algebra
Replies
8
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
1K
Back
Top