Least squares parameter correlation

In summary, the conversation discusses a problem with highly correlated parameters in a least squares inversion. The person has tried Tikhonov regularization but has not had success. They suggest looking at L curves and S curves to find an appropriate value for the parameter.
  • #1
vibe3
46
1
I am trying to solve a large least squares inversion (inverting data for the modeled sources), and find that my parameters describing 1 source are highly correlated with the parameters describing the second source.

Can anyone recommend a technique or reference which discusses how to reduce the correlation between parameters in a least squares system? I have already tried Tikhonov regularization (damping each set of parameters) with no luck.
 
Physics news on Phys.org
  • #2
I used Thikonov in my MSc thesis, and it is a genera method used on ill - posed problems, and it works fine. What you usually find is some correlation between a set of data (the one you one to calculate) with some "noise" of this data. Thikonov used the parameter [tex] \alpha[/tex] to regularize the solution. You may calculate the residual norm (RN) between your desired data and the one you obtain. However, using Thikonov regularization means to include this regulator [tex]\alpha |S|^{2}[/tex] in the residual norm and minimize it, where S is the solution. The problem with this method is a non- well stablished criterion to find the appropiate value of the parameter.

I suggest you to take a look at the L curves and S curves an its definitions,I hope I gave you a taste about it.
 

FAQ: Least squares parameter correlation

What is least squares parameter correlation?

Least squares parameter correlation is a statistical method used to measure the relationship between two or more variables. It calculates the best fit line through a set of data points in order to determine the strength and direction of the relationship between the variables.

How is least squares parameter correlation calculated?

The calculation of least squares parameter correlation involves minimizing the sum of the squared differences between the observed data points and the predicted values on the line of best fit. This is done by finding the slope and intercept of the line that minimizes the sum of squared residuals, which are the differences between the observed and predicted values.

What is the significance of the correlation coefficient in least squares parameter correlation?

The correlation coefficient, denoted as r, is a measure of the strength and direction of the relationship between the variables. It ranges from -1 to 1, where a value of -1 indicates a perfect negative correlation, 0 indicates no correlation, and 1 indicates a perfect positive correlation. In least squares parameter correlation, a higher absolute value of r indicates a stronger correlation between the variables.

What are the assumptions made in least squares parameter correlation?

There are several assumptions made in least squares parameter correlation, including: the relationship between the variables is linear, the data points are independent and randomly sampled, the residuals are normally distributed, and there are no significant outliers or influential data points. Violation of these assumptions can affect the accuracy and validity of the results.

How is least squares parameter correlation used in scientific research?

Least squares parameter correlation is commonly used in scientific research to determine the strength and direction of relationships between variables. It can also be used to make predictions and test hypotheses. This method is particularly useful in fields such as statistics, economics, and social sciences, where there are often multiple variables that may influence each other.

Similar threads

Back
Top