Nonlinear regression in two or more independent variables

In summary: I don't remember where, but it was a while ago and I forgot about it.In summary, there is an established numerical method for nonlinear regression on two or more independent variables. You can extend the Gauss-Newton method for this case.
  • #1
maistral
240
17
Hi. I wanted to learn more on this topic, but it seems all the available resources in the internet points to using R, SPSS, MINITAB or EXCEL.

Is there an established numerical method for such cases? I am aware of the Levenberg-Marquardt, Gauss-Newton and such methods for nonlinear regression on one independent variable only. I would like to ask how can I extend these methods for use on two or more independent variables.
 
Physics news on Phys.org
  • #2
Hi,

I suggest you widen your search to include non-linear least squares minimization. Plenty stuff there.
 
  • #3
Hi. Thanks for replying.

I did search for nonlinear LSM as well, but I keep on getting information only on single independent variables.
 
  • #4
? In the link I gave ##\bf x## is a vector -- or is that not what you are referring to ?
 
  • #5
Errr. If I understand this paper correctly x is a vector containing the values for the independent variable. My problem refers to a case where there are two independent variables (and thus, two vectors).

Unless I'm not reading this correctly, if this is the case I extend my apologies.
 
  • #6
No need for apologies.
Two independent vectors with N and M components, respectively, make one vector with N+M components ...
 
  • Like
Likes FactChecker
  • #7
Yes. In this context, all the independent variables are usually consolidated into a single vector, X. That makes the math notation much more concise in terms of vector operations.
 
  • #8
This is making me feel sad. I am totally unable to comprehend the notation. I have no idea how to insert the second set of independent variables.

As how I understood it, it should look like a matrix of N rows (data) x M columns (independent variables). The Jacobian should be simple I guess as it's simply going to be the partial derivative of the function with respect each modelling parameter evaluated at the independent variables.

Then I go to a dead end. I have no idea how to proceed further. I mean, I can evaluate JTJ and such; but regarding the independent variables, do I just move forward with the matrix operations?
 
  • #9
maistral said:
comprehend the notation. I have no idea how to insert the second set of independent variables.
I have a feeling it's not the notation that's the barrier here. You have some experiments where you vary something (the independent variables ##\vec x## ) and you measure something (the dependent variables ##\vec y##). Holding those separate is already a major task in many cases... :rolleyes:.

And you have some idea in the form of a model that expresses the ##y_i## as a function of the ##x_j\, : \ \ \ \hat y_i = f_i (\vec x, \vec p) . \ \ ## Here##\ \hat y_i\ ## is the expected (predicted) value of the actual measurement ##\ y_i \, .\ \ ## And ##\ \vec p\ ## is a vector of parameters, the things you are investigating after.

What you want to minimize is ##\displaystyle {\sum {(y_i - \hat y_i)^2\over \sigma_i^2} } ## by suitably varying the ##p_k .\ \ ## The ##\sigma_i## are the (estimated) standard deviations in the ##\ y_i##.

This is a bit general; perhaps we can make this easier if you elaborate a bit more on what exactly you are doing, or by focusing on a comparable example ?
 
  • #10
Hi! Thank you very much for replying.

Yes, I'm actually looking for a general case. What I'm trying to do is; say, I have this set of data:
2hrle1u.png


I wanted to fit the modelling parameters a, b, and z into Z(x,y). While this can be transformed into a multiple linear regression problem, I wanted to be able to do it using nonlinear regression. That's why I was asking if there is a numerical method tailored for these kinds of problems, or is there a way to extend, say for example, at the very least the Gauss-Newton method.

EDIT: I think I understand the strategy you mentioned. Using guess values of the parameters a, b, and z, I'll evaluate the modelling function at those initial guess values and the points x, y. Then I'll get the sum of the squares of the residuals (something like (Zmeasured - Zpredicted)2). Then the problem becomes a minimization problem which I can kill using something similar to the Newton-Raphson for optimization or steepest descent methods. I do not remember using that estimated variance though (I don't even know how to calculate that)...

My question was if there is a more 'robust' or algorithmic method in order to determine the parameters; something similar to the Levenberg-Marquardt.
 
  • #11
Hold on. I'm squealing here because of a tiny eureka moment. I'll try studying this on my own and see what happens.
 
  • #12
Aha. I made it work.

Basically I just need to run the algorithm normally since the partial derivatives are with respect to the modelling parameters anyway and the method would just treat the data points as constants. I had this idea since your post reminded me of that generic method of attempting to reduce the sum of the squares of residuals.

Moral lesson: do not over-analyze, lol.
Thanks again!
 

Related to Nonlinear regression in two or more independent variables

1. What is nonlinear regression in two or more independent variables?

Nonlinear regression in two or more independent variables is a statistical method used to model the relationship between a dependent variable and two or more independent variables. Unlike linear regression, which assumes a linear relationship between the variables, nonlinear regression allows for more complex and nonlinear relationships to be modeled.

2. When is nonlinear regression in two or more independent variables used?

Nonlinear regression in two or more independent variables is commonly used when the relationship between the dependent variable and independent variables cannot be adequately described by a linear model. It is also used when the data exhibits a curvilinear relationship or when there are multiple independent variables that may interact with each other.

3. How is nonlinear regression in two or more independent variables different from linear regression?

Nonlinear regression in two or more independent variables differs from linear regression in that it allows for more complex relationships to be modeled and does not assume a linear relationship between the variables. It also requires the use of more advanced statistical techniques such as gradient descent to find the best fit parameters for the model.

4. What are the benefits of using nonlinear regression in two or more independent variables?

One of the main benefits of using nonlinear regression in two or more independent variables is that it allows for more accurate modeling of complex relationships between variables. This can lead to better predictions and insights compared to linear regression, which may not be able to capture the true nature of the data. Nonlinear regression also allows for the inclusion of multiple independent variables, which can provide a more comprehensive understanding of the relationship between the variables.

5. What are the limitations of nonlinear regression in two or more independent variables?

One limitation of nonlinear regression in two or more independent variables is that it can be more computationally intensive and may require more data compared to linear regression. Additionally, it can be more difficult to interpret the results of a nonlinear regression model compared to a linear regression model. It also requires the researcher to have a good understanding of the underlying data and the appropriate techniques to use for model selection and parameter estimation.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
8
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
911
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
25
Views
8K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
Back
Top