Linear Gaussian parameter estimation

In summary, the conversation discusses the process of estimating a set of parameters in a multivariate linear-gaussian model. The speaker describes their method of deriving the maximum likelihood estimate (MLE) for the matrices A and Q and how they solve for each parameter. However, in a specific application, the speaker does not want to estimate the entire matrix A and instead wants to estimate only certain scalar elements. This presents difficulties in the matrix calculus and the speaker is unsure how to solve the equations symbolically. They also mention attempting to use Matlab's Symbolic toolbox to solve the equations, but the resulting estimates for A are dependent on Q, which they believe is correct. The conversation ends with a question about how to proceed with solving the parameters that are
  • #1
aydos
19
2
Hi,

I have a multivariate linear-gaussian model and I am trying to estaimte a particular scalar set of parameters of the model.
I know how to derive the MLE in order to find the matrices A and Q (linear transfer function and covariance respectively).
I take the log of the joint distribution, find the above parameter derivatives, equal the expression to zero and solve for each of the 2 above parameters. The expression for A only depends on the data. The expression of Q depends on A and on the data. So I solve A first and then I can solve Q.

However, I have a specific application where I do not want to estimate the entire matrix A. Some scalar elements of the matrix I know a priori and some other scalar elements are the parameters to be estimated.

This is where my problems start:
1- The matrix calculus gets very hairy and I do not know how to solve this symbolically.
2- I tried to skip the step above and expanded the linear equation into a scalar expression with Matlab Symbolic toolbox since the parameters to be estimated are now scalars. I then went through Matlab differentiation and solving tools as well. It seems to work in principle.
3- On the original problem (the one I know how to solve), the expressions for A do not depend on Q. But now the Matlab solution shows that my estimated scalar Aij parameters do depend on Q and I believe it is correct. So now I have a set of parameters that all depend on each other and I am not sure what to do to solve them.

Any light on what I might need here would be appreciated.

Regards,
Carlos

BTW. this is my first post here, how do I insert latex expressions in these posts?
 
Physics news on Phys.org
  • #2
Welcome to PF, Carlos.

You can insert latex into your post by typing

[ tex] your code here [ /tex]
or
[ itex] your code here [ /itex] (for an inline formula)

When you click on the [itex]\Sigma[/itex] symbol you are provided a short latex reference.

As for your question. Have you tried to solve your equations in the special case of a bivariate Gaussian, just to get a feeling of how the calculations go and if they can be done at all:smile:
 
Last edited:
  • #3
Hi aidos. As mentioned before, it's a bit difficult to tell what you're doing if you aren't more explicit with the formulas. That said, I'd point out that you cannot generally find the MLE for each part of the parameters separately and have that work out to be the MLE over all of the parameters. The usual Gaussian problem is a special case, where the mean estimate doesn't depend on the covariance, and so you can do it in an iterated fastion. But, most of the time, you need to jointly estimate all of the parameters at once to make it happen. Nevertheless, you can still use an iterated approach, wherein you hold all but one parameter constant, and then optimize the others, then move on to the next parameter, and so on. You may need to take only small steps in each parameter at each iteration, and do lots of iterations, to ensure convergence. Alternatively, it may be possible to solve the entire set of equations at once to get the answer, but this will usually not be the case and, even when it is, the solution may be very nasty.
 

FAQ: Linear Gaussian parameter estimation

What is a linear Gaussian model?

A linear Gaussian model is a statistical model that assumes a linear relationship between the independent and dependent variables, with the errors or residuals following a Gaussian or normal distribution.

What is parameter estimation?

Parameter estimation is the process of determining the values of unknown parameters in a statistical model based on the available data. In the case of linear Gaussian models, this involves finding the values of the slope and intercept that best fit the data.

How is parameter estimation done in linear Gaussian models?

In linear Gaussian models, parameter estimation is typically done using the method of least squares. This involves minimizing the sum of squared errors between the predicted values and the actual values of the dependent variable.

What are the assumptions of linear Gaussian models?

The main assumptions of linear Gaussian models include linearity, normality of errors, homoscedasticity (the errors have equal variances), and independence of errors. Violations of these assumptions can affect the accuracy of parameter estimates.

What are some applications of linear Gaussian parameter estimation?

Linear Gaussian parameter estimation is commonly used in various fields, including economics, engineering, and social sciences, to analyze and predict relationships between variables. It is useful for tasks such as forecasting, trend analysis, and hypothesis testing.

Similar threads

Back
Top