- #1
fab13
- 318
- 6
- TL;DR Summary
- I am looking for a way to make cross-correlation between 2 Fisher matrix and get a final Fisher matrix that, if it is inverted, will give the constraints of this cross-correlation.
I have 2 Fisher matrixes which represent information for the same variables (I mean columns/rows are the same in the 2 matrixes).
Now I would like to make the cross synthesis of these 2 matrixes by applying for each parameter the well known formula (coming from Maximum Likelihood Estimator method) :
$$\dfrac{1}{\sigma_{\hat{\tau}}^{2}}=\dfrac{1}{\sigma_1^2}+\dfrac{1}{\sigma_2^2}\quad(1)$$
##\sigma_{\hat{\tau}}## represents the best estimator representing the combination of a `sample1` (##\sigma_1##) and a `sample2` (##\sigma_2##).
Now, I would like to do the same but for my 2 Fisher matrixes, i.e from a matricial point of view.
For this, I tried to diagonalize each of these 2 Fisher matrix. Then, I add the 2 diagonal matrix and I have so a global diagonal Fisher matrix, but I don't know how to come back in the space of start (since the diagonalization don't give the same combination of eigen vectors for each matrix).
If I could get back in the same time to starting parameters space, I could do matrix products to get the final Fisher matrix by doing :
$$\text{Fisher$_{\text{final,cross}}$} = P.\text{Fisher$_{\text{diag,global}}$}.P^{-1}\quad(2)$$
with ##P## the passing matrixes (composed of eigen vectors) and I could get directly the covariance matrix by inverting ##\text{Fisher}_{\text{final,cross}}##
How can I come back from (2) of the ##\text{Fisher$_{\text{diag,global}}$}## diagonal matrix to starting space, i.e the single parameters ?
My difficulties come from the fact that the diagonlization of the 2 Fisher matrixes will produce different passing matrix ##P_1## and ##P_2##, that is to say, different eigen vectors, so a different linear combination of variables between both. I have written the passing matrix ##P## above but it is not defined, I think an expression of ##P## as a function of ##P_1## and ##P_2## passsing matrixes is the key point of my issue.
There is surely a linear algebra property whih could allow to circumvent this issue of taking into account the 2 different linear combination of variables while being able to come back in the space of start, i.e space of single parameters which represent the Fisher matrixes.
If someone could help me to perform this operation, I hope that I have been clear. If you have any questions, don't hesitate, I would be glad to give you more informations.
Precisions : I take the following notations :
1. ##D## diagonal matrix is the sum of the 2 diagonalized matrix ##D_1## and ##D_2## (from `Fisher1` and `Fisher2` initial matrixes) : ##D=D_1+D_2##2. ##P_1## and ##P_2## are respectively the "passing" matrixes (from `Fisher1` and `Fisher2` diagonalization) composed of eigen vectors.3. ##M_1## is the `Fisher1` matrix and ##M_2## is the `Fisher2` matrix.So, I am looking a way to find an endomorphism ##M## that checks :$$D=P^-1.M.P\quad(3)$$ where #P# is the unkonwn "passing" matrix.So, there are 2 unknown quantities in my issue :
1. The "passing" matrix, i.e the eigenvectors (I am yet trying to build it from ##P_1## and ##P_2## matrixes).
2. The ##M## matrix which represents this endomorphism.
But in this world of unknown quantities, I know however the eigen values of this wanted endomorphism ##M## (they are equal to the diagonal of matrix ##D##).
For the instant, I tried to do the combination between ##P_1## and ##P_2## in order to get (approximately, surely) :
$$P=P_1+P_2$$
such that way, I can compute ##M## cross Fisher matrix like this :
$$M=(P_1+P_2).D.(P_1+P_2)^{-1}$$
But after inverting ##M## (to get cross covariance matrix), constraints are not good (sigma greater than it was expected, for example greather than sigma given by only one matrix).
Would anyone help me to find a way to build this ##P## "passing" matrix from ##P_1## and ##P_2## ? As you can see, a simple sum is not enough.
If an exact building of ##P## from ##P_1## and ##P_2## is not possible, is there a way to approximate it ?
Regards
Now I would like to make the cross synthesis of these 2 matrixes by applying for each parameter the well known formula (coming from Maximum Likelihood Estimator method) :
$$\dfrac{1}{\sigma_{\hat{\tau}}^{2}}=\dfrac{1}{\sigma_1^2}+\dfrac{1}{\sigma_2^2}\quad(1)$$
##\sigma_{\hat{\tau}}## represents the best estimator representing the combination of a `sample1` (##\sigma_1##) and a `sample2` (##\sigma_2##).
Now, I would like to do the same but for my 2 Fisher matrixes, i.e from a matricial point of view.
For this, I tried to diagonalize each of these 2 Fisher matrix. Then, I add the 2 diagonal matrix and I have so a global diagonal Fisher matrix, but I don't know how to come back in the space of start (since the diagonalization don't give the same combination of eigen vectors for each matrix).
If I could get back in the same time to starting parameters space, I could do matrix products to get the final Fisher matrix by doing :
$$\text{Fisher$_{\text{final,cross}}$} = P.\text{Fisher$_{\text{diag,global}}$}.P^{-1}\quad(2)$$
with ##P## the passing matrixes (composed of eigen vectors) and I could get directly the covariance matrix by inverting ##\text{Fisher}_{\text{final,cross}}##
How can I come back from (2) of the ##\text{Fisher$_{\text{diag,global}}$}## diagonal matrix to starting space, i.e the single parameters ?
My difficulties come from the fact that the diagonlization of the 2 Fisher matrixes will produce different passing matrix ##P_1## and ##P_2##, that is to say, different eigen vectors, so a different linear combination of variables between both. I have written the passing matrix ##P## above but it is not defined, I think an expression of ##P## as a function of ##P_1## and ##P_2## passsing matrixes is the key point of my issue.
There is surely a linear algebra property whih could allow to circumvent this issue of taking into account the 2 different linear combination of variables while being able to come back in the space of start, i.e space of single parameters which represent the Fisher matrixes.
If someone could help me to perform this operation, I hope that I have been clear. If you have any questions, don't hesitate, I would be glad to give you more informations.
Precisions : I take the following notations :
1. ##D## diagonal matrix is the sum of the 2 diagonalized matrix ##D_1## and ##D_2## (from `Fisher1` and `Fisher2` initial matrixes) : ##D=D_1+D_2##2. ##P_1## and ##P_2## are respectively the "passing" matrixes (from `Fisher1` and `Fisher2` diagonalization) composed of eigen vectors.3. ##M_1## is the `Fisher1` matrix and ##M_2## is the `Fisher2` matrix.So, I am looking a way to find an endomorphism ##M## that checks :$$D=P^-1.M.P\quad(3)$$ where #P# is the unkonwn "passing" matrix.So, there are 2 unknown quantities in my issue :
1. The "passing" matrix, i.e the eigenvectors (I am yet trying to build it from ##P_1## and ##P_2## matrixes).
2. The ##M## matrix which represents this endomorphism.
But in this world of unknown quantities, I know however the eigen values of this wanted endomorphism ##M## (they are equal to the diagonal of matrix ##D##).
For the instant, I tried to do the combination between ##P_1## and ##P_2## in order to get (approximately, surely) :
$$P=P_1+P_2$$
such that way, I can compute ##M## cross Fisher matrix like this :
$$M=(P_1+P_2).D.(P_1+P_2)^{-1}$$
But after inverting ##M## (to get cross covariance matrix), constraints are not good (sigma greater than it was expected, for example greather than sigma given by only one matrix).
Would anyone help me to find a way to build this ##P## "passing" matrix from ##P_1## and ##P_2## ? As you can see, a simple sum is not enough.
If an exact building of ##P## from ##P_1## and ##P_2## is not possible, is there a way to approximate it ?
Regards
Last edited: