- #1
Hepth
Gold Member
- 464
- 40
I want to construct a completely correlated chi^2.
I have a two-dimensional dataset, and its basically like:
{m1,m2,m3,m4}
{a1,a2,a3,a4}
{x0,x0,x0,x0}
So m1-m4, a1-a4 are all different, but each x0 is the same. This happens when I am fitting 2D data, but it is required that the function goes to zero (or some point) along the axis.
I have the entire correlation matrix, and covariance matrix. I just need to then calculate the chi^2. Normally one would take the inverse of the correlation matrix, and just do a sum over the differences between my model and the desired times the Inverse of the correlation matrix.
The problem is the the covariance matrix is not invertable because it is singular, there are n rows that are repetitive. (Basically the x0 by x0 block is all 1's, so theyre linearly dependent.)
Is there a simple way around this? It doesn't matter too much, as the error on the x0 measurements is large (60%), but the errors are the same for all x0.
The chi^2 is used for the training in a neural network.
I have included the cov matrix here :
Thanks for any possible help.
I have a two-dimensional dataset, and its basically like:
{m1,m2,m3,m4}
{a1,a2,a3,a4}
{x0,x0,x0,x0}
So m1-m4, a1-a4 are all different, but each x0 is the same. This happens when I am fitting 2D data, but it is required that the function goes to zero (or some point) along the axis.
I have the entire correlation matrix, and covariance matrix. I just need to then calculate the chi^2. Normally one would take the inverse of the correlation matrix, and just do a sum over the differences between my model and the desired times the Inverse of the correlation matrix.
The problem is the the covariance matrix is not invertable because it is singular, there are n rows that are repetitive. (Basically the x0 by x0 block is all 1's, so theyre linearly dependent.)
Is there a simple way around this? It doesn't matter too much, as the error on the x0 measurements is large (60%), but the errors are the same for all x0.
The chi^2 is used for the training in a neural network.
I have included the cov matrix here :
Code:
cov = {{1.0,0.98,0.94,0.89,0.83,0.77,0.71,-0.18,-0.16,-0.13,-0.10,-0.072,-0.042,-0.013,0.91,0.91,0.90,0.88,0.86,0.81,0.74,-0.22,-0.22,-0.22,-0.22,-0.22,-0.22,-0.22},{0.98,1.0,0.99,0.96,0.91,0.87,0.83,-0.0039,0.020,0.045,0.071,0.098,0.12,0.14,0.95,0.94,0.93,0.92,0.89,0.84,0.76,-0.24,-0.24,-0.24,-0.24,-0.24,-0.24,-0.24},{0.94,0.99,1.0,0.99,0.97,0.94,0.91,0.16,0.18,0.20,0.23,0.25,0.27,0.28,0.96,0.95,0.94,0.92,0.89,0.84,0.76,-0.24,-0.24,-0.24,-0.24,-0.24,-0.24,-0.24},{0.89,0.96,0.99,1.0,0.99,0.98,0.96,0.29,0.31,0.33,0.35,0.37,0.39,0.39,0.95,0.94,0.93,0.91,0.87,0.82,0.74,-0.24,-0.24,-0.24,-0.24,-0.24,-0.24,-0.24},{0.83,0.91,0.97,0.99,1.0,1.0,0.98,0.40,0.42,0.44,0.46,0.47,0.48,0.48,0.92,0.91,0.90,0.88,0.85,0.80,0.71,-0.24,-0.24,-0.24,-0.24,-0.24,-0.24,-0.24},{0.77,0.87,0.94,0.98,1.0,1.0,1.0,0.48,0.50,0.52,0.54,0.55,0.56,0.54,0.89,0.88,0.87,0.85,0.82,0.76,0.68,-0.23,-0.23,-0.23,-0.23,-0.23,-0.23,-0.23},{0.71,0.83,0.91,0.96,0.98,1.0,1.0,0.55,0.57,0.59,0.60,0.61,0.61,0.60,0.86,0.85,0.84,0.82,0.78,0.73,0.65,-0.23,-0.23,-0.23,-0.23,-0.23,-0.23,-0.23},{-0.18,-0.0039,0.16,0.29,0.40,0.48,0.55,1.0,1.0,1.0,0.99,0.98,0.96,0.91,0.14,0.14,0.14,0.14,0.14,0.13,0.13,0.090,0.090,0.090,0.090,0.090,0.090,0.090},{-0.16,0.020,0.18,0.31,0.42,0.50,0.57,1.0,1.0,1.0,1.0,0.99,0.97,0.93,0.16,0.16,0.16,0.17,0.17,0.17,0.16,0.11,0.11,0.11,0.11,0.11,0.11,0.11},{-0.13,0.045,0.20,0.33,0.44,0.52,0.59,1.0,1.0,1.0,1.0,0.99,0.98,0.94,0.19,0.19,0.19,0.19,0.20,0.20,0.20,0.13,0.13,0.13,0.13,0.13,0.13,0.13},{-0.10,0.071,0.23,0.35,0.46,0.54,0.60,0.99,1.0,1.0,1.0,1.0,0.99,0.96,0.21,0.22,0.22,0.23,0.23,0.24,0.24,0.17,0.17,0.17,0.17,0.17,0.17,0.17},{-0.072,0.098,0.25,0.37,0.47,0.55,0.61,0.98,0.99,0.99,1.0,1.0,1.0,0.97,0.23,0.24,0.25,0.26,0.27,0.28,0.29,0.22,0.22,0.22,0.22,0.22,0.22,0.22},{-0.042,0.12,0.27,0.39,0.48,0.56,0.61,0.96,0.97,0.98,0.99,1.0,1.0,0.99,0.25,0.26,0.28,0.29,0.31,0.33,0.35,0.30,0.30,0.30,0.30,0.30,0.30,0.30},{-0.013,0.14,0.28,0.39,0.48,0.54,0.60,0.91,0.93,0.94,0.96,0.97,0.99,1.0,0.26,0.28,0.30,0.32,0.35,0.38,0.41,0.40,0.40,0.40,0.40,0.40,0.40,0.40},{0.91,0.95,0.96,0.95,0.92,0.89,0.86,0.14,0.16,0.19,0.21,0.23,0.25,0.26,1.0,1.0,0.99,0.98,0.96,0.92,0.85,-0.15,-0.15,-0.15,-0.15,-0.15,-0.15,-0.15},{0.91,0.94,0.95,0.94,0.91,0.88,0.85,0.14,0.16,0.19,0.22,0.24,0.26,0.28,1.0,1.0,1.0,0.99,0.97,0.94,0.87,-0.10,-0.10,-0.10,-0.10,-0.10,-0.10,-0.10},{0.90,0.93,0.94,0.93,0.90,0.87,0.84,0.14,0.16,0.19,0.22,0.25,0.28,0.30,0.99,1.0,1.0,1.0,0.99,0.96,0.90,-0.043,-0.043,-0.043,-0.043,-0.043,-0.043,-0.043},{0.88,0.92,0.92,0.91,0.88,0.85,0.82,0.14,0.17,0.19,0.23,0.26,0.29,0.32,0.98,0.99,1.0,1.0,1.0,0.98,0.93,0.031,0.031,0.031,0.031,0.031,0.031,0.031},{0.86,0.89,0.89,0.87,0.85,0.82,0.78,0.14,0.17,0.20,0.23,0.27,0.31,0.35,0.96,0.97,0.99,1.0,1.,0.99,0.96,0.12,0.12,0.12,0.12,0.12,0.12,0.12},{0.81,0.84,0.84,0.82,0.80,0.76,0.73,0.13,0.17,0.20,0.24,0.28,0.33,0.38,0.92,0.94,0.96,0.98,0.99,1.0,0.99,0.24,0.24,0.24,0.24,0.24,0.24,0.24},{0.74,0.76,0.76,0.74,0.71,0.68,0.65,0.13,0.16,0.20,0.24,0.29,0.35,0.41,0.85,0.87,0.90,0.93,0.96,0.99,1.0,0.39,0.39,0.39,0.39,0.39,0.39,0.39},{-0.22,-0.24,-0.24,-0.24,-0.24,-0.23,-0.23,0.090,0.11,0.13,0.17,0.22,0.30,0.40,-0.15,-0.10,-0.043,0.031,0.12,0.24,0.39,1.0,1.0,1.0,1.0,1.0,1.0,1.0},{-0.22,-0.24,-0.24,-0.24,-0.24,-0.23,-0.23,0.090,0.11,0.13,0.17,0.22,0.30,0.40,-0.15,-0.10,-0.043,0.031,0.12,0.24,0.39,1.0,1.0,1.0,1.0,1.0,1.0,1.0},{-0.22,-0.24,-0.24,-0.24,-0.24,-0.23,-0.23,0.090,0.11,0.13,0.17,0.22,0.30,0.40,-0.15,-0.10,-0.043,0.031,0.12,0.24,0.39,1.0,1.0,1.0,1.0,1.0,1.0,1.0},{-0.22,-0.24,-0.24,-0.24,-0.24,-0.23,-0.23,0.090,0.11,0.13,0.17,0.22,0.30,0.40,-0.15,-0.10,-0.043,0.031,0.12,0.24,0.39,1.0,1.0,1.0,1.0,1.0,1.0,1.0},{-0.22,-0.24,-0.24,-0.24,-0.24,-0.23,-0.23,0.090,0.11,0.13,0.17,0.22,0.30,0.40,-0.15,-0.10,-0.043,0.031,0.12,0.24,0.39,1.0,1.0,1.0,1.0,1.0,1.0,1.0},{-0.22,-0.24,-0.24,-0.24,-0.24,-0.23,-0.23,0.090,0.11,0.13,0.17,0.22,0.30,0.40,-0.15,-0.10,-0.043,0.031,0.12,0.24,0.39,1.0,1.0,1.0,1.0,1.0,1.0,1.0},{-0.22,-0.24,-0.24,-0.24,-0.24,-0.23,-0.23,0.090,0.11,0.13,0.17,0.22,0.30,0.40,-0.15,-0.10,-0.043,0.031,0.12,0.24,0.39,1.0,1.0,1.0,1.0,1.0,1.0,1.0}}
Thanks for any possible help.