- #1
kelly0303
- 580
- 33
Hello! I am reading Data Reduction and Error Analysis by Bevington, 3rd Edition and in Chapter 8.1, Variation of ##\chi^2## Near a Minimum he states that for enough data the likelihood function becomes a Gaussian function of each parameter, with the mean being the value that minimizes the chi-square: $$P(a_j)=Ae^{-(a_j-a_j')^2/2\sigma_j^2}$$ where ##A## is a function of the other parameters, but not ##a_j##. Is this the general formula or is it a simplification where the correlation between the parameters is zero? From some examples later I guess this is just a particular case and I assume the most general formula would be a multivariate gaussian, but he doesn't explicitly state this anywhere. Can someone tell me what's the actual formula? Also, can someone point me towards a proof of this? Thank you!