A glitch in Jorrie’s Cosmo-Calculator?

  • I
  • Thread starter JimJCW
  • Start date
In summary: The other in main() if (s > 0.9999999 && s < 1.0000001) { z = 0; // the 'now' line, force z to zero } else { z = s - 1; // redshift s=z + 1
  • #36
Jorrie said:
Here is a patched version of Lightcone7, dated 2002-05-14, which numerically corrects for the offset in the low-z regime.

First, the output z range is still different from the input z range:

1652699864880.png

1652699912214.png


I did the following experiment:

If I calculate Dnow using Gnedin's and your 2022 calculator and plot 'DGnedin - DJorrie' vs z, I expect to see points fluctuating around the Dnedin - DJorrie = 0 line. But I got the following result:

1652700186389.png


I wonder what clues we can get out of this.
 
Space news on Phys.org
  • #37
@JimJCW - You asked: Please let me know the line number and file name concerning the '1.001 + z' problem

In my python version there's a line near the bottom with a comment saying something like "to replicate a bug in Lightcone7 uncomment this line". In the Javascript version, no idea. I noticed it in the output - if you turn on both the ##S## and ##z## columns and bump up the number of dps they display with you can verify it yourself by calculating ##S-z##, but I don't know why it happens.
 
  • #38
Jorrie said:
I have used Tamara Davis' "Fundamental Aspects of the Expansion of the The universe and Cosmic Horizons" equations in the Tutorial, but the implementation in Lightcone7 was a bit different and quite possibly inaccurate.

Note that ##(D_{par}+D_{hor})S=\int_0^\infty T_Hds=\mathrm{const}## only holds for comoving distances and not for the proper distances used in Lightcone7, where ##D_{par}## tends to infinity and ##D_{hor}## tends to a finite constant value. This is clear from the Davis Three-panel Cosmological Expansion graphs in my sig.
Right. I'll do a bit more reading.
 
  • #39
JimJCW said:
First, the output z range is still different from the input z range:

View attachment 301506
View attachment 301507
Hi Jim, yes there is that difference, but that's not the problem, the inputs range must just be a little larger than the output range. As long as the distance now output is correct for the displayed z, that's acceptable for now.
1652712589257.png

For a trial, I've just subtracted 0.001 (line 274 in the 2022-05-14 version) from the output z in order to cancel the origin's horizontal offset, which is caused by the way the distances are computed in the legacy algorithm. The question is: are those (new) values correct? Perhaps you can check it against Gnedin's.

I know that this is not the right way to handle it, so we should be investigating a purer algorithm, like @Ibix is looking at. My own time is slightly limited atm, but I will keep watching and contributing as it hopefully develops as a community project.

Thanks for contributing!
 

Attachments

  • 1652712454038.png
    1652712454038.png
    6.9 KB · Views: 78
  • #40
Jorrie said:
Still pondering an overhaul of the legacy-ridden calculation.js module.
That might be an idea. Is the duplication of the line result.H_t = result.H_t*Ynow; in ScaleResultsan error?

JavaScript:
      case "Normalized":
          result.Tnow = result.Tnow/Tage;
          result.Y = result.Y/Ynow;
          result.Dnow  = result.Dnow/Ynow;
          result.Dthen = result.Dthen/Ynow;
          result.Dhor = result.Dhor/Ynow;
          result.Dpar = result.Dpar/Ynow;
          result.H_t = result.H_t*Ynow;
          result.H_t = result.H_t*Ynow;
          result.TemperatureT = result.TemperatureT / 2.725 ;  
        break;
 
  • #41
Ibix said:
You could put the code to github? I expect github won't last forever (nothing does), but it has a huge amount of inertia. Your code wouldn't run there, but it would always be available for someone to host should your version disappear.
GitHub provides a (free for open source) service called GitHub Pages which can host this so that it WILL run there (as long as it doesn't go over the free usage limits which should be adequate).

Now GitHub is owned by Microsoft, and they have moved a lot of their code (both open source and proprietory) on to it, it is not going to disappear for a LONG time.
 
  • Like
Likes Ibix
  • #42
pbuk said:
they [NMicrosoft] have moved a lot of their code (both open source and proprietory) on to it, it is not going to disappear for a LONG time.
As much as we might want it to. <sigh>
 
  • Like
Likes jim mcnamara
  • #43
pbuk said:
That might be an idea. Is the duplication of the line result.H_t = result.H_t*Ynow; in ScaleResultsan error?

JavaScript:
      case "Normalized":
          result.Tnow = result.Tnow/Tage;
          result.Y = result.Y/Ynow;
          result.Dnow  = result.Dnow/Ynow;
          result.Dthen = result.Dthen/Ynow;
          result.Dhor = result.Dhor/Ynow;
          result.Dpar = result.Dpar/Ynow;
          result.H_t = result.H_t*Ynow;
          result.H_t = result.H_t*Ynow;
          result.TemperatureT = result.TemperatureT / 2.725 ; 
        break;
Yes, good spot! Case "Normalized" is not often used, so that's maybe why the mistake escaped detection.
 
  • #44
Ibix said:
@JimJCW - You asked: Please let me know the line number and file name concerning the '1.001 + z' problem

In my python version there's a line near the bottom with a comment saying something like "to replicate a bug in Lightcone7 uncomment this line". In the Javascript version, no idea. I noticed it in the output - if you turn on both the ##S## and ##z## columns and bump up the number of dps they display with you can verify it yourself by calculating ##S-z##, but I don't know why it happens.

Would you please calculate the following Dnow values using your calculator so I can do some comparison?

z
Gnedin
Jorrie, 2022
ibix
0.11.410041.4084
0.091.272141.270609
0.081.133541.132106
0.070.994260.992931
0.060.854270.85305
0.050.71360.712477
0.040.572250.571225
0.030.430230.429278
0.020.287480.286651
0.010.144080.14336
000.000621
-0.010.144740.14526
-0.020.290150.290567
-0.030.436230.436536
-0.040.582950.583161
-0.050.730320.730433
-0.060.878340.878358
-0.071.026991.026926
-0.081.176311.176124
-0.091.326231.325968
-0.11.476791.476416

I am puzzled by the plot shown in Post #36.
 
  • #45
I think the puzzles will go away if we get the distances algorithm sorted out.
 
  • Like
Likes pbuk
  • #46
pbuk said:
GitHub provides a (free for open source) service called GitHub Pages which can host this so that it WILL run there (as long as it doesn't go over the free usage limits which should be adequate).
I will give GitHub Pages a try, because it seems a good place for a collaborative project.
 
  • #47
JimJCW said:
Would you please calculate the following Dnow values using your calculator so I can do some comparison?
Install python, man! If you're remotely interested in any scientific computing it's a useful tool. I believe Anaconda is the recommended Windows distribution, and I believe it comes with the scipy library I used in my code. Then you can play around with my code to your heart's content.

Here's the results. ##S## is set to the correct ##1+z## not the compatibility ##1.001+z## mode.
z​
Dnow Gly​
0.1​
1.41​
0.09​
1.27​
0.08​
1.13​
0.07​
0.994​
0.06​
0.854​
0.05​
0.713​
0.04​
0.572​
0.03​
0.43​
0.02​
0.287​
0.01​
0.144​
0​
0​
-0.01​
0.145​
-0.02​
0.29​
-0.03​
0.436​
-0.04​
0.583​
-0.05​
0.73​
-0.06​
0.878​
-0.07​
1.03​
-0.08​
1.18​
-0.09​
1.33​
-0.1​
1.48​
 
  • Like
Likes Motore
  • #48
Jorrie said:
I think the puzzles will go away if we get the distances algorithm sorted out.
Yes, although this is not going to be as simple as porting @Ibix's code which leverages a scipy module (which in turn leverages a FORTRAN module) to do integration with infinite limits and automatic step size adjustment. This module is not available in the browser so it is necessary to implement a similar algorithm "by hand".

I have some experience of collaborative Javascript projects, @Jorrie do you mind if I set something up as a illustration of how to transition the calculation module to a robust modern approach?
 
  • #49
Jorrie said:
I think the puzzles will go away if we get the distances algorithm sorted out.
I've been having a look at your Calculation.js module. Your ##D_{hor}## is just ##1/H(z)## multiplied by the appropriate unit conversion factor, so is actually the same as your ##R## column - I can do that.

However your ##D_{par}## seems to be ##a\int_0^a\frac{da'}{a'H(a')}## which, noting that ##a=1/S##, ##a'=1/s## and ##da'=-ds/s^2##, is ##\frac 1S\int_S^\infty\frac{ds}{sH(s)}##, which is what I've implemented, so I'm a bit confused. Also full of viruses today, so possible I've screwed something up.
 
  • #50
pbuk said:
This module is not available in the browser so it is necessary to implement a similar algorithm "by hand".
This has an ODE solver: https://ccc-js.github.io/numeric2/index.html. I think that could be slightly abused to do integration. Or we could just use the approach in the existing module, which seems to work fine with the possible exception of the ##D_{par}## calculation that we need to revisit. I don't think it's a bad approach and seems pretty responsive (although I'd move some stuff around to make it a bit easier to see what's going on, and you can get rid of one pass of the integrator, I think). It's just grown messy and needs a dead code pruning round.
 
  • #51
Ibix said:
This has an ODE solver: https://ccc-js.github.io/numeric2/index.html. I think that could be slightly abused to do integration.
I don't think that is a very good idea: (i) the project is even more stale than Lightcone7 (no commits for 10 years); (ii) robust methods for quadrature are very different from methods for initial value problems.

Ibix said:
Or we could just use the approach in the existing module.
I think that would be better.

Ibix said:
I don't think it's a bad approach and seems pretty responsive (although I'd move some stuff around to make it a bit easier to see what's going on, and you can get rid of one pass of the integrator, I think). It's just grown messy and needs a dead code pruning round.
I agree with all of that. An approach I have found successful in the past is to extract the existing code into a stand-alone module and implement continuous integration with some high level functional tests to check you don't break anything before refactoring the existing code so that you can figure out how it works/why it doesn't.

Once you have done this you can prioritize what needs to be done next (quick fix to address bugs? rewrite algorithm for better stability/accuracy/speed? Etc...).
 
  • #52
pbuk said:
(ii) robust methods for quadrature are very different from methods for initial value problems.
.
While I have no comment on the referenced project, far more development of adaptive and variable step size has occurred for ODEs than quadrature. In particular, for spikey functions poorly approximated by polynomials, a given accuracy is often achieved with much less computation by casting the quadrature as an ODE, and using modern ODE methods.
 
  • Informative
Likes pbuk
  • #53
Here it is https://lightcone7.github.io/LightCone7.html with the calculations split out into a separate module (source here) with the bare minimum of changes to get it to work independently.

Needs some tidying up and documentation: PM me with github user details if you want to use/take over this.
 
Last edited:
  • Like
Likes Jorrie
  • #54
PAllen said:
In particular, for spikey functions poorly approximated by polynomials, a given accuracy is often achieved with much less computation by casting the quadrature as an ODE, and using modern ODE methods.
I didn't realize that, although these functions are generally smooth.

PAllen said:
Far more development of adaptive and variable step size has occurred for ODEs than quadrature.
Yes this is true in general, but there is not much implemented in Javascript for ODEs either.
 
  • #55
pbuk said:
I didn't realize that, although these functions are generally smooth.
Agreed.
pbuk said:
Yes this is true in general, but there is not much implemented in Javascript for ODEs either.
Probably true, but in other languages, publicly available implementations of most any method should be available. The SciPy quadrature interface @Ibix mentioned looks good. It uses adaptive step size. Scipy also has ODE methods.

[edit: I see this was all discussed before - browser availability is a requirement. I have no idea what is available open source in this context.]
 
  • #56
pbuk said:
I have some experience of collaborative Javascript projects, @Jorrie do you mind if I set something up as a illustration of how to transition the calculation module to a robust modern approach?
By all means, do that. :smile:
 
  • #57
Ibix said:
Install python, man! . . .

That sounds like a good idea. Thanks for calculating the Dnow values! A comparison is shown below:

zGnedinJorrie, 2022ibix
0.11.410041.40841.41
0.091.272141.2706091.27
0.081.133541.1321061.13
0.070.994260.9929310.994
0.060.854270.853050.854
0.050.71360.7124770.713
0.040.572250.5712250.572
0.030.430230.4292780.43
0.020.287480.2866510.28
0.010.144080.143360.144
000.0006210
-0.010.144740.145260.145
-0.020.290150.2905670.29
-0.030.436230.4365360.436
-0.040.582950.5831610.583
-0.050.730320.7304330.73
-0.060.878340.8783580.878
-0.071.026991.0269261.03
-0.081.176311.1761241.18
-0.091.326231.3259681.33
-0.11.476791.4764161.48

If we plot the three Dnow vs z curves, they are just about indistinguishable. If we add the (DGnedin – Dibix) result to the plot shown in Post #36, we get the following:

1652873102275.png

In general, (DGnedin - Dibix) fluctuates around the y = 0 line. The greater deviations from the line near the two ends are probably due to round-off effect of the calculated Dibix. I am still puzzled by the slightly different behaviors of the two data sets.
 
  • #58
JimJCW said:
I am still puzzled by the slightly different behaviors of the two data sets.
I'm not. Jorrie's calculation uses a fixed step size, the integration steps don't match the result steps excactly, it doesn't treat the integration to infinity properly and it only manages to get as close as it does by using far more integration steps than it needs to - in short it needs rewriting.
 
  • Like
Likes Jorrie
  • #59
pbuk said:
in short it needs rewriting.
It looks to me that Calculate() can be split up into three steps
  1. Compute a list of the ##z## values requested, making sure ##z=0## is there
  2. Do the integrals ##\int_0^S\frac{1}{H(s)}ds## and ##\int_0^S\frac{1}{sH(s)}ds## for all ##S## corresponding to those ##z## and also for ##S=\infty##.
  3. Compute all output statistics for each ##z## (some of which depend on the integral up to ##S=1## or ##S=\infty##, hence doing this after the integration run).
I would expect that this is just reordering existing code, so ought to make little to no difference to the output. Then we could work on doing the integration in a better way.

Any thoughts? I haven't looked at your github page yet @pbuk, but will do so when I'm on something with a slightly larger screen.
 
  • #60
Sounds good. The present Calculate() also has a separate T_age module, simply because there is a legacy requirement to immediate see the effect on present age when any relevant input parameter is changed. But this could be integrated with the ##\int_0^S\frac{1}{H(s)}ds## module.
 
  • #61
Ibix said:
Any thoughts?
Yes that's exactly what I was thinking (plus step 0 being sanitizing inputs and other initialization).
 
  • #62
I'm impressed with the apparent enthusiasm for the community project, which should finally become a PF Cosmo tool, decoupled from anyone member and managed by an elected committee on a neutral platform.
 
  • #63
Well kind of self-elected! PM GitHub user ids to be added as project admins.
 
  • #64
If there is no good open source quadrature package available for JavaScript, it seems to me that the methods described in section 4.5 of the third edition of “Numerical Recipes” are more than adequate for the required integrals, and are much less code than trying to reproduce something like QUADPACK.
 
  • #65
I'm quite rusty on my cosmology... so I can't vouch for the equations.
When this thread started, I was interested in trying to reproduce the calculations in Desmos.

So, here is what I have so far:
https://www.desmos.com/calculator/dorewco5ms
See the table at the end to check the numerical calculations.

You'll have to check the equations ( based on calculations by Jorrie and Ibix ; see also work by pbuk ) all documented in the Desmos file.

I don't have any more time to play around with it.
Maybe someone might find it useful (once the equations are verified).


(A note on Desmos... the ordering of the entries doesn't matter.
Unlike geogebra or trinket/Glowscript,
every Desmos "save" generates a new unique URL... which the saver has to share for others to see.)
 
  • #66
@robphy , looking at Desmos worksheet (really nice, by the way, never heard of this tool), I have one question. You have a formula using ##f_T##, but I don’t see a definition for it.
 
  • #67
PAllen said:
@robphy , looking at Desmos worksheet (really nice, by the way, never heard of this tool), I have one question. You have a formula using ##f_T##, but I don’t see a definition for it.
It’s in the “constants” folder.
It’s a conversion unit.
 
  • #68
pbuk said:
I'm not. Jorrie's calculation uses a fixed step size, the integration steps don't match the result steps excactly, it doesn't treat the integration to infinity properly and it only manages to get as close as it does by using far more integration steps than it needs to - in short it needs rewriting.

I have been calculating Dnow using various calculators. The results are often slightly different from one another. The results calculated with cosmic and Toth calculators are added to the table shown in Post #57. They are consistent with that calculated by @Ibix but with more digits kept:

z
Gnedin
Jorrie, 2022
ibix
cosmic
Toth
0.11.410041.40841.411.4095921.409608
0.091.272141.2706091.271.271741.271756
0.081.133541.1321061.131.1331891.133219
0.070.994260.9929310.9940.9939440.993964
0.060.854270.853050.8540.8540050.854024
0.050.71360.7124770.7130.7133770.713399
0.040.572250.5712250.5720.5720630.57209
0.030.430230.4292780.430.4300650.430062
0.020.287480.2866510.2870.2873850.287382
0.010.144080.143360.1440.1440290.144017
000.000621000
-0.010.144740.145260.1450.1446980.144702
-0.020.290150.2905670.290.2900610.290057
-0.030.436230.4365360.4360.4360840.436097
-0.040.582950.5831610.5830.5827630.582789
-0.050.730320.7304330.730.7300940.730101
-0.060.878340.8783580.8780.8780720.878098
-0.071.026991.0269261.031.0266881.026715
-0.081.176311.1761241.181.1759411.175951
-0.091.326231.3259681.331.3258241.32584
-0.11.476791.4764161.481.4763321.476349

For the discussions blow, let’s assume that the cosmic result is correct and use it as a reference for comparison.

For two properly working calculators, the difference in their calculated Dnow’s as a function of z should be a slightly fluctuating plot around the y = 0 line, as exemplified by the Dcosmic - DToth plot shown in the following figure:

1653273274626.png


The Dcosmic - DJorrie plot, however, shows a peculiar behavior: there is an abrupt break around z = 0 and DJorrie(z) deviates systematically from Dcosmic(z). It will be nice if we can get a clue of why this is happening. If you or @Jorrie have some ideas about the location(s) in the Calculation.js file that is responsible for this behavior, please let us know.
 
  • #69
JimJCW said:
If you or @Jorrie have some ideas about the location(s) in the Calculation.js file that is responsible for this behavior, please let us know.
It's more complicated than that, see post #55 above. Anyway we have moved on from there, the Dnow calculations seem to be much improved over the range near z = 0, but other calculations need to be checked and a few other things sorted. The results in the table below are taken from the current development version (1.2.0) running at https://lightcone7.github.io/LightCone7-develop.html.

z
Gnedin
Jorrie, 2022
ibix
cosmic
Toth
v1.2.0
0.11.410041.40841.411.4095921.4096081.409700
0.091.272141.2706091.271.271741.2717561.271836
0.081.133541.1321061.131.1331891.1332191.133276
0.070.994260.9929310.9940.9939440.9939640.940199
0.060.854270.853050.8540.8540050.8540240.854071
0.050.71360.7124770.7130.7133770.7133990.713433
0.040.572250.5712250.5720.5720630.572090.572107
0.030.430230.4292780.430.4300650.4300970.
0.020.287480.2866510.2870.2873850.2873820.287407
0.010.144080.143360.1440.1440290.1440170.144040
000.0006210000
-0.010.144740.145260.1450.1446980.1447020.144040
-0.020.290150.2905670.290.2900610.2900570.290084
-0.030.436230.4365360.4360.4360840.4360970.436119
-0.040.582950.5831610.5830.5827630.5827890.582810
-0.050.730320.7304330.730.7300940.7301010.730153
-0.060.878340.8783580.8780.8780720.8780980.878142
-0.071.026991.0269261.031.0266881.0267151.026771
-0.081.176311.1761241.181.1759411.1759511.176036
-0.091.326231.3259681.331.3258241.325841.325931
-0.11.476791.4764161.481.4763321.4763491.476451
 
  • #70
pbuk said:
It's more complicated than that, see post #55 above. Anyway we have moved on from there, the Dnow calculations seem to be much improved over the range near z = 0, but other calculations need to be checked and a few other things sorted. The results in the table below are taken from the current development version (1.2.0) running at https://lightcone7.github.io/LightCone7-develop.html.

I noticed that using version (1.2.0), I can’t duplicate your Dnow(z) in one run of the calculator; I had to calculate one or two values at a time. Some of the Dnow values probably were not correctly entered in your table. For example, Dnow = 0 for z = 0.03.

If we add (Dcosmic - Dv1.2.0) vs z curve to the figure shown in Post #68, we get the following result:

1653477812401.png


The Dv1.2.0(z) result shows a big improvement over the Jorrie 2022 result, but it still deviates systematically from Dcosmic(z). A better indication of this is shown below:

1653477885989.png
 

Similar threads

Replies
15
Views
2K
  • Cosmology
3
Replies
100
Views
5K
Replies
18
Views
1K
Replies
4
Views
1K
Replies
17
Views
2K
Replies
1
Views
1K
Replies
5
Views
3K
  • Astronomy and Astrophysics
Replies
19
Views
4K
Replies
1
Views
3K
  • Special and General Relativity
2
Replies
66
Views
12K
Back
Top