- #1
jonmtkisco
- 532
- 1
A proliferation of very large voids have been observed recently, at a scale of 250-350 Mpc. Voids are believed to have average densities about 10% of the cosmic average. Voids are conservatively believed to comprise at least 60% of the volume of the observable universe, and perhaps more than 80%.
Voids are believed to have arisen from gravitational amplification of primordial density fluctuations, for example at the time of last scattering, as reflected by the CMB. The anisotropy of the CMB is measured to be very small, on the order of E-5.
Therefore, in order to have attained the degree of density inhomogeneity observed today, individual voids must have been expanding at a rate in excess of the average FLRW rate, over the timescale of the expansion.
Even in an Einstein-de Sitter universe with Lamba (cosmological constant) = 0, it seems to me that the increasing inhomogeneity over time represented by expanding voids would derail the Friedmann equation. As the matter overdensity of void shells is squeezed into a smaller and smaller fraction of the total volume, its influence over the expansion rate of void interiors must decrease as a result of the inverse-square law of gravity. (The gravity source becomes more and more distant from an ever-increasing fraction of the volume.) The equation must depart from equilibrium as the faster expanding regions become an ever larger fraction of the total volume. [For example, if all of the matter of the observable universe were condensed into a single black hole, surely the total expansion rate of the obsvable universe (at its current size) must be faster than the Friedmann equation would calculate for a homogeneous matter distribution.]
If the matter shells of the voids had been gravitationally bound from the beginning, the void fraction would not have increased over time. I cannot picture any scenario where the void fraction could increase to the extent it has, and then subsequently experience a gravitational collapse. The historical expansion rate of voids leads me to conclude either that total density is below critical density, or alternatively that even it if it is not, the inhomogeneity nevertheless must cause expansion to accelerate rather than decelerate.
Am I missing something basic here? I'm looking at this from an intuitive perspective rather than from the perspective of particular backreaction models. And I am not assuming that the spatial curvature of voids or of the universe as a whole is negative.
Jon
Voids are believed to have arisen from gravitational amplification of primordial density fluctuations, for example at the time of last scattering, as reflected by the CMB. The anisotropy of the CMB is measured to be very small, on the order of E-5.
Therefore, in order to have attained the degree of density inhomogeneity observed today, individual voids must have been expanding at a rate in excess of the average FLRW rate, over the timescale of the expansion.
Even in an Einstein-de Sitter universe with Lamba (cosmological constant) = 0, it seems to me that the increasing inhomogeneity over time represented by expanding voids would derail the Friedmann equation. As the matter overdensity of void shells is squeezed into a smaller and smaller fraction of the total volume, its influence over the expansion rate of void interiors must decrease as a result of the inverse-square law of gravity. (The gravity source becomes more and more distant from an ever-increasing fraction of the volume.) The equation must depart from equilibrium as the faster expanding regions become an ever larger fraction of the total volume. [For example, if all of the matter of the observable universe were condensed into a single black hole, surely the total expansion rate of the obsvable universe (at its current size) must be faster than the Friedmann equation would calculate for a homogeneous matter distribution.]
If the matter shells of the voids had been gravitationally bound from the beginning, the void fraction would not have increased over time. I cannot picture any scenario where the void fraction could increase to the extent it has, and then subsequently experience a gravitational collapse. The historical expansion rate of voids leads me to conclude either that total density is below critical density, or alternatively that even it if it is not, the inhomogeneity nevertheless must cause expansion to accelerate rather than decelerate.
Am I missing something basic here? I'm looking at this from an intuitive perspective rather than from the perspective of particular backreaction models. And I am not assuming that the spatial curvature of voids or of the universe as a whole is negative.
Jon
Last edited: