Our Leaking Universe
by John G. Cramer
Modern cosmology is not without its problems. One such problem is that the Hubble constant (symbol H0) has a lower value when extracted from cosmic microwave background data than when extracted from the red-shift recession velocities of “standard candle” stars and galaxies. Another problem is that the value of the cosmological constant (symbol L) as calculated from quantum field theory, has a ridiculously large value as compared to the value extracted from cosmic microwave background data. Two recent theoretical works suggest fixes for these problems by hypothesizing “leaks” in our Universe in which dark matter is decaying and energy is disappearing. We will start with the Hubble constant problem.
The Hubble constant H0 is named for Edwin Hubble (1889–1953), the astronomer who first discovered that the Universe is expanding. It is the cosmological measure of how the recession velocity of distant stars depends on their distance, which is the rate at which the Universe is expanding. It has been best determined from the detailed structure of the cosmic microwave background radiation, as measured by the data from the European Space Agency’s 2013 Planck mission. It can also be determined from direct observation of the red-shift rate of recession of “standard candle” stars and galaxies. Curiously, these two ways of determining H0 give different values, with Planck value about 10% lower than the standard candle value. This suggests that the rate of expansion of the Universe may have been significantly lower when the cosmic microwave background radiation was released about 300,000 years after the Big Bang than it is in the present era.
In July 2016, a group of Russian astrophysicists led by I. Takachev of the Russian Academy of Sciences published a paper in Physical Review D suggesting that this difference in H0 values may indicate that a larger quantity of cold dark matter was present in the early Universe, slowing the rate of expansion at that time. They go on to suggest that the gravitationally attractive mass of dark matter has partially decayed away to photons and neutrinos in the present era, leading to less gravitational attraction, a more rapid expansion of the Universe, and a larger value of H0. To implement this scenario, they have re-analyzed the data from the Planck mission after introducing two extra fitting parameters, the fraction of cold dark matter that can decay and the decay rate of that fraction.
Of course, it is to be expected that adding extra parameters will lead to a better fit to data. However, they have shown that even beyond this improvement, their hypothesis leads to a superior fit to the Planck data and explains the increase in H0. Their best fits, with large error bars, indicate that perhaps 3.5% of the cold dark matter in the early Universe was unstable to decay with a half life of around 5% of the present age of the Universe. They go on to suggest that the very high energy neutrinos observed by the IceCube experiment in Antarctica, which are difficult to explain otherwise, may have been produced by such dark matter decay.
At this point, the Planck data fitting with decaying dark matter is suggestive but not conclusive. Better data are needed, and perhaps an experiment that might provide the “smoking gun” of actually catching cold dark matter in the act of decay. In any case, there is now a strong possibility that the Universe contains more than one kind of dark matter and that some variety of dark matter is undergoing radioactive decay to photons or neutrinos.
This brings us to the cosmological constant. Theoretical physics has two major unsolved issues that are of great importance: the problem of quantum gravity and the problem of the cosmological constant L. Quantum gravity is an expected but not-yet-existing theory, a theory yet to be formulated, that would include quantum mechanics and gravitation seamlessly within the same theoretical framework. Over the past few decades there has been much theoretical effort aimed at unifying the quantum and the gravitational domains, but there is not much to show for all of this work. One monumental indication of the problems with such unification is that quantum field theory, our standard model for quantum processes at the highest energies, appears to overestimate the energy content of the quantum vacuum, what we call the cosmological constant L, by 120 orders of magnitude (i.e., by a factor of 10120).
The cosmological constant is related to the fraction of dark energy present in the Universe. Analysis of Planck data indicates that the dark-energy fraction of mass-energy in the Universe is about 68.3%. Because the dark energy content of a given volume of space increases as that volume increases, dark energy creates a “negative pressure” that causes the expansion rate of the Universe to increase. If the cosmological constant L actually had the value calculated from quantum field theory, its huge gravitational effects would have prevented the formation of galaxies, stars, and planets in the early Universe, and we would not be here to worry about the issue. This is the L problem, the problem of the cosmological constant.
One possible way of dealing with the dilemma of the cosmological constant is to use a variation of general relativity called “unimodular gravity.” Unimodular gravity was first introduced by Albert Einstein in 1919, four years after he introduced general relativity, in an unsuccessful attempt to connect gravitation to elementary particles. Unimodular gravity has slightly different space symmetry properties from standard general relativity. Nevertheless, it would appear, in general, to make the same observational predictions. However, it has two important differences. First, unimodular gravity has the property that vacuum fluctuations of the energy-momentum tensor, of the type that lead to the absurdly huge quantum-field-theory estimate of the cosmological constant, do not couple to gravity. This absence of coupling separates the energy present in quantum vacuum fluctuations from dark energy and the cosmological constant.
Second, energy and momentum are not strictly conserved in the framework of unimodular gravity. There is a “slow leak” in energy that can arise, for example, from space-granularity at the Planck scale. It is as if some “friction” within the particle interactions in the early Universe causes energy to slowly leak away.
In theoretical physics, failure to conserve energy is normally viewed as a “show-stopper,” an unacceptable flaw in any given theory. However, three theorists, T. Josset and A. Perez of the University of Toulon in France and D. Sudarsky of the Universidad Nacional Autonoma in Mexico, posted a preprint in December 2016 in which they consider unimodular gravity’s failure to conserve energy to be a useful asset. In their scenario, the cosmological constant is not related to the energy present in quantum vacuum fluctuations. Instead, it represents the quantity of energy that has disappeared due to energy non-conservation during the evolution of the Universe. The energy that has leaked away since the Big Bang is reflected in the dark energy that we now observe to be accelerating the expansion of the Universe.
They implement this idea by using several foundational models (some rather peculiar) and produce estimates of the cosmological constant that are consistent with Planck mission data. They also show that in the evolution of the Universe, the cosmological constant fairly rapidly reaches an equilibrium value and holds thereafter as the Universe expands, consistent with the expectation that the cosmological constant is fairly constant in value.
If one takes this approach seriously, the implications are that (a) we should be using unimodular gravity instead of standard general relativity to analyze gravitational phenomena like neutron stars, black-hole accretion discs, and gravitational waves, and (b) the early Universe burned through 2/3 of its energy in its early evolution, leaving behind the “scar” of the dark energy we see today.
* * *
It is not clear whether either of these theoretical extrapolations will withstand the test of closer scrutiny. However, from the point of view of science fiction these new ideas may offer opportunities. If dark matter can decay, it can perhaps also be annihilated on demand as an energy source. It is also interesting that there may be circumstances in which both energy and momentum are not conserved and can be created or destroyed. One can imagine a space drive based on unimodular gravity, in which thrust for propulsion is achieved by creating new momentum. And if energy is not conserved and can disappear into dark energy, perhaps it can also be made to appear in situations where energy is needed or useful or destructive (i.e., a “unimodular bomb”).
Whenever cracks appear in the laws of physics, there may be opportunities for new physics and technology.
Dark Matter Decay: Dark matter decaying after recombination: Lensing constraints with Planck data, A. Chudaykin, D. Gorbunov, and I. Tkachev, Physical Review D 94, 0223528 (2016); preprint arXiv:1602.08121 [astro-ph.CO].
Unimodular Gravity: Dark energy as the weight of violating energy conservation, Thibaut Josset, Alejandro Perez, and Daniel Sudarsky, preprint arXiv:1604.04183v3 [gr-qc].
John G. Cramer’s new book describing his transactional interpretation of quantum mechanics, The Quantum Handshake—Entanglement, Nonlocality, and Transactions, (Springer, January 2016) is available online as a printed or eBook at: http://www.springer.com/gp/book/9783319246406. EBook editions of hard SF novels Twistor and Einstein’s Bridge are available from the Book View Café co-op at: http://bookviewcafe.com/bookstore/?s=Cramer. Electronic reprints of over 180 “The Alternate View” columns by John G. Cramer, previously published in Analog, are available online at: http://www.npl.washington.edu/av.
Copyright © 2017 John G. Cramer