Supernova explosions are fascinating because they’re so cataclysmic, powerful, and awe-inspiring. They’re Nature’s summer blockbusters. Humans have recorded their existence in ancient astronomical records and stone carvings, and in our age, with telescopes.
Now, the Dark Energy Survey (DES) has uncovered the largest number of Type 1A supernovae ever found with a single telescope.
Finding large numbers of them is about more than just cataloguing these exploding stars. Type 1A supernovae serve as standard candles, reliable markers for determining astronomical distances. That means they can help us understand the expansion of the Universe and the force that drives it: Dark Energy.
That’s the goal of the international effort of the Dark Energy Survey (DES.) The DES operates the Dark Energy Camera (DECam.) DECam works in conjunction with the 4-meter (13 t.) Blanco Telescope at the Cerro Tololo Inter-American Observatory (CTIO) in northern Chile. DECam has a 550 megapixel CCD, wide-field vision, and can see the red-shifted light from distant galaxies. It also works in visible and ultraviolet light.
In the late 1990s, astronomers with the National Science Foundation NOIRLab discovered that the expansion of the Universe was accelerating. This was a surprise to scientists, who thought that the expansion was slowing down. Observations of Type 1a supernovae led to this new understanding, and scientists call the force that drives the accelerated expansion ‘Dark Energy.’
Now, scientists at the DES have turned to Type 1a supernovae again to see what else they can learn about Dark Energy. They presented their results at the 243rd AAS Conference and in a paper to be published in The Astrophysical Journal. The research is titled “The Dark Energy Survey: Cosmology Results With ~1500 New High-redshift Type Ia Supernovae Using The Full 5-year Dataset.” The paper has well over 100 authors, all with the DES Collaboration.
To perform the survey, the DECam mapped almost one-eighth of the entire sky, taking 758 nights over six years to do it. The observations captured about two million distant galaxies and several thousand supernovae. After filtering through all of the results, the team had over 1500 Type 1a supernovae.
“After accounting for the likelihood of each SN being an SN Ia, we find 1635 DES SN in the redshift range 0.10<z<1.13 that pass quality selection criteria and can be used to constrain cosmological parameters,” the authors write. This is five times as many high-quality Type 1a SN compared to previous leading research. This result gives scientists “… the tightest cosmological constraints achieved by any SN data set to date,” according to the authors.
When scientists discovered the accelerating expansion of the Universe, the discovery was based on only 52 high redshift supernovae. Since then, surveys of supernovae have come from combining data from multiple surveys. This new survey is a much more homogenous data set consisting of high-quality, well-calibrated light curves. Having a large data set from a single survey helps researchers eliminate the systematic errors that weaken the results of previous research.
“It’s a really massive scale-up from 25 years ago when only 52 supernovae were used to infer dark energy,” said Tamara Davis, a professor at the University of Queensland in Australia and co-convener of the DES Supernova Working Group.
The massive increase in SN data will allow cosmologists to place constraints on their models of Dark Energy and Universal Expansion. Researchers take each single SN and combine its distance with its redshift, which is a measure of how fast it’s moving away from Earth, driven by Dark Energy. Scientists are trying to find the answer to a critical question: did the density of Dark Energy change as the Universe expanded over time, or has it remained constant?
The standard model of how the cosmos works is called the Lambda Cold Dark Matter (LCDM) model. It explains how the Universe exolves and expands using the density of matter, the type of matter, and how Dark Energy behaves. The LCDM is based on the assumption that the density of Dark Energy is constant over time and isn’t diluted as the Universe expands.
“As the Universe expands, the matter density goes down,” said DES director and spokesperson Rich Kron, who is a Fermilab and University of Chicago scientist. “But if the dark energy density is a constant, that means the total proportion of dark energy must be increasing as the volume increases.”
But these new results may be poking a hole in the LCDM model.
This is the first SN survey large enough and with enough distant SN to make a detailed measurement of a critical time in the Universe’s expansion: the decelerating phase before the expansion accelerated about 9.8 billion years after the Big Bang.
The results support the idea that the density of Dark Energy is constant in the Universe. Cosmologists think that about three billion years ago, Dark Energy began to dominate the Universe’s energy density precisely because it’s constant and doesn’t dissipate with expansion. But this data also hints that the density varies some.
“There are tantalizing hints that dark energy changes with time,” said Davis. “We find that the simplest model of dark energy — ?CDM — is not the best fit. It’s not so far off that we’ve ruled it out, but in the quest to understand what is accelerating the expansion of the Universe, this is an intriguing new piece of the puzzle. A more complex explanation might be needed.”
Any new explanation might stem from changes in our understanding of gravity. Our understanding of Dark Energy’s existence relies heavily on General Relativity and its description of gravity. There are several different, competing, modified theories of gravity out there, and if one of them proves to be correct, then a cascade of new understandings in cosmology will follow, including Dark Energy and Universal Expansion.
But, in an eloquent example of how things are intertwined, the speed of gravity measured in the ongoing study of gravitational waves eliminated many competing alternate theories of gravity used to explain Dark Energy. As it stands now, most astrophysicists believe that Dark Energy exists and drives Universal Expansion.
The DES has pioneered some new, innovative techniques for analyzing astrophysical data. These techniques will be put to work when the Nancy Grace Roman Space Telescope and the Vera Rubin Observatory come online. The Vera Rubin, in particular, will generate an enormous amount of data that will require powerful data analysis to yield results.
“We’re pioneering techniques that will be directly beneficial for the next generation of supernova surveys,” said DES director and spokesperson Rich Kron.
When it comes to the enduring puzzle of the Universe’s expansion, the DES’ supernovae survey is just one of many diverse approaches needed to solve it. “We need as many diverse approaches as we can get in order to understand what dark energy is and what it isn’t,” said Nigel Sharp, a program director in NSF’s Astronomical Sciences Division. “This is an important route to that understanding.”
The analysis has, contra other supernova groups, refrained from trying to estimate the H_0 Hubble rate directly but parametrizes with H_0/t_age where t_age is the age of the hot big bang universe.
Interestingly trying to resolve the H_O tension they find a flat (LCDM) universe the best alternative (to wLCDM). Removing the nearby low-z supernovas moves the result closer to flat and subsampling shows LCDM is robust.
Conversely assuming their best fit free models it gives a cosmological age that is ~ 10 % younger than the Planck universe, conflicting with the ages of the oldest stars and earlier supernova H_0. For instance, their model H_0 is 81 +/- 1.4 km/s/Gpc (using the footnote 8 data) and more than 6 sigma away from earlier supernova results so a 3 sigma tension.
I find this result encouraging for the simpler LCDM standing the test of time, and could even be preferred in a model comparison analysis. Assuming a flat universe and a Planck t_age their supernova Hubble rate would be a comfortable 70 km/s/Gpc, which does not break LCDM physics at any time.
Also, if the more distant supernovas gives such a high Hubble rate I would suspect they have a problem with the supernova method calibration. But is it so!? More likely have made a simple mistake. We’ll see what the cosmologists will say on the results.