A new Simulation of the Universe Contains 60 Trillion Particles, the Most Ever

Today, the greatest mysteries facing astronomers and cosmologists are the roles gravitational attraction and cosmic expansion play in the evolution of the Universe. To resolve these mysteries, astronomers and cosmologists are taking a two-pronged approach. These consist of directly observing the cosmos to observe these forces at work while attempting to find theoretical resolutions for observed behaviors – such as Dark Matter and Dark Energy.

In between these two approaches, scientists model cosmic evolution with computer simulations to see if observations align with theoretical predictions. The latest of which is AbacusSummit, a simulation suite created by the Flatiron Institute’s Center for Computational Astrophysics (CCA) and the Harvard-Smithsonian Center for Astrophysics (CfA). Capable of processing nearly 60 trillion particles, this suite is the largest cosmological simulation ever produced.

The creators of AbacusSummit announced the simulation suite in a series of papers that appeared in the Monthly Notices of the Royal Astronomical Society (MNRAS). Made up of more than 160 simulations, it models how particles behave in a box-shaped environment due to gravitational attraction. These models are known as N-body simulations and are intrinsic to modeling how dark matter interacts with baryonic (aka. “visible”) matter.

The simulated distribution of dark matter in galaxies. Credit: Brinckmann et al.

The development of the AbacusSummit simulation suite was led by Lehman Garrison (a CCA research fellow) and Nina Maksimova and Daniel Eisenstein, a graduate student and professor of astronomy with the CfA (respectively). The simulations were run on the Summit supercomputer at the Oak Ridge Leadership Computing Facility (ORLCF) in Tennessee – overseen by the U.S. Department of Energy (DoE).

N-body calculations, which consist of computing the gravitational interaction of planets and other objects, are among the greatest challenges facing astrophysicists today. Part of what makes it daunting is that each object interacts with every other object, regardless of how far apart they are – the more objects under study, the more interactions that need to be accounted for.

To date, there is still no solution for N-body problems where three or more massive bodies are involved, and the calculations available are mere approximations. For example, the mathematics for calculating the interaction of three bodies, such as a binary star system and a planet (known as the “Three-Body Problem”), is yet to be resolved. A common approach with cosmological simulations is stopping the clock, calculating the total force acting on each object, moving time ahead slowly, and repeating.

For the sake of their research (which was led by Maksimova), the team designed their codebase (called Abacus) to take advantage of Summit’s parallel processing power – whereby multiple calculations can run simultaneously. They also relied on machine learning algorithms and a new numerical method, which allowed them to calculate 70 million particles per node/s at early times and 45 million particle updates per node/s at late times.

A snapshot of one of the AbacusSummit simulations, shown at various zoom scales: 10 billion light-years across, 1.2 billion light-years across, and 100 million light-years across. Credit: The AbacusSummit Team/ layout by Lucy Reading-Ikkanda/Simons Foundation

As Garrison explained in a recent CCA press release:

“This suite is so big that it probably has more particles than all the other N-body simulations that have ever been run combined – though that’s a hard statement to be certain of. The galaxy surveys are delivering tremendously detailed maps of the Universe, and we need similarly ambitious simulations that cover a wide range of possible universes that we might live in.

“AbacusSummit is the first suite of such simulations that has the breadth and fidelity to compare to these amazing observations… Our vision was to create this code to deliver the simulations that are needed for this particular new brand of galaxy survey. We wrote the code to do the simulations much faster and much more accurately than ever before.”

In addition to the usual challenges, running full simulations of N-body calculations requires that algorithms be carefully designed because of all the memory storage involved. This means that Abacus couldn’t make copies of the simulation for different supercomputer nodes to work on and divided each simulation into a grid instead. This consists of making approximate calculations for distant particles, which play a smaller role than nearby ones.

It then splits off the nearby particles into multiple cells so the computer can work on each independently, then combines the results of each with the approximation of distant particles. The research team found that this approach (uniform divisions) makes better use of parallel processing and allows a large amount of the distant-particle approximation to be computed before the simulation starts.

Abacus’ parallel computer processing, visualized. Credit: Lucy Reading-Ikkanda/Simons Foundation

This is a significant improvement to other N-body codebases, which irregularly divide simulations based on the distribution of particles. Thanks to its design, Abacus can update 70 million particles per node/second (where each particle represents a clump of Dark Matter with three billion solar masses). It can also analyze the simulation as it’s running and search for patches of Dark Matter that indicate the presence of bright star-forming galaxies.

These and other cosmological objects will be the subject of future surveys that map the cosmos in unprecedented detail. These include the Dark Energy Spectroscopic Instrument (DESI), the Nancy Grace Roman Space Telescope (RST), and the ESA’s Euclid spacecraft. One of the goals of these big-budget missions is to improve estimations of the cosmic and astrophysical parameters that determine how the Universe behaves and how it looks.

This, in turn, will allow for more detailed simulations that employ updated values for various parameters, such as Dark Energy. Daniel J. Eisenstein, a researcher with the CfA and a co-author on the paper, is also a member of the DESI collaboration. He and others like him are looking forward to what Abacus can do for these cosmological surveys in the coming years.

“Cosmology is leaping forward because of the multidisciplinary fusion of spectacular observations and state-of-the-art computing,” he said. “The coming decade promises to be a marvelous age in our study of the historical sweep of the universe.”

Further Reading: Simons Foundation, MNRAS

Matt Williams

Matt Williams is a space journalist and science communicator for Universe Today and Interesting Engineering. He's also a science fiction author, podcaster (Stories from Space), and Taekwon-Do instructor who lives on Vancouver Island with his wife and family.

Recent Posts

Here’s How to Weigh Gigantic Filaments of Dark Matter

How do you weigh one of the largest objects in the entire universe? Very carefully,…

1 hour ago

How Could Astronauts Call for Help from the Moon?

Exploring the Moon poses significant risks, with its extreme environment and hazardous terrain presenting numerous…

14 hours ago

There Was a 15 Minute Warning Before Tonga Volcano Exploded

Volcanoes are not restricted to the land, there are many undersea versions. One such undersea…

14 hours ago

Main Sequence and White Dwarf Binaries are Hiding in Plain Sight

Some binary stars are unusual. They contain a main sequence star like our Sun, while…

16 hours ago

What a Misplaced Meteorite Told Us About Mars

11 million years ago, Mars was a frigid, dry, dead world, just like it is…

18 hours ago

Uranus is Getting Colder and Now We Know Why

Uranus is an oddball among the Solar System's planets. While most planets' axis of rotation…

21 hours ago