Understanding the Universe and how it has evolved over the course of billions of years is a rather daunting task. On the one hand, it involves painstakingly looking billions of light years into deep space (and thus, billions of years back in time) to see how its large-scale structure changed over time. Then, massive amounts of computing power are needed to simulate what it should look like (based on known physics) and seeing if they match up.
That is what a team of astrophysicists from the University of Zurich (UZH) did using the “Piz Daint” supercomputer. With this sophisticated machine, they simulated the formation of our entire Universe and produced a catalog of about 25 billion virtual galaxies. This catalog will be launched aboard the ESA’s Euclid mission in 2020, which will spend six years probing the Universe for the sake of investigating dark matter.
The team’s work was detailed in a study that appeared recently in the journal Computational Astrophysics and Cosmology. Led by Douglas Potter, the team spent the past three years developing an optimized code to describe (with unprecedented accuracy) the dynamics of dark matter as well as the formation of large-scale structures in the Universe.
The code, known as PKDGRAV3, was specifically designed to optimally use the available memory and processing power of modern super-computing architectures. After being executed on the “Piz Daint” supercomputer – located at the Swiss National Computing Center (CSCS) – for a period of only 80 hours, it managed to generate a virtual Universe of two trillion macro-particles, from which a catalogue of 25 billion virtual galaxies was extracted.
Intrinsic to their calculations was the way in which dark matter fluid would have evolved under its own gravity, thus leading to the formation of small concentrations known as “dark matter halos”. It is within these halos – a theoretical component that is thought to extend well beyond the visible extent of a galaxy – that galaxies like the Milky Way are believed to have formed.
Naturally, this presented quite the challenge. It required not only a precise calculation of how the structure of dark matter evolves, but also required that they consider how this would influence every other part of the Universe. As Joachim Stadel, a professor with the Center for Theoretical Astrophysics and Cosmology at UZH and a co-author on the paper, told Universe Today via email:
“We simulated 2 trillion such dark matter “pieces”, the largest calculation of this type that has ever been performed. To do this we had to use a computation technique known as the “fast multipole method” and use one of the fastest computers in the world, “Piz Daint” at the Swiss National Supercomputing Centre, which among other things has very fast graphics processing units (GPUs) which allow an enormous speed-up of the floating point calculations needed in the simulation. The dark matter clusters into dark matter “halos” which in turn harbor the galaxies. Our calculation accurately produces the distribution and properties of the dark matter, including the halos, but the galaxies, with all of their properties, must be placed within these halos using a model. This part of the task was performed by our colleagues at Barcelona under the direction of Pablo Fossalba and Francisco Castander. These galaxies then have the expected colors, spatial distribution and the emission lines (important for the spectra observed by Euclid) and can be used to test and calibrate various systematics and random errors within the entire instrument pipeline of Euclid.”
Thanks to the high precision of their calculations, the team was able to turn out a catalog that met the requirements of the European Space Agency’s Euclid mission, whose main objective is to explore the “dark universe”. This kind of research is essential to understanding the Universe on the largest of scales, mainly because the vast majority of the Universe is dark.
Between the 23% of the Universe which is made up of dark matter and the 72% that consists of dark energy, only one-twentieth of the Universe is actually made up of matter that we can see with normal instruments (aka. “luminous” or baryonic matter). Despite being proposed during the 1960s and 1990s respectively, dark matter and dark energy remain two of the greatest cosmological mysteries.
Given that their existence is required in order for our current cosmological models to work, their existence has only ever been inferred through indirect observation. This is precisely what the Euclid mission will do over the course of its six year mission, which will consist of it capturing light from billions of galaxies and measuring it for subtle distortions caused by the presence of mass in the foreground.
Much in the same way that measuring background light can be distorted by the presence of a gravitational field between it and the observer (i.e. a time-honored test for General Relativity), the presence of dark matter will exert a gravitational influence on the light. As Stadel explained, their simulated Universe will play an important role in this Euclid mission – providing a framework that will be used during and after the mission.
“In order to forecast how well the current components will be able to make a given measurement, a Universe populated with galaxies as close as possible to the real observed Universe must be created,” he said. “This ‘mock’ catalogue of galaxies is what was generated from the simulation and will be now used in this way. However, in the future when Euclid begins taking data, we will also need to use simulations like this to solve the inverse problem. We will then need to be able to take the observed Universe and determine the fundamental parameters of cosmology; a connection which currently can only be made at a sufficient precision by large simulations like the one we have just performed. This is a second important aspect of how such simulation work [and] is central to the Euclid mission.”
From the Euclid data, researchers hope to obtain new information on the nature of dark matter, but also to discover new physics that goes beyond the Standard Model of particle physics – i.e. a modified version of general relativity or a new type of particle. As Stadel explained, the best outcome for the mission would be one in which the results do not conform to expectations.
“While it will certainly make the most accurate measurements of fundamental cosmological parameters (such as the amount of dark matter and energy in the Universe) far more exciting would be to measure something that conflicts or, at the very least, is in tension with the current ‘standard lambda cold dark matter‘ (LCDM) model,” he said. “One of the biggest questions is whether the so called ‘dark energy’ of this model is actually a form of energy, or whether it is more correctly described by a modification to Einstein’s general theory of relativity. While we may just begin to scratch the surface of such questions, they are very important and have the potential to change physics at a very fundamental level.”
In the future, Stadel and his colleagues hope to be running simulations on cosmic evolution that take into account both dark matter and dark energy. Someday, these exotic aspects of nature could form the pillars of a new cosmology, one which reaches beyond the physics of the Standard Model. In the meantime, astrophysicists from around the world will likely be waiting for the first batch of results from the Euclid mission with baited breath.
Euclid is one of several missions that is currently engaged in the hunt for dark matter and the study of how it shaped our Universe. Others include the Alpha Magnetic Spectrometer (AMS-02) experiment aboard the ISS, the ESO’s Kilo Degree Survey (KiDS), and CERN’s Large Hardon Collider. With luck, these experiments will reveal pieces to the cosmological puzzle that have remained elusive for decades.
Further Reading: UZH, Computational Astrophysics and Cosmology
One question:
1) If dark matter clumps … what stops it from forming globules, like stars and planets.
That’s an interesting question! I’d like to second that, Matt!
Dark matter stars and planets, you say? That IS a good question. I will have to refer this to one of our astrophysicists, see what they can come up with 🙂
I can’t remember the source of this, but here’s my notes on ‘Dark Stars’:
DM neutralinos interact/annihilate each other, producing quarks and anti-quarks + heat
The heat = radiation pressure, preventing a protostar from collapsing and triggering fusion
The star would become a massive Brown Dwarf – 4 to 2,000 AU in diameter!
Lifetime unknown – may last months, Myr or Gyr – depends on mass of neutralinos (unknown)
Once the DM is exhausted (radiation pressure fades away) the star may collapse to form a normal star, or collapse to a BH, or the DM may still be active and in equilibrium (still exist today)
If they exist today, they should emit Gamma rays, neutrinos and anti-matter (from dark molecular clouds – one would not normally expect such emissions from star-forming clouds)
Alternatively, 🙂 DM particles do not collapse/merge – they overcome gravity by their bee-swarm motion (same reason why Globular Clusters do not collapse)
Learning how to spell, let alone proper diction is a great way to improve your credibility… Not espousing provable nonsense can help as well.
BTW, doesn’t the Xtian Bible also teach that everything came from nothing? Just sayin’.
http://www.lookingforgod.com/questions-and-answers/category/25/question/47/
That literally makes no sense.
I think they are all trapped in black hole, since it hard to find out and no one can measure about its ‘density’