News

Modeling the Universe with data

20 Sept 2021

Space and time encoded in 320 million Megabytes: LMU astrophysicist Klaus Dolag uses simulations to study how the cosmos evolved.

© Hirschmann et al./Magneticum Pathfinder

Astrophysicist Klaus Dolag will do anything for his very own little universe. He will get up at night (several times, if necessary) to check its condition and ensure that it’s in good shape.

But then, ‘little’ is not quite the right word here. This universe is a cube measuring 12.5 billion light-years on a side. Compared with the visible Universe as seen from Earth, the diameter of which cosmologists estimate at approximately 93 billion light years, Dolag’s cosmos is not that small after all. But there is a much more salient difference between the two: Dolag’s slimmed-down version is a simulation, albeit one of the most detailed and complex yet constructed by cosmologists. He built it in 2015 with the aid of Munich‘s supercomputer SuperMUC. The task took 3 weeks (including many sleepless nights), and he’s still working on it. After all, when you’ve built your own universe – even if it exists only on hard disks – there’s always something to be discovered, and improved.

Striking similarity

The thin thread of gas millions of light-years long between two galaxy clusters from the simulation looks like images from X-ray telescopes.

© Klaus Dolag

But why would astrophysicists want to simulate a universe in the first place? After all, we live in a real one, and it’s a lot bigger. “When we look at the sky around us, what we see is only a snapshot,” Dolag says. “We want to know how the Universe came to be what it now is, how it evolved and what sorts of physical processes were involved in its development.”

That puts the problem very succinctly. Solving it is much more complicated. To simulate cosmic evolution, one basically needs two things (in addition to huge reserves of data processing power): the appropriate choice of starting conditions, with which each simulation begins, and the equations that describe physical formulae how the system subsequently evolves.

Neither of these ingredients is very well understood, which introduces a lot of uncertainty. – Nobody was around to witness the Big Bang around 13.8 billion years ago, when space and time began. In that momentous instant, the Universe must have expanded at an unfathomably rapid rate from an infinitesimal point into a gigantic ball of gas. How exactly this happened remains a matter over which theorists continue to puzzle.

In the beginning everything was simple

Luckily for astrophysicists like Klaus Dolag, who leads the Computational Center for Particle and Astrophysics (C2PAP) in the Munich Cluster of Excellence on “ORIGINS”, we do have a photo of the Universe in its pram, as it were. It was taken shortly after the ‘fog’ that characterized the Universe’s early phase lifted, and it shows minuscule variations in what physicists call the cosmic microwave background. Clearly, even at this early time, the matter in the Universe was not uniformly distributed – and the denser clumps would later give rise to the structures we now know as stars, galaxies and galaxy clusters.

Crucially, these density fluctuations provide a sufficiently simple setting with which to initiate simulations. From here on, the physical processes involved in cosmic evolution become ever more complex – and detailed simulations become ever more costly, computationally speaking. “This is also where gravitation comes into the picture,” says Dolag. “Gravitation is the force that ultimately shapes our Universe.” Thanks to Einstein‘s theory of General Relativity, the laws of gravity are well understood. But when tenuous filaments of gas slowly collapse into ever denser clouds and form stars, other effects come into play. These include those driven by hydrodynamics, magnetic fields and interactions between highly energetic particles.

Guided by insights and probabilities

The basic problem lies in the wildly different scales that must be taken into account. Calculating the impact of the processes involved down to the atomic level is hopelessly impractical. The model is comprised of a cube with sides of 12.5 billion light-years, which is divided into 180 billion elements. – Even at that resolution, one cannot track the evolution of individual stars. Instead, physicists make use of phenomenological models, which are based on insights and educated guesses. These include factors such as the mean lifetimes of classes of stars, how much energy is released when they self-destruct, what effect such cataclysms have on the local environment – and how these parameters affect rates of star formation and the large-scale evolution of the Universe. The magnitude of these effects in particular regions of the simulated universe depends strongly on the local conditions. “The higher the gas density in a stellar nursery, the more stars are likely to form, for example,” says Dolag. “That’s a universal law.”

But was that always the case? Or did the early – and far denser – Universe behave in a fundamentally different way? Here, Dolag shrugs his shoulders. “That’s not clear, and it’s one of the biggest deficiencies in all current simulations.” A further difficulty arises from the fact that physicists are far from understanding all the relevant processes that contributed to cosmic evolution. We now know that less than 5% of the Universe (i.e. its matter and energy) is made up of the sort of matter whose behavior is well described by the known laws of physics. The rest consists of what cosmologists call ‘dark matter’ and ‘dark energy’ – ‘dark’ because it cannot be observed directly and little is known about its precise nature. So simulations must make do with inferences based on observations of the effects of these constituents.

What even the best telescopes can’t see

Nevertheless, Dolag‘s simulated universe is anything but a relaxing pastime. It’s not like running a farm on your phone, just for fun. Cosmological simulations are constructed in the hope that they will uncover phenomena that have not been detected in the existing Universe, and help astronomers to gain a better understanding of puzzling discoveries. Researchers have thought long and hard about the question of why only 3-4% of the ‘normal’ matter in the Universe is in the form of stars. A second, somewhat larger fraction is contributed by the hot gas found in galaxy clusters, which can only be seen with X-ray telescopes. – And nothing at all is known about some 50% of the matter in our local Universe. Theorists assume that it consists of thin, relatively cool filaments of gas, like those that gave rise to galaxies, and that it forms a cosmic web that links galaxies together. With a calculated density of only about 10 atoms per cubic centimeter, these structures have remained below the resolution of today’s best telescopes.

However, last year, the German X-ray telescope eROSITA detected a structure that may represent such a connecting thread – a gas filament over 50 million light-years in length that runs between two galaxy clusters. Is this a bit of the missing matter? Dolag took a closer look at his simulation, and found two linked clusters in almost exactly the same configuration. The astrophysicists involved in the project refer to the similarity as “striking”. It’s not proof, but it is very suggestive. – And it suggests that Dolag’s universe can indeed reveal previously unknown phenomena.

A simulated cosmos also has other advantages. It allows physicists to play around with the constituents, such as the amount of dark matter, and see what happens. In the model, dark matter is assumed to consist of cold particles that do not interact with each other. So what happens if one endows them with different properties like those postulated by some theorists? Does this destabilize the simulated universe, or does it make the model more realistic?

Simulated cosmos

The most massive galaxy cluster calculated by the Magneticum Pathfinder consists of hundreds of galaxies (white), with hot gas (light blue) and cold gas (brown) in between.

© Hirschmann et al./Magneticum Pathfinder

A mammoth task – even for a supercomputer

Six years ago, Dolag repeatedly reset his alarm clock each night to avoid being confronted at breakfast with the discovery that the computer had hung up hours before. At that time, the simulation kept the supercomputer in Munich’s Leibniz Computing Center busy for 3 straight weeks. All of the 86,016 processing units were fully occupied with a simulation called Magneticum Pathfinder. “This made many other users very unhappy,” he recalls. The whole campaign took 25 million processor-hours, and generated 320 million MB of scientific data.

That sort of thing can’t be done every two years – not if one wants to remain on good terms with one’s colleagues, and certainly not when one considers that, over such a short interval, the computing power available cannot keep pace with the ever increasing demand for higher resolution and precision of the simulation. If any given element in a simulation interacts with a million others, and not with 100,000 as in the previous run, the computational cost rises by a factor of 100. “In other words, if this segment used to take up 1% of the processing time, it suddenly takes up the entire capacity of the machine,” says Dolag. He therefore assumes that the next bigger and better version of the simulator will take at least five years to design. In addition, early tests have shown that it will require the next generation of supercomputers – or perhaps the one after that – for its implementation.

In the meantime, Dolag is working on the details of future simulations. Last year, he received an Advanced Grant worth up to 2.5 million euros from the European Research Council. He and his team are using the money to study the behavior of the sparsely distributed ionized gas found in galaxy clusters. This simulation will pay more attention to processes at the atomic level. Dolag hopes to understand the processes involved at the smallest scales. The insights gained will then be incorporated into large-scale simulations with their enormous dimensions. The study will also explore the roles of factors such as the conductivity, viscosity and the degree of turbulence within the gas, and consider how these factors can be encoded in future runs – whenever the latest generation of supercomputer (and the goodwill of its other users) permits.

Although Dolag works at the LMU Observatory, he has little interest in stargazing. “It’s much more exciting to have a universe of one’s own to hand than having to observe a small part of some other one,” he says, with a knowing smile.

Text: Alexander Stirn

© C. Olesinski/LMU

Dr. Klaus Dolag is an academic staff member in the Department of Computational Astrophysics at the LMU Observatory, and Head of the Computational Center for Particle and Astrophysics in the Munich-based Cluster of Excellence “ORIGINS”. Dolag (b. 1970) studied Physics at the Technical University of Munich (TUM) and obtained his PhD from LMU for research work done at the Max Planck Institute for Astrophysics in Garching. He went on to complete his Habilitation at LMU. In 2020, he received one of the prestigious Advanced Grants from the European Research Council (ERC).

Read more articles of the current issue and other selected stories in the online section of INSIGHTS. Magazine.

What are you looking for?