Millennium simulation

from Wikipedia, the free encyclopedia

The Millennium Simulation (also known as Millennium Run in English ) is a project of the Virgo Consortium , a group of cosmologists from Germany, Great Britain, Canada, Japan and the USA under the leadership of the Max Planck Institute for Astrophysics in Garching Munich . The aim was to create a computer simulation to clarify the cosmological question of how the large structures, galaxies and stars observable today could be formed from the largely structureless universe immediately after the Big Bang . In the summer of 2005 it was possible to present results showing the origin of large irregularities from small introduced inhomogeneities.

The simulation

At the center of the simulation is not ordinary matter , but dark matter , which according to popular opinion makes up about 80 percent of the mass of the universe. Obviously, this type of matter could not be driven apart by the strong electromagnetic radiation of the hot early universe and thus clumped together earlier than “normal” matter. Therefore dark matter played the most important role in the structure formation of the universe.

Even with supercomputers it is not possible to model the processes in the entire known universe. Therefore it was limited to a cube-shaped section of 650 MPc or 2 billion light years edge length. Dark matter of 10 trillion solar masses was "introduced" into this area , which was evenly distributed over 2160³ ≈ 10 billion virtual particles. At the start of the simulation, tiny density fluctuations were impressed on the distribution of the dark matter. Such irregularities must also have existed in reality, as is known from observing the cosmic background radiation . The strength of the irregularities roughly corresponds to that in the real universe 10 million years after the Big Bang . The program now calculated the movement of each particle due to gravity with a step length of about a million years. Like the real universe, the simulation was based on an expanding space. The simulation ended after about 11,000 time steps, which corresponds to a period of 14 billion years, i.e. the age of today's universe . The simulation ran for 28 days on 512 processors .

The simulation begins 397,000 years after the Big Bang, when the cosmic background radiation that can be received today was emitted. The cosmic background radiation has been measured by astrophysical satellites for years (e.g. the Cobe satellite ). The inhomogeneities found were taken as the starting point for the observed structures of the distribution of matter. From this initial distribution of matter and applying the laws of physics, the validity of which is assumed for the currently observable universe, the development of the spatial distribution of matter was studied on the mathematical model. Since the large-scale structures observable today (spongy distribution of galaxies / galaxy clusters with filaments, "walls" and "voids") resulted in the course of the simulation, one could assume that the basic assumptions of the simulation were correct.

In a second simulation step, normal matter was modeled into the simulation according to the distribution of dark matter, which made it possible to visualize glowing stars and galaxy shapes.

On June 2, 2005, the first scientific results were published. After the Sloan Digital Sky Survey had challenged current assumptions of cosmology by detecting black holes in very bright quasars at great distances (and thus at an unexpectedly early stage of the universe), the Millennium simulation was also able to use the very early Prove the formation of such quasars in their model that this does not contradict common assumptions of cosmology.

Millennium II

In 2009 the same group of astrophysicists performed the Millennium II simulation (MS-II), which examined a smaller space cube with an edge length of 400 million light years. In doing so, 2160³ "particles" were also considered, although each represented only 6.9 million solar masses. This task was even more complex to program than the original simulation, because distributing the computing power between the processors becomes more complex when it comes to dense clumps of matter. This stronger spatial agglomeration of the "particles" is due to their lower mass. With MS-II, 2048 processors on the Power-6 computer in Garching near Munich were busy for about a month.

Another simulation was carried out in which fewer "particles" were used under the same starting conditions to determine whether the results of the high-resolution simulation can also be confirmed at lower resolutions.

Millennium XXL

In 2010 the most complex simulation to date, Millennium XXL (MXXL), was carried out. This time a cube with an edge length of 12 billion light-years was chosen, in which 6720³ "particles" with 7 billion solar masses each were examined. MXXL thus comprised a volume 216 times larger than the original Millennium simulation. The simulation was carried out on JUROPA, one of the top 15 supercomputers at the time. It had 12,000 cores, 30 TiBytes of RAM and output more than 100 terabytes of data. Cosmologists have used the data from the MXXL simulation to study the distribution of galaxies and halos of dark matter in very large dimensions and to further clarify how the largest structures in the universe came into being.

Millennium Observatory

The Millennium Run Observatory (MROb - roughly: observatory fed with Millennium data) is a theoretical, virtual observatory that uses the predictions for the distribution of dark matter and galaxies from the Millennium simulations in order to make "observations" simulated with a virtual telescope and then to compare this with the actual observations. Astrophysicists use it to plan real observation sessions and can thus reduce the time in which they need scarce telescope time. By continually comparing simulated "observations" and actual observations, the predictions of the Millennium simulations can be checked and refined. A first tranche of virtual "observations" by MROb was made available to astronomers worldwide for analysis on the MROb website. The virtual MROb universe can be searched with an online tool, the MROb browser. This enables the user to interact with the MROb database, which stores the data of millions of dark matter halos and their galaxies. Updates to the MROb network are currently planned.

The result

The simulation impressively shows how the introduced density fluctuations gradually increase, so that ultimately a lumpy structure emerges like the one in today's universe. At the end of the simulation, masses the size of galaxies and galaxy clusters had formed. A spongy structure with fractal properties has emerged.

The Millennium simulation results agree very well with the observations of the universe, so that for the first time a valid model of the entire universe has been generated. However, newer simulations such as Bolshoi have delivered partly different results on the basis of new data.

See also

literature

Web links

Individual evidence

  1. ^ The Millennium-XXL Project: Simulating the Galaxy Population in Dark Energy Universes . Retrieved July 2, 2013.