The Millennium Run, or Millennium Simulation (referring to its size) is a computer N-body simulation used to investigate how the distribution of matter in the Universe has evolved over time, in particular, how the observed population of galaxies was formed. It is used by scientists working in physical cosmology to compare observations with theoretical predictions.
Overview
A basic scientific method for testing theories in cosmology is to evaluate their consequences for the observable parts of the universe. One piece of observational evidence is the distribution of matter, including galaxies and intergalactic gas, which are observed today. Light emitted from more distant matter must travel longer in order to reach Earth, meaning looking at distant objects is like looking further back in time. This means the evolution in time of the matter distribution in the universe can also be observed directly.
The Millennium Simulation was run in 2005 by the Virgo Consortium, an international group of astrophysicists from Germany, the United Kingdom, Canada, Japan and the United States. It starts at the epoch when the cosmic background radiation was emitted, about 379,000 years after the universe began. The cosmic background radiation has been studied by satellite experiments, and the observed inhomogeneities in the cosmic background serve as the starting point for following the evolution of the corresponding matter distribution. Using the physical laws expected to hold in the currently known cosmologies and simplified representations of the astrophysical processes observed to affect real galaxies, the initial distribution of matter is allowed to evolve, and the simulation's predictions for formation of galaxies and black holes are recorded.
Since the completion of the Millennium Run simulation in 2005, a series of ever more sophisticated and higher fidelity simulations of the formation of the galaxy population have been built within its stored output and have been made publicly available over the internet. In addition to improving the treatment of the astrophysics of galaxy formation, recent versions have adjusted the parameters of the underlying cosmological model to reflect changing ideas about their precise values. To date (mid-2018) more than 950 published papers have made use of data from the Millennium Run, making it, at least by this measure, the highest impact astrophysical simulation of all time.
Size of the simulation
For the first scientific results, published on June 2, 2005, the Millennium Simulation traced 21603, or just over 10 billion, "particles." These are not particles in the particle physics sense – each "particle" represents approximately a billion solar masses of dark matter. The region of space simulated was a cube with about 2 billion light years as its length. This volume was populated by about 20 million "galaxies". A super computer located in Garching, Germany executed the simulation, which used a version of the GADGET code, for more than a month. The output of the simulation needed about 25 terabytes of storage.
First results
The Sloan Digital Sky Survey had challenged the current understanding of cosmology by finding black hole candidates in very bright quasars at large distances. This meant that they were created much earlier than initially expected. In successfully managing to produce quasars at early times, the Millennium Simulation demonstrated that these objects do not contradict our models of the evolution of the universe.
Millennium II
In 2009, the same group ran the 'Millennium II' simulation (MS-II) on a smaller cube (about 400 million light years on a side), with the same number of particles but with each particle representing 6.9 million solar masses. This is a rather harder numerical task since splitting the computational domain between processors becomes harder when dense clumps of matter are present. MS-II used 1.4 million CPU hours over 2048 cores (i.e. about a month) on the Power-6 computer at Garching; a simulation was also run with the same initial conditions and fewer particles to check that features in the higher-resolution run were also seen at lower resolution.
Millennium XXL
In 2010, the 'Millennium XXL' simulation (MXXL) was performed, this time using a much larger cube (over 13 billion light years on a side), and 67203 particles each representing 7 billion times the mass of the Sun. The MXXL spans a cosmological volume 216 and 27,000 times the size of the Millennium and the MS-II simulation boxes, respectively. The simulation was run on JUROPA, one of the top 15 supercomputers in the world in 2010. It used more than 12,000 cores for an equivalent of 300 years CPU time, 30 terabytes of RAM and generated more than 100 terabytes of data. Cosmologists use the MXXL simulation to study the distribution of galaxies and dark matter halos on very large scales and how the rarest and most massive structures in the universe came about.