Related Websites

 

Frequently Asked Questions

FAQ - Questions

What is the Bolshoi Cosmological Simulation?
Why a supercomputer simulation?
What observational input data were used?
What theory determined the computations?
What does the Bolshoi simulation show?
What supercomputer was used?
Was the Bolshoi simulation run just once?
Are the Bolshoi simulations the first cosmological simulations to be run?
Why is the Bolshoi simulation so important?
Why the name Bolshoi?
Where is more information about Bolshoi, BigBolshoi/Multidark, and miniBolshoi?
What was the support for the Bolshoi simulation suite?

FAQ - Answers

What is the Bolshoi Cosmological Simulation?

The Bolshoi simulation is the best cosmological simulation yet made of the evolution of the large-scale structure of the universe, from shortly after the Big Bang to the present. Calculated on one of the world’s largest supercomputers, its results are being made publicly available to the world’s astronomers and astrophysicists.

Why a supercomputer simulation?

The physical universe is too vast and massive and time scales too long for bench-top laboratory experiments to test scientific theories. With supercomputers, astronomers can create a simulation—a model—of distant events that occurred over billions of years, based on known laws of physics, and observe that model in sped-up time, to make predictions that can be tested by actual observations of the real universe. In essence, supercomputing has transformed cosmology into an experimental science.

The Bolshoi simulation took observational data, input the measurements into a computational model of a representative sample of the universe, started time running shortly after the universe formed, and let the supercomputer crunch the numbers on an enormous scale. Along the way, the supercomputer regularly captured three-dimensional “snapshots,” like frames of a giant 3-D movie. These snapshots, called time steps, have been stored for future study by astronomers and astrophysicists worldwide.

What observational input data were used?

The input data to the Bolshoi simulation are based on the world’s most accurate available measurements of key cosmological parameters.

Specifically, Bolshoi is based on the cosmological parameters of WMAP5, a data set that combines the most reliable ground-based observations with a five-year run of cumulative data from the highly successful NASA Explorer mission WMAP (the Wilkinson Microwave Anisotropy Probe). Such a long run of cumulative data allowed the WMAP science team to produce high-resolution maps of the entire sky, meticulously plotting the anisotropy—that is, the unevenness—of the temperature and other characteristics of the cosmic microwave background radiation in great detail.

The cosmic microwave background radiation is radiation left over from the Big Bang that formed the universe 13.7 billion years ago (an age that WMAP has now determined to within 1% accuracy). Analysis of the tiny variations in this primordial radiation over the sky has revealed a wealth of information about the history and composition of the universe. It also provides a significant test of Einstein’s General Theory of Relativity, which describes the nature of gravity.

In 2010 the WMAP science team released even longer cumulative results of the WMAP mission’s first seven years of measurements, plus additional ground-based observations. These WMAP7 results are completely consistent with the WMAP5 results on which the Bolshoi simulation is based.

What theory determined the computations?

The Bolshoi simulation is based on Lambda Cold Dark Matter cosmogony (abbreviated ΛCDM), now accepted as the standard modern theoretical framework for understanding the formation of the large-scale structure in the universe.

Ordinary matter—known to physicists as “baryonic” matter because most of the mass of ordinary atoms comes from its baryons, the protons and neutrons in atomic nuclei—makes up less than 5 percent of the universe. About five times that much—about 23 percent—of the density of the universe is made of invisible, transparent “cold dark matter,” whose existence is felt through its gravitational influence. The remaining 72 percent of the cosmic density is dark energy (described more below) It is now known that every galaxy, including our own Milky Way, resides at the center of a giant halo of dark matter roughly ten times larger in radius and mass. The ΛCDM cosmogony includes dark energy makes detailed predictions for the growth of structure in the universe hierarchically through gravitation: specifically, it predicts that repeated mergers of smaller things ultimately end up creating bigger things.

Thus, the Bolshoi simulation models not just how the minority of visible universe of stars, gas, and dust evolved—but also the vast majority of the invisible universe. Indeed, one principal purpose of the Bolshoi simulation is to compute and model the evolution of dark matter halos—thereby rendering the invisible visible for astronomers to study, and to predict structures that astronomers could then seek to observe.

What does the Bolshoi simulation show?

Using the best available observational data (WMAP5/7) and following the best available theory (ΛCDM), the Bolshoi simulation computed how a representative volume of the universe evolved from the Big Bang to the present.

It did not model the entire universe. Just as many sciences seeking to understand large populations rely on analyzing a smaller representative sample of a population, the Bolshoi simulation modeled a smaller representative volume. Specifically, it computed the evolution of a cubic volume measuring about 1 billion light-years on a side: a volume that would contain more than a million galaxies. (For comparison, the visible Milky Way galaxy is about 100,000 light years across, and its dark matter halo is about 1.5 million light years across.)

The Bolshoi simulation clock started about 24 million years after the Big Bang, based on a highly accurate calculation of the evolution of the universe to that time. The Bolshoi simulation then followed the evolution of 8.6 billion particles, each particle representing an amount of dark matter with a mass about 200 million times the mass of the sun (about 1/5000th [??] the mass of the Milky Way dark matter halo). 180 times during the simulated evolution of the universe, the resulting picture of all the dark matter particles and their motions was captured and stored like a frame in a monumental three-dimensional movie. These stored time steps will allow astrophysicists to explore the three-dimensional model of the universe and study how dark matter halos, their galaxies, and clusters of galaxies coalesced and evolved.


What supercomputer was used?

The full Bolshoi computation took 6 million CPU hours on the Pleiades supercomputer at the NASA Advanced Supercomputing facility (NAS) at NASA Ames Research Center, Moffett Field, California. Pleiades was ranked in June 2011 as the seventh fastest supercomputer in the world (see http://www.nasa.gov/home/hqnews/2011/jun/HQ-11-194_Supercomputer_Ranks.html ). The resulting stored 180 time steps—each about 0.5 terabytes—occupy nearly 90 terabytes of memory.
The Bolshoi simulation produced breathtaking images and animations (see http://hipacc.ucsc.edu/Bolshoi/Images.html and http://hipacc.ucsc.edu/Bolshoi/Movies.html ) that show the formation and evolution of every dark matter halo that can host galaxies, including the merging of these halos. These visualization were made possible by custom software tools developed by the NAS data analysis and visualization team. More details about the Pleiades supercomputer and its visualization tools appears at http://www.nas.nasa.gov/hecc/resources/pleiades.html .

Was the Bolshoi simulation run just once?

The Bolshoi simulation was the first of a suite of three separate simulations run on Pleiades.

A second simulation, known as BigBolshoi or MultiDark, was run with the same number of particles, but in a volume 4 billion light years across, thus 64 times larger. Although lower resolution, BigBolshoi was run to predict the properties and distribution of galaxy clusters and other very large structures in the universe, and to help with projects such as the Baryon Oscillation Spectroscopic Survey (BOSS) attempting to determine the properties of the dark energy. (Most of the density of the universe, about 73 percent, is yet another invisible component, called dark energy, which is causing the expansion of the universe to speed up. All available data suggest that dark energy is a cosmological constant, a possibility first proposed by Albert Einstein, who called it by the Greek capital letter lambda [Λ]. That is what the Bolshoi simulations assume.)

The Bolshoi and BigBolshoi simulation database is being hosted by the MultiDark project, supported by grants from Spain and Germany at the Potsdam Astrophysics Institute (AIP) in Potsdam, Germany.

A third, higher-resolution simulation,called miniBolshoi, is now being run on Pleiades of a region to model the formation and distribution of the tiniest galaxies.

Are the Bolshoi simulations the first cosmological simulations to be run?

No. The first gigantic cosmological supercomputer simulation was the Millennium simulation, published in 2005 by the Virgo consortium of scientists in Europe. The original Millennium simulation was run in a volume approximately 2 billion light years across. It stored outputs at 64 time steps. Based on the first release of WMAP data in 2003 (WMAP1) from only one year of data from the WMAP mission, the Millennium simulation has proved to be very fruitful for modern cosmology, and the basis for more than 400 research papers. A higher-resolution simulation called Millennium-II was published in 2009, but it used the same WMAP1 cosmological parameters, now known to be incorrect.

The initial WMAP1 parameters have been superseded by the WMAP5 parameters (used for Bolshoi) and WMAP7 (which are consistent with the WMAP5 findings), whose longer runs of data-collecting (five and seven years) together with improved cosmological data from other sources significantly modified the initially released findings.

The Bolshoi simulation has the same force resolution as Millennium-II in a volume 16 times larger, and is based on the WMAP5/7 cosmological parameters.


Why is the Bolshoi simulation so important?

Bolshoi is the most accurate and highest-resolution large cosmological simulation run to date. Its results—including merger trees (basically the family trees) of dark matter halos and galaxies—will be made available this fall and winter to the world’s astronomers in a series of phased releases.

The Bolshoi and BigBolshoi/MultiDark simulations will allow astronomers to model the evolution of the large scale structure of the universe, including the evolution and distribution of galaxies and clusters, with unprecedented accuracy. For example, the distribution of galaxies and the properties of galaxy clusters implied by the Bolshoi simulations appear to be in excellent agreement with observations.

Huge cosmological simulations are essential for interpreting the results of ongoing astronomical observations, and for planning the new large surveys of the universe that are expected to help determine the nature of the mysterious dark energy.


Why the name Bolshoi?

“Bolshoi” is the Russian word for “big,” “great,” or “grand.”

Where is more information about Bolshoi, BigBolshoi/Multidark, and miniBolshoi?

Technical papers resulting from the Bolshoi simulations are a growing list; links can be found at http://hipacc.ucsc.edu/Bolshoi/Papers.html . Links to additional relevant websites, images, videos, and press releases can be found at  http://hipacc.ucsc.edu/Bolshoi/ .

What was the support for the Bolshoi simulation suite?

This research was supported by grants from NASA and NSF to Joel Primack and Anatoly Klypin, including massive grants of supercomputer time on the NASA Advanced Supercomputing (NAS) supercomputer Pleiades at NASA Ames Research Center in Mountain View, CA. Hosting of the Bolshoi outputs and analyses at Leibniz-Institute for Astrophysics Potsdam (AIP) is partially supported by the MultiDark grant from the Spanish MICINN Consolider-Ingenio 2010 Programme.