1. The Purpose and History of Astronomical Computer Simulations
2. Algorithms
3. Systems/Architectures
4. Simulation/Projects
2
The Purpose of Astronomical Computer Simulations What are Astronomical Computer Simulations? Astronomical computer simulations are computationally generated representations of astronomical objects (galaxies, clusters, dark matter, etc) used to model subsets of the universe.
What purpose do they serve? Simulations are used as experiments to verify cosmological theories. They are created with initial conditions depending on the specific experiment. When run, accurate simulations model their observable counterparts.
To what theories can simulations be applied? Expansion of the universe, merging galaxies, star formation, dark matter, etc.
3
The Need for Simulations
Wait?
Simulate!
Observations could take millions of years. Use the N-body algorithms. In 1941 Erik Holmberg had a bright idea [1]
4
Why
The Development of Simulations First computer simulation conducted in 1963 by Sverre Aarseth for ~ n = 100 [2] From there different N-body algorithms developed over time
5
Year
1970’s
1980’s
1990’s
2000’s
N
10^3
10^4
10^10+
The N-body problem
Particle-Particle (PP)
Direct application of N-body
O(n2)
No approximation
Accuracy = machine precision
7
Barnes Hut Tree
Most common application of Tree Codes
O(nlogn)
Octree for 3D
Precision and Work dependent on chosen theta
Particle-Mesh (PM)
Apply a mesh over computation area
O(GlogG) + O(N)
Potential fields and Mass density
Low resolution
8
Fix to PM
Use PP for small distances
Particle-Particle/Particle-Mesh(P3M)
The Modern Landscape
RIT’s BlueSky Linux
-1000 CPUs 4TB memory 200TB storage NASA-AMES -About 250 times BlueSky Linux’s computing power
-over 4 PetaFLOPS Illustris Project -8,192 CPUs, 25 TB of RAM, 230 TB of gathered data GRAPE -512 PC 2 GRAPE-DR boards each
11
GRAPE-DR
Hardware acceleration board
Works parallel to the cpu
Much more efficient when compared to GPGPU
12
GRAPE-DR Fermi NVIDIA
cores 512 448
tranistors 400M 3B
clock 400 Mhz 1.15 Ghz
flops 400 G 1.03 T
power 50 W 247 W
The Millennium Simulation
Run by the Virgo Consortium located at the Max Planck Institute for Astrophysics in Garching, Germany.
10 billion “particles”, each representing 8.6 x 10^8 solar masses of dark matter
Astronomical Object Corresponding Particle Amount
Dwarf Galaxy ~100 particles
Milky Way Galaxy ~1,000 particles
Galaxy Cluster ~1,000,000+ particles
14
The Millennium Simulation
Spatial resolution of 5 kpc/h
Contains over 20 million galaxies
Allows for accurate predictions on strong and weak gravitational lensing
Large scale, yet relatively precise
28 days of real-time processing
Published in Nature in June 2005
15
Illustris project - On-going group of Astrophysical simulations run by many scientists.
- Attempting to get a better understanding of the formation and evolution of galaxies.
- Includes black holes, dark matter, dark energy
- Main simulation
- Curie supercomputer at CEA (France) and SuperMUC computer at the Leibniz Computing Center (Germany)
- 8,192 CPU cores, 25 TB of RAM,
- galaxy matching between FP and DMO runs
- non-parametric stellar morphologies at z>0
16