some thoughts on extreme scale computing

13
Some thoughts on Extreme Scale Computing David J. Dean Senior Advisor Under Secretary for Science Department of Energy

Upload: fadhila

Post on 22-Feb-2016

45 views

Category:

Documents


0 download

DESCRIPTION

Some thoughts on Extreme Scale Computing. David J. Dean Senior Advisor Under Secretary for Science Department of Energy. The tools have changed rapidly. These were our supercomputers in the 1970’s and 1980’s. 1986: X-MP/48 ~220 Mflop sustained 120-150kW (depending on model) - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Some thoughts on Extreme Scale Computing

Some thoughts on Extreme Scale Computing

David J. DeanSenior AdvisorUnder Secretary for ScienceDepartment of Energy

Page 2: Some thoughts on Extreme Scale Computing

The tools have changed rapidly These were our supercomputers in

the 1970’s and 1980’s

1986:X-MP/48 ~220 Mflop sustained120-150kW (depending on model)$40M for computer+disks (FY09$)

NNSA: Roadrunner at 1.105 PF (LINPACK)LANL; 2.5 MW

SC/ASCR: Jaguar at 2.331 PF (LINPACK)ORNL; 6.9 MW

Factor 1x107 in speedFactor of 18 in power

Today:

Page 3: Some thoughts on Extreme Scale Computing

Various DOE computing assets serving the DOE mission space

Machine Place Speed (max)

On list Since

Jaguar ORNL 1.75 PF 2009 (1)

Roadrunner LANL 1.04 PF 2009 (3)

Dawn LLNL 0.478 PF 2007 (7)

BG/P ANL 0.458 PF 2007 (8)

NERSC LBL 0.266 PF 2008 (15)

Red Storm SNL 0.204 PF 2009 (17)

Top 500 list, November 2009

Page 4: Some thoughts on Extreme Scale Computing

Leadership Computing: Scientific Progress at the Petascale

Nuclear EnergyHigh-fidelity predictive simulation tools for the design of next-generation nuclear reactors to safely increase operating margins.

Fusion EnergySubstantial progress in the understanding of anomalous electron energy loss in the National Spherical Torus Experiment (NSTX).

Nano ScienceUnderstanding the atomic and electronic properties of nanostructures in next-generation photovoltaic solar cell materials.

TurbulenceUnderstanding the statistical geometry of turbulent dispersion of pollutants in the environment.

Energy StorageUnderstanding the storage and flow of energy in next-generation nanostructured carbon tube supercapacitors

BiofuelsA comprehensive simulation model of lignocellulosic biomass to understand the bottleneck to sustainable and economical ethanol production.

All known sustained petascale science applications to date have been run on OLCF system

4

Page 5: Some thoughts on Extreme Scale Computing

Process for identifying exascale applications and technology for DOE

missions ensures broad community input• Town Hall Meetings April-June 2007• Scientific Grand Challenges Workshops

November 2008 – October 2009• Climate Science (11/08), • High Energy Physics (12/08), • Nuclear Physics (1/09), • Fusion Energy (3/09), • Nuclear Energy (5/09), • Biology (8/09), • Material Science and Chemistry (8/09), • National Security (10/09)

• Cross-cutting workshops• Architecture and Technology (12/09)• Architecture, Applied Mathematics and

Computer Science (2/10)• Meetings with industry (8/09, 11/09)• External Panels

• ASCAC Exascale Charge• Trivelpiece Panel

MISSION IMPERATIVES

FUNDAMENTAL SCIENCE

5

Page 6: Some thoughts on Extreme Scale Computing

Simulation enables fundamental advances in basic science.

• High Energy Physics• Understanding of Dark Energy and Dark

Matter• Testing QCD and physics beyond the

standard model• Nuclear Physics

• Unification of nuclear physics from quark-gluon plasma to basics of nucleon structure to nucleosynthesis

• Fundamental understanding of fission and fusion reactions

• Facility and experimental design• Effective design of accelerators• Probes of dark energy and dark matter • ITER shot planning and device control

6

ITER

ILC

Hubble imageof lensing

Structure ofnucleons

These breakthrough scientific discoveries and facilities require exascale applications and technologies.

Page 7: Some thoughts on Extreme Scale Computing

Computing applied to problems of National

importance• Climate• Nuclear Energy• Smart Grid• Nuclear weapons• Materials under extremes• Combustion• Competitiveness

See ASCAC (Rosner Committee) and Trivelpiece reportsSimulations are a key part to solutions

Page 8: Some thoughts on Extreme Scale Computing

Understand and control chemical and physical phenomena in multicomponent systems from femtoseconds to millennia, at temperatures to 1000°C and radiation doses to hundreds of displacements per atom

Example: Fundamental science challenge for nuclear energy systems

• Microstructural evolution and phase stability

• Mass transport, chemistry, and structural evolution at interfaces

• Chemical behavior in actinide and fission-product solutes

• Solution phenomena

• Nuclear, chemical, and thermomechanical phenomena in fuels and waste forms

• First-principles theory for ƒ-electron complexes and materials

• Predictive capability across length and time scales

• Material failure mechanisms

Basic Research Needs for Advanced Nuclear Energy Systems, Gaithersburg (2006)

8

Page 9: Some thoughts on Extreme Scale Computing

Example: The next decade will see Nuclear Energy models spanning multiple time and

length scales.

9

Bridging length and time scales to resolve scientific unknowns [in nuclear energy] will require 3D simulations 100x standard resolution = A 10 Exaflop problem.Science-Based, Nuclear Energy Systems Enabled by Advanced Modeling and Simulation at the Extreme Scale

Page 10: Some thoughts on Extreme Scale Computing

Critical Exascale Technology Investments

10

• System power is a first class constraint on exascale system performance and effectiveness.

• Memory is an important component of meeting exascale power and applications goals.

• Programming model. Early investment in several efforts to decide in 2013 on exascale programming model, allowing exemplar applications effective access to 2015 system for both mission and science.

• Investment in exascale processor design to achieve an exascale-like system in 2015.

• Operating System strategy for exascale is critical for node performance at scale and for efficient support of new programming models and run time systems.

• Reliability and resiliency are critical at this scale and require applications neutral movement of the file system (for check pointing, in particular) closer to the running apps.

• HPC co-design strategy and implementation requires a set of a hierarchical performance models and simulators as well as commitment from apps, software and architecture communities.

Page 11: Some thoughts on Extreme Scale Computing

Co-design expands the feasible solution space to allow better solutions

11

Page 12: Some thoughts on Extreme Scale Computing

12

Exascale Computing Need:

Enable dramatic advances in climate modeling, energy technologies, national security, and science via development of next-generation HPC

Challenges: Next 1000x improvement in computing capability cannot be achieved by

simply scaling up today’s hardware Power consumption needs to be dramatically reduced to make

exascale feasible Millions of processors will present significant challenges for

concurrency and resiliency New programming models will be required to exploit new architectures Applications, programming environment, and hardware must be co-

developed New architectures will require rethinking applications and programming

environment from the ground up

Page 13: Some thoughts on Extreme Scale Computing

13

Exascale Computing Path Forward

We have begun exploratory research efforts (FY10-11) A concerted program would be:

Goal: exascale capability by the end of the decade Lab/industry/academic partnerships to begin hardware and

programming environment R&D Focus on key applications, including climate, nuclear security, Energy

Simulation topics High level coordination required to ensure multiple research

programs are appropriately integrated