european exascale software...

51
FP7 Support Action - European Exascale Software Initiative DG Information Society and the unit e-Infrastructures European Exascale Software Initiative SPXXL/SCICOMP, Paris - May 2011 Jean-Yves Berthou, EDF R&D EESI coordinator, www.eesi-project.eu 1

Upload: others

Post on 30-Apr-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

FP7 Support Action - European Exascale Software Initiative

DG Information Society and the unit e-Infrastructures

European Exascale Software Initiative

SPXXL/SCICOMP, Paris - May 2011

Jean-Yves Berthou, EDF R&D

EESI coordinator, www.eesi-project.eu

1

Page 2: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Factors that Necessitate Redesign of Our Software

� Steepness of the ascent from terascale to petascale to exascale

� Extreme parallelism and hybrid design

� Preparing for million/billion way parallelism

� Tightening memory/bandwidth bottleneck

� Limits on power/clock speed implication on multicore

� Reducing communication will become much more intense

� Memory per core changes, byte-to-flop ratio will change

� Necessary Fault Tolerance

� MTTF will drop� Checkpoint/restart has limitations

Software infrastructure does not exist today

Page 3: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Potential System Architecture Targets

Page 4: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

A Call to Action� Hardware has changed dramatically while software

ecosystem has remained stagnant� Previous approaches have not looked at co-design of

multiple levels in the system software stack (OS, runtime, compiler, libraries, application frameworks)

� Need to exploit new hardware trends (e.g., manycore, heterogeneity) that cannot be handled by existing software stack, memory per socket trends

� Emerging software technologies exist, but have not been fully integrated with system software, e.g., UPC, Cilk, CUDA, HPCS

� Community codes unprepared for sea change in architectures

� No global evaluation of key missing components

www.exascale.org

Page 5: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

International Community Effort

� We believe this needs to be an international collaboration for various reasons including:� The scale of investment� The need for international input on requirements � US, Europeans, Asians, and others are working on their

own software that should be part of a larger vision for HPC.� No global evaluation of key missing components� Hardware features are uncoordinated with software

development

www.exascale.org

Page 6: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Where We Are Today:Where We Are Today:

www.exascale.org

� Ken Kennedy – Petascale Software Project (2006)� SC08 (Austin TX) meeting to generate interest

� Funding from DOE’s Office of Science & NSF Office of Cyberinfratructure and sponsorship by Europeans and Asians

� US meeting (Santa Fe, NM) April 6-8, 2009 � 65 people

� European meeting (Paris, France) June 28-29, 2009

� Outline Report� Asian meeting (Tsukuba Japan) October 18-20, 2009

� Draft roadmap and refine report

� SC09 (Portland OR) BOF to inform others

� Public Comment; Draft Report presented� European meeting (Oxford, UK) April 13-14, 2010

� Refine and prioritize roadmap; look at management models

� Maui Meeting October 18-19, 2010

� SC10 (New Orleans) BOF to inform others (Wed 5:30, Room 389)

� San Francisco Meeting – April 6-7, 2011

Apr 2009

Jun 2009

Oct 2009

Nov 2009

Apr 2010

Oct 2010

Nov 2008

Nov 2010

Apr 2011

Page 7: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

IESP Goal

Build an international plan for developing the next generation open source software

for scientific high-performance computing

Build an international plan for developing the next generation open source software

for scientific high-performance computing

Improve the world’s simulation and modeling capability by improving the coordination and development of the HPC software environment

Workshops:

www.exascale.org

Page 8: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Roadmap Components

www.exascale.org

Page 9: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Motivations for launching EESI

Coordinate the European contribution to IESP

Enlarge the European community involved in the software roadmapping activity

Build and consolidate a vision and roadmap at the European Level, including applications, both from academia and industry

11/05/2011 9

Page 10: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Characteristics of the EESI project

Coordination and support action – FP7/Infrastructureswww.eesi-project.eu

Coordinator: EDF R&D, Jean-Yves Berthou Starting date : 1st of June 2010, for 18 monthsRequested EC contribution : 640 000 €Consortium : 8 contractual partners:

17 associated participants11 contributing participants

The project has received funding from the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement n° 261513 1010

Page 11: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

EESI partners

11

-

STRATOSSTRATOSSTRATOSSTRATOS

11

Page 12: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

EESI main goals

Build a European vision and roadmap to address the challenge of performing scientific computing on the new generation of computers which will provide hundreds of Petaflopperformances in 2015 and Exaflop performances in 2020

� investigate how Europe is located, its strengths and weaknesses, in the overall international HPC landscape and competition

� identify priority actions� identify the sources of competitiveness for Europe induced by the

development of Peta/Exascale solutions and usages� investigate and propose programs in education and training for the

next generation of computational scientists� identify and stimulate opportunities of worldwide collaborations

1212

Page 13: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

EESI main tasks

Coordination of the European participation in IESP� Make a thorough assessment of needs, issues and strategies

� Develop a coordinated software roadmap� Provide a framework for organizing the software research community

� Engage and coordinate vendor community in crosscutting efforts

� Encourage and facilitate collaboration in education and training

Cartography of existing HPC projects and initiatives in Europe, US and ASIA

Coordination of “disciplinary working groups” at the European level� Four groups “Application Grand Challenges”

� Four groups “Enabling technologies for Petaflop/Exaflop computing”

Synthesis, dissemination and recommendation to the European Commission

1313

Page 14: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Initial cartography of existing HPC projects, initiatives in Europe, US and ASIA

14

� Initiatives in Europe:� Europe – Funding

� Europe – Collaborative Projects & Existing Initiatives

� National Initiatives in Europe

� Initiatives worldwide:� US Systems & Initiatives

� Japan Systems & Initiatives� China Systems & Initiatives

� South Korea Systems & Initiatives

� India Systems & Initiatives

www.eesi-project.eu

Page 15: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

15

5 months 8 months 3 monthsT0+5 T0+17 T0+18T0+13

1 monthT0

1 monthT0+12

1 monthT0+16

June 1, 2010 October 2010 June 2011 November 2011

Initial International

workshop

Internal workshop: presentation of each working group results

and roadmaps

Synthesis of all contributions and

production of a set of

recommendations

Initial cartography of existing HPC

projects, initiatives in Europe, US and

ASIA

Weather, Climatology and Earth Sciences

Industrial and Engineering Applications (Transport, Energy)

Life science, Health, BPM

Fundamental Sciences (Chemistry, Physics)

Scientific software engineering

Software eco-systems

Hardware roadmap, links with vendors

Numerical libraries, solvers and algorithms

Updated cartography of existing HPC

projects, initiatives in Europe, US and

ASIAT0

European Exascale Software Initiative AGENDA

Constitution of WG, Setup of guidelines, organisation modes

Enabling technologies for Exaflop computing

Application Grand Challenges

Presentation of EESI

results to the

EC

Final conference:

public presentation of project result

15

Page 16: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

� EESI General framework

WP3 and WP4 Interactions

16

WP3 – Applications Grand Challenge

WP4 – Enabling Technologies for EFlops Computing

Scientific needs

Hardware & softwareroadmaps

WP2 : Link with IESP and other projects

WP5 : Communication - Dissemination

IESP Meeting, San Francisco, 6-7 April 2011

Page 17: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

� Common templates in order to

� Facilitate exchanges between WP3 and WP4� Identify and classify key issues (What-Who-Where-When-How much )

� Description of the scientific and technical perimeter of the WG (*)

� Social benefits, societal, environmental and economical impact (*)

� Scientific and technical hurdles� Address cross cutting issues : Resilience, Power Mngt, Programmability,

Performance and Reproducibility of the results, …� European strengths and weaknesses in the worldwide competition

� Sources of competitiveness for Europe

� Needs of education and training� Potential collaborations outside Europe

� Existing funded projects and funding agencies

� Timeline, needs of HR, provisional costs, ...

� Building an (or several) exa-scale prototype in Europe? By when?

� Facilitate elaboration of intermediate and final reports(*) for Application WGs, description in terms of Application Grand Challenges

Common Guidelines for Group Activity

17 IESP Meeting, San Francisco, 6-7 April 2011

Page 18: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

� 115 experts excluding Chairs/Vice chairs� 14.4 experts/group av without Chairs/Vice Chairs� Still possible to enrol up to 5 additional experts

� 14 countries, good European coverage� Participation of 2 US, 1 Israelian and 1 Lithuanian

experts

Expert distribution

18EESI WorkshopNovember 9 2010

Page 19: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

EESI participants

1919

-

Page 20: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

� Enabling technologies for Exaflop computing (JSC)� Assess novel HW and SW technologies for Exascale challenges� Review IESP roadmap and discuss in broader EU context� Economic impact & European competitiveness� Build a European vision and a roadmap

� 4 working groups� WG 4.1 : Hardware roadmap, links with Vendors

H. Huber (Chair, STRATOS/LRZ) & R. Brunino (Vice Chair, CINECA)� WG 4.2 : Software eco-systems

F. Cappello (Chair, INRIA) & B. Mohr (Vice Chair, JSC)� WG 4.3 : Numerical libraries, solvers, and algorithms

I. Duff (Chair, SFTC) & A. Grothey (Vice chair, Univ. Edinburgh)� WG 4.4 : Scientific software engineering

M. Ashworth (Chair, STFC) & A. Jones (Vice Chair, NAG)

WP4 Enabling technologies for Exaflop computing Organisation

20 IESP Meeting, San Francisco, 6-7 April 2011

Page 21: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

WG 4.1 : Hardware roadmap, links with Vendors

21

Leif Nordlund AMD SE Business Development Manager

Jean-Pierre Panziera Bull FR Director of Performance Engineering

Wilfried Oed Cray DE HPC Engineer, EMEA

Luigi Brochard IBM FR Distinguished HPC Engineer

Andrey Semin Intel RUS HPC Technology Manager, EMEA

Frank Baetke HP DE Worldwide Director of High Performance Computing

Michael Kagan Mellanox IL CTO of Mellanox Technologies

Dev Tyagi Supermicro UK General Manager at Supermicro, UK

Giampetro Tecchiolli Eurotech IT CEO/CTO of Eurotech

Salvatore Rinaudo ST Microelectronics IT ST Microelectronics

Rüdiger Wolff SGI DE

Principal Systems Engineer & Systems Engineer Manager, EMEA

Dmitry Tkachev T-Platforms RUS Chief Architect

Pierre Lagier Fujitsu FR Technical Director

Ulrich Brüning ExTOLL DE Computer Architecture Research Manager

Aad van der Steen NCF NL HPC architectures & benchmarking

Alex Ramirez BSC ES Computer Architecture Research Manager

Dominik Ulmer CSCS CH HPC Center

Jean Gonnord CEA FR Numerical Simulation & Computer Sciences

Page 22: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Questions for the experts� What are the major challenges/issues in HPC in the next 5 to 10 years?� What efforts will your institution or company under take to address these

challenges/issues?� a) If applicable, please mention all R&D on promisi ng future multi-Peta to

Exascale hardware technologies such as hybrid many-core CPUs, novel memory technologies (3D-stacked memory, memristor, Spin Torque Transfer Magnetic RAM, graphene based memory, etc.), novel background storage technologies, photonic interconnects, highl y cooling efficient system packaging technologies, etc.b) If applicable, please mention all R&D on promisi ng future multi-Peta to Exascale software technologies such as highly scalable programming languages, fault tolerant parallel programming envi ronments, parallel file systems with end-to-end data integrity, highly scal able and energy efficient system management software, hierarchical storage so lutions, OS with coherent inter-node scheduling mechanisms, etc.c) If applicable, please mention all R&D on promising highly scalable numerical algorithm for multi-Peta to Exascale applications

22

WG 4.1 : Hardware roadmap, links with Vendors

Page 23: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Chairs:� Franck Cappello INRIA&UIUC, 1 FR, Resilience and FT� Benrd Mohr, JSC, 1 DE, Performance tools

� Jesus Labarta BSC 1 ES Programming models� Marc Bull EPCC 1 UK OpenMP / PGAS

� François Bodin CAPS 2/V FR Programming/Compiler for GPUs� Raymond Namyst LABRI Bordeaux, 3 FR MPI&OpenMP Runtime , GPUs

� Jean-François Méhaut INRIA Grenoble 4 FR Performance Modeling/Apps.

� Matthias Müller TU Dresden 2 DE Validation/correctness Checking � Felix Wolf GRS 3 DE Performance Tools� David Lecomber ALINEA 2/V UK Parallel debugger � Simon McIntosh-Smith U. Bristol 3 UK Computer Architecture & FT

� Vladimir Voevodin MSU 1 RU Performance tools � Thomas Ludwig DKRZ 4 DE Power management � Olivier Richard INRIA 5 FR Job and Resource Manager� Jacques C. Lafoucriere CEA 6 FR I/O, File system � Toni Cortés BSC 2 ES I/O, Storage

� Pascale Rossé BULL V FR OS, System management� Karl. Solchenbach Intel V BE All

WG 4.2 : Software eco-systems

23

IESP San Francisco, April 2011

Page 24: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

WG 4.2 TOPICS� System Software

� Operating Systems, System Management� Job and Resource Manager � Runtime Systems � I/O systems

� Development Environments� Programming Models, Compilers � Debuggers� Performance tools� Correctness tools

� Crosscutting Dimensions� Resilience� Power management

WG 4.2 : Software eco-systems

IESP San Francisco, April 2011

Page 25: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

WG 4.3 Numerical Libraries, Solvers and Algorithms

25

Expert Name Field of Expertise Organisation Country

Jack DongarraHPC. Numerical linear algebra

Manchester/Tennessee UK/USA

Mike Giles GPU. CFD/Finance Oxford University UK

Gerard Meurant HPC. PDE solution ex-CEA Paris FR

Volker MehrmannLinear algebra. HPC applications

TU Berlin/Matheon DE

Torsten Koch Combinatorial Optimization BerlinDE

Peter Arbenz Eigenvalues. HPC ETH Zurich CH

Bo Kågström Dense Linear Algebra Ulmea SE

Julius Žilinskas Global optimization Vilnius University LT

Salvatore Filippone HPC. Numerical software Tor Vergata, Rome IT

Luc Giraud Iterative and hybrid methods INRIA Bordeaux FR

Patrick Amestoy Direct methods. Solvers ENSEEIHT-IRIT, Toulouse FR

Karl Meerbergen Preconditioners KU Leuven BE

Francois Pellegrini Mesh, load balancingBordeaux University, INRIA lab LABRI

FR

� Chair:

Iain

Duff

(STFC)

� Vice-Chair:

Andreas

Grothey

(EPCC)

IESP Meeting, San Francisco, 6-7 April 2011

Page 26: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

WG 4.4 Scientific Software Engineering

26

Expert Name Field of Expertise Organisation Country

Sebastian von Alfthan HPC algorithms and optimisation CSC FI

John Biddescombe scientific visualisation CSCS CH

Ian Bush HPC algorithms and optimisation NAG UK

Iris Christalder HPC software frameworks LRZ DE

Rupert Ford HPC algorithms and optimisationUniversity of Manchester

UK

Yvan Fournier HPC algorithms and optimisation EDF FR

Joachim Hein HPC algorithms and optimisation Lund University SE

Mohammed Jowkar HPC algorithms and optimisation BSC ES

Peter Michielse HPC algorithms and optimisation NWO NL

Stephen Pickles HPC algorithms and optimisation STFC UK

Felix Schuermann Blue Brain project EPFL CH

Stephane Ploix Visualisation EDF FR

Christopher Greenough Software engineering STFC UK

Ash Vadgama Performance Modeling AWE UK

� Chair:

Mike

Ashworth

(STFC)

� Vice-Chair:

Andrew

Jones

(NAG)

IESP Meeting, San Francisco, 6-7 April 2011

Page 27: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

� Objectives� Identify issues for Exascale application development� Develop roadmap for an Exascale application development

process

� First results� Five themes were proposed and accepted:

� Application frameworks and workflows� Visualization and data management� Fault tolerant algorithms� Application design� Software engineering

� Each themes was investigated and extensively discussed� Cross-cutting issues were identified

WG 4.4 Scientific Software Engineering

27 IESP Meeting, San Francisco, 6-7 April 2011

Page 28: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

� Application Grand Challenges (GENCI)� Identify the apps drivers for Peta and Exascale� Needs & expectations of scientific applications� Economic impact & European competitiveness� Build a European vision and a roadmap

� 4 working groups� WG 3.1 : Industrial & Engineering Apps (Transport, Energy)

P. Ricoux (Chair, Total) & JC. André (Vice Chair, CERFACS)� WG 3.2 : Weather, Climatology and Earth Sciences

G. Aloisio (Chair, ENES-CMCC) & M. Cocco (Vice Chair, INGV)� WG 3.3 : Fundamental Sciences

G. Sutmann (Chair, CECAM-JSC) & JP. Nominé (Vice chair, CEA)� WG 3.4 : Life Sciences and Health

M. Orozco (Chair, BSC) & J. Thorton (Vice Chair, EBI)

WP3 Application Grand Challenges Organisation

28 IESP Meeting, San Francisco, 6-7 April 2011

Page 29: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

� Aeronautics: full MDO, CFD-based noise simulation real-time CFD-based in-flight simulation: the digital aircraft

� Structure calculation: design new composite compounds, …� Special Chemistry : atom to continuum simulation for macro

parameters estimation in catalyst, surfactants, tribology, interfaces, nano-systems, …

� Energy: computations with smaller and smaller scales in larger and larger geometries : Turbulent combustion in closed engines and opened furnaces, explosion prediction, nuclear plants, hydraulics, …

� Oil and gas industries: full 3D inverse waveform problem, multi-scale reservoir simulation models, …

� Engineering (in general): multi-scale CFD, multi-fluids flows, multi-physics modelling,, complex systems modelling, …

WG 3.1 : Industrial & Engineering Apps (Transport, Energy)

29

Lead by: P. Ricoux (Chair, Total) & JC. André (Vice Chair, CERFACS)

Page 30: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Geophysics, Oil and gasMultiphase flows, Oil and gasCombustion, CFDFlight physics, Aeronautics

Applied math., PhysicsHPC, Particle dynamicsCFD, HydraulicsPropulsion and engine flows, AutomotivePropulsion, Engine flowsCFD, AeronauticsNeutronics, Nuclear industrySurface transportation, Trains Chemical EngineeringComputingComputer Aid Engineering

WG 3.1 : Industrial & Engineering Apps (Transport, Energy)

30

3-55 Hz

Code_Saturne-THYC

Lead by: P. Ricoux (Chair, Total) & JC. André (Vice Chair, CERFACS)

Henri CALENDRA (TOTAL, F)Keld NIELSEN (ENI, I) Thierry POINSOT (CERFACS, F)Eric CHAPUT (EADS/Airbus, F)Demetrios PAPAGEORGIOU (Imp. College, GB)Ulrich RUDE (U. Erlangen, D)Jean-Daniel MATTEI (EDF, F)Christian HASSE (BMW, D)Heinz PITSCH (Univ. Aachen, D)Norbert KROLL (DLR, D) Tanguy COURAU (EDF, F)Ali TABBAL (ALSTOM, F)Chuan-yu WU (U. Birmingham, GB)Mark SAWYER (Univ. Edimburgh)Jacek MARCZYK Ontonix

Page 31: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Questions and issues to address:- for efficiently reaching goals: which effort ? where to spend the money? - the roadmap must be implemented in several industriesRoadmaps are different depending upon particular indu stries : For some domains, Exaflop capacities are needed to solve industrial

problems:Aeronautics: Exaflop not the ultimate goal, need at least Zetaflop for LES

aircraft. Many “farming” applications (almost independent simulations), Data mining and data assimilation, Steady cases (RANS, URANS) with improved physical modeling

Seismics, O&G: Largely embarrassingly parallel, major problems are programming model, memory access and data management, need Zetaflop for full inverse problem

Engineering: Optimization, Monte Carlo type simulations, LES/RANS/DNS … (main problem : equations themselves, physics, coupling, …)

For these domains, production problems will be solved by “farming”

applications . 31

WG 3.1 : Industrial & Engineering Apps (Transport, Energy)

Page 32: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

For some domains, Exaflop capabilities are needed to solve industrial problems:

Combustion: turbulent combustion modeling, LES in large scale reactors, Coupling multi-physics, Exaflop for Combustion at the right scale, Multi-cycle engine ( Weak scalability)

Nuclear Energy: steady CFD calculations on complex geometries , RANS, LES and quasi-DNS type calculations under uncertainties, Monte Carlo Ultimate Neutronic Transport Calculation (Time dependant & Multiphycis coupling)

Other domains: multi-Fluids Flows , Fluid Structure Interactions, Fluidized beds (particle simulations , stochastic PDE, Multi-scale approach, physics, fluid in particles

32

WG 3.1 : Industrial & Engineering Apps (Transport, Energy)

Page 33: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Common main issues to be addressed (1/2)

� At the level of the simulation environment:� Unified Simulation Framework and associated services: CAD, mesh generation, data setting tools, computational scheme editing aids, visualization, etc.

� Multi‐physics simulation: establishment of standard coupling interfaces and software tools, mixing legacy and new generation codes

� Open‐source jointly developed mesh‐generation tool, automatic and adaptive meshing

� Standardized efficient parallel IO and data management (sorting memory for fast access, allocating new memory as needed in smaller chunks, treat parts of memory that are rarely/never needed based on heuristic algorithms, …)

33

WG 3.1 : Industrial & Engineering Apps (Transport, Energy)

Page 34: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Common main issues to be addressed (2/2)

� At the level of codes/applications:� New numerical methods, algorithms, solvers/libraries, improved efficiency

� coupling between stochastic and deterministic methods

� Numerical scheme involving Stochastic HPC computing for uncertainty and risk quantification

� meshless methods and particle simulation

� Scalable program, strong and weak scalability, load balancing, fault-tolerance techniques, multi-level parallelism, issues identified with multi-core with reduced memory bandwidth per core , collective communications, efficient parallel IO

� Standards programming models (MPI, OpenMP, C++, Fortran, …) handling multi-level parallelism and heterogeneous architecture

� Human resources, training (what level?)34

WG 3.1 : Industrial & Engineering Apps (Transport, Energy)

Page 35: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

The big challenge is to model this complex system

Warren M. Washington – NCAR

Scientific Grand Challenges Workshop Series:

Challenges in Climate Change Science and the Role of Computing at the Extreme Scale

DOE Workshop (ASCR-BER)

November 6-7, 2008

• Several complex process to be simulated

• Several interacting processes

• Great range of time scales to be

analyzed

• Great range of spatial scales to be

considered

• Need interdisciplinar sciences (physics,

chemistry, biology, geology,…)

• Inherently non-linear governing

equations

• Need sophisticated numerics

• Need huge computational resources

WG 3.2 : Weather, Climatology and Earth Sciences

Lead by : G. Aloisio (Chair, ENES-CMCC) & M. Cocco (Vice Chair, INGV)

Page 36: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

36

WG 3.2 : Weather, Climatology and Earth Sciences

DE 3FR 3IT 3UK 3SE, FI, ES 3

Organization Email Country Area of Expertise

ENES-CMCC [email protected] IT Exascale Computing ChairINGV [email protected] IT Seismology Vice-ChairCMCC [email protected] IT Oceanography

IPSL [email protected] FR Earth-system modellingCERFACS [email protected] FR Coupled Climate ModelsIPG [email protected] FR Solid Earth, Seismology

BADC [email protected] UK Climate Data ManagementManchester Univ. [email protected] UK High Performance Computing ECMWF [email protected] UK Data Assimilation

MPI [email protected] DE Earth System ModelingDKRZ [email protected] DE High Performance Computing LMU [email protected] Solid Earth, Geophysics

SMHI [email protected] SE Climate Change modelingFMI [email protected] FI MeteorologyBSC [email protected] ES Earth Sciences

(Final) Topics Group

Lead by : G. Aloisio (Chair, ENES-CMCC) & M. Cocco (Vice Chair, INGV)

Page 37: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

37

WG 3.2 : Weather, Climatology and Earth Sciences

The WCES (Weather, Climate & Earth Sciences) Main Goals

� High-resolution Climate and Earth System Models to better simulate the interactions and feedbacks among the physical & biological complex processes controlling natural phenomena

� Innovative approaches for modelling solid Earth processes to better understand the physical processes controlling earthquakes, volcanic eruptions and tsunamis as well as those driving tectonics and Earth surface dynamics

• atmosphere• ocean• land, soils, permafrost, and vegetation (specified and interactive) cover• sea ice• land ice• carbon and other biogeochemical cycle• clouds and related microphysics• hydrology• atmospheric chemistry• aerosols• ice sheets• human systems

Several complex components to be considered !!!

� Unified next-generation atmospheric simulation systems for Weather and Climate scientists, allowing a strong reduction in climate prediction uncertainty

Lead by : G. Aloisio (Chair, ENES-CMCC) & M. Cocco (Vice Chair, INGV)

Page 38: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

ENES foresight document (working draft)

Evolution diagram with global/regional models and resolution

Ensemble runs are needed for quantifying uncertainty

WG 3.2 : Weather, Climatology and Earth Sciences

Page 39: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Critical Exascale Software for WCES

Page 40: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Multiscale

WG 3.3: Fundamental Sciences

40

Particles

Plasmas

Atoms

Molecules

Time and space scales

Large structures

Domain

sQCD

Fusion for energy

Astrophysics

Nanosciences

Laser/plasmaInteraction

Chemistry/catalysis

Soft matter/polymers

Quantum chemistryElectronic structure

Lead by: G. Sutmann (Chair, CECAM-JSC) & JP. Nominé (Vice chair, CEA)

Fundamental sciences: Physics, Chemistry, Material Sciences, Astrophysics

Page 41: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Name Organization Country Area of ExpertiseVolker Springel Garching, MPI Astrophysik Ger AstrophysicsRomain Teyssier ETH Zürich Sui AstrophysicsMaurizio Ottaviani CEA Fra Fusion

Luis SilvaUniversidade Tecnica de Lisboa Por Laser Plasma Interaction

Alexei Removich Kokhlov Moscow State University Rus Soft MatterAlessandro Curioni IBM Research - Zurich Sui Materials SciencesGilles Zerah CECAM - CEA Fra Materials SciencesNicola Marzari University of Oxford UK Materials SciencesAdrian Wander STFC Daresbury UK Materials Sciences

Mike Payne University of Cambridge UK Quantum ChemistryThierry Deutsch CEA Fra Quantum ChemistryMike Ashworth STFC Daresbury UK Methods and algorithmsThomas Schultess CSCS Sui Materials SciencesPieter in t'Veld BASF Ger Soft matter

41

Ger: 2 Sui: 3 Fra: 3 Por: 1 Rus: 1 UK: 4 Total: 14

WG 3.3: Fundamental Sciences

Page 42: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

� Objectives (1)� Identify exascale challenging problems like

� Quantum chemistry: electronic structure calculations for large systems, excited states

� Materials Science: ceramics, semi-conductors, design of nano-particles, catalysis

� Soft matter physics: simulation of long polymers, self organisation

� Astrophysics: cosmology, high-resolution solar physics

� Plasma physics: fusion and laser-plasma interactions � Particle physics

What is the impact of: fundamental understanding, industrial impact

WG 3.3: Fundamental Sciences

42

Page 43: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Science Drivers

43

� Materials Science : first principles design of materials� e.g. energy conversion

• chemical energy to electricity (fuel cells)

• sunlight to electricity (photovoltaic cells)

• nuclear energy to electricity (nuclear fission or fusion)

� e.g. energy storage• batteries

• supercapacitors

• chemical storage in high energy density fuels (hydrogen, etanol, methanol)

� e.g. energy use• solid state lighting, smart windows, low power computing, leightweightmaterials for transporting

WG 3.3: Fundamental Sciences

Page 44: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Summary

44

� Software issues for Fundamental Sciences

� Fault tolerant and energy efficient algorithms� Software support to measure or estimate energy

� Data locality

� Optimal order algorithms

� Mesh generation

� Algorithms with low communications� Standard interfaces for multiscale simulations

� Support of several executables in one job� Parallel I/O

� Load-balancing

WG 3.3: Fundamental Sciences

Page 45: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Lead by: M. Orozco (Chair, BSC) & J. Thorton (Vice Chair, EBI)

Simulation scenario in Life Sciences� Molecular simulations

� Structural prediction� Docking� Atomistic simulation� Cell-scale

mesoscopicsimulations

� Gene inter-relations� Cell simulation� Organ simulation� Ecosystem simulation� Drug design

J.M.Cela

WG 3.4 : Life Sciences and Health

Page 46: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

46

Helmut Grubmuller Max Planck InstitutePaolo Carloni German Research SchoolWofgang Wenzel Karlsruhe Institute of TechnologyReinhard Schneider EMBL

Janet Thorton/Andrew Lyall EMBL-EBICharles Laughton University of Nottingham

Modesto Orozco BSCAlfonso Valencia CNIO

Richard Lavery University of NottinghamNicolas Baurin Sanofi-Aventis

Erik Lindahl Stockholm University

Henry Markram/Felix Schuermann EPFLManuel Peitsch Swiss Institute of Bioinformatics

Anna Tramontano University of RomaJulian.Tirado-Rives (US) University of Yale Adrian E. Roitberg (US) University of Florida

WG 3.4 : Life Sciences and Health

Page 47: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Example of MD Simulations of biological systems :

� 1000 times bigger, realistic cell membrane models with drug permeation and binding => large amount of data

� 1000 times more computationally complex, simulations of biomolecules based purely on quantum mechanical principles , possible at Exascale but need high memory per core

� Timescales 1000 times longer, studies of the dynamics of protein folding and molecular motors that take up t o seconds of real time , Anton (a specific purpose architecture) is capable to run 100 longer (3~5 years ahead)

WG 3.4 : Life Sciences and Health

Page 48: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Felix Schürmann, Herny Markram (Blue Brain Project, EPFL)

Medical simulation: Human Brain Project

WG 3.4 : Life Sciences and Health

Page 49: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

Summary� Molecular Dynamics

� Data expected to grow 10~100 times, competitiveness vs ASIC, Exascale capability for complex systems and biased-sampling techniques

� Genomics � Data management, high I/O requirement, constant application

development, training

� Systems Biology � Multi-scale simulation, visualization tools, hardware flexibility for

new software development, data handling

� Medical Simulation� Data Management, real-time monitoring, Exabyte memory, data

transfer, Multi-scale

WG 3.4 : Life Sciences and Health

Page 50: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

50

5 months 8 months 3 monthsT0+5 T0+17 T0+18T0+13

1 monthT0

1 monthT0+12

1 monthT0+16

June 1, 2010 October 2010 June 2011 November 2011

Initial International

workshop

Internal workshop: presentation of each working group results

and roadmaps

Synthesis of all contributions and

production of a set of

recommendations

Initial cartography of existing HPC

projects, initiatives in Europe, US and

ASIA

Weather, Climatology and Earth Sciences

Industrial and Engineering Applications (Transport, Energy)

Life science, Health, BPM

Fundamental Sciences (Chemistry, Physics)

Scientific software engineering

Software eco-systems

Hardware roadmap, links with vendors

Numerical libraries, solvers and algorithms

Updated cartography of existing HPC

projects, initiatives in Europe, US and

ASIAT0

European Exascale Software Initiative AGENDA

Constitution of WG, Setup of guidelines, organisation modes

Enabling technologies for Exaflop computing

Application Grand Challenges

Presentation of EESI

results to the

EC

Final conference:

public presentation of project result

50

Page 51: European Exascale Software Initiativespscicomp.org/wordpress/wp-content/uploads/2011/05/berthou-EESI… · Where We Are Today: Ken Kennedy – Petascale Software Project (2006) SC08

EESI Final Conference, www.eesi-project.eu

51

EESI Final InternationalConference10 – 11 October 2011,

World Trade Center,Barcelona (Spain)

ObjectivesThe main objective of the EESI Final International Conference is presenting the project results to the stakeholders, scientists and policy makers containing presentations and discussions with all experts participating to the WG activities.

Policy makers, scientists, and participants of the EESI project are invited to attend this event.