high energy and nuclear physics collaborations and links stu loken berkeley lab henp field...

23
High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

Upload: kelley-welch

Post on 16-Jan-2016

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

High Energy and Nuclear PhysicsCollaborations and Links

Stu Loken

Berkeley Lab

HENP Field Representative

Page 2: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative
Page 3: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative
Page 4: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative
Page 5: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative
Page 6: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative
Page 7: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative
Page 8: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative
Page 9: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative
Page 10: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

Primary Goal:

A full understanding, achieved through numerical simulation, of how supernovae of all types explode and how the elements have been created in nature. Comparison of these results with astronomical observations and the abundance pattern seen in the sun.

SciDAC Supernova Science Center

UCSC LANL LLNL

U. Arizona

Page 11: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

Major Challenges

• For gravitational collapse supernovae – Three-dimensional hydrodynamics coupled to radiation transport including regions that are optically gray

• For thermonuclear supernovae – Turbulent combustion at low Prandtl number and very high Rayleigh number

• For nucleosynthesis – Standardized nuclear data in a machine-friendly format

• Computational - Optimization of codes on parallel computers. Manipulation and visualization of large data sets. Development of novel approaches to radiation transport and hydrodynamics on a grid

Page 12: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

First three-dimensional calculation of a core-collapsesupernova.

This figure shows the iso-velocitycontours (1000 km/s) 60 ms aftercore bounce in a collapsing massivestar. Calculated by Fryer and Warrenat LANL using SPH (300,000 particles).

Resolution is poor and the neutrinoswere treated artificially (trapped orfreely streaming, no gray region), butsuch calculations will be used toguide our further code development.The box is 1000 km across.

Page 13: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

Nucleosynthesis in a 25solar mass supernova comparedwith abundances in the sun.

Abundances for over 2000isotopes of the elements fromhydrogen through polonium were followed in each of approximately 1000 zonesthroughout the life of the starand its simulated explosion as a supernova. Our library will eventually include 300 suchmodels.

A production factor of 20 forevery isotope would mean that every isotope in the sun couldbe created if 1 gram in 20 ofthe primordial sun had been ejected by 25 solar mass supernovaelike this one. Actually making the solar abundances requires many stars of different masses.

Page 14: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

General Approach

• Develop an ensemble of appropriate codes – We are exploring several approaches to hydrodynamics, at both high and low Mach numbers. Monte Carlo transport will serve to test and validate other approaches

• Obtain short term results for guidance – Useful results can be obtained in 3D using a Lagrangian particle based code (SPH) and rudimentary neutrino transport. These can guide future work.

• Work with in-house experts in computer science – There are experts in radiation hydro – e.g. Monte Carlo, at LANL. The University of Arizona Center for Integrative Modeling and Simulation and High Performance Distributed Computing Laboratory will help with code development and visualization.

• Work with other SciDAC centers -

Page 15: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

Other Related SciDAC Activities

• Terascale Supernova Initiative – We will make our presupernova models and nuclear data libraries publicly available. We plan joint meetings with the TSI teams • High Performance Data Grid Toolkit and DOE Science Grid – These three national co-laboratories and networking centers, working together with computer scientists on our team at Arizona, can help us optimize our codes for large scale, distributed computing environments (Grid computing). The PI’s of these centers have ongoing interactions with our team members at the HPDC laboratory in Arizona

• Algorithmic and Software Framework for Applied Partial Differential Equations Center – Phil Collela is a co-author of one of our main codes (PPM). We are interested in learning new techniques especially for

following low Mach number flows

Page 16: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

Other Related SciDAC Activities - continued

• Terascale High Fidelity Simulations of Turbulent Combustion with Detailed Chemistry – Type Ia (thermonuclear) supernovae are prime examples of turbulent combustion. We have already worked with experts at the Livermore Sandia Combustion Center for years • Scalable Systems Software Center and the Center for Component Technology for Terascale Simulation Software – We will collaborate with experts at these two centers to make our codes scalable, component based, and run efficiently in Grid computing environments

• Particle Physics Data Grid (PPDG) – Grid enabled tools for data intensive requirements. Also possible common interest in Monte Carlo and other particle transport schemes. Discussions begun with Richard Mount.

Page 17: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

http://www.phy.ornl.gov/tsi/

TeraScale TeraScale Supernova InitiativeSupernova Initiative

Page 18: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

Investigator Team

Radiation Transport/Radiation Hydrodynamics

Blondin (NC State) Bruenn (FAU) Hayes (UCSD) Mezzacappa (ORNL) Swesty (SUNYSB)

Nuclear Structure Computationsfor EOS and Neutrino-Nucleus/Nucleon Interactions

Dean (ORNL, UT) Fuller (UCSD) Haxton (INT, Washington) Lattimer (SUNYSB) Prakash (SUNYSB) Strayer (ORNL, UT)

Linear System/Eigenvalue Problem Solution Algorithms for Radiation Transport and Nuclear Structure Computation

Dongarra (UT, ORNL) Saied (UIUC, NCSA) Saylor (UIUC, NCSA)

Visualization

Baker (NCSA) Toedte (ORNL)

Cross-Cutting Team Long-Term CollaborationsStructured like SciDAC

Supernova Science

Blondin Bruenn Fuller Haxton Hayes Lattimer Meyer (Clemson) Mezzacappa Swesty

TOPS

TOPS

SDM

CCAPERCTSTT

Page 19: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

ADVANCED COMPUTING FOR 21ST CENTURY ACCELERATOR SCIENCE & TECHONLOGY

Accelerators are Crucial to Scientific Discoveries in High

Energy Physics, Nuclear Physics, Materials Science, and Biological

Science

Page 20: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

ASE Partnerships in Applied Math & Comp. Sci.

The success of the ASE will require close collaboration with applied mathematicians and computer scientists to enable the development of high-performance software components for terascale platforms.

Collaborating SciDAC Integrated Software Infrastructure Centers:

• TOPS – Eigensolvers, Linear Solvers • TSTT – Mesh Generation & Adaptive Refinement • CCTTSS – Code Components & Interoperability • APDEC – Parallel Solvers on Adaptive Grids• PERC – Load Balancing & Communication

Collaborating National Labs & Universities:

• NERSC – Eigensolvers, Linear Solvers• Stanford – Linear Algebra, Numerical Algorithms • UCD – Parallel Visualization, Multi-resolution techniques• LANL – Statistical Methods for Computer Model Evaluation

Page 21: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

FNAL, BNLHigh Intensity Beams in Circular Machines

UCLA, USC, Tech-XPlasma Based Accelerator Modeling

UC DavisParticle & Mesh Visualization

Stanford, NERSCParallel Linear Solvers & Eigensolvers

LBNLParallel Beam Dynamics Simulation

SLACLarge-Scale Electromagnetic Modeling

SNLMesh Generation & Refinement

U. Maryland Lie Methods in

Accelerator Physics

LANL High Intensity Linacs,

Computer Model Evaluation

M=e:f2: e:f3: e:f4:…N=A-1 M A

Page 22: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

Collaborations and Links

ISIC Acc. Sim

QCD TSI SN Res.

TOPS ** ? ** *CCTTSS

** ** *

TSTT ** ** *APDEC ** * **SSS *PERC ** ** ** *SDMC * **

Page 23: High Energy and Nuclear Physics Collaborations and Links Stu Loken Berkeley Lab HENP Field Representative

More Possible Links

Link Acc. Sim

QCD TSI SN Sci.

PPDG ** ** *SAM *

Sci Grid *Networks

**

Combustion * *Viz * * *