virtualization for hpc at nci

Post on 25-Dec-2014

331 Views

Category:

Technology

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

In this presentation from the Dell booth at SC13, Joseph Antony from NCI describes how they are using HPC Virtualization to meet user needs. Watch the video presentation: http://insidehpc.com/2013/12/05/panel-discussion-thought-hpc-virtualization-never-going-happen/

TRANSCRIPT

HPC  +  Virtualiza.on?

Joseph  Antony  joseph.antony@anu.edu.au  

 

(NCI)        

Disclaimer:  Views  expressed  are  en.rely  mine    

WHAT  IS  NCI?  

NCI  –  an  overview  

Mission:  •   to  foster  ambi8ous  and  aspira8onal  research  objec8ves  and  to  enable  their  realisa8on,  in  the  Australian  context,  through  world-­‐class,  high-­‐end  compu8ng  services  

NCI  is:  •   being  driven  by  research  objec8ves    •   a  comprehensive,  ver8cally-­‐integrated  research  service  •   providing  na8onal  access  on  priority  and  merit,  and  •   being  built  on,  and  sustained  by,  a  collabora8on  of  na8onal  organisa8ons  and  research-­‐intensive  universi8es  

   

Research  Outcomes  

 Communi.es  and    

Ins.tu.ons/  Access  and  Services  

 Exper.se  Support    

and    Development  

 Digital  Laboratories  

 Data  Centric  Services    

 Compute  (HPC/Cloud)  

and    Data  Infrastructure  

Res

earc

h O

bjec

tives

Climate Science has a solution

Raijin NCI cloud

NCI + CoE technical

Integrated Intimately connected Robust Accessible

Impact !

Raijin NCI cloud

NCI + CoE technical

ACCESS • A collaborative tool • Under svn • Co-support • CoE/BoM/CSIRO PhDs • Shared research(ers) CMIP-5 • A collaborative data set • Co-supported • Shared analyses • CoE/BoM/CSIRO PhDs • Shared research(ers)

In  case  you’re  wondering  where  are  we  located?  

•  In  the  na.on’s  capital,  at  its  na.onal  university  …  

HPC  Virtualiza.on?  

•  HPC  procurements  typically  involve  major  CAPEX  spend  on  Big  Iron  for  aWacking  grand  challenge  problems  

•  Typically  most  large  HPC  centers  have  – Capability  machines:  lands  in  the  TOP500  around  #10  to  20.  These  have  special  purpose  architectures  and  accelerators  

–   Work-­‐horse  machines:  usually  x86  +  IB  

HPC  Virtualiza.on?  (1)  

•  HPC  procurements  typically  involve  major  CAPEX  spend  on  Big  Iron  for  aWacking  grand  challenge  problems  

•  Typically  most  large  HPC  centers  have  – Capability  machines:  lands  in  the  TOP500  around  #10  to  20.  These  have  special  purpose  architectures  and  accelerators  

–   Work-­‐horse  machines:  usually  x86  +  IB  

HPC  Virtualiza.on?  (2)  

From ‘CERN Data Center Evolution’

Looming  Iceberg  ….  

Dribble.com – “Mr. Iceberg meets the Titanic”

Changing  Environment  for  HPC  Centers  

HMAS HPC

File system storage

Use/Re-use: Metadata Apriori Analytics Deep Storage Retrieval Long Lived Artifacts Multiple communities Publishing Data Replication

From http://www.exascale.org/

Is  there  life  beyond  a  batch-­‐oriented  system?  

•  HPC  Centers  will  be  forced  to  evolve  beyond  batch-­‐oriented  systems  due  to  an  immovable  iceberg  –  ‘Big  Data’  

•  The  NCI  moved  to  virtualiza.on  in  1999  to  handle  non-­‐tradi.onal  workloads  due  to  complex  data  lifecycles  –  Satellite  Image  Processing  –  CMIP5  Climate  Data  Processing  –  Genomics  Assembly  Pipelines  –  N-­‐to-­‐N  Cancer  Genomics  Comparisons  –  Interac.ve  Volume  Rendering  –  Trawling  YouTube  and  Analyzing  Birthday  Videos  

Engaging  with  Priority  Research:  Environment.  

Goal:  •  To  provide  a  single  high-­‐performance  

compu.ng  for  environment  research    Partners:    •  CSIRO,  GA,  Bureau,  Research  Community  •  Lockheed-­‐Mar.n,  GA,  NCI,  VPAC    Requirements:  •  Provide  na.onal  processing  environment  

for  key  satellite  data  (eg.  SEADAS)  •  Provide  collabora.ve  environment  for  

tools  that  produce  reference  Digital  Eleva.on  Maps  

•  Provide    data  environment  for  fast,  easy  na.onal-­‐nested  grid.  

         

Engaging  with  Priority  Research:  Environment.  

Data  Intensive  Ac8vity  •  Data  Processing  Intensive  Pipelines  (SEADAS)  over  large  data  raw  imagery  Key  ini8al  datasets  •  LANDSAT  archive  •  MODIS  •  DEMs  (9s,  3s,  1s)  •  LIDAR  •  Deriva.ve  products  Data  Intensive  query  and  analysis  environment  Eg.  Hadoop  over  nested  grids.  

Engaging  with  Priority  Research:  Environment.  

•  Collabora.on  to  provide  beWer  and  common  processing  environments  •  Next  genera.on  of  tools  (able  to  operate  at  na.onal  scale)  •  New  aggrega.on  of  tools  and  techniques  under  TERN  e-­‐MAST  project.    

Engaging  with  Priority  Research:  Climate  cont.  

Research  Highlights:  Life  Sciences  

Research  Highlights:  Materials/Nanotechnology  

Research  Highlights:  Physical  and  Chemical  Sciences  

Research  Highlights:  Physical  and  Chemical  Sciences  

Research  Highlights:  Physical  and  Chemical  Sciences  

Research  Highlights:  Earth  Sciences  

From http://www.exascale.org

NCI’s  Cloud  Node  Architecture  from  Dell  

NCI’s  Science  Cloud  Building  Blocks  from  Dell  

•  From  Dell  using  C8000  chassis  building  blocks  for  OpenStack  compute,  Swik  (S3)  and  Ceph  (EBS)  

•  Hyperscale  meets  Exascale  …  (?)  

Summary  

•  HPC  in  the  Cloud  –  Clusters-­‐in-­‐the-­‐cloud  – Offload,  Burs.ng  

•  Big  Data  needs  – Complex,  long-­‐lived  data  processing  – Community  ecosystems  

•  Service  provider  abstrac.on  

END  

top related