introduction to the hpcc dirk colbry research specialist institute for cyber enabled research

14
Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

Upload: lindsay-mccarthy

Post on 12-Jan-2016

217 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

Introduction to the HPCC

Dirk ColbryResearch Specialist

Institute for Cyber Enabled Research

Page 2: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

HPCC Online Resources

www.hpcc.msu.edu – HPCC home wiki.hpcc.msu.edu – Public/Private Wiki forums.hpcc.msu.edu – User forums rt.hpcc.msu.edu – Help desk request tracking mon.hpcc.msu.edu – System Monitors

Page 3: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

HPCC Cluster Overview Linux operating system

Primary interface is text based though Secure Shell (ssh)

All Machines in the main cluster are binary compatible (compile once run anywhere)

Each user has 50Gigs of personal hard drive space. /mnt/home/username/

Users have access to 33TB of scratch space. /mnt/scratch/username/

A scheduler is used to manage jobs running on the cluster

A submission script is used to tell the scheduler the resources required and how to run a job

A Module system is used to manage the loading and unloading of software configurations

Page 4: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

gateway

Access to HPCC is primarily though the gateway machinie: ssh [email protected] Access to all HPCC services uses MSU username

and password.

Page 5: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

HPCC System Diagram

Page 6: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

Hardware Time LineYear Name Description Cores Memory Total Cores

2005 green 1.6GHz Itanium2 (very old) 128 576 (shared) 128

Main Cluster

2005 amd05 Dule-core 2.2GHz AMD Opterons 4 8GB 512

2007 intel07 Quad-core 2.3GHz Xeons 8 8GB 1024

2008 intel08 Sun x4450s (Fat Node) 16 64GB 32

2009 amd09 Sun Fire X4600 Opterons (Fat Node) 32 256GB 128

1696

We are currently investigating two new purchases for 2009/2010 Graphics Processing Unit (GPU) Cluster New General Purpose Large Cluster

Page 7: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

HPCC System Diagram

Page 8: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

Cluster Developer Nodes

Developer Nodes are accessible from gateway and used for testing.

ssh dev-amd05 – Same hardware as amd05 ssh dev-intel07 – Same hardware as intel07 ssh dev-amd09 – Same hardware as amd09

We periodically have some test boxes. These include: ssh dev-intel09 – 8 core intel Xeon with 24GB of memory ssh gfx-000 – Nvidia Graphics Processing Node

Jobs running on the developer nodes should be limited to two hours of walltime.

Developer nodes are shared by everyone.

Page 9: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

HPCC System Diagram

Page 10: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

Available Software

Center Supported Development Software Intel compilers, openmp, openmpi, mvapich,

totalview, mkl, pathscale, gnu... Center Supported Research Software

Matlab, R, fluent, abaqus, HEEDS, amber, blast, ls-dyna, starp...

Center Unsupported Software (module use.cus) gromacs, cmake, cuda, imagemagick, java,

openmm, siesta...

Page 11: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

Steps in Using the HPCC

1. Connect to HPCC

2. Transfer required input files and source code

3. Determine required software

4. Compile programs (if needed)

5. Test software/programs on a developer node

6. Write a submission script

7. Submit the job

8. Get your results and write a paper!!

Page 12: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

Module System

To maximize the different types of software and system configurations that are available to the users. HPCC uses a Module system.

Key Commands module avail – show available modules module list – list currently loaded modules module load modulename – load a module module unload modulename – unload a module

Page 13: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

Getting Help

Documentation and User Manual - wiki.hpcc.msu.edu

User Forums - forums.hpcc.msu.edu

Contact HPCC and iCER Staff for:

Reporting System Problems

HPC Program writing/debugging Consultation

Help with HPC grant writing

System Requests

Other General Questions

Primary form of contact - www.hpcc.msu.edu/contact

HPCC Request tracking system – rt.hpcc.msu.edu

HPCC Phone – (517) 353-9309 9am-5pm

HPCC Office – Engineering Building 3200 9am-5pm

Page 14: Introduction to the HPCC Dirk Colbry Research Specialist Institute for Cyber Enabled Research

Next Week - Getting Connected

Secure Shell - hpc.msu.edu Putty Windows Secure Shell

X11 Server (windowing) xming cygwin

File transfers Mapped Network Drives - files.hpc.msu.edu