namd and gromacs - california institute of · pdf file1 13 1/30 namd and gromacs the quest to...

5
1 13 NAMD and GROMACS The Quest To Go Parallel Peter Spijker California Institute of Technology Materials Process and Simulation Center Biochemistry & Molecular Biophysics Technische Universiteit Eindhoven Department of Biomedical Engineering Division of Biomedical Imaging and Modeling March 2, 2004 12 13 2/30 Presentation Overview • Purpose of presentation • NAMD? GROMACS? What the heck? • Discussion on both packages • Compare both packages • Examples • Parameter files • Parallelisation • References • Questions 13 3/30 Purpose Of This Presentation • Give a first glimpse on NAMD and GROMACS • A (very) short introduction how to use them • Explain reasons for using this packages • Discuss the parallel-experience 13 4/30 NAMD? GROMACS? What The Heck? • NAMD is short for: N ot (just) A nother M olecular D ynamics Program • Developed by the Theoretical Biophysics Group (Klaus Schulten) at the University of Illinois at Urbana-Champaign • GROMACS is short for: Gro ningen M a chine for C hemical S imulations • Developed by the Berendsen Group, Department of Biophysical Chemistry, University of Groningen, The Netherlands 13 5/30 NAMD? GROMACS? What The Heck? • Both packages are Molecular Dynamics Programs • Both aimed for high-performance simulations • Both are designed for parallel systems. This is a development of the last years. • Both are pretty young (in their current form): NAMD: 1999 GROMACS: 2001 13 6/30 Molecular Dynamics • A couple of remarks on Molecular Dynamics: • Simulations are classical • Electrons are in the ground state • Force fields are approximate • Force field is pair-additive • Long range interactions are cut off • Boundary conditions are unnatural

Upload: phamtuong

Post on 28-Mar-2018

225 views

Category:

Documents


3 download

TRANSCRIPT

1

13

1/30

NAMD and GROMACS

The Quest To Go Parallel

Peter SpijkerCalifornia Institute of Technology

Materials Process and Simulation CenterBiochemistry & Molecular Biophysics

Technische Universiteit EindhovenDepartment of Biomedical Engineering

Division of Biomedical Imaging and Modeling

March 2, 2004

12 13

2/30

Presentation Overview

• Purpose of presentation

• NAMD? GROMACS? What the heck?

• Discussion on both packages

• Compare both packages

• Examples

• Parameter files

• Parallelisation

• References

• Questions

13

3/30

Purpose Of This Presentation

• Give a first glimpse on NAMD and GROMACS

• A (very) short introduction how to use them

• Explain reasons for using this packages

• Discuss the parallel -experience

13

4/30

NAMD? GROMACS? What The Heck?

• NAMD is short for:

Not (just) A nother Molecular Dynamics Program

• Developed by the Theoretical Biophysics Group (Klaus Schulten) at the University of Illinois at Urbana-Champaign

• GROMACS is short for:

Groningen Machine for C hemical S imulations

• Developed by the Berendsen Group, Department of Biophysical Chemistry, University of Groningen, The Netherlands

13

5/30

NAMD? GROMACS? What The Heck?

• Both packages are Molecular Dynamics Programs

• Both aimed for high-performance simulations

• Both are designed for parallel systems. This is a development of the last years.

• Both are pretty young (in their current form):

NAMD: 1999

GROMACS: 2001

13

6/30

Molecular Dynamics

• A couple of remarks on Molecular Dynamics:

• Simulations are classical

• Electrons are in the ground state

• Force fields are approximate

• Force field is pair-additive

• Long range interactions are cut off

• Boundary conditions are unnatural

2

13

7/30

NAMD

• A short overview:

• Force Field Compatibility (CHARMM & X-PLOR)

• Full Electrostatics (Particle Mesh Ewald)

• Multiple Time Stepping ( Verlet)

• Input and Output Compatibility (PDB & PSF & DCD à VMD)

• Dynamics Simulation Options (next slide)

• Easy to Modify and Extend (C++)

• Interactive MD Simulations

• Load Balancing

13

8/30

NAMD

• Dynamics Simulation Options:

• Constant Energy Dynamics

• Constant Temperature Dynamics

• Periodic Boundary Conditions

• Constant Pressure Dynamics

• Energy Minimization

• Fixed Atoms

• Rigid Waters

• Rigid Bonds to Hydrogens

• Harmonic Restraints

• Spherical or Cylindrical Boundary Restraints

13

9/30

NAMD

• Used forcefields

• CHARMM

• X-PLOR

• AMBER

• GROMACS

• But:

• Bond potentials are always approximated with harmonic or sinusoidal potentials

• For both AMBER and GROMACS restrictions apply

13

10/30

GROMACS

• A short overview:

• Claimed being the fastest code in the world

• A good set of bonded and non-bonded interaction equations

• Long Range Electrostatics ( Ewald, PME, PPPM)

• All hydrogen forcefield

• Output compatibility (GRO à VMD)

• Possibility of Reduced Units with Lennard-Jones

• Simple forcefield files à Easy to do coarse grain

• Large Analysis Toolkit

• Easy to extend (C++)

• Uses preprocessor (GROMPP) to optimise the input files

13

11/30

GROMACS

• Dynamics Simulation Options

• Shell Molecular Dynamics

• Constraint algorithms (SHAKE, LINCS)

• Simulated Annealing

• Stochastic Dynamics (Verlet)

• Brownian Dynamics (position Langevin)

• Energy Minimization (Steepest Descent, Conjugate Gradient)

• Normal Mode Analysis

• Free Energy Calculations

• Essential Dynamics Sampling (WHATIF)

13

12/30

GROMACS

• Non-Bonded Interactions:

• Lennard-Jones

• Buckingham

• Coulomb interaction

• Coulomb interaction with reaction forcefield

• Modified non-bonded interactions (shift function, Ewald summation)

• Bonded Interactions:

• Bond Stretching Harmonic Potential

• Morse Potential Bond Stretching

• Cubic Bond Stretching Potential

• Harmonic Angle Potential

• Cosine based angle potential

• Improper and Proper dihedrals

• All type of restraints

3

13

13/30

NAMD vs GROMACS

• Both packages are very useful

• Both perform well on parallel systems

• NAMD is more focussed on biological simulations (proteins, lipid bilayers)

• GROMACS is more aimed for computational chemistry on a whole

• Both are highly compatible with themselves and other programs for both analysing and visualisation

• GROMACS is very useful for coarse grain simulations

13

14/30

Examples

• NAMD

• Apolipoprotein (benchmark)

• Bovine Pancreatic Trypsin Inhibitor (interactive)

• Alanine (small, isolated)

• GROMACS

• Dipalmitoylphosphatidylcholine (DPPC)

• Full Atomistic

• Coarse Grain

13

15/30

Examples - NAMD

• Apolipoprotein (benchmark)

92.224 atoms, 1.0 ps, 300 K, periodic

13

16/30

Examples - NAMD

• Bovine Pancreatic Trypsin Inhibitor (interactive)

822 atoms, 20 ps, 300 K, minimisation

Interactive :

- Watching

while running

- Possibility to

pull atoms to

a position

13

17/30

Examples - NAMD

• Alanine

66 atoms, 10 ps, 300 K, free, runtime = 1.5 minutes

Show in VMD

13

18/30

Examples - GROMACS

• Dipalmitoylphosphatidylcholine (DPPC, benchmark)

Full atomistic, 121.856 atoms, 10 ps, 323 K, periodic

4

13

19/30

Examples - GROMACS

• Dipalmitoylphosphatidylcholine (DPPC)

Coarse Grain, 27.324 particles, 400 ps, 323 K, periodic

± 17 min on 1 proc.

13

20/30

Forcefield file

• GROMACS Forcefield file

• Define Atomtypes

• Define Non-bond parameters

• Define Molecule Types

• Define Atoms

• Define Bonds, angles, dihedrals

• Define System

13

21/30

Input- And Outputfiles

• NAMD:

Input:

• PDB: coordinate & velocity file

• PSF: structure file

• FF: CHARMM, X-PLOR parameter file

• NAMD: Simulation parameter file

Output:

• COOR: final coordinate file

• VEL: final velocity file

• DCD: trajectory file

• DAT: run information (energies and so on)

• XSC: System configuration output

13

22/30

Input- And Outputfiles

• GROMACS:

Input:

• GRO: coordinate & velocity file

• TOP: topology (i.e. structure) of the system

• ITP: force field (can be included in TOP)

• MDP: simulation parameter file

Preprocessed:

• TPR: simulation input file

Output:

• GRO: final configuration & velocity file

• TRR/XTC: trajectory file

• EDR: energies

• LOG: run information

13

23/30

The Quest To Go Parallel

• Why parallel?

• Longer simulations

• Larger systems

• More complex systems

• Keep in mind that communication can

be the bottle-neck

13

24/30

Parallel Computation

• Basic idea is to run divide the work of the job throughout a number of processors

• Involves intelligent communication between processors to share information

• Using Batch Scripting to submit jobs

• Using MPI to communicate and build topology

BORG

NODE

HULK

NODE NODE

5

13

25/30

The Power Of Parallel Computation

• Benchmark System NAMD

• Apolipoprotein

92.224 atoms

Full atomistic

Coulomb interactions

Explicit water

• Run information

1000 steps à 1.0 ps

• Computational time

1 processor: 1 hour 30 minutes à 16 ps/day à S = 100%

4 processors: 25 minutes à 57.6 ps/day à S = 90%

8 processors: 14 minutes à 102.9 ps/day à S = 80%

Computations performed on BORG

S = scaling on parallel system

S = per_N / ( N * per_1 )

per_x = picoseconds per day

N = number of nodes

13

26/30

The Power Of Parallel Computation

• Benchmark System GROMACS

• Dipalmitoylphosphatidylcholine

121.856 atoms

Full atomistic

Coulomb interactions

Explicit water

• Run information

5000 steps à 10 ps

• Computational time

1 processor: 2 hour 30 minutes à 96 ps/day à S = 100%

5 processors: 36 minutes à 390 ps/day à S = 82%

8 processors: 29 minutes à 480 ps/day à S = 62%

Computations performed on BORG

13

27/30

The Power Of Parallel Computation

• Coarse Grain System on GROMACS

• Dipalmitoylphosphatidylcholine

27.324 particles (± 110.000 atoms)

Coarse Grain

Coulomb interactions

Coarse grained water (1 particle)

• Run information

10.000 steps à 400 ps

• Computational time

1 processor: 17 minutes à 33.882 ps/day à S = 100%

4 processors: 9 minutes à 64.000 ps/day à S = 47%

Computations performed on BORG

13

28/30

Conclusions

• NAMD & GROMACS are useful for Molecular Dynamics

• NAMD scales better in parallel performances

• GROMACS is faster in calculations (ps/day)

• Coarse Grain simulations are easy to perform

• Parallel simulations are easy to run

• Minor disadvantage: network data transfer

13

29/30

References

• NAMD User Manual

• GROMACS User Manual

• NAMD website: http://www.ks.uiuc.edu/Research/namd

• GROMACS website: http://www.gromacs.org

• Brooks et al. CHARMM , J. Comp. Chem. (1982)

• Lindahl et al. GROMACS, J. Mol. Model. (2001)

• Marrink et al. CG Model, J. Phys. Chem. (2004)

• Manuals will be made available through website FF -group

• For information how to run: [email protected]

13

30/30

Further Questions

?