cyberinfrastructure support at penn state - internet2 · cyberinfrastructure support at penn state...
TRANSCRIPT
Cyberinfrastructure Supportat
Penn State
CRCC Workshop
April 2006
Vijay Agarwala, Kevin Morooney
Information Technology Services
vijay at psu.edu, kxm at psu.edu
Outline
• Service model description
• Some examples
• The Good, The Bad
Service Opportunities
• Computation
• Visualization
• Collaboration
• Networking
Service Opportunities
• Computation
• Visualization
• Collaboration
• Networking
ITS Clusters
• Condominium model of usage and acquisition
– Started over 20 years ago, many lessons learned
– When you aren't using yours, others can
– When you are using yours, you can use more than yours
• We end up with a bigger single “image” to enable bigger science
• Each partner gets the perception and reality of computing more than they could if they went on their own
A typical cluster - ownership view
ITS
Research Group A
Research Group B
Research Group C
Research Group D
Research Group E
Research Group F
A typical cluster - “weekly” view
ITS
Research Group A
Research Group A
Research Group A
Research Group D
Research Group E
Research Group F
ITS
Research Group E
Research Group E
Research Group C
Research Group E
Research Group E
Research Group F
A typical cluster - another “weekly” view
One way to look at it...V
endors, solutions
Facu
lty
Facilities
ITS-research methodologies-funding trends-funding availability-skillsets-partner?
-direction-viability-performance-price-partner?
Typical Acquisition Timeline
0 1 2 3
Typical Acquisition Timeline
0 1 2 3
Some Base Philosophies
• Take technology risks so faculty don't have to
• Achieve institutional efficiencies
• Stay “in-line” with federal/regional/technology trends and facilities
• Do what faculty don't want to do/can't do/won't do/shouldn’t do
Computation - our value proposition
• Reliability
• Serviceability
• Security
• Performance
• Staying on an technology edge
• Maintaining vendor relationships
• Applications availability
Computation - our value proposition
• Reliability
• Serviceability
• Security
• Performance
• Staying on an technology edge
• Maintaining vendor relationships
• Applications availability
Why wouldfaculty ever wantto go it alone?!?!?
Dr. James B. Anderson, Evan Pugh Professor of ChemistryDr. Craig S. Bishop, Associate Professor of MeteorologyDr. Ottar N. Bjornstad, Assistant Professor of Entomology and StatisticsDr. James G. Brasseur, Professor of Mechanical EngineeringDr. Lance R. Collins, Professor of Chemical EngineeringDr. William M. Frank, Professor of MeteorologyDr. Barbara J. Garrison, Shapiro Professor of ChemistryDr. Ross C. Hardison, Professor of Biochemistry and Molecular BiologyDr. Jainendra K. Jain, Erwin W. Mueller Professor of PhysicsDr. Izabela Makalowska, Director of the Computational Genomics CenterDr. Wojciech Makalowski, Associate Professor of BiologyDr. Webb C. Miller, Professor of Computer Science and EngineeringDr. Katriona Shea, Assistant Professor of Biology
LION-XE Participants
LION-XE Computational Cluster
128 Dell PowerEdge 1550s
• Dual 1 Ghz Intel Xeon Processors
• 1 GB of ECC RAM
• 36 GB of SCSI disk
BlueArc Fileserver
Dolphin SCI high-speed interconnect
http://gears.aset.psu.edu/hpc/systems/lionxe/
LION-XL Computational Cluster
176 Dell PowerEdge 2650s
• Dual 2.4 Ghz Intel Xeon Processors
• 4 GB of ECC RAM
• 36 GB of SCSI disk
BlueArc Fileserver
Quadrics high-speed interconnect
Dr. James B. Anderson, Evan Pugh Professor of Chemistry Dr. Long-Qing Chen, Professor Materials Science and Engineering Dr. Qiang Du, Professor of Mathematics Dr. Barbara J. Garrison, Shapiro Professor of Chemistry Dr. Daniel Haworth, Assistant Professor of Mechanical Engineering Dr. Zi-Kui Liu, Associate Professor of Materials Science and Engineering Dr. Anton Nekrutenko, Assistant Professor Biochemistry and Molecular Biology Dr. Padma Raghavan, Associate Professor Computer Science and Engineering Dr. Jorge Sofo, Director of the Materials Simulation Center Dr. Jinchao Xu, Director of the Center for Computational Mathematics and Applications
Dr. Ludmil Zikatanov, Assistant Professor of Mathematics
LION-XL Participants
http://gears.aset.psu.edu/hpc/systems/lionxl/
LION-XM Computational Cluster
128 Dell PowerEdge 1750s
• Dual 3.06 GHz Intel Xeon Processors
• 4 GB of ECC RAM
• 36 GB of SCSI disk
BlueArc Fileserver
Myrinet high-speed interconnect
Dr. Douglas Cowen, Associate Professor of Physics, Astronomy and Astrophysics Dr. Melik Demirel, Assistant Professor of Engineering Science and MechanicsDr. Dan Haworth, Associate Professor of Mechanical Engineering Dr. Costas Maranas, Associate Professor of Chemical Engineering Dr. Mike Modest, Professor of Mechanical Engineering Dr. Martin Moeck, Assistant Professor of Architectural EngineeringDr. Paul Plassman, Associate Professor of Computer Science and Engineering
Dr. Stephen Rathbun, Associate Professor of Statistics
LION-XM Computational Partners
http://gears.aset.psu.edu/hpc/systems/lionxm/
Pleiades Computational Cluster
The Pleiades Cluster, a partnership between Dr. Lee
Samuel Finn, Director of the Center for Gravitational
Wave Physics, and the GEaRS group, is dedicated to
the analysis of data from the Laser Interferometer
Gravitational-Wave Observatory (LIGO ), whose
goal is the detection of gravitational waves and their
use as a new tool of astronomical discovery.
128 Sun V60x servers
• Dual 2.8 GHz Intel Xeon Processors
• 2 GB memory
28 Dell 1750 servers
• Dual 3 GHz Intel Xeon Processors
• 2 GB memory
9 Dell PowerEdge 1750 file servers
18 Dell PowerVault 220S SCSI enclosures
• 35.1 TB aggregate storage
168-port Alcatel 7800 Gigabit Ethernet switch http://ligo.aset.psu.edu/
LION-XO Computational Cluster
80 Sun SunFire v20z Compute Servers
• Dual 2.4 GHz AMD Opteron Processors
• 8 GB of ECC RAM (Expandable to 16 GB)
• 219 GB of SCSI disk
52 Sun SunFire v40z Compute Servers
• Quad 2.6 GHz AMD Opteron Processors
• 40 Servers with 16 GB of ECC RAM
• 12 Servers with 32 GB of ECC RAM
• 584 GB of SCSI disk
Silverstorm Infiniband high-speed interconnect
Dr. Antonios Armaou, Assistant Professor of Chemical Engineering
Dr. Susan Brantley, Director, Earth and Environmental Systems Institute
Dr. Barbara Garrison, Shapiro Professor of Chemistry
Dr. James Kubicki, Associate Professor of Geosciences
Dr. Janna Maranas, Assistant Professor of Chemical Engineering
Dr. Mark Maroncelli, Professor of Chemistry
Dr. Patrick Reed, Assistant Professor of Civil and Environmental Engineering
Dr. Mark Shriver, Associate Professor of Anthropology
Dr. Thorsten Wagener, Assistant Professor of Civil Engineering
LION-X0 Computational Partners
http://gears.aset.psu.edu/hpc/systems/lionxo/
LION-XA Computational Cluster
32 Dell PowerEdge 1850 Compute Servers
• Dual 3.6 GHz Intel Xeon EM64T Processors
• 8 GB of ECC RAM
• 146 GB of SCSI disk
Dr. Bryan Grenfell, Professor of Biology
Dr. Edward Holmes, Professor of Biology
Dr. David Lemmon, Center for Development and Health Research Methodology
Dr. Klaus Keller, Assistant Professor of Geosciences
Dr. Katernya Makova, Assistant Professor of Biology
Dr. Randen Patterson, Assistant Professor of Biology
LION-XA Computational Partners
http://gears.aset.psu.edu/hpc/systems/lionxa/
Hammer Computational Cluster
8 Sun SunFire v40z Compute Servers
• Quad AMD Opteron 2.6 GHz Processors
• 32 GB of ECC RAM
• 876 GB of SCSI disk
Largely for interactive and code development work
http://gears.aset.psu.edu/hpc/systems/hammer/
Unisys ES7000 SMP Server
The Unisys ES7000 is targeted at serving research
needs that require either a large amount of
addressable memory or a large number of shared
memory processors.
2 ES7000 Domains
• 16 1.5 Ghz Itanium2 processors
• 6 MB of L3 Cache
• 32 GB of RAM
1 ES7000 Domain
• 32 1.6 Ghz Itanium2 processors
• 6 MB of L3 Cache
• 128 GB of RAM
What's next
• BlueGene, Cell processor, Galaxy servers (8-way dual core systems), GPUs
• HTX adapters (Pathscale/QLogic)
• Next gen Infiniband (Silverstorm)
Application Support
Finite Element Solvers: ABAQUS, ANSYS, FLUENT, LS-DYNA, MSC/Nastran
Mathematical and Statistical Libraries and Programs: ATLAS, BLAS, IMSL, LAPACK, Mathematica, Matlab, Intel MKL, PETSc, GNU R
Computational Chemistry and Materials Science: Gamess, Gaussian 03, NWChem, WIEN2K, Vasp
Computational Biology: BLAST, ClustalW, ClustalX, PHRED/PHRAP/Consed, RepeatMasker
Solid Modeling: I-DEAS, MSC/Patran
Compilers: Absoft ProFortran, GNU Pascal, IBM Java2, IBM XL Fortran Compiler, Intel C++ Compiler, Intel Fortran Compiler, Jikes, Lahey/Fujitsu Fortran 95 Pro, Portland PGI Compliers and Tools, SUN Java2
http://gears.aset.psu.edu/hpc/software/
Education/Training
• On demand teaching and learning
– where you want it– when you want it
– what you want
• Go where the research groups are and teach customized material
– location, location, location
– no excuses
What’s going well
• Build it together, not “build it and they will come”
• More frequent publication acknowledgement, authorship
• More frequent and prominent role in faculty recruitment
• Inroads to new departments
• Better platform diversity
• Graduate Minor alive and well
• Joint funding of staff, live in our eco-system
What needs help
• Institutionalization of the condo
• Space, power, cooling
• Solving the VO problem
• It’s a two front battle - grass roots and leadership
• Staff retention and recruitment
Visualization
• A bit harder to define – means different things to different faculty
• Different kinds of facilities on campus
• We try to fill the niches which aren't being met
• Guiding principle – lower the bar for adoption both in terms of cost and usability
System Description:• 12 projector tiled display
• High resolution (4096 x 2304 pixels)• Large format (7 x 11 ft.)
• 24 CPU/12 node Linux cluster• Allows display of significant detail throughout a larger spatial context• Provides a large format, high-resolution, multi- window workspace
Adaptable Software Development Environment:• Parallel clustered graphics applications• Hi-resolution tiling of OpenGL applications• Unified tiled Linux desktop• Linux, Chromium, DMX, VTK, OpenDX• Similar software environment now being applied in IEL clustered VR solution
Tiled Display WallScalable, Parallel, Open Source Graphics Development Environment
http://gears.aset.psu.edu/viz/facilities/displaywall/
System description:
• 1, 2 or 3-screen – Windows, Linux, Mac – flexible display • 2 + 1 screen Windows desktop VR and multimedia use• 3 screen panoramic VR using Linux clustered graphics• Unified Linux desktop for use of existing applications• Scalable, open source, development – Chromium, DMX, OpenSG, VRPN, Qt, JAVA3D, C++, VRML
Primary teaching and research facilitated:• Undergraduate design studio (Kalisperis)• Digital design media (Kalisperis, Muramoto) • NSF engineering education, new facility funded (Messner)• LARCH 3D perception/communication studies
Development items and extended impacts:• Pending proposals (AE/NSF/VaTech, Arch/Comm/AE)• IEL techniques applied in Sports Medical research• Investigating 3D telecollaboration and enhanced user interactions (e.g. in-room wireless, shared desktops among laptops & tablet PCs), touch screen, tracking)• Collaborative opportunities with private firms
ITS/SALA Immersive Environments Lab; A/E Engineering Education VR Lab Affordable, Accessible Projection Based VR for Research and Instruction.
http://gears.aset.psu.edu/viz/facilities/iel/
Stereo-enabled Multimedia Technology Classroom
System description:
• Stereo-enabled enhanced multimedia projection for
lecture use in a technology classroom
• Based on IEL two-screen Windows desktop design
• Collaboration among ITS units (GEaRS, CLC, MTSS)
Implementation status:
• Initial installation completed in 158 Willard (Fall 2003)
• Room is in the generally schedulable classroom pool
• Several demonstrative 3D applications installed
• Now in marketing and “friendly user” period to better
assess and accommodate faculty needs
http://gears.aset.psu.edu/viz/facilities/stereoclassroom/
Sports Medicine VR LabApplication description:
• Integration of VR display and graphics programming with motion tracking, force platform, EEG, fMRI and potentially other human performance measurement devices for kinesiology and psychology studies of perception, cognition and postural stability in normal and brain injured subjects• VR stimuli coding using open development tools, data analysis visualization using Matlab
Research tracks supported:
• Lasch building lab of adapted IEL design for working with concussed athletes (Slobounov, Sebastianelli, Ray) • Computer building Idesk lab for Parkinson’s patients (Newell, Haibach)• Adaptation of previous balance protocols for use during fMRI measurement at NIH (Slobounov)• Emotional image response posture studies (Ray)• NIH proposal (Slobounov) pending review
http://gears.aset.psu.edu/viz/facilities/sportsmed/
Visualization-Applications and Programming Consulting, Collaboration and Training
GEaRS staff provide local expertise for Penn State faculty on the application of visualization and VR systems, applications and programming toolkits within a range of research and teaching contexts. Current focus areas include:
• Graphics, modeling and data visualization applications (e.g. Tecplot, FormZ, VMD, Matlab)
• Open source data visualization toolkits and development environments (e.g. VTK, OpenDX)
• 3D graphics programming/VR development (e.g. OpenSG, VRPN, JAVA3D, OpenGL, VRML, X3D)
• Parallel graphics development (e.g. Chromium, DMX, Paraview)
• Languages (e.g. C++, Tcl/Tk, Qt, JAVA, Python)
Staff expertise is made available to faculty through individual consulting and/or collaboration, staff participation or assistance in credit course instruction, customized on-demand seminars for specialized disciplines or research groups, etc.
ACCESS Grid NodeThe ACCESS Grid (AG) node will support large-scale distributed meetings, collaborative work sessions, seminars, lectures, tutorials, etc. It enables group-to-group communications using multicast internet-working, voice and video teleconferencing, and desktop applications sharing among multiple remotely-located participants. The AG node enables rich multimedia exchanges among the participants, and is supported by a large number of organizations in academia and industry.
Recent conferences and meetings enabled:• 2nd and 3rd Virtual Conferences on Genomics and Bioinformatics, SC Global, HPC Seminars, SC2 Visualization Group, EMS WUN (Dr. DiBiase), AG Art (Dr. Thurman), National Internet2 Day
Adaptable solutions:• PIG setup/testing for individual access• Connectivity with TNS H.323 via VRVS• Currently adding Access Grid functionality to IEL for enhanced 3D telecollaboration
Http: //gears.aset.psu.edu/viz/facilities/gridnode/