r. middletongridpp8 – bristol – 23 rd september 2003slide 1 uk escience bof session ahm –...
TRANSCRIPT
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 1
UK eScience BoF Session AHM – Nottingham -
September 2003
“Intersecting UK Grid and
EGEE/LCG/GridPP”
“Intersecting UK Grid and
EGEE/LCG/GridPP”
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 2
BoF Agenda
• Applications & Requirements• Technical Exchanges & Collaboration
• Common Strategies / Roadmap• Discussion
• Applications & Requirements• Technical Exchanges & Collaboration
• Common Strategies / Roadmap• Discussion
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 3
Applications…
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 4
What does the eScience Grid currently look like?
Globus v2 installed at all regionaleScience centres.
Heterogenous resources(linux clusters, SGI O2/3000,SMP Sun machines)
eScience certificate authority
Globus v2 installed at all regionaleScience centres.
Heterogenous resources(linux clusters, SGI O2/3000,SMP Sun machines)
eScience certificate authority
Mark H
ayes
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 5
GridPP & EDG
Dedicated linux clusters runningEDG middleware (globus++)
Very homogenous resources
Resource broker (based on Condor)
LDAP based VO management
Dedicated linux clusters runningEDG middleware (globus++)
Very homogenous resources
Resource broker (based on Condor)
LDAP based VO management
Mark H
ayes
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 6
Applications
Applications on the eScience Grid
E-Minerals - Monte Carlo simulations of radiation damage to crystal structures (Condor-G, home-grown shell scripts)
Geodise - genetic algorithm for optimisation of satellite truss design (Java COG plugins in Matlab)
GENIE - ocean-atmosphere modelling (flocked Condor pools)
Other tools in use: HPCPortal, InfoPortal, Nimrod/G, SunDCG
Applications on the eScience Grid
E-Minerals - Monte Carlo simulations of radiation damage to crystal structures (Condor-G, home-grown shell scripts)
Geodise - genetic algorithm for optimisation of satellite truss design (Java COG plugins in Matlab)
GENIE - ocean-atmosphere modelling (flocked Condor pools)
Other tools in use: HPCPortal, InfoPortal, Nimrod/G, SunDCG
GridPP & EDG
(e.g.) ATLAS data challenge - monte carlo event generation, tracking & reconstruction
Large FORTAN/C++ codes, optimised for Linux(and packaged as RPMs)
Runs scripted using EDG job submit toolsGUIs under development (e..g GANGA)
GridPP & EDG
(e.g.) ATLAS data challenge - monte carlo event generation, tracking & reconstruction
Large FORTAN/C++ codes, optimised for Linux(and packaged as RPMs)
Runs scripted using EDG job submit toolsGUIs under development (e..g GANGA)
Mark H
ayes
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 7
Technical…
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 8
Comparison
GT2
EDG 2.0 Value added Components
ED
G Info
VO
MS
RB
Data M
an.
Only Linux 7.3 in places
PP Applications
GT2
Sim
ple User
managem
ent
???
UK
MD
S
L2 Grid Value added components
Application
Network monitoring
eScience CA
Peter C
larke
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 9
UK Grid : Deployment Phases
Level 0: Resources with Globus GT2 registering with the UK MDS at ginfo.grid-support.ac.uk;
Level 1: Resources capable of running the Grid Integration Test Script;
Level 2: Resources with one or more application packages pre-installed and capable of offering a service with local accounting and tools for simple user management, discovery of applications and description of resources in addition to MDS;
Level 3: GT2 production platform with widely accessible application base, distributed user and resource management, auditing and accounting. Resources signing up to Level 3 will be monitored to establish their availability and service level offered. 7 Centres of Excellence Globus GT3 testbed (later talks and mini workshop) JISC JCSR resources
Level 4: TBD, probably OGSA based.
Level 0: Resources with Globus GT2 registering with the UK MDS at ginfo.grid-support.ac.uk;
Level 1: Resources capable of running the Grid Integration Test Script;
Level 2: Resources with one or more application packages pre-installed and capable of offering a service with local accounting and tools for simple user management, discovery of applications and description of resources in addition to MDS;
Level 3: GT2 production platform with widely accessible application base, distributed user and resource management, auditing and accounting. Resources signing up to Level 3 will be monitored to establish their availability and service level offered. 7 Centres of Excellence Globus GT3 testbed (later talks and mini workshop) JISC JCSR resources
Level 4: TBD, probably OGSA based.
Rob Allan
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 10
UK Grid : Key Components
• ETF Coordination: activities are coordinated through regular Access Grid meetings, e-mail and the Web site;
• Resources: the components of this Grid are the computing and data resources contributed by the UK e-Science Centres linked through the SuperJanet4 backbone to regional networks;
• Middleware: many of the infrastructure services available on this Grid are provided by Globus GT2 software;
• Directory Services: a national Grid directory service using MDS links the information servers operated at each site and enables tasks to call on resources at any of the e-Science Centres;
• Security and User Authentication: the Grid operates a security infrastructure based on x.509 certificates issued by the e-Science Certificate Authority at the UK Grid Support Centre at CCLRC;
• Access Grid: on-line meeting facilities with dedicated rooms and multicast network access.
• ETF Coordination: activities are coordinated through regular Access Grid meetings, e-mail and the Web site;
• Resources: the components of this Grid are the computing and data resources contributed by the UK e-Science Centres linked through the SuperJanet4 backbone to regional networks;
• Middleware: many of the infrastructure services available on this Grid are provided by Globus GT2 software;
• Directory Services: a national Grid directory service using MDS links the information servers operated at each site and enables tasks to call on resources at any of the e-Science Centres;
• Security and User Authentication: the Grid operates a security infrastructure based on x.509 certificates issued by the e-Science Certificate Authority at the UK Grid Support Centre at CCLRC;
• Access Grid: on-line meeting facilities with dedicated rooms and multicast network access. Rob Allan
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 11
UK Grid : GT3 Testbeds
2 testbeds funded:
•Edinburgh/ Newcastle/ UCL/ Imperial
–OGSA-DAI
–E-Materials e-Science pilot demonstrator
–AstroGrid application demonstrator
•Portsmouth/ Daresbury/ Westminster/ Manchester/ Reading
–Tackle issues of inter-working between OGSI implementations
–Report on deployment and ease of use
–HPCPortal services demonstrator
–CCP application demonstrator
–E-HTPX e-Science pilot demonstrator
–InfoPortal demonstrator using OGSA-DAI
2 testbeds funded:
•Edinburgh/ Newcastle/ UCL/ Imperial
–OGSA-DAI
–E-Materials e-Science pilot demonstrator
–AstroGrid application demonstrator
•Portsmouth/ Daresbury/ Westminster/ Manchester/ Reading
–Tackle issues of inter-working between OGSI implementations
–Report on deployment and ease of use
–HPCPortal services demonstrator
–CCP application demonstrator
–E-HTPX e-Science pilot demonstrator
–InfoPortal demonstrator using OGSA-DAIRob Allan
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 12
UK Grid : Issues to be Tackled
A number of areas significant for a production Grid environment have hardly been tackled using GT2.
Issues include:• Grid information systems, service registration, discovery and
definition of facilities. Schema important;• Security, in particular role-based authorisation and security of
middleware;• Portable parallel job specifications;• Meta-scheduling, resource reservation and ‘on demand’;• Linking and interacting with remote data sources;• Wide-area visualisation and computational steering;• Workflow composition and optimisation;• Distributed user, s/w and application management;• Data management and replication services;• Grid programming environments, PSEs and user interfaces;• Auditing, advertising and billing in a Grid-based resource market;• Semantic and autonomic tools; etc. etc.
A number of areas significant for a production Grid environment have hardly been tackled using GT2.
Issues include:• Grid information systems, service registration, discovery and
definition of facilities. Schema important;• Security, in particular role-based authorisation and security of
middleware;• Portable parallel job specifications;• Meta-scheduling, resource reservation and ‘on demand’;• Linking and interacting with remote data sources;• Wide-area visualisation and computational steering;• Workflow composition and optimisation;• Distributed user, s/w and application management;• Data management and replication services;• Grid programming environments, PSEs and user interfaces;• Auditing, advertising and billing in a Grid-based resource market;• Semantic and autonomic tools; etc. etc.
Rob Allan
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 13
Coordination & Management
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 14
SR2004 e-Science ‘soft-landing’
• Key e-Science Infrastructure components:
– Persistent National e-Science Research Grid– Grid Operations Centre– UK e-Science Middleware Infrastructure
Repository– National e-Science Institute (cf Newton Institute)– National Digital Curation Centre– AccessGrid Support Service– e-Science/Grid collaboratories Legal Service – International Standards Activity
• Key e-Science Infrastructure components:
– Persistent National e-Science Research Grid– Grid Operations Centre– UK e-Science Middleware Infrastructure
Repository– National e-Science Institute (cf Newton Institute)– National Digital Curation Centre– AccessGrid Support Service– e-Science/Grid collaboratories Legal Service – International Standards Activity Paul Jeffreys
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 15
Post 2006 e-Science ‘soft-landing’
• Components foreseen:-
– Baseline OST ‘Core Programme’ and collaborate with JISC/JCSR about long-term support issues
– OST should support Persistent Research Grid and e-Science Institute
– JISC should support Grid Operations Centre, AccessGrid Support Service
– OST and JISC should support jointly Repository, Curation Centre, e-Science Legal Service and International Standards Activity
• Components foreseen:-
– Baseline OST ‘Core Programme’ and collaborate with JISC/JCSR about long-term support issues
– OST should support Persistent Research Grid and e-Science Institute
– JISC should support Grid Operations Centre, AccessGrid Support Service
– OST and JISC should support jointly Repository, Curation Centre, e-Science Legal Service and International Standards Activity
Paul Jeffreys
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 16
Intersecting UK Grid andEGEE/LCG/GridPP
Coordination and Management• GridPP
– Very substantial resources (computing and infrastructure)– Built-in international connections .. EGEE, LCG– Need to create ‘production’ service
• Possible points of intersection of UK programme with GridPP/EGEE/LCG:-– Resources – shared sites, shared personnel– Grid Operations and Support– Security (operational level – firewalls etc)– Change Management of software suite– OMII– CA– Interface to Europe– Training, dissemination– Collection of requirements
• GridPP– Very substantial resources (computing and infrastructure)– Built-in international connections .. EGEE, LCG– Need to create ‘production’ service
• Possible points of intersection of UK programme with GridPP/EGEE/LCG:-– Resources – shared sites, shared personnel– Grid Operations and Support– Security (operational level – firewalls etc)– Change Management of software suite– OMII– CA– Interface to Europe– Training, dissemination– Collection of requirements
Paul Jeffreys
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 17
Discussion
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 18
Discussion
• EDG not yet heterogeneous, OGSA, timelines, packaging issues
• Get people working together as investment for future– Working parties : Security, Resource Brokering,
Information Services, Workflow, Ops & Support• Avoid being too ambitious or complex• Need over-arching strategic / roadmap view• Outcomes :
– Pursue grass-roots collaboration & strategic view in parallel
– Set up working party to recommend common elements of roadmap
– Set up small number of technical working parties…definition and details being taken forward by Andy Parker – report in preparation.
• EDG not yet heterogeneous, OGSA, timelines, packaging issues
• Get people working together as investment for future– Working parties : Security, Resource Brokering,
Information Services, Workflow, Ops & Support• Avoid being too ambitious or complex• Need over-arching strategic / roadmap view• Outcomes :
– Pursue grass-roots collaboration & strategic view in parallel
– Set up working party to recommend common elements of roadmap
– Set up small number of technical working parties…definition and details being taken forward by Andy Parker – report in preparation.
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 19
Status
GT2
L2 Grid
Application
GT2
PP Grid
PP Application
“Common Core” Layer
“Shared” Engineering
??? LCG-1/EGEE-0 ???(GT2)
??? OGSA ???
R. Middleton GridPP8 – Bristol – 23rd September 2003 Slide 20
Conclusions
• Possible route for technical collaboration– Shared engineering – “working together”– Common objective in OGSA (similar
timescales)
• No overall co-ordination - yet• Much still to do to make it happen !!
• Possible route for technical collaboration– Shared engineering – “working together”– Common objective in OGSA (similar
timescales)
• No overall co-ordination - yet• Much still to do to make it happen !!