dr. harald kornmayer a distributed, grid-based analysis system for the magic telescope, chep 2004,...

16
Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken 1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe C. Bigongiari, INFN Padua A. deAngelis, M. Paraccini, A. Forti, University of Udine M. Delfino, PIC Barcelona M. Mazzucato, CNAF Bologna For the MAGIC Collaboration A distributed Grid- A distributed Grid- based analysis system based analysis system for the MAGIC for the MAGIC telescope telescope Forschungszentrum Karlsruhe in der Helmholtz- Gemeinschaft

Upload: monica-blake

Post on 23-Dec-2015

220 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1

H. Kornmayer, IWR, Forschungszentrum KarlsruheC. Bigongiari, INFN Padua

A. deAngelis, M. Paraccini, A. Forti, University of UdineM. Delfino, PIC Barcelona

M. Mazzucato, CNAF Bologna

For the MAGIC Collaboration

A distributed Grid-based analysis A distributed Grid-based analysis system for the MAGIC telescopesystem for the MAGIC telescope

Forschungszentrum Karlsruhein der Helmholtz-Gemeinschaft

Page 2: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken2

OutlineOutline

• What kind of MAGIC? • Telescope requirements• The basic architecture of distributed

system• First results• the future

Page 3: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken3

MAGIC - introductionMAGIC - introduction

MAGIC Telescope• Ground based Air Cerenkov

Telescope• LaPalma,

Canary Islands (28° North, 18° West)

• 17 m diameter• operation since autumn 2003

(still in commissioning)• Collaborators:

IFAE Barcelona, UAB Barcelona, Humboldt U. Berlin, UC Davis, U. Lodz, UC Madrid, MPI München, INFN / U. Padova, U. Potchefstrom, INFN / U. Siena, Tuorla Observatory, INFN / U. Udine, U. Würzburg, Yerevan Physics Inst., ETH Zürich

Page 4: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken4

MAGIC – physics goals MAGIC – physics goals

AGNActive Galactic Nuclei Supernova RemnantsUnidentified EGRET

sources

(Mkn 501, Mkn 421)

Gamma Ray Bursts

etc..

Crab nebular by Chandra

Page 5: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken5

MAGIC – ground based MAGIC – ground based γγ-ray astronomy-ray astronomy

1. Primary particle createsan extensive air shower

2. Secondary particles produce cerenkov light

3. Telescope collects the Cerenkov light to the focal plan

4. The camera system DAQ records the showers

Gamma ray flux is lowhuge collection area is required

only ground based observations possible

The cosmic rays consist mainly of hadronic primaries. A gamma/hadron separation based on MC simulations is needed.

Page 6: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken6

MAGIC – the cameraMAGIC – the camera

• 577 Photomultiplier tubes ~ 3.5° FOV

• Read out with a 300 MHz FADC system

• Readout is based on multi level triggers

Gamma Proton Muon

Page 7: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken7

MAGIC – Monte Carlo simulationsMAGIC – Monte Carlo simulations

• Based on the air shower simulation program CORSIKA

• Simulation of hadronic background is very CPU consuming– to simulate the background of one night, 70

CPUs (P4 2GHz) needs to run 19200 days – to simulate the gamma events of one night for a

Crab like source takes 288 days.– The detector/atmosphere is volatile.

• A good production strategy is needed.

Page 8: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken8

MAGIC - summaryMAGIC - summary

• European collaboration• In operation since 1 year,

still in commissioning phase • parameters:

– field of view 3.5o

– current trigger threshold ~ 50 GeV– current trigger rate ~ 200 Hz (typical)– current data volume / hour ~ 1.2 -2.0 GB

• Intensive Monte Carlo studies needed to lower the threshold

• A second telescope is already funded.

Page 9: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken9

MAGIC - summaryMAGIC - summary

• European collaboration• In operation since 1 year,

but still in commissioning phase • parameters:

– field of view 3.5o

– current trigger threshold ~ 50 GeV– current trigger rate ~ 200 Hz (typical)– current data volume / hour ~ 1.2 -2.0 GB

• Intensive Monte Carlo studies needed to lower the threshold

• A second telescope is already funded.

Crab (March 19 + 22)

Mkn 421 (April 2004)

Page 10: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken10

RequirementsRequirements

Storage/amount of data: •real data: > 7 TB per year•MC data: > 10 TB per year•Easy access to data

Computation: •MC

• today: 100 CPUs• need for > 1000 CPU

•Analysis:• today: 40 CPUs• need for > 200• new methods in future

General:•Distributed

•Collaborators in whole Europe•Secure

• restricted access• Accessiblity

• easy access via a Portal • easy access to data

• Availablity• 24x7

• Scalability• MAGIC II is coming• collaboration with other ACT

Page 11: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken11

Basic ArchitectureBasic Architecture

MAGIC backbone: • three computing centers

•CNAF (Italy)•PIC (Spain)•FZK (Germany)

• + LaPalma

• run main services• data storage• job scheduling• Portal

• collaborators „plugin“ their resources in the backbone

Page 12: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken12

Basic architecture IIBasic architecture II

• Metadata Catalog needed to classify the observation data

Usage of existing components:

• based on LCG 2 •CE/SE/UI•Resource Broker•Replica location service

• CrossGrid Portal solution • Migrating Desktop• Roaming Access Server• JSS (Job Submission

Service)

• Meta Data Catalogs• select real data with

astronomical sources• select mc data with input parameters

• Implementation of services will start with the developments for MonteCarlo production

Page 13: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken13

Run and control simulations in the distributed system-Driven by three use cases

- Submit jobs - Monitor jobs- Manage data

-Easy to use GUI-Hide LCG commands-Java swing GUI

-Job Monitoring and Data Management based on a dedicated database

MAGIC Grid Simulation ToolMAGIC Grid Simulation Tool

Page 14: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken14

MAGIC @ CrossGridMAGIC @ CrossGrid

Used for first implementation steps •LCG-2 testbed with extentions from CrossGrid

•16 sites in Europe

•First tests done

•200 jobs submitted

•10% failed due to problems with setup (replica location service)

•Easy resubmit due to the MAGIC run database

Page 15: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken15

MAGIC and EGEEMAGIC and EGEE

• EGEE is the biggest Grid project in Europe• MAGIC applied as a generic application (NA4)• Proposal was accepted in June/July• Collaboration with EGEE started since august

First Results: - Virtuelle Organisation VO-MAGIC set up at NIKHEF/SARA- Usage of GILDA testbed (Cantania) agreed- Integration of first site of MAGIC collaborators planned for the end of the year

Page 16: Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken1 H. Kornmayer, IWR, Forschungszentrum Karlsruhe

Dr. Harald Kornmayer A distributed, Grid-based analysis system for the MAGIC telescope, CHEP 2004, Interlaken16

Conclusion & the FutureConclusion & the Future

The MAGIC Grid • is a good example for a distributed simulation and

analysis system• can be used to exploit the existing Grid middleware • is on the way and first results can be seen

• is a prototype for the astroparticle physics • can help to build collaboration with other

experiments (ACT, satellites, Optical telescope,..)