29-30 march 2006lhc experiment's softwarel. silvestris 1 software domain decomposition core...

94
29-30 March 2006 29-30 March 2006 LHC Experiment's Software LHC Experiment's Software L. Silvestris L. Silvestris 1 Software Domain Decomposition Software Domain Decomposition Core PluginMgr Dictionary MathLibs I/O Interpreter GUI 2D Graphics Geometry Histograms Fitters Simulation Foundation Utilities Engines Generators Data Management Persistency FileCatalog Framework DataBase Grid Services Batch Interactive OS binding 3D Graphics NTuple Physics Collections Conditions Exper Frameworks Simulation Program Reconstruction Program Analysis Program Event Detector Calibration Algorithms

Post on 19-Dec-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 1

Software Domain DecompositionSoftware Domain Decomposition

Core

PluginMgr Dictionary

MathLibs I/O

Interpreter

GUI 2D Graphics

Geometry Histograms Fitters

Simulation

Foundation Utilities

Engines

Generators

Data Management

Persistency

FileCatalogFramework

DataBase

Grid Services

Batch

Interactive

OS binding

3D Graphics

NTuple Physics

Collections

Conditions

Exper Frameworks

Simulation Program Reconstruction Program Analysis Program

Event Detector Calibration Algorithms

Page 2: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 2

Fast simulation (I)Fast simulation (I)

Different levels of “fast” simu at the four expts:– CMS extreme: swimming particles through detector; include

material effects, radiation, etc. Imitate full simulation – but much faster (1Hz).

– ATLAS: particle-level smearing. VERY fast (kHz)– LHCb: generator output directly accessible by the physics

application programs

But: ongoing work in bridging the gap– For example, in shower-parametrization in the G4 full

simulation (ATLAS and CMS)

Common goal of all: output data at AOD level

Page 3: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 3

Fast simulation: CMS (II)Fast simulation: CMS (II)

Simplified (FAMOS) geometry

Detailed geometry

Nested cylinders,

Fast propagation,Fast material

effect simulation.

Complicated geometry,

propagation in short

steps, full & slow

simulation

t t-

pT (2nd jet)

Page 4: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 4

Fast Simulation: GFLASH (III)Fast Simulation: GFLASH (III)

GFLASHGFLASH

Page 5: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 5

Fast Simulation: GFLASH (IV)Fast Simulation: GFLASH (IV)

Energy deposition in a 5x5 crystal matrix

for 50 GeV electrons

Histograms = full geant4 simulation Red markers = shower parameterization

CMSCMS

Page 6: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

Reconstruction, Trigger Reconstruction, Trigger and Monitoringand Monitoring

Page 7: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 7

Reconstruction, Trigger, MonitoringReconstruction, Trigger, Monitoring

General feature: all based on corresponding framework (AliRoot, Athena, Gaudi, CMSSW)

– Multi-threading is necessary for online environment– Most Algorithms & Tools are common with offline

Two big versions:– Full reconstruction– “seeded”, or “partial”, or “reconstruction inside a region of

interest”• This one used in HLT

Online monitoring and event displays– “Spying” on Trigger/DAQ data online

• But also later in express analysis

Online calibrations

Page 8: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 8

Online selectionOnline selection

109 Ev/s 109 Ev/s

102 Ev/s102 Ev/s

99.99 % Lv199.99 % Lv1

99.9 % HLT99.9 % HLT

0.1 %0.1 %

105 Ev/s 105 Ev/s

0.01 %0.01 %

Same hardware (Filter Subfarms) Same But different situations

Same hardware (Filter Subfarms) Same software But different situations

CMSCMS

Page 9: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 9

High-Level TriggerHigh-Level Trigger

A huge challenge; large (small) rejection (accept) factor

– In practice: startup will use smaller rates. • CMS example: 12.5 kHz (pilot run) and 50 kHz

(1033 cm-2s-1)• Real startup conditions (beam, backgrounds,

expt) unknown – Startup trigger tables: in progress. ATLAS/CMS

have prototypes. Real values: when beam comes…

ATLAS/CMS

LHCb ALICE

Intrctn rate

109 Hz 107 Hz 104 Hz

HLT input 100 kHz 1 MHz 1 kHz

HLT accept

100-200 Hz

200 Hz ~50 Hz

Lvl-1 (HW)

HLT (SW)

Page 10: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 11

Combined muon reconstructionCombined muon reconstruction – ATLAS – ATLAS exampleexample

MDT RPC/TGC

magnet

magnet

calorimeter

inner detector

++ ++

++

++

++

++

+

+

+

+++

++

++

μμ

MDT RPC/TGC

Muon spectrometer

Mu

on

sp

ectr

om

ete

r

Page 11: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 12

Reconstruction: ATLAS Tracking Reconstruction: ATLAS Tracking In

ner

D

etec

tor

Post processingPreparation Track reconstruction

Services Tools

Track finding

Magnetic field

Detector description

Material description Fitting Vertexing

Reconstruction tools

Segment finding

Combined reconstruction

Particle creation

clustering

Drift circlecreation

Track finding

Segment finding Particle creation

clustering

Drift circlecreation

Mu

on

S

pec

tro

met

er

Electron/photon

Muon

b tagging

Region selector

Condition Database

Calibration

Extrapolation

Tau

Missing Et

Vertex finder

ATLASATLAS

Page 12: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 13

ATLAS: Applications for inner detector ATLAS: Applications for inner detector reconstructionreconstruction

Tracking is running in:– Offline– Combined test-beam– High level trigger

– Cosmics

Offline

Cosmics High level triggerHigh level trigger

Combined TB

Page 13: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 14

ATLAS & CMS Tracking PerformancesATLAS & CMS Tracking Performances

ATLAS ID

CMS -System

ATLAS -System

ATLASCMS

Higgs New Physics

Essential componentEssential componentfor HLT algosfor HLT algos

Page 14: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 15

Regional reco example: Muon Trigger Regional reco example: Muon Trigger

Muon Trigger: simple oneConditions:

– High Pt threshold – around 15 GeV– Primary muon: transverse impact parameter below 30 microns– Direction known from L1 with 0.5 rad accuracy

Tracker information needed: confirm existence of track with the selection criteria aboveUsing regional seeding and Pt cut in trajectory building, it

takes about 10 ms to reject L1 muon candidate

Tracker can be used at Level 2!

Page 15: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 16

Regional reco example: CMS b Jet TriggerRegional reco example: CMS b Jet Trigger

From pixel hits and calorimeters:– The seed for tracks reconstruction is created around the LVL1 jet

direction– Primary vertex is calculated

Tracks are reconstructed in a cone of R>0.15 around the jet directionTracks are conditionally reconstructedThe Jet direction is refined using the reconstructed tracks

pp

primary tracks

Secondary tracksjet

jet direction

beam line

b

b

Jet

RoI

calorimeterTrigger b Jet: complex one

Page 16: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 17

Examples Regional reco example: CMS : Incl b tagging

Low lumiLow lumi

b-tag efficiency vs bkg efficiency @ low lumi; offline and HLT.The difference offline-HLT is negligible

Online inclusive b tag at HLT possible, provided alignment under control …

Page 17: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 18

Example Tracker @ HLT Exclusive BExample Tracker @ HLT Exclusive Bss

Lvl-1 HLT Global Events/ 10fb-1 Trigger Rate

15.2% 33.5% 5.1% 47 <1.7Hz

HLTHLT Full TrackerFull TrackerMass resolution

= 46 MeV

= 74 MeV

• Lvl-1: 2 PT>3GeV, =15.2%• HLT strategy:

– Select pixel seeds with PT > 4 GeV in - region around trigger ’s – Conditional tracking:

- stop if pt<4 GeV/c @ 5σ

or Nhit=6 or σ(pt)/pt<0.02- Bs reconstruction if only 2 track candidates with opposite charge in 150 MeV window- Vertex20 and dr> 150 m

Average CPU Time = 240 msec / 1GHz CPU

Page 18: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 21

Physics & Data Quality Monitoring : CMS (I)Physics & Data Quality Monitoring : CMS (I)

CPU CPU CPU CPU CPU CPU CPU CPU

DQM principle: usesame code to servedifferent customers

HLT Inputs Physics objects Triggers etc…

Monitoring producers

Monitoring consumers (clients)

DQM infrastructure:Collectors/Servers

Page 19: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 22

Physics & Data Quality Monitoring : CMS (II)Physics & Data Quality Monitoring : CMS (II)

Client

“DQM”

Monitoringinformation

• Configuration• Reference objects• Historic plots• Etc…

• “Comparison-to-reference”• Collation of similar objects

Database Tools

“Alarm”

“System ok”

• Clear separation of creation of monitoring information from collection, processing• Used from all CMS detectors

Page 20: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 23

DQM: CMS Web and Qt Interface (III)DQM: CMS Web and Qt Interface (III)

• “Monitoring producer” (and collector): CERN• “Monitoring consumers” (clients): one at CERN, one at Florida (US)• You are looking at web browser running in Florida office

Live cosmic test data forend-cap muon detector

Cosmic test data forcalorimeter detector(reading from a file)

Page 21: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 24

Physics & Data Quality Monitoring : ATLAS Physics & Data Quality Monitoring : ATLAS (IV)(IV)

Page 22: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

Calibration and Calibration and AlignmentAlignment

Page 23: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 26

Calibration & Alignment (I)Calibration & Alignment (I)

Key part of commissioning activities– Dedicated calibration streams part of HLT output (e.g. calibration

stream in ATLAS, express-line in CMS; different names/groupings, same content)

What needs to be put in place– Calibration procedure; what, in which order, when, how– Calibration “closed loop” (reconstruct, calibrate, re-reconstruct, re-

calibrate…)• Conditions data reading / writing / iteration• Reconstruction using conditions database

What is happening– Procedures defined in many cases; still not “final” but understanding

improving– Exercising conditions database access and distribution infrastructure

• With COOL (ATLAS & LHCb) conditions database, realistic data volumes and routine use in reconstruction

• In a distributed environment, with true distributed conditions DB infrastructure

Page 24: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 27

Calibration/Alignment: LHCb (II)Calibration/Alignment: LHCb (II)

Misalignments are applied through detector structure– “Interesting” detector elements have access to

misalignment matrix– Misalignment represents change from nominal alignment in

the reference frame of the detector element i.e. relative to its parent detector element

Added runtime misalignments to detector components – extending the LHCb detector description framework

Misalignments are tied in to the Conditions Database framework

– to allow both automatic runtime updating and propagation of changes, plus versioning and time dependence of alignment parameters

The functionality was tested within the LHCb reconstruction chain

– LHCb subdetectors are using it to investigate detector alignment procedures and strategies, systematic effects, etc.

The extension is a non-intrusive enhancement– respects the design principles of the LHCb detector

description suite

Page 25: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 29

Calibration & Alignment: CMS Mental Model Calibration & Alignment: CMS Mental Model (IV)(IV)

Provides a unified access mechanism for non-Event dataRecord: holds data with same interval of validityEventSetup “snapshot” of detector at an instant in time

Not a new idea: has been used by CLEO experiment since 1998

Page 26: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 31

Calibration & Alignment CMS Event SetUp Calibration & Alignment CMS Event SetUp ComponentsComponents

Components do the work of actually creating/reading the data

The EventSetup supports two types of dynamically loaded components–ESSource

• reads data from disk• sets the ‘interval of validity’ for data in a Recorde.g., read calibration information from a database for a particular run range

–ESProducer•creates data by running an algorithm•obtains data needed by the algorithm from Records in the EventSetup•e.g., create tracking geometry by combining alignment shifts and perfect positioning of material

Page 27: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 33

Calibration & Alignment CMS Event SetUp: Data Calibration & Alignment CMS Event SetUp: Data RetrievalRetrieval

To a user, EventSetup appears to have all its data loaded

To avoid unnecessary computation, data is retrieved on the first request

NOTE: an EDProducer is an Event module

Page 28: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 36

Calibration/Alignment: Data FlowCalibration/Alignment: Data Flow

Construction DB

Online Configuration DB

Condition DBEquip.Man. DB

OfflineCondition DB

for Tier0

OfflineCondition DB

for GRID

OfflineCondition DB

for GRID

OfflineCondition DB

for GRID

OfflineCondition DB

for GRID

OLD conditions

OfflineCondition DBat P5 (HLT)

Calibration & Alignment Data Flow

Online Master Data Storage

Offline ReconstructionConditions DB ONline

Offline ReconstructionConditions DB OFfline

Create rec conDB data set

Poolification

Condit

ions

Calibratio

n

Objects

Bat 513

Page 29: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 37

Calibration & Alignment (II)Calibration & Alignment (II)

Many open questions still:– Inclusion in simulation; to what extent?

• Geometry description and use of conditions DB in distributed simulation and digitisation

– Management• Organisation and bookkeeping (run number ranges, production

system,…)– How do we ensure all the conditions data for simulation is

available with right IOVs?– What about defaults for ‘private’ simulations ?

– Reconstruction• Ability to handle time-varying calibration

– Asymptotically: dynamic replication (rapidly propagate new constants) to support closed loop and ‘limited time’ exercises

» Tier-0 delays: maximum of ~4-5 days (!)– Calibration algorithms

• Introduction of realism: misbehaving and dead channels; global calibrations (E/p); full data size; ESD/RECO input vs RAW

Page 30: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

AnalysisPhysics Tools & Visualization

Page 31: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 39

Analysis (introduction)Analysis (introduction)

Common understanding: early analysis will run off of RECO/ESD format

– RECO/ESD(ATLAS/CMS)~(0.25-0.5) MB; ALICE/LHCb~0.04– The reconstructed quantities; frequent reference to RAW

data• At least until basic understanding of detector, its response

and the software will be in place

Asymptotically, work off of Analysis Object Data (AOD)– MiniDST for the youngsters in the audience– Reduction of factor ~5 wrt RECO/ESD format– Crucial: definition of AOD (what’s in it); functionality– Prototypes exist in most cases

• Sizes and functionality not within spec yet– One ~open issue: is there a need for a TAG format (1kB

summary)?• E.g. ATLAS has one, in a database; CMS not.

Page 32: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 40

Analysis: Data Tiers ExampleAnalysis: Data Tiers Example

CMS plans to implement a hierarchy of

Data Tiers– Raw Data: as from the Detector

– RECO: contains the objects created by Reconstruction

– Full Event: contains the previous RAW+RECO

– AOD: again a subset of the previous, sufficient for the large majority of “standard” physics analyses

• Contains tracks, vertices etc and in general enough info to (for example) apply a different b-tagging

• Can contain very partial hit level information

RAW

RECO

AOD

CMS:~1.5 MB/event

CMS: ~ 250 kB/event

CMS:~ 50 kB/event

Page 33: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 42

Analysis “flow”: an exampleAnalysis “flow”: an example

RECO/AODDatasets

AODpre

Cand, User Data

AODSignal dataset

Background dataset(s)

preCand,

User Data

At Tier 1/ Tier2At Tier 0/ Tier1

AOD, Cand

AOD, Cand

pre

pre

At Tier 2

Laptop ?500 GB

50 GB

Example numbers

Page 34: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 43

User analysis: a brief historyUser analysis: a brief history

1980s: mainframes, batch jobs, histograms back. Painful.

Late 1980s, early 1990s: PAW arrives. – NTUPLEs bring physics to the masses– Workstations with “large” disks (holding data locally) arrive; looping

over data, remaking plots becomes easy

Firmly in the 1990s: laptops arrive; – Physics-in-flight; interactive physics in fact.

Late 1990s: ROOT arrives– All you could do before and more. In C++ this time.– FORTRAN is still around. The “ROOT-TUPLE” is born– Side promise: if one inherits all one owns from TObject,

reconstruction and analysis form a continuum

2000s: two categories of analysis physicists: those who can only work off the ROOT-tuple and those who can create/modify it

Mid-2000s: WiFi arrives; Physics-in-meeting

Page 35: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 44

Analysis (I)Analysis (I)

All-ROOT: ALICE– Event model has been improving; – Event-level Tag DB deployed

• Collaboration with ROOT– Batch distributed analysis being deployed– Interactive analysis prototype– New prototype for visualization p

Page 36: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 45

Analysis a la ALICEAnalysis a la ALICE

Page 37: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 47

Analysis a la CMSAnalysis a la CMS

Goal: one format, one program for all (reconstruction, analysis)

1. Bare root: open the POOL catalog, and inspect the Data Objects1. Store “simple” structures that are browsable by plain ROOT;

2. CMSSW-Lite: load a small number of libraries, don’t allow access to any calibration, mag field map etc

gSystem>Load("libPhysicsToolsFWLite")AutoLibraryLoader::enable()TFile f("reco.root")Events.Draw("Tracks.phi()-TrackExtra.outerPhi(): Tracks.pt()", "Tracks.pt()<10", "box")

3. Full-CMSSW: full access to calibrations and full availability of libraries. Used mainly to produce reconstructed objects from RawData to Reco or AOD Tiers• Same jet-finding; muon-matching code; cluster corrections• Issue is what data is available (RAW, RECO, AOD)

Page 38: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 48

Event Display (I)Event Display (I)

cosmic muon in SX5

Drift Tube

HCAL

CMSCMS

Page 39: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 49

Event Display (II)Event Display (II)

ATLAS

Page 40: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 51

Establish a common “language” for analysis– Kinematics, navigation among constituents (i.e.: daughters)

and components (i.e.: reco/generator/… information)Provide a common interface to many Physics tools

– Constrained fits, Combiners,…– It’s also a standard intermediate stage of many analysis

processingSpeed up the learning curve for newcomers

– Learning by examples, web pages, …– Examples must be valid for all Physics Channels

Analysis a la CMS: Particle CandidatesAnalysis a la CMS: Particle Candidates

Page 41: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 52

Analysis a la CMS: Particle Candidates for Jets

t t t t t t m m m e e e

CaloTowers Muons Electrons

c c c c c c c c c c c c JetConstituents

j j j j Jets

Contain updatedkinematics info,so energy correctionscan be applied

RECO

Further energy correctionscan be applied

Page 42: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 53

Analysis a la CMS: Common analysis Analysis a la CMS: Common analysis modulesmodules

Common Particle Candidates provided:– Composite: by value, by reference– “Leaf”: from Track, Muon, Electron, Photon, CaloTower

Common functionalities can be common building blocks– Selectors (e, , …)– Combiner modules– Constrained fits– Boosters– …

Page 43: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 54

How will analysis actually be done?How will analysis actually be done?

It is not possible to enforce an analysis model– TAGs may turn out to be very useful and widely utilized; they

may also turn out to be used by only a few people.

Many physicists will try to use what their experience naturally dictates to themAt a given stage, users may want do dump ntuples

anyway– For sure *some* users will do this anyway

The success of any model will depend on the perceived advantages by the analyzersExtremely important:

– Communication: explain the advantages of modularity– Help users: make transition process smooth

Page 44: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

Software Deployment

Page 45: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 56

Software Development Tools (I)Software Development Tools (I)

Code management; – ATLAS example: Approximately 1124 CVS modules (packages)

• ~152 containers– Container hierarchy for commit and tag management

• ~900 leaf– Contain source code or act as glue to external software

• ~70 glue/interface– Act as proxies for external packages

Code distribution: – Different layers of builds (nightly, weekly, developers’, major

releases…)

Testing and validation– Very complex process. – Ultimate test: the “challenges”

Page 46: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 57

Software Development ToolsSoftware Development Tools

Release Process very similar in Atlas and CMS

– Main problem: large number of developers and geographical diversity

– Use different tools for configuration mgt and build

– Quite some commonality in process (and (some) tools)

• Nightlies (nicos)• collecting/controlling

tags

Page 47: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 58

Documentation (I)Documentation (I)

Everyone says it’s important; nobody usually does it– A really nice example from ATLAS

• ATLAS Workbook• Worth copying…

See next slide ….See next slide ….

Page 48: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 59

Documentation (II)Documentation (II)

https:/

/twiki.c

ern.ch

/twiki/b

in/view/CMS/WorkBook

https:/

/twiki.c

ern.ch

/twiki/b

in/view/CMS/WorkBook

Reference Manual Reference Manual

http://c

msdoc.cern.ch/Releases/CMSSW/latest_n

ightly/doc/htm

l

Page 49: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

What’s left to do

Page 50: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 61

Injecting additional realismInjecting additional realism

Impact on detector performance/physics; e.g. ATLAS– cables, services from latest engineering drawings, barrel/end-

cap cracks from installation – realistic B-field map taking into account non-symmetric coil

placements in the cavern ( 5-10 mm from survey) – include detector “egg-shapes” if relevant (e.g. Tilecal

elliptical shape if it has an impact on B-field …)– displace detector (macro)-pieces to describe their actual

position after integration and installation (e.g. ECAL barrel axis 2 mm below solenoid axis inside common cryostat) break symmetries and degeneracy in Detector Description and Simulation

– mis-align detector modules/chambers inside macro-pieces – include chamber deformations, sagging of wires and

calorimeter plates, HV problems, etc. (likely at digitization/reconstruction level)

Technically very challenging for the Software …

Page 51: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 62

Real commissioningReal commissioning

Learning a lot from testbeam (e.g. ATLAS integrated test-beam) and integrated tests (e.g. CMS Magnet Test/

Cosmic Challenge)

– But nothing like the real thing

Calibration/Alignment challenges a crucial step forward– All experiments have some kind of system-wide test planned

for mid and end-2006

Detector synchronization– Procedures (taking LHC beam structure and luminosity) being

put in place; still a lot to do

Preparing for real analysis– Currently: far from hundreds of users accessing (or trying to

access) data samples

Page 52: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 63

ATLAS integrated testbeamATLAS integrated testbeam

All ATLAS sub-detectors (and LVL1 trigger) integrated and run together with common DAQ and monitoring, “final” electronics, slow-control, etc. Gained lot of global operation experience during ~ 6 month run.

x

z

y

Geant4 simulation of test-beam set-up

Page 53: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 64

Cosmics Data Cosmics Data

ATLAS CMS

Tower energies:~ 2.5 GeV

Page 54: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 65

““Hardware Alignment System”Hardware Alignment System”

Four important ingredients:Four important ingredients:• Internal Muon Alignment Barrel• Internal Muon Alignment Endcap• Internal Tracker Alignment • Alignment of Muon w.r.t Tracker (Link System)

Specifications:Specifications:• Monitor tracker support structures at ~10m• Monitor Muon support structures at ~100m• Monitor Muon w.r.t Tracker at ~100m

Readiness for:Muon @ MTCC

Tracker @ 25% Test

Magnet Test and Tracker IntegrationMagnet Test and Tracker Integration

Page 55: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

Summary/Outlook

Page 56: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 67

SummarySummary

Overall shape: ok– Common software in place.– Much of the experiments’ software either complete or nearly

fully-functional prototypes in place

Difference between theory and practice: working on it, but

still difficult to predict conditions at the timeA number of important tests/milestones on the way

– E.g. the calibration challenges. In parallel with Grid-related milestones: major sanity checks

Deployment has begun in earnest– First pictures from detectors read out and reconstructed… at

least locally

Performance (sizes, CPU, etc): in progress

Page 57: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 68

Still a long way to go before some of the more complicated

analyses are possible:– Example from SUSY (IFF sparticles produced with high – Gauginos produced in their decays, e.g.

• qL20qL (SUGRA P5)

• q g q 20qq (GMSB G1a)

– Complex signatures/cascades(1) 2

0 10h (~ dominates if allowed)

(2) 20 1

0+– or 20 +–

– Has it all: (multi)-leptons; jets, missEt, bb…– This kind of study: in numerous yellow reports

• Complex signal; decomposition…

In between: readout, calib/align, HLT, reconstruction, AOD, measurement of Standard Model…

– But we’re getting ever closer!

OutlookOutlook

~

~ _~

~ ~

~ ~~~

~ ~ ~

Page 58: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 69

More Material on LHC Experiment’s SWMore Material on LHC Experiment’s SW

LCG Application Areahttp://lcgapp.cern.ch/

Alice Home Pagehttp://aliceinfo.cern.ch/index.html

– Offline Home Page– http://aliceinfo.cern.ch/Offline

Atlas Home Pagehttp://atlas.ch/

– Offline Home Page– https://uimon.cern.ch/twiki/bin/view/Atlas/AtlasComputing

Page 59: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 70

More Material on LHC Experiment’s SWMore Material on LHC Experiment’s SW

CMS Home Pagehttp://cms.cern.ch/

– Offline Home Page– http://cmsdoc.cern.ch/cms/cpt/Software/html/General/

LHCb Home Pagehttp://lhcb.web.cern.ch/lhcb/

– Offline Home Page– http://lhcb-comp.web.cern.ch/lhcb-comp/

Page 60: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

End Lecture 2End Lecture 2

Page 61: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

Backup onDistributed Analysis

Page 62: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 73

Classical Parallel Data AnalysisClassical Parallel Data Analysis

StorageBatch farm

queues

manager

outputs

catalog

“Static” use of resources Jobs frozen, 1 job / CPU

“Manual” splitting, merging Limited monitoring (end of single job) Possible large tail effects

submit

files

jobsdata file splitting

myAna.C

mergingfinal analysis

query

Page 63: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 74

Interactive Parallel Data AnalysisInteractive Parallel Data Analysis

catalog StorageInteractive farm

scheduler

query

Farm perceived as extension of local PC More dynamic use of resources Automated splitting and merging Real time feedback Much better control of tail effects

MASTER

query:data file list, myAna.C

files

final outputs(merged)

feedbacks

(merged)

Page 64: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 75

Batch-oriented D.A. SystemsBatch-oriented D.A. Systems

GANGA: cooperation between LHCb and ATLAS– Designed for analysis on the Grid (T1)– Backend type in job definition: local, local batch, grid (via DIRAC)– Facilitate bookkeeping, job tracking

DIAL (ATLAS: Distributed Interactive analysis on Large Datasets)– Web service framework for common interface to large range of batch and workload management systems– Insulate users from splitting, submission, merging and error recovery– Performance for a reference dataset (see later)

• 1872 files, 100 evts each, replicated 6 times• Reference atlasdev analysis (transformation)

CRAB (CMS)– Python tool to facilitate creation of a large number of user analysis job– Efficient access to data hiding middleware complications– Manages submission, tracking, monitoring and harvesting– Used for physics TDR and SC3 (WLCG, OSG)

Page 65: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 76

GANGA

U. Egede

Page 66: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 77

CRAB

M. Corvo

Some statistics

Most accessed sites since July 05

CRAB jobs so far

-

Page 67: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 78

Submission systemsSubmission systems

ATLAS– ProdSys: provide seamless access to all ATLAS grid resources

• emphasis on batch model– interactive solutions difficult to realize on top of the current

middleware layer– PanDA: Production and Distributed Analysis system (OSG)

• Job/Executor interface, Task buffer, Brokerage, Dispatcher, Data Service, Job scheduler, Logging / Monitoring system

• Very recent project developed by the ATLAS U.S. teamCMS

– BOSS: Batch Object Submission System• tool for batch job submission, real time monitoring and book

keepingLHCb

– DIRAC: Workload and Data management system• Pull scheduling paradigm via pilot agent technology• Multi-threaded mode show to reduce job start times

– Pilot agents request several jobs from the same user and run two jobs in parallel

Page 68: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 79

ProdDB

CECE CE

DulcineaDulcineaDulcinea

DulcineaDulcinea

LexorDulcinea

DulcineaCondorG

CG

PANDA

RBRB

RB

ATLAS Prodsys ProdSys

D. Liko

Page 69: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 80

BOSS WorkflowBOSS Workflow

boss submitboss queryboss kill BOSS

DB

BOSS Schedulerfarm node

farm node

Wrapper

User specifies job - parameters including:– Executable name.– Executable type - turn on customized monitoring.– Output files to retrieve (for sites without shared file system and grid).

User tells Boss to submit jobs specifying scheduler i.e. PBS, LSF, SGE, Condor, LCG, GLite etc..Job consists of job wrapper, Real time monitoring service and users executable.

From Evolution of BOSS by Wakefield [240]

Page 70: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 81

New data flowNew data flow

User Interface

BOSS CLIENT

LOCAL OR GRID

SCHEDULER

REAL-TIME BOSS

DB SERVER

Worker Node

BOSS JOB WRAPPER

USER PROCESS

BOSS REAL-TIME

UPDATER

Submit or control job

Get job running status

Pop job monitoring info

Job control and logging File I/O control

Set job logging info (possibly via proxy)

Retrieve output files

BOSS JOURNAL

BOSS DB

BOSS

S. Wakefield

Page 71: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 82

DIRACDIRAC

CHEP 2006 (13th–17th February 2006) Mumbai, IndiaStuart K. Paterson 3

Introduction to DIRAC

The DIRAC Workload & Data Management System (WMS) is made up of Central Services and Distributed Agents

Realizes PULL scheduling paradigm

Agents are requesting jobs whenever the corresponding resource is availableExecution environment is checked before job is delivered to WN

Service Oriented Architecture masks underlying complexity

S. K. Patterson

Page 72: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

BACK-UP -SLIDES

Page 73: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 84

TTree IndicesTTree Indices

Use to connect friend TTrees.Extended for TChains

– Re-use its TTrees’ indexes– Requires the TTrees to be sorted

// Create index using Run and Event numberstree.BuildIndex("Run","Event");// Read entry for Run=1234 and Event=56789tree.GetEntryWithIndex(1234,56789);

1

1

1

2

2

2

1

2

2

1

2

1

Main TreeUser Treerun event run event

1

1

1

2

2

2

1

2

2

1

2

1

Indexed Main TreeUser Treerun event run event

Page 74: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 85

pp Cross Section and Pile-uppp Cross Section and Pile-upInteractions/s:• Lum = 1034 cm–2s–1 = 107 mb–1 Hz• inel(pp) = 70 mb• Interaction Rate, R = 7108 Hz

Events / beam crossing:• t = 25 ns = 2.510–8 s• Interactions/crossing = 17.5

Not all proton bunches are full:• Approximately 4 out of 5 are full• Interactions/“active” crossings = 17.5 × 3564/2835 = 23Operating conditions:

1) A “good” event containing a Higgs decay +2) ~20 extra “bad” (minimum bias) interactions

e

e

All tracks with pT > 1 GeV

H ZZ* 2e2H ZZ* 2e2

Page 75: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 86

Physics ProgramPhysics Program

Tracking detectors essential not only for tracking but also for Trigger and

particle identification and energy flow in the full energy range

S.M. Higgs search MSSM Higgs Bosons

A, H, H cross-section ~ tg2Best sensitivity from A/H , H

mmhh < 135 GeV < 135 GeV mmAA m mH H mmHH at large mat large mAA

Page 76: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 87

Physics Program -2Physics Program -2

• Search for SUperSYmmetric (SUSY) particles and New Physics

• Heavy Flavour and precision physics: CP violation of B hadrons; rare B decays; top mass & couplings, W mass & couplings

• Heavy Ions physics

Page 77: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 88

Track Parameter accurancy vs # of hitsTrack Parameter accurancy vs # of hits

Impact parameter ResolutionImpact parameter Resolution

Full Tracker

Transverse Momentum ResolutionTransverse Momentum Resolution

Tracking time proportional to the number of hits

Good efficiency/ghost rate & resolution with just 5 hits

Timing vs Reconstructed HitsTiming vs Reconstructed Hits

Page 78: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 90

SummarySummary

Main pp experiments– ATLAS has continuous tracker (TRT), CMS not

• that will be probably seen on CPU per event !– Inner tracker

• CMS has a better resolution (twice the magnetic field)– All silicon detector with different technologies (pixel

and -strips)• ATLAS

– Different detectors and technologies– Muon tracker

• ATLAS has better resolution (air-core vs. iron-core solution)

Page 79: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 91

Track Reconstruction PerformancesTrack Reconstruction Performances

Reconstruction of charged tracks with Tracker (trajectory=helixtrajectory=helix) from inside (pixelpixel) out (--strip detectorsstrip detectors). Pixel Occupancy ~ 10-4

Pixel

-strips…

For lower pt tracks multiple scattering becomes significant and the dependence reflects the amount of material traversed by tracks and the lever arm effect

Page 80: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software

RDBMS Components (II)RDBMS Components (II)

Technologies supported:– Oracle

• Fully implements the CORAL API• Based on OCI version 10g

– MySQL• Best suited where low level of administration is required• Based on native C API version 4.0 (currently migrating to 5.0)

– SQLite• File-based, embedded SQL engine• No administration, very lightweight!• A means of transferring small amounts of relational data?

– Frontier• squid caches between client and Oracle database server

Page 81: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 93

Hadronic Calorimeter (HCAL)Hadronic Calorimeter (HCAL)

HCAL studies on energy resolution and linearity, e/ ratio, and shower profile instrumental in G4 hadronic physics validationComparisons between single particle measurements in test beam: 2002-2004, different HCAL modules, preceded by ECAL prototype, to beams of , e and over large energy range - G4 hadronic physics parametric (LHEP) and microscopic (QGSP) models energy resolution and response linearity as a function of incident energy in good agreement with the data within the large systematic uncertainties in the latterTransverse and longitudinal shower profiles studied in 1996 and 2004 test beam showers predicted by G4 narrower than those by G3Showers predicted by QGSP (v 2.7) shorter than those by LHEP (v 3.6) list, with LHEP predictions closer to those from G3/Geisha

test beam 2004 results

Page 82: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 94

New in ALICE ReconstructionNew in ALICE Reconstruction

Tracking in High Density EnvironmentUse TRD detector for reconstruction

Do ‘Local Reconstruction’ – Excellent space resolution for high

momentum track much improved momentum resolution

– Works in high density environmentBiggest improvement due to the correct error parameterization

Local Reconstruction

#394 – M.Ivanov, Track reconstruction in high density environment

Old

er

alg

orith

ms

#385 – M.Ivanov, Track reconstruction algorithms for the ALICE High-Level Trigger

New

Fast Hough-Transform TPC Tracking:Very good efficiency, stable

– stable to dN/dy~8kFast:

– ~5s for central PbPb event with dN/dy~4000Pt resolution worsens linearly with Pt

TS Tracking:tracks efficiently propagated to ITSTrack parameters resolution greatly improved

Page 83: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 95

ALICE High Level TriggerALICE High Level Trigger

Data rate from central PbPb collisions (dN/dy~2000-4000):

200Hz*(30Mb-60Mb)=6-12Gb/s

Max mass storage bandwidth ~1.2Gb/s

The goal of HLT is to reduce the data rate without biasing important physics information:

– Event triggering– “Regions of Interest”– Advanced data compression

Requirements:Requirements:― Fast and robust online Fast and robust online reconstructionreconstruction― Sufficient tracking efficiency Sufficient tracking efficiency and resolutionand resolution― Fast analysis of important Fast analysis of important physics observablesphysics observables

Detectors

DAQ HLT

Detectors

DAQ HLT

Mass Storage

1.2GB/s1.2GB/s

12GB/s12GB/s

Detectors

DAQ HLT

Detectors

DAQ HLT

Mass Storage

1.2GB/s1.2GB/s

12GB/s12GB/s

Page 84: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 96

Muon Momentum resolutionMuon Momentum resolution

Cross over ~70 GeV

Inner tracker dominates Muon spectrometer dominates

Page 85: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 97

Timing Performance-exampleTiming Performance-example

Timing of reconstruction of tth(bb) events on 2.8GHz Pentium4Results for different steps in reconstruction chain:

– Data Preparation ~ 599 ms (POOL IO, clustering, space point formation)

– Track Reconstruction ~ 357 ms – Post Processing ~ 180 ms

(primary vertex finding, particle creation 118)– Truth Association + Statistics 530 ms

(POOL IO, about 60 ms for statistics… )New track reconstruction similar to previous packages (our

benchmark):– iPatRec ~ 466 ms– xKalman ~ 608 ms

Page 86: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 98

Event Selection & ReconstructionEvent Selection & Reconstruction

Interactions Reco-FW1. Explicit Scheduling: the steps

needed to fulfil an operation are explicitly declared

2. Reco Algos implemented as FW Modules1. Independent, communicate

via the Event Store and have common abstract interfaces

2. Different parameter sets can select difference performance levels, via1. Different algorithms2. Different parameters of

the same algorithm

Raw Data Unpack

Digis Run clustering

Clusters Run tracking

Tracks …

Data Objects central in this

view

Page 87: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

AlignmentAlignment

Page 88: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

17 February 200617 February 2006 Event Processing Applications suEvent Processing Applications summarymmary

Track based alignment in CMSTrack based alignment in CMS

HIP (Hits & Impact Points) iterative method

– To be used for CMS pixel alignment– Basic idea: reconstruct tracks

normally, then align individual detectors or composite structures

– Modules or alignment parameters can be free or fixed

– Rigid support structures can be aligned as well as individual sensors

– Easily parallelizable method

#356 – T.Lampen, Track based alignment of composite detector structures

Page 89: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 101

Variable ATLAS detector descriptionsVariable ATLAS detector descriptions

Basic idea: have possibility of building various ATLAS geometry layouts with every single version of ATLAS software

ATLAS geometry versioning system is based on Hierarchical Versioning of detector description primary numbers stored in the ATLAS Geometry Database

In order to switch between different geometry layouts it is enough to change a single parameter: ATLAS top level geometry tag

ATLAS geometry tags can be passed across job boundaries– Using persistent TagInfo objects– Subsequent jobs can pick up

correct geometry configuration from the input file bypassing manual configuration through job options

#67 – V.Tsulaia, Software Solutions for a Variable ATLAS Detector Description

Page 90: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

17 February 200617 February 2006 Event Processing Applications suEvent Processing Applications summarymmary

COCOACOCOA

COCOA is a general purpose alignment software developed as a Software Engineering project– User describes optical system in ASCII files– COCOA reconstructs the unknown parameters

and propagate the errorsCOCOA stressed by years of use in CMSFull CMS Link alignment system (2865

parameters)– 25 minutes in Athlon 1.3 GHz– Memory: 590 Mb (long double matrices)

Time and memory scales as (#param)2 !– Next challenge is to simulate full CMS (40k params)– Methods under study

#321 – P.Arce, COCOA: General purpose software for simulation and reconstruction of optical alignment systems

Page 91: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 103

Alice: Analysis Basic ConceptsAlice: Analysis Basic Concepts

Page 92: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 104

Analysis Tools Nov 05 – Mar 06 (II)Analysis Tools Nov 05 – Mar 06 (II)

Data Tiers:Modular RECO data products allow separation of “core” component

from “extra” component . The “core” part is a natural candidate for AOD

t t t t t tTracks …Kinematics(helix parameters)

T T T T TTracksExtra T …Track extrapolation,references to RecHits

h h h h hTracksHits h h h h h h h h h … RecHits

Page 93: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 105

CS Network

“Sub-Farm” 0

StoMan 0

SubFarm DN (GbE)

FU 0 FU m

Events

DQM Collector

Q-onl EvConsumer MonConsumer

“Sub-Farm” 1

SubFarm DN (GbE)

FU m+1 FU 15

DQM Collector

BU0 BUn-1 BUn BU2n-1

MTCC “T0”

Farm Manager

DQM Manager

1xGbE

Run Control

Disk Server ~1TB

Hot Buffer

StoMan 1

Filter Farm architectureFilter Farm architecture

Page 94: 29-30 March 2006LHC Experiment's SoftwareL. Silvestris 1 Software Domain Decomposition Core PluginMgrDictionary MathLibsI/O Interpreter GUI2D Graphics

29-30 March 200629-30 March 2006 LHC Experiment's SoftwareLHC Experiment's Software L. SilvestrisL. Silvestris 106

GANGA

U. Egede