commissioning and performance of the cms high level trigger
DESCRIPTION
Commissioning and Performance of the CMS High Level Trigger. Leonard Apanasevich University of Illinois at Chicago f or the CMS collaboration. Introduction. Outline:. Overview of the CMS Trigger System Trigger Commissioning in 2010 Trigger Rates and Performance HLT Processing Times. - PowerPoint PPT PresentationTRANSCRIPT
Commissioning and Performance of the CMS
High Level Trigger
Leonard ApanasevichUniversity of Illinois at
Chicago
for the CMS collaboration
TIPP 2011 2
Introduction
• Overview of the CMS Trigger System
• Trigger Commissioning in 2010• Trigger Rates and Performance• HLT Processing Times
• At design L = 1034cm-2s-1
– ~ 1 GHz input rate• Event size: ~1 MB (≈75M
channels) → 1000 TB/sec for 1 GHz input rate– ~300 MB/sec affordable– Enormous rate reduction
necessary!→ but without losing interesting
physicsJune 11, 2011
Outline:
Trigger is essential
TIPP 2011 3
The CMS Detector
June 11, 2011
TIPP 2011 4
CMS Trigger System
June 11, 2011
• The CMS trigger system is structured in two levels: Level-1 (L1) and HLT
• Level-1 trigger reduces the rate from 40 MHz to 100 kHz (max)– Custom-designed electronic
boards and chips• High-Level trigger further
reduces the rate to O(200 Hz)– Software-based using the full
granularity of the CMS detector
– Runs offline-quality algorithms running on a ~5K processor filter farm
Electrons,Photons,Jets,MET Muons
3<||<5 ||<3 ||<3 ||<2.1 0.9<||<2.4 ||<1.2
The Level-1 Trigger• Runs synchronously at the
LHC clock rate of 40 MHz• Selects muons, electrons,
photons, jets– ET and location in detector– Also Missing ET, Total ET, HT, and jet counts
• 128 Level-1 trigger bits. Bits can be set using (up to 128) logical combinations of Level-1 objects– e.g:
• Total decision latency: 3.2 s
June 11, 2011 5TIPP 2011
For details about the Level-1 system and it’s performance, see Thursday's talk by P. Klabbers
TIPP 2011 6
High Level Trigger• The HLT runs as a dedicated
configuration of the CMS reconstruction software on a
672 node DAQ filter farm– Dual 4-core machines– 2.66 GHz, 16 GB Ram– 1 Master / 7 Event Processors
• With a nominal input rate of 100 kHz, running on 4704 processes
in parallel, each process has available an average of 47 ms to – read the input data– run all the trigger algorithms
(~150 at the end of 2010; currently, ~300)
– take the final accept/reject decision
– stream the data to the Storage Managers
• Nominal output rate ~200 Hz
• For comparison, offline reconstruction takes ~5 sec per event
June 11, 2011
June 11, 2011 TIPP 2011 7
• Each HLT trigger path runs independently from the others
• Processing of a trigger path stops once a module returns false
• Reconstruction time is significantly improved by doing regional data-unpacking and local reconstruction across HLT
• All algorithms regional– Seeded by previous levels (L1, L2, L2.5)
HLT Algorithm DesignL1 seeds
L2 unpacking (MUON/ECAL/HCAL)
Local Reco (RecHit)
L2 Algorithm
Filter
L2.5 unpacking (Pixels)
L2.5 Algorithm
Local Reco (RecHit)
“Local”: using one sub-detector only“Regional”: using small (η, φ) region
TIPP 2011 8
Trigger Commissioning in 2010
• CMS trigger menu was continuously adopting to the LHC conditions– Developed ~12 trigger menus covering a large range of luminosity
scenarios over the course of the 2010 run• Low luminosity regime:
– e.g.: L=1E28,1E29,4E29 Hz/cm2 trigger menus
– Trigger decision based on simple threshold cuts
– Large fraction of the bandwidth (~30%) reserved for calibration and minimum bias triggers to ensure complete understanding of the detector performance
• High(er) luminosity regime:– e.g.:2E31, 6E31, 2E32 menus– More elaborate paths: require
isolation and identification besides thresholds
June 11, 2011
TIPP 2011 9
Trigger Menu Development
• Using data to develop trigger menus for higher instantaneous luminosities– Most paths exhibit fairly linear
behavior vs luminosity– Extrapolation errors minimized by
using most recent data to keep the rate non-linearities under control
– Large discrepancies understood to be due to paths with significant cosmic/noise components or depending on the beam conditions (beam gas and beam halo)
• Menus were prepared for each twofold increase in luminosity
• Target rate between 200 and 400HzJune 11, 2011
TIPP 2011 10
CMS Web Based Monitoring
• Summary of all HLT paths• Similar page available for L1
June 11, 2011
Sync problem
ECAL Problem (Muon trigger unaffected)
A dedicated talk on the WBM system will be given on Monday
TIPP 2011 11
Lessons Learned…
• Overplan– never trust the machine schedule !– always be ready for an extra factor 2-3 in luminosity
• Involve– HLT menus are increasing in complexity with luminosity– need a tighter involvement/collaboration with the
physics users• Document
– the most common questions from users are:• “when was [insert favorite trigger here] first deployed?”• “why is it prescaled?”
June 11, 2011
TIPP 2011 12
HLT Performance
June 11, 2011
TIPP 2011 13
Performance of Jet Triggers
• Jets at HLT were reconstructed using an iterative cone algorithm with cone size R = 0.5– Switched to Anti-kT (R=0.5) in 2011
• Jet algorithm is identical to the one used in the offline analysis
• Efficiency for offline reconstructed jets to pass HLT jet triggers with pT threshold = 15, 30, and 50 GeV– Shown for Barrel and Endcap regions
June 11, 2011
Comparison with MC
TIPP 2011 14
Performance of MET Triggers
• Missing ET calculated from algebraic sum of transverse energies of calorimeter objects plus muons
• Good agreement between MC and data
June 11, 2011
L1 MET (thresh=20 GeV) efficiency vs. Offline MET
HLT MET (thresh=45 GeV) efficiency vs. Offline MET
TIPP 2011 15
HLT Muon Reconstruction• First Stage:
– Confirm L1 “seeds”: refit hits in the muon chambers with full granularity of the detector
• Kalman filter iterative technique– Reconstruction in L1 regions of interest
• Second Stage:– Inclusion of tracker hits– Regional tracker reconstruction – Combine 1st-stage objects to charged particle tracks in the
tracker– pT resolution much better compared to 1st stage
• Optional: Isolation in calorimeters (at 1st Stage) and tracker (at 2nd Stage)
June 11, 2011
TIPP 2011 16
Performance of Muon Triggers
• Efficiency for a high-quality offline reconstructed muon matched to L1 object to pass the HLT single muon trigger with a threshold of pT > 3 GeV, plotted as a function of pT
• Events collected with the minimum bias trigger• Lower than expected efficiency due to time calibration at start-up
June 11, 2011
Barrel Endcaps
TIPP 2011 17
HLT e/γ Reconstruction• Common stage for Photons and Electrons:
– Spatial matching of energy deposits (clusters) in the Electromagnetic Calorimeter (ECAL) with e/γ candidates at L1
– Form super-clusters (group of clusters; bremsstrahlung/conversions recovery)
– ET cut applied– ECAL super-cluster shape consistent with an electromagnetic object– Calorimetric (ECAL+HCAL) isolation
• Photons– Tight track isolation in a solid cone
• Electrons:– Matching with hit pairs in pixel detectors– Electron track reconstruction– Angular matching of ECAL cluster and full track– Loose track isolation in a hollow cone
June 11, 2011
TIPP 2011 18
Performance of Photon and Electron Triggers
June 11, 2011
Trigger efficiency for selected offline superclusters matched to L1 objects to pass a photon trigger with a threshold of ET > 15 GeV, plotted vs. the supercluster ET.
Efficiency for offline reconstructed electrons, which have passed a photon trigger with a threshold of ET > 15 GeV, to pass an electron trigger with similar ET threshold.
TIPP 2011 19
HLT CPU Performance
• Study performed on a Minimum Bias sample of collected data (average pile up ~ 1.5 event/xing)
• Filter farm machine specs:– Processors: 2 Quad Core Intel
® Xeon ® 5430– 2.66 GHz nominal frequency– 16 GB of memory
• Average CPU time budget at 100kHz (50kHz) of input rate is 50 ms (100 ms)
June 11, 2011
Basic unpacking andbook-keeping: commonto all HLT paths
“L2” section: regional unpacking, jet reco, pixel tracking
“L3” section: fullregional tracking,multiple objects L1 seeding and prescale:fast rejection of events
TIPP 2011 20
Conclusions• The CMS trigger performed amazingly well in 2010
– Successfully captured physics over 5 orders of magnitude in luminosity– Less then 30 mins downtime due to HLT during entire 2010 run
• Good understanding of the evolution of trigger rates and CPU timing with instantaneous luminosity allowed optimal operation of the HLT
• In general, the physics triggers exhibit sharp turn-on curves and are in good agreement with the simulation
• LHC has already achieved instantaneous luminosities of ~1.3x1033 Hz/cm2 (max in 2010 was ~2x1032 Hz/cm2) and could reach as high as 5x1033 Hz/cm2
• The trigger paths are becoming more and more complex and specialized as we try to maintain low thresholds to capture all the physics signals of interest– Over 350 paths in the current trigger menu (max in 2010 was 174)– Bandwidth issues are becoming important
• Need to continue to develop strategies for controlling trigger rates in a high pileup environment– Not a significant issue so far for the trigger, but we may have up to ~18
evts/crossing before the end of the year (currently ~5 evts/crossing)June 11, 2011
Upcoming challenges in 2011