machine learning with the mip timing detector

15
Machine Learning with the MIP Timing Detector Margaret Lazarovits Learning Machine Learning February 28, 2019

Upload: others

Post on 31-Jan-2022

5 views

Category:

Documents


0 download

TRANSCRIPT

Machine Learning with the MIP Timing Detector

Margaret LazarovitsLearning Machine Learning

February 28, 2019

Outline1. Particle Physics Overview

• What are we looking for?

• How are we looking for it?

2. Detector Physics

• Compact Muon Solenoid (CMS)

• HL-LHC Upgrade & Issues

3. What is the MTD?

• Basics

• Design

4. Current ML Status

• Neural Network

5. Next Steps

!2

What are we looking for?• Standard Model Physics - completing the picture

• Mass Hierarchy

• Matter/Antimatter

• Unifying forces

• New Physics - Beyond Standard Model (BSM)

• SUSY

• Higgs Boson

• Dark Matter

• Cosmology

• Recreating early universe conditions

• Heavy ion collisions

• Cosmic rays

!3

Higgs Boson discovery in diphoton channel (CERN, 2012)

How are we looking for it?Accelerators

• SLAC - Stanford Linear Accelerator Center

• ILC - International Linear Collider

• Fermilab - Tevatron (RIP 1983 - 2016)

• LHC - Large Hadron Collider

!4

Detectors

• DUNE - Deep Neutrino Underground Experiment

• IceCube

• MINERvA - Main Injector Experiment for v-A

• D0 + CDF

• CMS + ATLAS

Compact Muon Solenoid (CMS)

!5

HL-LHC Upgrade

High Luminosity LHC

• 10x more data

• Operational around 2026

• new, innovative technology

!6

CER

N M

ay 2

016

2

6

3

1

45

FOCUSING MAGNETS12 more powerful quadrupole magnets

for each of the ATLAS and CMS experiments, designed to increase the

concentration of the beams before collisions.

CIVIL ENGINEERING2 new 300-metre service tunnels and 2 shafts near to ATLAS and CMS.

“CRAB” CAVITIES16 superconducting “crab”

cavities for each of the ATLAS and CMS experiments to tilt the

beams before collisions.

ALICE

CMS

LHCb

ATLAS

COLLIMATORS15 to 20 new collimators and 60 replacement collimators to reinforce machine protection.

BENDING MAGNETS4 pairs of shorter and more

powerful dipole bending magnets to free up space for the new

collimators.

SUPERCONDUCTING LINKSElectrical transmission lines based on a

high-temperature superconductor to carry current to the magnets from the new service

tunnels near ATLAS and CMS.

LHC TUNNEL

HL-LHC IssuesPileup

!7

MIP Timing Detector• Detects MIPs (Minimum

Ionizing Particles) in space as well as time

• Precision Timing (~30 ps)

• Improve track + vertex reconstruction

• Improve missing pT resolution

• Reduced pileup rate

!8

CMSSW generated visualization of MTD encaps (orange) and barrel (gray)

What is the MTD?Endcap technology

• Silicon low gain avalanche detector (LGAD) on the bottom

• ASIC in the middle

• Flex circuit

• Surrounded by aluminum compound

!9

How does the MTD measure time?

• Project at hand - could we potentially use neural networks?

• Previously:

• Used reference time, interpolated peak, used CFD

!10

41 41.2 41.4 41.6 41.8 42 42.2 42.4 42.6 42.8 43time (ns)

20−

0

20

40

60

80

100

120

140

160

ampl

itude

(mV)

Event 4

How does the MTD measure time?

• Collect time by CFD for all events to get time resolution

• Goal:

!11

1− 0.5− 0 0.5 1time (ns)

0.2

0.4

0.6

0.8

1

Ampl

itude

(1/P

eak

Ampl

itude

)

CMSPreliminary W6 8e14

2 2.05 2.1 2.15 2.2 2.25 2.3 2.35time (ns)

0

20

40

60

80

100

120

Even

ts

mu = 2.1858 +/- 0.0014

sigma = 0.04103 +/- 0.00097

CMSPreliminary W6 8e14 (BV = 550V)

σ(t) = ∼ 30ps

Is there a better way?• Currently using Keras to fit pulse times (inputs) to reference

times (labels)

• Using a moving window to determine if there is a peak (1) or no peak (0)

• Data preprocessing

• Need to convert vector of sample voltages, sample times and reference times to something more NN friendly

• Transforms to tensors (matrices) that match shape of input layer

!12

Is there a better way?

!13

128 neurons

Linear stack of layers

Neural Net Architecture• Dense layers (128 neurons)

• Activation functions

• Input: relu (rectified linear unit)

• Output: softmax (for probabilities)

• Loss function: categorical cross entropy

• Optimizer: Adam (adaptive learning rate)

!14

Next Steps• Improve efficiency of code - slow run time

• Add dropout layers?

• Revise model

• More hidden layers?

• Revise activation function in last layer - categorical output (not continuous

• Convolution layers?

• ???

!15