when dosimetry goes wrong in therapy and imaging · when dosimetry goes wrong in therapy and...

Post on 13-Jul-2020

1 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

When Dosimetry Goes Wrong in Therapy and Imaging

Keith FaulknerOla Holmberg

Joanna IzewskaTommy Knöös

Pedro Ortiz-LopezGeoffrey S. Ibbott

IDOS, Nov. 9, 2010

Categories of errors• Errors made at commissioning: new machine or procedure

• Errors made through QA, such as patient-specific dosimetry, or QA/dosimetry wasn’t done

• Errors introduced through calculation (incorrect dosimetry data, incorrect application of data, inadequate procedures)

• Dose error resulting from equipment malfunction or introduced by incorrect or unrecognized service work

IDOS, Nov. 9, 2010

Errors in Radiology and Nuclear Medicine

• Determinations of fetal dose

• Unknown pregnancy, errors in determining dose

• Unwanted exposure from incorrect imaging parameters

• CT angiography events in the US recently

• pediatric patient receiving repeated exposures

• Dose to members of the public from I-131 patients

• Use of public transportation or hotel rooms

IDOS, Nov. 9, 2010

Errors in Radiation Therapy

• Experience from TLD audits showing variations in machine calibration.

• Experience from RPC phantoms showing errors in IMRT and 3D dose delivery.

• Published reports of events including:

• Errors in basic machine calibration

• Errors in output factors (small fields, half-blocked fields)

• Errors in applying calculation parameters

IDOS, Nov. 9, 2010

Distribution of TLD resultsPhotons beams

within 7%Number of beams: 3051

Avg. RPC/Inst.: 0.999Stdev.: 1.6%

Electrons beamswithin 7%

Number of beams 4310Avg. RPC/Inst: 0.998

Stdev.: 1.7%

IAEA, July 6, 2010

Frac

tion

of in

stitu

tions

Institutions with One or More Unacceptable TLD Measurements

Year

7

75

80

85

90

95

100

AF AM EM EU SE WP

world region

% a

ccep

tabl

e re

sults

1st participation after f-up

IAEA/WHO TLD audit results within the 5% limit

In 2000-2009, 4440 beams were checked with 511 TLD results followed-up, of these ~50% in Eastern Europe

8

Reasons for discrepancies in TLD results

no data42%

calc. error16%

set-up error32%

other10%

Analysis of 511

discrepancies occurring in 2000 2009

9

No physicist, no dosimetry equipment?

1029 empty or incomplete TLD data sheets (no data on ion chamber

dosimetry)

3411 correctly filled-out TLD data sheets

Results of the IAEA/WHO TLD results, 2000-2009

75%

4%9% 12%

Within 5%>20%10%-20%5%-10%

93%

4%1% 2%

10

IAEA TLD audits: use of dosimetry Codes of Practice

TLD results vs. dosimetry Codes of Practice Years 2000-2009; N=4440, m=1.007, SD=3.8%

CoP N Mean DTLD /Dstat

SD (%)

NDw 1875 (13)* 1.003 2.6

NK 1162 (13) 1.005 3.3

NX 340 (7) 1.020 4.7

Ion chamber details only 129 (3) 1.020 5.4

No dosimetry data 858 (39) 1.011 5.1

*75 deviations beyond 20% excluded, in brackets

NDw42%

Nk27%

Nx8%

Unknown23%

110

10

20

30

40

50

60

70

80

90

perc

ent

Farmer 0.6 cc Small volume 0.1 - 0.3 cc Local make

2002-2003 2004-2005 2006-2007 2008-2009

1 4

IAEA TLD results: use of ionization chambers

(75 deviations beyond 20% excluded)

Chamber type N Mean DTLD /Dstat

SD (%)

Farmer 0.6 cc 2807 1.005 2.9Small volume 0.1 - 0.3 cc 316 1.008 3.9

Local make 229 1.016 5.1

No chamber details 1013 1.011 5.1

TLD results 2000-2009

12

94%

1%4%

(within 5 %)

(dev. 10 - 20 %)

(dev. > 20%)

6%

3 %

82%(within 5 %)

(dev. >20%)

Medical accelerators Co-60 units

IAEA/WHO TLD results (2000-2009)

2427 high-energy X-ray beams:148 deviations

beyond 5 % level

2013 Co-60 beams:363 deviations

beyond 5 % level

misadministration?

accident?

1%

(dev. 5-10%)tolerance

9%

tolerance(dev. 10 - 20 %)

misadministration?

(dev. 5-10%) accident?

IDOS, Nov. 9, 2010 13

RPC visits: the only completely independent comprehensive radiotherapy quality audit in North AmericaIAEA QUATRO visit is similar

On-Site Dosimetry Review Visit

Identify errors in dosimetry and QA and suggest improvements.Collect and verify dosimetry data for chart review.Improve quality of patient care.

IAEA, July 6, 2010

Reference Beam Calibration

Percent of Beams out of Criteria

(since 2000)

Photons ElectronsTLD (±5%) 3-5% 5-8%

Visits (±3%) 2-4% 3-14%

Reference Beam Calibration Percent of Inst. with ≥ 1 beam out of

criteria (since 2000)

Photons ElectronsTLD (±5%) 7-11% 6-12%

Visits (±3%) ~13% ~15%

On-Site Dosimetry Review Visit

IDOS, Nov. 9, 2010 15

Errors Regarding Number of Institutions (%)Review QA Program 127 (77%)

*Wedge Transmission 53 (32%)*Photon FSD (small fields) 46 (28%)Off-Axis, Beam Symmetry 42 (25%)

*Photon Depth Dose 34 (21%)*Electron Calibration 25 (15%)*Photon Calibration 22 (13%)

*Electron Depth Dose 19 (12%)

Selected discrepancies discovered 2004 – 2008

On-Site Dosimetry ReviewOn-Site Dosimetry Review

*70% of institutions received at least one of the significant dosimetry recommendations.

IDOS, Nov. 9, 2010

RPC Treatment Record Review

• Independent calculation of tumor dose

Agree within 5% (15% for implants)

• Verify dose, time, fractionation per protocol

• Notify institution if major deviation seen during review to prevent further deviations

IDOS, Nov. 9, 2010

Examples of Systematic Errors > 5% (>15%)

Error MagnitudeTPS used wrong depth when head frame used 27%

TPS did heterogeneity corrections incorrectly 8.5%

Institution ignored effects when >50% of the field was blocked 5%

Point of calculation near edge of field 6-7%

Non-measured output with average TLD > 5% 7%

Lung correction used, not allowed on protocol 9-13%

TPS wedge factor differs from clinical wedge factor 9%

AAPM/ASTRO Patient Safety Conference, June 24, 2010

RPC Phantoms

Pelvis (10)

Thorax (13)

Liver (2)H&N (31)SRS Head (4)

Spine (3)

IAEA, July 6, 2010

Phantom Results

Comparison between institution’s plan and delivered dose.

Phantom H&N Prostate Spine Lung LiverIrradiations 752 174 19 174 23

Pass 585 143 13 124 12Pass % 78% 82% 68% 71% 52%

Criteria 7%/4mm 7%/4mm 5%/3mm 5%/5mm 7%/4mm

Year introduced 2001 2004 2009 2004 2005

IAEA, July 6, 2010

Explanations for Failures

Explanation Minimum # of occurrences

incorrect output factors in TPS 1

incorrect PDD in TPS 1

IMRT Technique 3

Software error 1inadequacies in beam modeling at leaf

ends (Cadman, et al; PMB 2002) 14

QA procedures 3errors in couch indexing with Peacock

system 3

equipment performance 2

setup errors 7

IDOS, Nov. 9, 2010

Apparent Causes of Error

• Inadequate training

• Insufficient staffing

• Inadequate procedures • Inconsistent use of dosimetry protocols

• Poor equipment design• Obsolete equipment (cobalt units, dosimetry

equipment)• Cluttered accelerator consoles• Cumbersome and inaccurate methods of beam data

entry into TPSs

IDOS, Nov. 9, 2010

Lessons Learned – Conventional Equipment

• Independent check of calibration of new beams

• Independent verification of basic data entered into TPS, or of model developed to represent data

• Independent check of TPS calculations, including regular checks and removal of obsolete data and files

• Written policies and procedures

• Policy requiring verification by a physicist of all equipment repairs before treatment is resumed

• Patient monitoring upon completion of brachytherapy procedures

IDOS, Nov. 9, 2010

Lessons Learned – Advanced Technologies

• Commissioning procedures for conventional equipment are applicable to advanced technologies

• Implementing a new technology should be based on an evaluation of the expected benefit. A step-by-step approach should be followed, for example from 2D to 3D conformal therapy and from there to IMRT

• Proper training should always be included with new technologies

• Data integrity should be checked after any computer malfunction

• Dosimetry protocols for non-standard radiation beams are necessary

• Increased attention to image identification and understanding is critical

IDOS, Nov. 9, 2010

ASTRO Six-Point Response

• Database of errors

• Practice accreditation program

• Improved educational programs on QA and safety

• Education for patients and caregivers

• Improving data transfer

• New and expanded federal initiatives

top related