quality assurance guidance for conducting brownfields site

75
United States Office of Solid Waste and EPA 540-R-98-038 Environmental Protection Emergency Response OSWER 9230.0-83P Agency (5102G) PB98-963307 September 1998 Quality Assurance Guidance for Conducting Brownfields Site Assessments

Upload: trannhi

Post on 05-Jan-2017

219 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Quality Assurance Guidance for Conducting Brownfields Site

United States Office of Solid Waste and EPA 540-R-98-038Environmental Protection Emergency Response OSWER 9230.0-83PAgency (5102G) PB98-963307

September 1998

Quality Assurance Guidancefor Conducting BrownfieldsSite Assessments

Page 2: Quality Assurance Guidance for Conducting Brownfields Site
Page 3: Quality Assurance Guidance for Conducting Brownfields Site

Quality Assurance Guidance for ConductingBrownfields Site Assessments

U.S. Environmental Protection AgencyOffice of Solid Waste and Emergency Response

Washington, DC 20460

Page 4: Quality Assurance Guidance for Conducting Brownfields Site
Page 5: Quality Assurance Guidance for Conducting Brownfields Site

ii

Notice

This guidance document describes key principles and best practices for Brownfields site assessment qualityassurance and quality control based on program experience. It is intended as a reference for peopleinvolved in the Brownfields site assessment process. This guidance manual does not constitute arulemaking by the U.S. Environmental Protection Agency (EPA). The policies set forth in this documentare intended solely as guidance. They are not intended, nor can they be relied upon, to create anysubstantive or procedural rights enforceable by any party in litigation with the United States. EPA officialsmay decide to follow the guidance provided in this directive, or may take action that is at variance with theguidance, policies, and procedures in this directive, on the basis of an analysis of specific circumstances. The Agency also reserves the right to change this directive without public notice. Mention of trade namesor commercial products does not constitute endorsement or recommendation for use.

Page 6: Quality Assurance Guidance for Conducting Brownfields Site
Page 7: Quality Assurance Guidance for Conducting Brownfields Site

iii

TABLE OF CONTENTS

Executive Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ES-1Section 1. Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1Section 2. Brownfields Site Assessments. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Brownfields Site Assessment Purpose. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3Brownfields Site Assessment Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Section 3. Data Quality Objectives. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5The DQO Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7Step 1: Stating the Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7Step 2: Identifying the Decision. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9Step 3: Identifying Inputs to the Decision. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10Step 4: Defining the Boundaries of the Study. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11Step 5: Developing a Decision Rule . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11Step 6: Specifying Limits on Decision Errors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12Step 7: Optimizing the Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13Estimating Costs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14

Section 4. Quality Assurance Programs & Sampling Design Strategies . . . . . . . . . . . . . . . . . . . . . . . . . 15Quality Assurance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16Quality Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16Sampling Design Strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

Multi-phase Investigations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20Types of Samples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20Background Samples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21Analytical Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Quality Control in the Field. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Field Instrument/Equipment Inspection and Calibration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23Field Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24Field System Audits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

Quality Control in the Laboratory. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24Quality Control Samples. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25Document Control . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26Deliverables for Data Useability Review. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27Improving Data Useability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

Appendix A - Model Quality Assurance Project Plan. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A-1Appendix B - Glossary of Terms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . B-1Appendix C - References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . C-1

Exhibit 1 - Brownfields Site Assessment ProcessExhibit 2 - Summary of the DQO ProcessExhibit 3 - Comparison of Preliminary Assessment and Due Diligence Site AssessmentsExhibit 4 - The Springfield SiteExhibit 5 - Number of Samples Required to Achieve Given Rates of False Positive and Negative, and

MDRDExhibit 6 - Types of QC Samples

Page 8: Quality Assurance Guidance for Conducting Brownfields Site
Page 9: Quality Assurance Guidance for Conducting Brownfields Site

ES-1

EXECUTIVE SUMMARY

any sites across the nation once used for Through careful planning, the Brownfields siteMindustrial and commercial purposes are assessment team develops a conceptual site

now abandoned or under-used. Some of these model and establishes and communicates thesites — often referred to as “Brownfields”— are team’s goals and how the team will reach thosecontaminated; others are perceived or suspected goals using a Quality Assurance Project Planto be contaminated. In 1993, the Environmental (QAPP).Protection Agency (EPA) created theBrownfields Economic Redevelopment Initiativeto empower States, Tribes, communities, andother stakeholders to work together in a timelymanner to assess and safely clean up Brownfieldsto facilitate their reuse.

This guidance document serves to informBrownfields site managers of important qualityassurance concepts and issues, and provides aroad map for identifying the type and quality ofenvironmental data needed to present a clearpicture of the site’s environmental conditions.

However, because of the wide range of site-specific issues, project goals, and the degree ofdifficulty that the Brownfields site assessmentteam may encounter, this document cannotanticipate every question likely to arise duringthe project. Therefore, when questions arise, it ishoped that the reader will turn to the extensivelyreferenced Internet and document resourcesprovided in Appendix C for more detailedinformation.

Brownfields Site Assessments

The Brownfields site assessment requires a teamapproach encompassing a range of multi-disciplinary knowledge and skills. TheBrownfields site assessment should providesufficient data of adequate quality to allowofficials to confidently make decisions about thepotential reuse of a Brownfields site.

Brownfields Site Assessment Process

The Brownfields site assessment processroutinely involves one or more of the followingactivities: a review of historical records; a fieldinvestigation including sample collection andanalysis; the assessment of data useability; andan evaluation of cleanup options and costs.

Quality Assurance/Quality Control

Brownfields team members should understandthe benefits of strong quality assurance (QA) andquality control (QC) procedures.

Quality assurance is an integrated system of management activities involving planning,implementation, assessment, reporting, andquality improvement to ensure that a process,item, or service is of the type and quality neededand expected. Quality control is the overallsystem of technical activities (including checkson sampling and analysis) that measure theperformance of a process against definedstandards to verify that they meet predefinedrequirements. Since errors can occur in the field,laboratory, or office, QC must be part of each ofthese functions.

Document Control

Document control is a crucial, but an oftenoverlooked, component of quality assurance. Itis critical to completion of the last stage of aBrownfields site assessment — review of datauseability.

Data useability review depends on thoroughdocumentation of predefined data specificationsand the related events that take place duringimplementation of the project

Data Quality Objectives (DQO) Process

Data credibility is one of the most importantchallenges facing municipalities, Tribes, andStates managing a Brownfields site assessment.An important planning tool used to help ensuredata credibility is the DQO process.

Page 10: Quality Assurance Guidance for Conducting Brownfields Site

ES-2

1. State the Problem: What is the purpose of theproject?

2. Identify the decision(s): What are theavailable options under consideration?

3. Identify Inputs in the Decision(s): Whatinformation is needed to make informed,defensible decisions?

4. Define the Boundaries of the Study: What isthe geographical extent and time and budgetconstraints for the project.

5. Develop a Decision Rule: Formulate “if...then”statements that relate the data to the decisionthey support.

6. Specify Limits on Decision Errors: Estimatehow much uncertainty will be tolerated in the sitedecision(s).

7. Optimize the Design: Identify the most cost-effective means to gather the data needed. Ifobstacles exist, reassess all the steps of the DQOprocess to refine decisions and goals until aworkable roadmap or decision tree is produced.

The DQO process allows the Brownfields site Application of the DQO process is actually aassessment team to determine the level of data “common sense” approach that translates broadquality needed for specific data collection consensus-based goals into specific tasks. In thisactivities, and to estimate the cost associated with way, the Brownfields team uses the DQOthese activities. process to prepare a road map, which can guide

SUMMARY OF THE DQO PROCESS

These seven steps are used during the planningof the Brownfields site assessment process toensure that field activities, data collectionoperations, and the resulting data meet theproject objectives. The DQO process is iterative,and the output of one step may affect prior steps. This may lead the Brownfields site assessmentteam to revisit some previous steps but willultimately lead to a more efficient data collectiondesign.

the project, inform the public and otherinterested parties, and bring newcomers to theproject quickly up to speed.

Quality Assurance Project Plan (QAPP)

The Environmental Protection Agency requiresthat all Federally funded environmentalmonitoring and measurement efforts participatein a centrally managed quality assuranceprogram.

Any Brownfields site assessment teamgenerating data under this quality assuranceprogram has the responsibility to implementminimum procedures to ensure that the precision,accuracy, and completeness of its data are knownand documented.

To ensure this responsibility is met uniformly,each Brownfields site assessment team shouldprepare a written QAPP. The QAPP is a formaldocument describing in comprehensive detail thenecessary QA and QC, and other technicalactivities that should be implemented to ensurethat the results of the work preformed will satisfythe stated performance criteria. The QAPP documents the project planningprocess (i.e., the DQO process), enhances thecredibility of sampling results, produces data ofknow quality, and saves resources by reducingerrors and the time and money spent correctingthem.

This guidance document includes a descriptionof a QAPP and template forms to prepare one.

Page 11: Quality Assurance Guidance for Conducting Brownfields Site

11

CONCEPTSQA - (Quality Assurance) an integrated system ofplanning, quality control, assessment,improvement, and reporting.QC - (Quality Control) a system of technicalactivities that measure and control quality so thatdata meet users’ needs.QAPP - (Quality Assurance Project Plan) adocument containing detailed procedures forachieving data quality; generally prepared for allEPA environmental data collection activities andapproved prior to data collection.DQOs - (Data Quality Objectives) quantitative andqualitative statements that define the type,quantity, and quality of data needed to support thesite decision and acceptable levels of uncertaintyin the data that form the basis for the decision.DQO Process - a systematic planning tool thatfocuses on investigative goals and resultantdecisions to help decision-makers plan to collectthe type and quality of data that meet theacceptable level of uncertainty.

SECTION 1. INTRODUCTION

any sites across the nation once used for This document provides municipalities, Tribes,Mindustrial and commercial purposes are and States with guidance for an overall approach

now abandoned or under-used. Some of these to quality assurance for Brownfields sitesites — often referred to as “Brownfields”— are assessments. It includes a description of acontaminated; others are perceived or suspected Quality Assurance Project Plan (QAPP) andto be contaminated. In 1993, the Environmental forms necessary to prepare one. (See AppendixProtection Agency (EPA) created the A for the QAPP template.) The guidanceBrownfields Economic Redevelopment Initiative presented here provides a road map forto empower States, Tribes, communities, and identifying the type and quality of environmentalother stakeholders to work together in a timely data needed to present a clear picture of themanner to assess, and safely clean up environmental conditions of the site. KnowingBrownfields to facilitate their reuse. Successful the quality of environmental measurement dataBrownfields projects can help reverse the spiral will allow municipalities, Tribes, and States toof unaddressed contamination and its related make site redevelopment decisions that are bothproblems and help maintain deterrents to future technically sound and financially feasible. contamination.

Because concerns about future environmental instructions on how to conduct all aspects of arisks and liability can hinder redevelopment, the Brownfields site assessment. Instead it outlinesBrownfields Initiative seeks to minimize the what the various tasks are, and the expertiseuncertainty surrounding actual or perceived necessary for the tasks. For those unfamiliarcontamination associated with these sites. with Brownfields site assessments, it will provideEstablishing and following comprehensive the background necessary to communicatequality assurance (QA) procedures during the effectively with other Brownfields teamcollection of environmental data relating to site members, contractors, and laboratories.contamination helps to minimize uncertainty.

Section 2 describes the range of environmentalactivities expected during Brownfields siteassessments. Section 3 outlines and providesexamples of the data quality objectives (DQO)process. Section 4 discusses sampling designstrategies and the importance of QA and qualitycontrol. Section 4 also builds on information inSection 3 by providing more specific samplingdesign examples and other information. Section4 refers the reader to the QAPP templateprovided in Appendix A — a series of forms thatcan be used to develop a site-specific QAPP. Section 4 discusses the assessment of collecteddata including whether they meet the objectivesof the Brownfields site assessment as definedduring the DQO process. This documentcontains a glossary (Appendix B) and a list ofsources of information (Appendix C). Eachsection introduces a set of new concepts.

This document does not present step-by-step

Page 12: Quality Assurance Guidance for Conducting Brownfields Site
Page 13: Quality Assurance Guidance for Conducting Brownfields Site

33

CONCEPTSBrownfields Project Team - term applied to thegroup of individuals essential to the success ofthe project; the team is collectively skilled inanalytical chemistry, environmental engineering,statistics, economics, public policy, etc. Brownfields Site Assessment - a process todetermine the feasibility of site redevelopmentthrough various activities, including backgroundinvestigations, site sampling and analysis, andevaluation of cleanup options and costs.

SECTION 2. BROWNFIELDS SITE ASSESSMENTS

Brownfields site assessment should provide contamination. The team can obtain thisAsufficient data of adequate quality to allow information through review of historical records,

officials to confidently make decisions about the and through interviews with personnel who maypotential reuse of a Brownfields site. The site have knowledge of past waste generation andassessment should minimize the uncertainties disposal practices at the site. A site visit is alsoinherent in environmental investigation and helpful for identifying visible signs offocus on producing data relevant to site-specific contamination.objectives.

Beneficial reuse of Brownfields sites requires a follows the background investigation. Samplingteam approach encompassing a range of multi- and analysis focuses on those areas of concerndisciplinary knowledge and skills, including identified during the background investigation. expertise in analytical chemistry, environmental The field sampling activities identify theengineering, geology, sample collection, contaminants (e.g., arsenic in soil), thestatistics, public policy, and economics. Public concentrations of those contaminants (e.g., 50satisfaction with the project will also depend on parts per million (ppm)), and the areas ofthe Brownfields project team’s (team) efforts to contamination that should be addressed beforebuild a consensus and meet the interests of the redevelopment can begin (e.g., all areas ofpublic, local community, and commercial sector. contamination greater than 20 ppm arsenic in

Brownfields Site AssessmentPurpose

Brownfields site assessments are conducted tofacilitate the reuse of properties by determiningwhether contamination exists onsite, and if so,the characteristics of contamination, includingthe threat it poses, potential solutions forcleanup, and the cost of solutions necessary toprepare the site for redevelopment.

Brownfields Site AssessmentProcess

A Brownfields site assessment routinely involvesone or more of the following activities: abackground investigation; a field investigationincluding sample collection and analysis; anevaluation of cleanup options and costs; and theassessment of the useability of resulting data.

Frequently, the first step is to conduct a sitebackground or historical investigation to identifypast uses of the property, including types andamounts of chemicals that may have been usedonsite and waste generation and disposalactivities that may have contributed to

A sampling and analysis investigation typically

soil).

Another activity is estimating the cost of cleanupoptions based on future uses and redevelopmentplans. Information on cleanup options can befound on EPA’s CLU-IN website located athttp://www.clu-in.com/supply1.htm.

The next activity is assessing whether the dataare sufficient for their intended purpose. Forexample, are they sufficiently reliable todetermine that the site does not require cleanupprior to redevelopment?

This document focuses on preparing for aBrownfields site assessment (see Exhibit 1below). Careful planning is critical to ensurecollection of useful data at minimum cost.

Page 14: Quality Assurance Guidance for Conducting Brownfields Site

44

Planning tools that will help the team produce extensive sampling events at individual sites tothe data they need include the idea of a documenting the technical feasibility of cleanupconceptual site model, the data quality objectives options. The following sections present(DQO) process, and some important statistical systematic methods for planning a cost-effectiveconcepts. These tools help the team complete Brownfields site assessment with appropriatethe QAPP, which establishes and communicates quality assurance that will produce data ofthe team’s goals and how the Brownfields site adequate quality to meet project goals.assessment will reach those goals.

The QAPP developed for Brownfields siteassessments should combine planning for theentire project — management, sampling,analysis, data review/evaluation, and reporting— under one cover. The QAPP should beshared with all members of the Brownfieldsproject team and contractors performingsampling and analytical work. During thepreparation of the QAPP, the team should rely onits contractors and the laboratory it has chosen toperform analyses to provide assistance whereneeded.

Planning for data review is especially importantto an effective QAPP. Data review involvescomparing the actual data generated during siteassessment against the DQOs established duringproject planning.

It is expected that a Brownfields team may useFederal funding for various Brownfields siteassessment activities. These activities can rangefrom developing an inventory of potential sites to

EXHIBIT 1THE BROWNFIELDS

SITE ASSESSMENT PROCESS

Page 15: Quality Assurance Guidance for Conducting Brownfields Site

55

40 CFR 31.45 Quality AssuranceIf the grantee’s project involves environmentally relatedmeasurements or data generation, the grantee shalldevelop and implement quality assurance practicesconsisting of policies, procedures, specifications,standards, and documentation sufficient to produce dataof quality adequate to meet project objectives and tominimize loss of data due to out-of-control conditions ormalfunctions. [53 FR 8076, Mar. 11, 1988]

1. State the Problem: What is the purpose of theproject?

2. Identify the Decision(s): What are theavailable options under consideration?

3. Identify Inputs in the Decision(s): Whatinformation is needed to make informed,defensible decisions?

4. Define the Boundaries of the Study: What isthe geographical extent, time, and budgetconstraints for the project?

5. Develop a Decision Rule: Formulate “if...then”statements that relate the data to the decisionthey support.

6. Specify Limits on Decision Errors: Estimatehow much uncertainty will be tolerated in the sitedecision(s).

7. Optimize the Design: Identify the most cost-effective means to gather the data needed. Ifobstacles exist, reassess all the steps of the DQOprocess to refine decisions and goals until aworkable road map or decision tree is produced.

SECTION 3. DATA QUALITY OBJECTIVES

ata credibility is one of the most important framework and examples of its application to aDchallenges facing municipalities, Tribes, hypothetical Brownfields site assessment.

and States managing Brownfields assessments. In accepting a Brownfields grant, the recipient An overview of the DQO process is summarizedhas agreed to comply with the quality assurance in Exhibit 2 below, and a more thorough(QA) provisions set forth in 40 CFR 31.45 (see discussion will follow. The DQO process isbox). iterative, and the output of one step may affect

Members of the Brownfields team who areinvolved in project planning, sample collection,laboratory analysis, data review, and dataassessment should understand the benefits of QAand quality control (QC) procedures. Theseprocedures will be used during the planning ofthe Brownfields site assessment to ensure thatfield activities and data collection operations, andthe data they generate, meet the objectives of theproject. The DQO process allows the team todetermine the level of data quality needed forspecific data collection activities, and to estimatethe costs associated with these activities.

The DQO process is actually a “common sense”approach to translate broad consensus-basedgoals into specific tasks. Only after defining theoverall goals of the project can the team identifythe tasks that will produce the data needed tosupport decision-making at the end of theproject. In this way, the team uses the DQOprocess to prepare a road map, which can guidethe project, inform the public and other The first step in the DQO process is to develop ainterested parties, and bring newcomers to the conceptual site model. A conceptual site modelproject up to speed. provides an understanding of the site based on

It is not possible to provide a common set of assessment. It identifies historical uses of theDQOs applicable to Brownfields site assessments site, potential exposure pathways, cleanupbecause site characteristics, decisions, and data concerns, and future land use. As data from thequality needs vary from site to site. However, Brownfields site assessment become available,the following is an overview of the DQO they are used to refine the model and give the

prior steps. This may lead the team to revisitsome previous steps but ultimately will lead to amore efficient data collection design.

EXHIBIT 2 SUMMARY OF THE DQO PROCESS

currently available data prior to the site

Page 16: Quality Assurance Guidance for Conducting Brownfields Site

66

CONCEPTSConceptual Site Model - the conceptual site modelis dynamic in nature. It is initially based on thebest-available information and is updated asadditional data becomes available during the siteassessment.Uncertainty - the probability of making anerroneous decision based on available data.Null Hypothesis - an assumption that is tested bya scientific investigation (e.g., environmentalinvestigation); the baseline condition assumed tobe true in the absence of contrary evidence or analternative hypothesis (e.g., the earth is flat unlessproven round, the site is dirty unless provenclean); generally based on the case with the leastdesirable consequences.Decision Error - an incorrect conclusion about asite (e.g., deciding site cleanup is not neededwhen it really is) caused by using data that are notrepresentative of site conditions due to samplingor analytical error.False Negative Decision Error - accepting the nullhypothesis when it is actually false. Implications ofthe false negative decision error depend on thestructure of the null hypothesis. False Positive Decision Error - rejecting the nullhypothesis when it is actually true. Implications ofthe false positive decision error depend on thestructure of the null hypothesis. Screening Assessments - short site inspectionsthat may have already been conducted, or that theteam may need to conduct, to gather sufficientdata for an effective sampling and analysis plan. The data gathered will be useful in the initialconceptual site model.

team a clearer picture of the site. A well-defined, Some stakeholders may demand a statisticaldetailed conceptual site model will help the team expression of certainty in cases where theidentify data necessary to support decisions about planned reuse will allow the public unrestrictedthe property. access to potentially contaminated media at a site

To determine the kinds of data to be collected, critical decisions, for example, industrial reusethe DQO process translates the goals of the where much of the property is used for buildingsBrownfields site assessment into qualitative and and parking lots, less certainty may be acceptablequantitative statements that define the type ofdata needed to support decisions and that specifythe amount of uncertainty (i.e., the chance ofdrawing an incorrect conclusion) the decision-maker is willing to accept. For example,different future uses may require that differentenvironmental standards be met. Excessivesampling to detect contamination below thelevels required for the planned future use canwaste resources.

Keeping these goals in mind, the DQO processguides the team to define items such as thenumber and types of samples to be collected,analytical detection limits, and certainty. Afterthese parameters (the DQOs) are established,analytical methods and instrumentation can beselected to develop the most cost-effectivesampling design that will meet these objectives.

Uncertainty — the chance of drawing anincorrect conclusion — is addressed in Step 6 ofthe DQO process. For critical decisions, such aswhether a Brownfields site can be safely reusedas a public recreation area, the degree ofcertainty that the site will not pose a threat tohuman health must be quite high in order to gainpublic acceptance. Because the term “quitehigh” is indefinite, the level of safety at the sitemay be in question. Quantifying the amount ofuncertainty present in a decision clarifies howconfident the team can be that it is correct. Forexample, some decision-makers may require certainty of 90% to 95% before they feelcomfortable making the decision to allow reuseof a site for public recreation. The only way tomake such a definitive statement about certaintyis by using a statistical sampling design. Thistype of design is discussed in Section 4 of thisdocument.

(e.g., residential or recreational reuse). For less

or a qualitative statement may be sufficient.

The DQO process controls the potential formaking decision errors due to uncertainty in thedata by helping the team set limits on theprobability of making a decision error (i.e.,decision error rate). A hypothesis about thecondition of the site is the basis for thisdetermination. The hypothesis, referred to as thenull hypothesis, should be designed to guardagainst making the decision error that has themost undesirable consequences. The null

Page 17: Quality Assurance Guidance for Conducting Brownfields Site

77

hypothesis is derived from information in the The DQO process offers several benefits. ByStatement of the Problem (Step 1 of the DQO using the DQO process, the team can establishprocess), including what is known about the site, criteria for determining when data are sufficientthe projected site reuse scenario, and the for site decisions. This provides a “stoppingresources available for study and cleanup. rule” — a way for the team to determine whenTypical null hypotheses are the following: “the they have collected enough data of sufficientsite is clean enough” or “the site is too dirty for quality to achieve the desired objectives. Inthe reuse scenario.” The team will identify an addition, the DQO process helps the teamalternative hypothesis contrary to the null establish an adequate level of data review andhypothesis and the sampling and analysis plan is documentation. Data review is a process ofthen designed to test the null hypothesis by assessing data quality based on writtenproviding strong evidence to the contrary. performance-based acceptance criteria (e.g.,

Generally, the more severe consequences ofmaking the wrong decision at a Brownfields siteoccur when the site is actually contaminatedabove established health limits, but the decision-maker acts on data that erroneously indicate that Another benefit of the DQO process is that itthe site is clean. In this situation, human health focuses studies by clarifying vague objectivescould be endangered if reuse occurs without and identifying the decisions that should be madecleanup. Therefore, the null hypothesis at a prior to the selection of sampling and analysisBrownfields site is likely to be “the site is too parameters. This gives the team confidence thatdirty for the reuse scenario,” and the site the data collected will support the decisionsassessment is then designed to show that the site concerning redevelopment of the site.is clean, which is the alternative hypothesis. Additional explanation is provided under Step 6of the DQO process.

Because of the limited funding for Brownfieldssite assessments, it may not be possible to collectdata sufficient to achieve a desired level ofcertainty in site decisions. Because increasingcertainty usually requires the collection of moresamples, it can be costly. If the team can affordto collect and analyze only a limited number ofsamples, decision-makers must take care tocommunicate only what they know about theenvironmental conditions at a site and howconfident they are in that knowledge.

Limited funding highlights the need for a well-planned investigation that capitalizes on time-and cost-saving technologies. By following asystematic planning process, such as the DQOprocess, decision-makers will be able to strikethe best balance between what they want to knowabout a property and what they can afford toknow about a property given the realities of theirbudget.

samples must be analyzed by the laboratorywithin a specific period of time referred to as theholding time). Data review also determineswhether the data satisfy the predefined DQOs.

The DQO Process

The outputs of the DQO process include theinformation the team will need to complete mostof the Quality Assurance Project Plan (QAPP). Forms D through N of the QAPP templateprovide space for describing the sampling andanalysis plan (SAP). (See Appendix A.) Thepre-defined objectives and decision statementsthat are a product of the DQO process form thebasis for the SAP. Section 4 of this documentdiscusses elements of the QAPP and SAP inmore detail with reference to the correspondingforms in Appendix A.

Step 1: Stating the Problem

The first step of any decision-making process isto define the problem that has prompted thestudy. The team will develop a concise andcomplete description of the problem, which canbe documented on form C of the QAPP template. This description provides the basis for DQOdevelopment and is built on informationcollected during the background investigation.

Page 18: Quality Assurance Guidance for Conducting Brownfields Site

88

Project planning is perhaps the most critical The conceptual site model should include mapscomponent of the Brownfields assessment and site diagrams that illustrate structures andprocess, as it allows team members to determine areas of potential contamination includingfully the goals and scope of sampling events and locations of chemical handling, storage, andthe resources necessary to accurately characterize disposal. If facility records are unavailable, thethe site. Therefore, it is important that all team may find information through State,interested parties (including project managers, Resource Conservation and Recovery Actengineers, chemists, field sampling personnel, (RCRA), and National Pollutant Dischargestatisticians, local government officials, and the Elimination System (NPDES) programs, andpublic) be involved in the project from the local records offices.conceptual design stage. Roles andresponsibilities can be documented on forms Aand B of the QAPP template.

The conceptual site model is an important part of agencies. Some States have laws that requireStep 1, Stating the Problem. The conceptual site property owners to disclose the available reportsmodel should be updated as additional to prospective purchasers. These reports mayinformation becomes available, but should answer some of the team’s questions about a site.initially illustrate the following:

� Potential chemicals of concern; assessments are EPA’s Preliminary Assessment� Media in which these chemicals may be (PA) and ASTM’s Phase I Assessment (Phase I),

present and to which they may migrate which are briefly described in Exhibit 3 below. (surface and subsurface soil, surface water, Because the scope of these assessments varies,groundwater, and onsite structures); they may not answer all of the questions the team

� Whether human or environmental receptors wants to address. For example, to find a(i.e., targets) at or near the site may be discussion about offsite receptors, the team willexposed to contamination; and want to look at a PA rather than a Phase I.

� Current and anticipated land use.

Some information may be available from currentand past owners, lending institutions, and/orenvironmental regulatory and real estate

Two common examples of “screening”

EXHIBIT 3COMPARISON OF PRELIMINARY ASSESSMENTS AND DUE DILIGENCE SITE ASSESSMENTS

Preliminary Assessment (PA) Due Diligence - ASTM Phase I

A site may consist of the legal property boundaries and other A site within the context of a Due Diligence study is limited toareas where contaminants have come to be located. the legal boundaries of the subject property.

Level of effort is approximately 120 hours. Level of effort is approximately 40 hours.

Examines all available site-related documents during file Examines all “practically reviewable” and “reasonablysearches of Federal, State, and local agencies. ascertainable” site-related documents.

Surveys on- and offsite pathway potential targets. Surveys onsite targets within property boundaries.

Considers hazardous substance migration to offsite human Considers only onsite targets and impacts to the site from on-and environmental targets. and offsite sources.

Identifies sources and estimates extent of contamination on- Identifies sources and estimates extent of contamination onlyand offsite. within property boundaries.

Does not evaluate sources within secure buildings. Evaluates sources within all buildings (e.g., asbestos).

Petroleum products are not considered. Petroleum products are considered.

Source: Site Assessment: A Comparison of Federal Program Methods and Commercial Due Diligence. Journal of Environmental Law & Practice.p.15-25 March/April 1997

Page 19: Quality Assurance Guidance for Conducting Brownfields Site

99

Previously collected analytical data areparticularly important to the conceptual sitemodel. This information may identify some ofthe chemicals onsite and their locations. It alsocan indicate the variability of contaminationonsite. The sampling and analytical methods thatwere used previously may also prove helpful tothe Brownfields sampling and analysis plan.

Especially when using previously collectedanalytical data, the team needs to review theinformation for accuracy and completeness. Ifthe data are several years old, reported analyticaldata and site features may not represent currentsite conditions.

Aerial photos are often helpful in reconstructingthe history of a site with multiple prior owners. Aerial photos are available throughhttp://mapping.usgs.gov.

The team should also ensure that the conceptualsite model illustrates site conditions that maylead to an unacceptable threat or that are basedon current and projected future land uses. Forexample, if local groundwater is used inhouseholds and businesses, the physicalcharacteristics of the soil and local hydrogeologyshould be understood to better assess the threat togroundwater resources.

Finally, the team should document the availableresources and relevant deadlines for the study inthe problem statement. This description shouldspecify the anticipated budget, availablepersonnel, and contractual vehicles, whereapplicable.

An example problem statement follows:

The Springfield team is considering Brownfieldsredevelopment of a site comprised of 10 acres ofwaterfront property historically used for truckrepair. The site, depicted in Exhibit 4, iscurrently fenced and abandoned. Aerial photosindicate a repair garage located on the westernportion of the property, an office and employeeparking on the east end near the water, andtruck storage between these two areas. Acentral fence separates the garage and truckstorage area from the office section of theproperty.

Facility records indicate that an underground fuelstorage tank is located behind the northeastcorner of the garage. A previous siteassessment mentions the removal of four barrelsof toluene by the County health department in1983 and the presence of a second storage tankfor spent solvent disposal south of the garage. Analytical results from the previous siteassessment indicate that total petroleumhydrocarbons (TPH — constituents of fuel) andvolatile organic compounds (VOCs —constituents of solvents) are present in soils nearthe areas of former storage of toluene and thespent solvent tank. Interviews with pastemployees indicate that trucks were washed onoccasion in the truck parking area of the site.

The site appears to present a threat viaexposure to soils; groundwater and surfacewater may also be contaminated. The site islocated in a commercial/residential zone. Thecity wants to redevelop the site for commercialand public use.

EXHIBIT 4THE SPRINGFIELD SITE

Step 2: Identifying the Decision

While environmental field investigations aredesigned to satisfy a broad array of objectives,the goal of a Brownfields site assessment is tocollect adequate environmental data for decision-makers to determine if the site is suitable for aspecific reuse. This determination may requireseveral separate but related decisions.

Page 20: Quality Assurance Guidance for Conducting Brownfields Site

1010

The “decision statements” usually take the form The team will identify the characteristics of theof questions that the study will attempt to site that need to be measured based on aanswer. Form C of the QAPP template threshold value (i.e., a drinking water action(Appendix A) provides space to document these level) that provides the criterion for choosingdecisions. The decision statements are important among alternative actions. Regulatory standards,because they indicate alternative actions and such as State drinking water standards, usuallydecision performance criteria in later steps of the form the basis for action levels. If no regulatoryDQO process. threshold or standard can be identified during

Decision Statements for the Springfield sitemight include the following:

Historical information indicates that the easternportion of the property has not been used forchemical management activities; the waterfrontsection may be reused as a park or otherpublicly accessible facility. To assess thefeasibility of this option, the team will make thefollowing decisions: Will the site need to becleaned up before it can be reused as a park? Ifcleanup is too expensive, can the site beredeveloped for another use?

Because historical information indicates that thewestern portion of the site is at least partiallycontaminated, this area is being considered forcommercial reuse. The team will make thefollowing decisions to facilitate commercialfinancing: Is the site clean enough to attract aprivate sector developer? Have issues ofconcern to lenders been addressed? What levelof cleanup or other actions is necessary toanswer the questions of developers andlenders?

For the Springfield site, the team will make twodistinct decisions based on different projectedreuse options for different portions of the site.

Step 3: Identifying Inputs to the Decision

The team should identify the information needed to resolve the decision statement(s) anddetermine how this information will be obtained. For example, if groundwater use is aconsideration in the site reuse scenario, the teamshould identify how samples of groundwater willbe used to support the reuse decision. The teamshould review the conceptual site model to learnwhether existing groundwater data provide theinformation needed for the study and identifydata gaps in the model.

this step, the team should identify informationneeded to develop a realistic concentration goal. This information will be critical to the finalsampling design.

Later, in Step 7 of the DQO process, Optimizingthe Design, the input identified during this step isreviewed and refined. The team should be awarethat the decisions made during this step are“draft” and may be changed during optimization. The team will make the final decision on whichanalytical methods to use in Step 7.

The products of this step include a list ofinformational input needed to make the decisionand a list of environmental characteristics thatwill be measured. In essence, the output of thisstep are actually the input to the decision.

Some input that might be identified for theSpringfield site include the following:

All soil samples in the waterfront area slated forpublic access reuse will be collected to depthsspecified in State regulations. The samples willbe analyzed for chemicals likely to be presentusing a State, EPA, or other analytical methodthat meets the objectives of the Brownfieldsassessment. To rule out the possibility of othercontaminants of concern, one representativesample will be analyzed for a broad spectrum ofcompounds, including those whose presence isconsidered unlikely.

Because previous analytical data from thewestern portion of the site indicate contaminationwith TPH and VOCs in the former toluenestorage and spent solvent storage areas, fieldtechniques will be used to confirm contaminationin these areas. Data gaps for this portion of thesite include the following: whether theunderground storage tank has leaked and if so,whether contamination has reachedgroundwater; whether contamination exists inthe truck washing area; and remaining areas of

Page 21: Quality Assurance Guidance for Conducting Brownfields Site

1111

the site for which no previously collected dataexist. Soils in these areas will be analyzed for acombination of TPH and VOCs using a State,EPA, or other analytical method that meets theobjectives of the Brownfields assessment. Torule out the possibility of other contaminants ofconcern, at least one representative sample willbe analyzed for a spectrum of compoundsincluding those whose presence is consideredunlikely. Groundwater at the site will becollected and analyzed for TPH and VOCs usinga State, EPA, or other analytical method thatmeets the objectives of the Brownfields siteassessment.

Step 4: Defining the Boundaries of the Study

The boundaries of the study refer to both spatialand temporal boundaries. To define theboundary of the decision, the team shouldidentify the geographic area within which alldecisions apply. A spatial boundary could be theproperty boundary, a portion of the property, potential exposure areas, or an area with aspecific reuse projection. For example, a soil-sampling boundary may include the top 12inches of soil where truck washing reportedlytook place.

The scale of decision-making is the smallest area,volume, or time frame of the media for which the team wishes to make a decision. For example, todecide whether to clean up the top 2 feet ofsurface soil, the scale of decision-making mightbe related to the method of cleanup; ifcontaminated soil will be hauled in a 5-toncapacity truck, the boundary of decision-makingmight be a 2-foot deep, 65-square foot area. Thisexample is based on the practicalities of cleanuprather than exposure scenarios.

The team should also define the temporalboundaries of the decision. The team may find itimpossible to collect data over the full timeperiod to which the decision will apply. The team will have to determine the most appropriatepart of that period for gathering data that reflectthe conditions of interest.

Practical boundaries that could also affectsampling are identified in this step. Forexample, seasonal conditions or the

unavailability of personnel, time, or equipmentmay make sampling impossible. Form D of theQAPP template contains space for the team todocument the boundaries of the investigation,including a project timeline.

Boundaries for the Springfield studies mightinclude the following:

The park reuse decision applies to the area ofthe site east of the fence that divides the garageand truck parking area from the office area. Thisportion of the property is not further subdivided.

The commercial reuse decision applies to theareas west of the fence. This portion of theproperty is further subdivided into thegroundwater and soil boundaries. The soil isfurther subdivided into the subsurface soilaround the underground storage tank, thesurface soils in the truck washing area, and thesoils in the remaining portion of the site.

Temporal boundaries are addressed byperforming studies when personnel areavailable, during the dry season, followingnotification of the local community and receipt ofauthorization from site owners.

Step 5: Developing a Decision Rule

The purpose of developing a decision rule is tointegrate the output from the previous steps ofthe DQO process into a statement that estimates the parameter(s) of interest, delineates the scaleof decision-making, specifies the action level,and describes the logical basis for choosingamong alternative actions.

A decision rule is usually a comparison of astatistical parameter of interest (such as theaverage level of arsenic in soil, or the maximumlevel of toluene in groundwater) to a specificaction level. The action level is the contaminantthreshold that defines the conditions aroundwhich the team should select among thealternative actions and/or take differentdirections to solve the problems. For example, ifthe action level is exceeded, the team maychoose to clean up the site.

The output for this step is an “if...then” statementthat defines the conditions that would cause the

Page 22: Quality Assurance Guidance for Conducting Brownfields Site

1212

team to choose among alternative courses of sampling and measurement errors is called theaction. Form D of the QAPP template provides total study error. space to record these decision rules.

Decision rules for the Springfield site includethe following:

If the average levels of TPH and VOCs in soilsamples are less than selected action levels,then the redevelopment project can proceed.

If the average levels of TPH and VOCs soilsamples are higher than selected action levels,then cleanup to the action level is required priorto reuse.

If the site assessment indicates that groundwaterhas become contaminated from site activities,the team should contact the State to discuss theimpact on redevelopment scenarios for thewestern portion of the site.

Step 6: Specifying Limits on Decision Errors

Because of the limitations of environmentalsampling and analysis, the team runs the risk of making the wrong decision because ofincomplete information. Sampling may notcapture all of the variations in concentrations,and analyses can only estimate the “true” value. The team must therefore develop means to limitor control the impact of errors in estimations onthe likelihood of making a decision error. Theselimits should be incorporated into the samplingand analysis plan during Step 7 of the DQOprocess.

Decision-makers are interested in knowing thetrue state of some feature of the environment(e.g., the average concentration of arsenic in thetop twelve inches of soil). The measurementdata that describe this feature can be in error. Sampling “error” occurs when the samplingscheme (which determines the samplinglocations) does not adequately detect thevariability in the amount of contaminant in theenvironmental matrix from point to point acrossthe site. Measurement errors can occur duringsample collection, handling, preparation, andanalysis when standard procedures as describedin the SAP are not followed. Report preparationis another source of error. The sum of the

As mentioned earlier, sampling and measurementerrors can lead to errors in data thereby causingthe decision-maker to select the wrong course ofaction. The DQO process helps the team controlthese errors through development and testing ofthe null hypothesis and selection of limits forerroneously accepting or rejecting the nullhypothesis.

Although some Brownfields sites are onlyperceived to be contaminated, many may becontaminated at levels exceeding health-basedaction levels. In selecting the null hypothesis fora Brownfields site, the team should choose thecircumstances that can have the most severeconsequences. The null hypothesis in Step 6 willusually be “the site is contaminated” and needssome level of cleanup (i.e., the site iscontaminated above certain action levels). Totest this hypothesis, the team must provide ampleevidence to the contrary or prove that the site isclean (i.e., any contamination present is belowcertain action levels). Erroneously accepting thenull hypothesis as true (false negative decisionerror) may unnecessarily increase the cost of sitecleanup because decision-makers may believethat action is warranted when it is not. Erroneously rejecting the null hypothesis (falsepositive decision error) can increase the risk ofexposure at a property because a decision-makermay believe that no action is warranted when itis.

In contrast to the situation above, if the nullhypothesis were “the site is clean enough,” theteam would need to show the presence ofcontamination above certain action levels. Inthis case, erroneously accepting the nullhypothesis (false negative decision error) canincrease the risk of exposure at a propertybecause a decision-maker may believe that noaction is warranted when it is; whereas,erroneously rejecting the null hypothesis (falsepositive decision error) may increase the cost ofsite cleanup. The amount and quality of datacollected will depend on how the team plans tocontrol decision error rates.

Page 23: Quality Assurance Guidance for Conducting Brownfields Site

1313

In the Springfield site example, because theprojected reuse of the eastern portion of theproperty involves its unrestricted public use, thenull hypothesis is that the eastern portion of theproperty is dirty. The team will need to show itis clean. This hypothesis reflects what most ofthe public will probably assume before anyenvironmental studies occur at the site. Theteam already has data showing that the westernportion of the site is contaminated.

To completely avoid any decision errors (100%certainty), the team would have to sample allsurface soil related to the decision whether toclean up the surface soil at the site. Because thisis financially infeasible, the team must collect anumber of representative samples from the areain a manner that reduces the decision error rate. The analytical results from these samples will betranslated into an estimation of contamination onpart of or the entire site. The more samplescollected, the greater the certainty the team willhave in its decision; however, the more samplescollected, the more costly the investigation. Theteam needs to balance the level of certaintydesired with the cost of that certainty (the cost ofadditional sample collection and analysis). FormD of the QAPP template can be used todocument limits on decision errors.

Limits on the decision errors for the Springfield site may include the following:

Because of public access reuse projections,reuse of the eastern portion of the property mayresult in children coming into contact with sitesoils; therefore, the null hypothesis is that thesite is contaminated. The probability of falsepositive decision errors (erroneously rejectingthe null hypothesis or deciding that the site isclean when it is not) should therefore beminimized as much as possible. Errors thatincrease the probability of leaving soils in placewhen they contain substances at levels greaterthan the action level — false positive decisionerrors — will be considered acceptable no morethan 10% of the time.

Errors that increase the probability of cleaningup soils when that action is not required — falsenegative decision errors — will be consideredacceptable only 10% of the time.

Limits on the probability of errors in decision-making for the western portion of the site are notneeded because the sample design is intendedto simply define the boundaries of knowncontamination.

Step 7: Optimizing the Design

The team should evaluate the cost of samplingdesign options that meet the DQO constraintsand select the most resource-effective option. The chosen alternative that meets the DQOs maybe the lowest cost alternative, or it may be arelatively low-cost design that still performs wellwhen design assumptions change. Because therole of statistics is very important whendeveloping a sampling design that achievesspecified decision error rates, the team memberwith statistical expertise should be consulted atthis stage. (See Section 4 for a more thoroughdiscussion of sampling strategies.)

The team should review the output of theprevious steps to determine exactly how theselected limits on decision errors will define therequired number and location of samples and thetypes of analyses. This step frequently involvesrefinement of initial design parameters.

Many different strategies could be employed tooptimize the investigation at the Springfield site;only a few are presented here. Section 4describes more options available to the team toreduce the project’s sampling and analysis costswhile meeting DQOs.

Optimizing the SAP for the Springfield site:

The waterfront portion of the property is not likelyto be significantly contaminated because waste-related activities were neither documented norlikely conducted there. It should therefore bepossible to reject the null hypothesis — rejectthe assumption that the site is dirty by showingthat the site is clean — by collecting 40 to 50samples using a statistical design that maintainsa 10% false positive decision error rate. Thisallows a 10% chance that the decision-makerwill consider the site clean when it is not.

For the western portion of the site, the team willuse field techniques to confirm previouslydetected levels of VOCs in the former toluene

Page 24: Quality Assurance Guidance for Conducting Brownfields Site

1414

storage and spent solvent storage areas. Thesame techniques will be used in areassuspected to be contaminated with VOCs andTPH — the areas around the undergroundstorage tank and truck washing area.

This example assumes the variability in sitecontamination is approximately 25%. The effectof this factor on the sampling design is explainedin Section 4. Errors may also result frommishandling of samples or improper fieldprocedures. Section 4 of this documentdiscusses the effect of variability, cross-contamination and other problems on decisionerrors and how quality control samples can beused to identify and control the impact of someof these effects.

Estimating Costs

The cost of conducting a Brownfields siteassessment is driven by the adequacy of availablehistorical data, the type and level ofcontamination, the site assessment technologiesused, and the property’s projected site reuse. Estimating costs for Brownfields siteassessments creates unique challenges. Althoughthe tendency may be to expedite the planningperiod, care must be exercised to ensure that theinterest of site owners, investors, purchasers, andlenders is maintained. The cost estimates shouldtherefore be developed quickly while preservingthe highest level of accuracy possible. As statedearlier in this section, increasing the quality ofenvironmental measurements will likely increasethe cost of a Brownfields site assessment.

Developing Brownfields site assessment costestimates is hindered by a lack of detailed costestimating literature that applies to typical

Brownfields sites. The majority of availableinformation is based on large Federalgovernment and private sector sites such asabandoned rail yards and steel mills, not thesmaller former industrial sites such as automotiverepair shops or metal finishing facilities. Someguides that may assist in the development of costestimates are listed in Appendix C.

Once the team has selected the final samplingdesign based on all considerations, includingcost, it should properly document the design. This will protect the efficiency and effectivenessof predefined field sampling procedures, qualitycontrol procedures, and statistical procedures fordata analysis. Forms D through N of the QAPPtemplate provide space for the final samplingdesign. All drafts of the sampling and analysisdesign generated during the DQO process shouldbe discarded once the final sampling design isselected and documented.

A complete discussion of DQOs and their use indeveloping the SAP can be found in thefollowing documents available through EPA’sQuality Assurance Division (QAD) website(http://es.epa.gov/ncerqa/qa/qa_docs.html):

Guidance for the Data Quality ObjectivesProcess, September 1994. EPA QA/G-4: EPA600-R-96-055.

Guidance for Quality Assurance Project Plans,February 1998. EPA QA/G-5 EPA 600-R-98-018.

EPA Requirements for Quality AssuranceProject Plans, Draft Final, October 1997. EPAQA/R-5 (final publication pending).

Page 25: Quality Assurance Guidance for Conducting Brownfields Site

1515

CONCEPTSSampling Design - scheme for sample collectionthat specifies the number of samples collected ina biased and/or unbiased pattern, as grabs orcomposites, etc. Biased Sampling - collection of samples atlocations based on the judgment of the designer.Statistical Sampling - collection of samples in asystematic or random manner.Multi-phase Sampling - sample collection inmultiple stages; data are used to plan subsequentrounds.Adaptive Sampling - when multi-phase samplingis performed in a single mobilization using fieldanalytical methods which provide results in 24hours or less.Low Bias Analytical Error - when analytical dataindicate that a substance is not present above aspecified concentration, when in fact it is. Lowbias errors can increase the risk of exposurebecause a decision-maker may be led to concludethat no action is warranted when it is. High Bias Analytical Error - when analytical dataindicate that a substance is present above aspecified concentration, when in fact it is not. High bias errors can increase the cost of cleanupbecause a decision-maker may be led to concludethat action is warranted when it is not.Grab Sample - sample from a single locationuseful for identifying and quantifying chemicals inan area where contamination is suspected.Composite Sample - composed of more than onediscrete sample taken at different locations, usefulto quantify average contamination across a site.Analytical Method - procedures used to identifyand/or quantify chemicals in a sample.Measurement Error - the difference between thetrue sample value and the measured analyticalresult.Broad Spectrum Analysis - analytical procedurecapable of identifying and quantifying a widerange of chemicals.Field Analysis - measurement taken in the field;results are quick and quantitative or qualitative.Data Useability - adequacy of data for decisions;determined by comparing resulting data qualitywith predefined quality needs documented inQAPP (defined during the DQO process).

SECTION 4. QUALITY ASSURANCE PROGRAMS &

SAMPLING DESIGN STRATEGIES

his section describes basic sampling design the sampling design and these details should beTstrategies that can be used to optimize documented in the QAPP.

Brownfields sampling and analysis plans (SAP);introduces the QA and QC concepts; anddiscusses terminology used duringimplementation of Brownfields site assessmentand the assessment of the resulting data. Theconcepts discussed in this section, related topreparation for collection, analysis, and review ofsite assessment data, will be helpful in thedevelopment and preparation of the QAPP for aBrownfields site assessment.

The last part of Section 4 discusses how theQAPP integrates all technical and quality aspectsfor the life cycle of the project — includingplanning, implementation, and assessment — toproduce a project-specific road map for obtainingthe type and quality of environmental dataneeded for a specific decision. Appendix A ofthis document contains forms that the team canuse directly or as a guide for writing the QAPP. Some of the forms may not be necessarydepending on site conditions and samplingdesign. The following elements related tosampling should be included in the QAPP:

� Sampling design (form E of the QAPPtemplate);

� Sampling methods (form F-1);� Sample handling and custody (form K);� Analytical methods (forms F-1 and F-2);� Quality control (form M);� Instrument/equipment testing, inspection,

and maintenance (forms G and I);� Instrument calibration frequency (form J);

and� Data management (form N).

The design and extent of a Brownfields siteassessment will be dictated largely by theconceptual site model, the availability ofresources, and the required data quality and levelof quality control exercised. The DQOdevelopment process should define all aspects of

Page 26: Quality Assurance Guidance for Conducting Brownfields Site

1616

Definitive Data

Definitive data are documented to be appropriatefor rigorous uses that require both hazardoussubstance identification and concentration andare generated using methods that produce datasuitable for scrutiny of the data validation anduseability criteria described later in this section. Definitive data are analyte-specific, withconfirmation of analyte identity andconcentration. Methods produce tangible rawdata (e.g., chromatograms, spectra, numericalvalues) as paper printouts or computer-generatedelectronic files. Definitive data may be generatedat the site or at an offsite location, as long as theQA/QC requirements of the method are satisfied.

For data to be definitive, either analytical or totalmeasurement error should be determined. Forfurther guidance on definitive data, refer toGuidance for Performing Site Inspections UnderCERCLA, Interim Final, and Guidance for DataUseability in Risk Assessment. Thesedocuments are available through NTIS athttp://www.ntis.gov/envirn/envirn.htm.

Quality Assurance

QA is an integrated system of managementactivities that are part of a project’s planning,implementation, assessment, reporting, andquality improvement. These activities ensurethat products are of the type and qualityexpected.

QA will be an integral part of a Brownfields siteassessment because it provides the road map toall activities necessary to collect data of knownand adequate quality. Data of adequate qualitywill give the team a sufficient level of confidencein the data to make informed decisions about theredevelopment of the site including thefollowing: the threat posed by thecontamination, potential site remediationalternatives, and additional projects needed toprepare a site for redevelopment.

The QAPP provides the framework for theBrownfields project’s QA program by outliningactivities that promote the collection of data withthe accuracy and precision required for theproject. Some elements of the QA programinclude the following:

� Staff organization and responsibility (form Bof the QAPP template);

� Standard Operating Procedures (SOPs) forsampling and analytical methods (form F-1);

� Field and laboratory calibration procedures(forms H and J);

� Routine and periodic quality controlactivities (form M);

� Data assessment procedures (form O); and� Data reduction, validation, and reporting

procedures (forms P, Q-1, Q-2, and R).

Quality Control

Quality control (QC) is integral to the success ofa QA program. It is the overall system oftechnical activities that measure the performanceof a process against defined standards to verifythat they meet predefined requirements. Sinceerrors can occur in the field, the laboratory, orthe office, it is necessary for QC to be part ofeach of these functions.

An example of a QC activity is collection of arinsate blank sample. When equipment iscleaned and reused in the field, the samplingteam will collect a sample of the spent rinsewater. Analysis of this sample will showwhether the equipment was sufficiently cleanedor if hazardous substances have remained on theequipment that will contaminate the next sample. The data from the rinsate blank measure theperformance of decontamination procedures inthe field. If contaminants are found in the rinsateblank, other samples collected with the sameequipment may also be contaminated and maynot meet the stated requirements established bythe DQOs. QA and QC procedures are discussedbelow in the context of expected environmentalmeasurement activities for Brownfields siteassessments.

QA and QC parameters apply to the two primarytypes of data — definitive and nondefinitive data— and whether the data collection activity isassociated with field measurements or laboratorymeasurements. The following boxes providedefinitions of these terms.

Page 27: Quality Assurance Guidance for Conducting Brownfields Site

1717

Nondefinitive Data

Nondefinitive data are frequently collected duringthe first stage of a multi-phase screeningassessment using rapid, less precise methods ofanalysis with less rigorous sample preparation. Nondefinitive data can provide analyteidentification and quantification, although bothmay be relatively imprecise. Typically, 10% ofnondefinitive samples or all critical samples areconfirmed using analytical methods and QA/QCprocedures and criteria associated with definitivedata. Nondefinitive data without associatedconfirmation data are of unknown quality.

Qualitative, nondefinitive data identify thepresence of contaminants and classes ofcontaminants and can help focus the collection ofdefinitive data, which is generally the moreexpensive of the two.

Each site assessment should be guided by adetailed description of the work to be performedin the SAP. The SAP takes the conceptual sitemodel developed in the DQO process andtranslates it into a sampling and analysis designthat identifies where, how, how many, and whattypes of samples will be collected; how thesamples will be stored and transported; how,when, and by what method the samples will beanalyzed; and what procedures and records willbe used to track the samples through the process. Depending on the complexity of the Brownfieldssite, multiple SAPs may be needed. QA and QCparameters should be described in detail for eachof these steps, and include specific correctiveactions to be taken if difficulties are encounteredin the field. Guidance documents on samplingmethods are listed in Appendix C.

Sampling Design Strategies

Sampling design strategies should factor in theconditions unique to the site being considered forredevelopment, including data gaps in theconceptual site model, exposure potential,projected site reuse, and available resources.

Step 7 of the DQO process should identifyseveral possible sampling design strategies. Some of the variables that may be used in thesestrategies to bring down the cost of the project

are described below. The details of the selected sampling design can be documented on forms Ethrough N of the QAPP template. The overallsampling design is described on form E.

Unique site conditions that may call for a certainstrategy include a site with buildings slated forreuse. In this situation, non-routine samplingand analysis may be required for unusual samplematrices, such as building materials.

The main sampling design decision is whether astatistical (probability based) or judgmental(nonrandom or biased) sampling design shouldbe employed. Judgmental sampling is a usefuldesign when the team wants to characterize areasof suspected contamination. Statistical samplingdesigns are suited for evaluating trends andestimating the distribution of contaminants. Sometimes both judgmental and statisticalsampling is required on a single site. Animportant distinction of statistical samplingdesigns is that they are usually required when thelevel of confidence needs to be quantified.

For surface soil sampling in residential reusescenarios, a statistical sampling design is likelyto be chosen because a quantitative statement ofthe decision error will be needed to show that thelevel of any contamination at the site is safe. Industrial reuse may not require as rigorous aresult or may be possible with a qualitativestatement at sites where exposure is not possible.

When historical data are unavailable to indicate discrete areas of contamination, i.e., hot spots, auseful strategy is to collect samples along a grid. Grid sampling is designed to cover the entire sitewith samples collected at regularly spacedintervals. The goal of grid sampling is to reducethe probability of making a decision error. Itallows the calculation of the probability ofremaining undetected hot spots. This techniquecould be particularly useful on the previouslyunsampled western portion of the Springfield sitewhere industrial activities are known to havebeen carried out. Different types of grids areavailable and are generally selected based on thedifficulty of layout in the field, sufficientdetection capability, and cost.

Page 28: Quality Assurance Guidance for Conducting Brownfields Site

1818

Factors affecting the success of grid sampling possible at the end of the project. For moreinclude the size and shape of the grid and the information, refer to EPA Quality Assurancesize and shape of the hot spot. If the hot spot is Division’s (QAD) Guidance for the Datasmall relative to grid spacing, the probability of Quality Objectives Process and other documentsmissing a hot spot will be relatively high. The listed in Appendix C. To quantitativelycost of grid sampling is determined by the demonstrate that a specific level of certainty hasnumber of samples, which is determined by the been achieved for site decisions, individual datagrid shape and spacing. Closer spacing yields a sets must be of sufficient quality, and overallhigher probability of detecting a hot spot. statistical analysis (which integrates information

If data are needed to determine if groundwater must be able to support the site decision.onsite is contaminated, a statistical samplingdesign would be unnecessary and impracticalbecause of the cost of installation of groundwaterwells. Judgmental groundwater samples mightbe collected from nearby wells, or if the budgetcan bear it, one or more wells may be locatedwhere the contamination is most likely to bepresent. Appendix C identifies severalgroundwater sampling and monitoring guidancedocuments.

Errors in judgmental sampling may come fromcross-contamination of samples in the field orimproper calibration, maintenance, and use offield equipment (these issues are discussed laterin this document). To ensure useable data ifthese problems arise, the field team should havepreviously defined alternative options, such asrouting samples to a fixed laboratory. QCsamples (discussed later in the section) will alsobe helpful.

Because of the need for quantification of thedecision error in the public park reuse scenario— a sampling design that translates the resultsfrom a limited number of samples to an estimateof the contamination on part of the site — astatistically based sampling design will berequired.

While this document does provide some that variability. If previous data can helpdescription of how statistics are used to assist in estimate the amount of variability, samplingthe decision process, statistical expertise and protocols can be optimized during projectsupport should be enlisted throughout the project planning.to ensure a defensible environmental decision is

from the individual data sets into a site decision)

Section 3 introduced the concept of testing thenull hypothesis and its relation to decision errors. A false positive decision error is the erroneousrejection of the null hypothesis; a false negativedecision error is the erroneous acceptance of thenull hypothesis. At the Springfield site, the nullhypothesis is that the site is dirty. The falsepositive decision is that the site is clean when itis dirty; the false negative decision is that the siteis dirty when it is clean.

If a goal of the project is to document whetheronsite contaminant concentrations are higherthan background concentrations, previous datamay help estimate the relative difference betweenthe background concentrations and the onsiteconcentrations. By using that estimation and thepredefined degree of statistical certainty, theteam can calculate the number of samplesrequired. The greater the difference betweenbackground and site contaminant levels, theeasier it is to document and quantify thatdifference; therefore, fewer samples are needed.

The amount of contaminant variability onsitealso directly impacts the planning andimplementation of sampling design. The higherthe variability, the greater the number of samplesthat have to be collected to adequately document

Page 29: Quality Assurance Guidance for Conducting Brownfields Site

1919

EXHIBIT 5NUMBER OF SAMPLES REQUIRED TO

ACHIEVE GIVEN RATES OF FALSEPOSITIVE AND NEGATIVE, AND

MDRD*

False Positive

FalseNegative

MDRD NumberSamples

10% 10% 10% 42

10% 10% 20% 12

20% 10% 20% 8

30% 20% 10% 19

20% 20% 20% 5

20% 10% 40% 3

*Number of Samples is based on known variabilityin contaminant levels represented by a coefficient ofvariation (CV) of 25%, random sampling design, andnormal distribution.Minimum detectable relative difference(MDRD) is used when discriminating site levels frombackground levels. It is the percent difference betweenthe two detected levels.Source: EPA. Guidance for Data Useability in RiskAssessment. April 1992.

Exhibit 5 can be used in some cases to optimize decision error rates. The MDRD represents thethe sampling design by keeping samples to a difference between concentrations of substancesminimum while maintaining a known level of on the site and substances in the background. confidence. It illustrates the number of samples The greater the difference, the easier it is torequired for selected decision error rates given a distinguish between background and site levelsstatistical sampling design, the requirement to and therefore, fewer samples are needed for thedifferentiate the site levels from background same decision error rate.levels (minimum detectable relative difference(MDRD)), and a 25% variability in contaminant An underestimation of the actual variability inlevels onsite. It also assumes normally existing contamination is the most likely reasondistributed data; for lognormally distributed data, the results would fall short of the desiredthe data should be transformed to a normal confidence level. The team should prepare todistribution before the methods described here perform the necessary calculation as soon as thecan be applied. data results become available to determine if the

These parameters, the MDRD and the variability in the concentration of a chemical of concernin contaminant concentrations (coefficient of increases, more samples are needed to adequatelyvariation (CV)), can be estimated based on represent the onsite concentrations of thatpreviously collected data at the site being chemical.assessed or by using data from sites with similaruses and substances. If the CV is low — the For the public park reuse portion of the propertyconcentrations on the site do not vary greatly — in the Springfield example in Section 3, 40 to 50fewer samples are needed to achieve the same samples collected in a statistical design

desired confidence level is being met. If not,costly resampling or reanalysis of criticalsamples may be necessary.

Exhibit 5 also provides an overview of theinterplay of site characteristics and decision errorrates in the sampling design by illustrating thenumber of samples needed to meet specifieddecision error rates in given circumstances. Ingeneral, the number of samples increases as thedesired error rate decreases. The exhibit shows,for example, that to achieve a false positivedecision error rate of 20% requires 4 fewersamples than to achieve a rate of 10%, otherfactors being equal. For a given decision errorrate, the number of samples also tends to increasewith decreasing MDRD. For example, toachieve the same decision error rate, 42 samplesare required if background and onsitecontaminant levels differ by only 10%. Ifbackground and onsite levels differ by 20%, only12 samples are required to reach the samedecision error rate.

A change in the coefficient of variation, thefactor held constant in Exhibit 5, also can affectthe number of samples required for a givendecision error rate. In general, as the variability

Page 30: Quality Assurance Guidance for Conducting Brownfields Site

2020

(assuming no need to compare to background water. To develop this information relatively(MDRD) and a variability factor of 25%) will inexpensively, field technologies may be used,produce a statistical confidence of 10% false including real-time detection instruments such asnegative and positive decision error rates a photoionization detector or organic vapor(assuming normally distributed data). analyzer. These tools provide ranges of

If the Springfield team had not optimized the definitive results can be obtained withsampling and analysis design through the DQO immunoassays and x-ray fluorescenceplanning process, either too many or too few instruments, which are discussed later in thesamples may have been collected. document. Results of the preliminary study

� If too many samples were collected and and cost-effective sampling design. unnecessary analyses were conducted,money that could have been spent on other When no previous data are available to estimateprojects would have been wasted. the variability of the contamination at the site, a

� If too few samples were collected, the data preliminary investigation may be used tomay be insufficient to confidently allow safe calculate an estimate of variability in sitereuse of the property. contaminant concentrations; this estimate can

Multi-phase Investigations

A single sampling event may not provide anadequate characterization of the contaminationonsite, especially when the conceptual site modelcontains significant data gaps. In these situationsmulti-phase sampling may be helpful. The needfor this sort of investigation should be identifiedduring the DQO process.

Based on a review of existing site assessmentreports, the team should determine if theprevious data are of sufficient quality andquantity to guide the sampling design or if somepreliminary information must be obtained toproperly plan for collection of the predefinedsamples. The team should base the assessmentof existing data on a review of the documentedquality of the data, and the decision rule criteriaestablished as part of the DQO process.

If a multi-phase assessment is selected, preliminary activities should be developed usingguidelines found in ASTM E1527 (StandardPractice for Environmental Site Assessment:Phase I Environmental Site Assessment Process)or a similar, accepted protocol. When historicalinformation is scarce, preliminary samplingdesigns may include measurement ofcontamination in soils, groundwater, and surface

concentrations of classes of substances. More

should enable the team to plan a more focused

multi-phase investigation may be useful. A

then be used to determine the number of samplesrequired to achieve the specified confidence levelor decision error rate. Not knowing the actualvariability in existing contamination is the mostlikely reason for the results to fall short of thedesired confidence level.

Because of the expense of multiple fieldmobilizations, a dynamic work plan thatcombines two phases of sampling into a singlemobilization would be helpful in keeping downcosts. A dynamic work plan combines fieldtechnologies that quickly provide the data neededfor planning the next phase of data acquisitionwith adaptive sampling designs that are flexibleenough to respond to data generated in the field. For example, field data could be collected todemonstrate the variability of chemicalconcentrations, which influences the number ofsamples required to reach a specific decisionerror rate.

Types of Samples

The types of samples to be collected andanalyzed are determined during the DQOprocess. Like other variables of the samplingdesign, the type of sample collected is dependenton the team’s predefined decision error rate,required degree of accuracy, the spatial andtemporal variability of the media, and the cost.

Page 31: Quality Assurance Guidance for Conducting Brownfields Site

2121

The paragraphs below discuss grab and collection, including how to determine if thecomposite samples and how they can be used in sample is representative, handling and custodythe sampling design. requirements, and the sample volume that must

A sample can be collected discretely or as a necessary analyses. The laboratory chosen tocomposite sample. Discrete samples, called grab perform analyses should be able to providesamples, are taken at a single location and are volume requirements for selected analyticaluseful for identifying and quantifying chemicals methods.in areas of a site where contamination issuspected. For example, grab samples collected Sometimes SOPs will call for the collection ofaround tanks and drums can answer the questions extra volumes of a sample. For example, whenof whether and what substances may have leaked sampling for VOCs, loss of the analytes throughfrom the tanks and drums. breakage of a sample container or through

To identify average contamination across a site, can render the sample or analysis useless. composite samples may be more appropriate. Composite samples are composed of more thanone discrete sample taken at different locations.The discrete samples are mixed to obtain ahomogeneous single composite sample andanalyzed. Composite sampling allows samplingof a larger area while controlling laboratoryanalytical costs because several discrete samplesare physically mixed and one or more samplesare drawn from the mixture for analysis. Thedrawbacks of composite samples are related tothe averaging of the contamination levels indiscrete samples. Compositing minimizes thesignificance of low levels of contamination andmay mask locations where contamination isabove the action level. The number of compositesamples and the number of individual sampleswithin a composite sample should be based onthe decision error rate goals established duringthe DQO process.

Collection of composite samples was notrecommended at the Springfield site described inSection 3 because the samples were to beanalyzed for compounds that may volatilizeduring mixing. If the contaminant of concernwas a metal, then compositing samples to reducethe number of samples and analyses would havebeen cost-effective.

The Brownfields team should request standardoperating procedures (SOPs) for samplingactivities from its contractor. (SOPs arediscussed in more detail later in this section.) The SOPs should describe all aspects of sample

be collected to provide enough material for all

volatilization (loss of the container’s headspace)

Background Samples

Some action levels are derived from naturallyoccurring or background concentrations. Inthese cases, teams should collect background orupgradient samples from nearby areas that arenot impacted by site contamination.

Background samples are analyzed for the sameparameters as the site samples to establishbackground concentrations of target analytes andcompounds. They are collected in areasunaffected by the site, and therefore, indicatewhether the concentration of a particular analytein a sample is related to site activities.

When it is necessary to compare sitecontamination levels to background levels, it ishelpful to collect and analyze backgroundsamples prior to the final determination of thesampling design since the number of samplesnecessary for a specific decision error will bereduced if background concentrations are low.

Analytical Methods

The samples called for in the SAP will beanalyzed either onsite or in a laboratoryaccording to analytical methods selected duringthe DQO process. Analytical methods can beclassified based on the medium (e.g., soil, water)from which the sample was taken, the samplepreparation method, the chemicals for which theanalysis is requested (analytes), the expected

Page 32: Quality Assurance Guidance for Conducting Brownfields Site

2222

levels of the analytes (detection limits or ranges), Forms F-1 and F-2 of the QAPP templatethe level of confidence in the results, and cost. provide space for documentation of analyticalDue to the variety of analytical methods methods to be used in the sampling design. available, the team should work closely with the SOPs should be attached to the QAPP for alllaboratory to select appropriate methods that analytical methods: standard, non-modifiedmeet the predefined DQOs. publicly available methods and nonstandard or

The team should pay particular attention to the conditions. An example of a nonroutine methodaction levels for the site decision when selecting is analysis of samples from building materialsanalytical methods. Different methods have that may have become contaminated. All fielddifferent detection limits, the lowest methods or mobile laboratory methods shouldconcentration of a contaminant that can be also have clearly written SOPs.detected by a particular test method or analyticalinstrument. When contaminant concentrations in An alternative approach to selecting analyticala sample decrease, that is, as they approach the methods is included in the Performance Baseddetection limit, they become increasingly Measurement System (PBMS). This approach isdifficult to quantify, and the instrument readings a partnership-style relationship with theor test results become less reliable. laboratory that facilitates cost-effective, methodMeasurements that fall below the detection limits adaptations that best serve site-specific projectare not reliable, and are usually reported as less needs. Instead of prescribing how to accomplishthan or equal to the detection limit. The team a task, PBMS statements of work describe inwill want to select a method with a detection objective terms what performance standards mustlimit appropriate for the intended use of the data. be met. If the team chooses this approach, itFor the decision of whether to clean up the site should work closely with the laboratory tofor public park use, the team will want to use a document site-specific performance of themethod that can accurately quantify method and how this performance supports theconcentrations below the action level. predefined data quality needs. More data on

Analytical methods can be varied during http://www.epa.gov/epaoswer/hazwaste/test/optimization of the sampling design to produce pbms.htm.more cost-effective results. For example, if asampling design option calls for a multi-phase If the team decides to analyze samples in theinvestigation, the preliminary study may collect field, they should be aware of any limitations ofdata using a broad spectrum analytical method to the field methods under consideration and ensureidentify classes of compounds. Subsequent that the DQOs will be met. Two common typesanalyses can focus on compounds for which of field analytical methods are immunoassaysanalyte- and class-specific methods are available, and x-ray fluorescence (XRF). These methodsthereby producing less expensive and more can produce rapid results at relatively little costaccurate data than full spectrum methods. The but they may have limited ability to identifycombination of a multi-phase strategy and varied certain contaminants and reach very lowanalytical methods allows for the collection of detection limits. Ongoing technologicalmore samples without a loss of confidence or advances are rapidly expanding the capabilitiesincrease in cost. Also, an early broad spectrum and usefulness of field analytical technologies. analysis increases the probability of identifying Information on performance of field analyticalall contaminants of concern. This sampling technologies can be obtained from the Cleanupdesign strategy would have been useful for the Information website at http://www.clu-park reuse portion of the Springfield site for in.com/supply1.htm. See Appendix C forwhich no previous sampling data exist if the site additional references.history indicated a high likelihood ofcontamination.

modified methods that accommodate site-specific

PBMS can be found on the following webpage:

Page 33: Quality Assurance Guidance for Conducting Brownfields Site

2323

Matrix interferences are common obstacles to methods can be used to evaluate a potentialsuccessful sample analysis. For example, a contractor’s competence to perform field clayey matrix may not release analytes of activities or as a standard for conducting a fieldconcern during sample preparation causing the audit (described later in this section). SOPs areresulting levels to be biased low. The team especially important for field activities whereshould explore and document contingency plans significant error can be introduced into datato guide field work in the event that site-specific measurements. Data review and acceptanceinterferences hinder the reliability of a particular depends on the documentation that SOPmethod. Contingency plans, which can be protocols were followed. Any modifications todocumented on form O of the QAPP template, the SOPs during field work should be thoroughlycan save considerable time and money by documented. All SOPs used for a Brownfieldsaverting the downtime of expensive field teams site assessment should be included as appendicesand limiting the costs of producing non- to the QAPP (see form F-1 of QAPP template). informative data and of subsequent resampling. For further reference, see Guidance for the

Quality Control in the Field

Field quality control requirements anddocumentation of all field sampling andobservations is critical to provide a historical record for future reviews and analysis of theuseability of the data produced. The officialfield log book will contain documentation offield activities that involve the collection andmeasurement of environmental data. Additionalforms may be used in the field to record relatedactivities as explained below.

SOPs delineate the step-by-step approach thatfield personnel will follow in collecting samples,taking field measurements, calibratinginstruments, etc. Sampling and field analyticalSOPs that may be used during a Brownfields siteassessment include the following:

� Sampling of surface and/or subsurface soil;� Wipe sampling;� Sampling of concrete and debris; � Sampling of groundwater and surface water;� Use of field analytical instrumentation, such

as onsite GC, GC/MS, XRF, or other fieldmeasurement methods;

� Monitoring well installation anddevelopment; and

� Direct push sampling.

Most qualified sampling contractors and Stateand Federally certified laboratories developSOPs and analytical methods as part of theiroverall QA program. These SOPs and analytical

Preparation of Standard Operating Proceduresfor Quality-Related Operations (see AppendixC).

SOPs are necessary for field activities includingcalibration, decontamination, and preventivemaintenance and should be a part of the SAP. The field team should document what SOP theyare using in the field and any deviations from theSOP.

Decontamination protocols describe methods,tools, and products used to clean reusablesampling equipment after sample collection toprevent contaminating the next collected sample. These protocols are sometimes dictated by thespecific sampling SOPs. A field preventivemaintenance protocol involves ensuring that allfield equipment has been properly calibrated,charged, and inspected prior to and at the end ofeach working day and that replacement parts areavailable.

Field Instrument/Equipment Inspection andCalibration

Sampling and analysis generally requires the useof varied equipment and tools in the gathering ofenvironmental data. All field equipment needsto be inspected to determine if it is adequate forthe media, parameters to be sampled, and thetests to be performed.

Data may be generated onsite through the use ofreal-time equipment, such as a photoionizationdetectors, an organic vapor analyzer, or a pH

Page 34: Quality Assurance Guidance for Conducting Brownfields Site

2424

meter. A more detailed analysis may call for relevant to later assessments of the data’smobile lab-generated data. useability.

The field-testing and mobile laboratory The team should track the transfer of samplesequipment should be examined to ensure that it from the field to the laboratory with chain-of-is in working condition and properly calibrated. custody forms. Information on the chain-of-The calibration of field instruments should be custody forms should include name ofperformed according to the method and schedule laboratory, persons relinquishing and receivingof an SOP — usually based on the samples, quantity of sample material,manufacturer’s operating manual. Calibration preservation solutions, test methods requested,should be performed more often as field unit of measurements, and signatures ofconditions dictate. laboratory personnel. Custody procedures

Field Documentation

Generally, the Brownfields team records fieldactivities in ink, in a bound notebook withprenumbered pages or on a preprinted form. For During the initial stages of field activities, a QAeach sampling event, the field team provides the representative from the Brownfields team shouldsite name and location, date, sampling start and determine whether the field activities arefinish times, names of field personnel, level of following the protocols delineated in the QAPP. protection, documentation of any deviation from If, during the audit, the QA representativeprotocol, and signatures of field personnel. identifies deviations from the prescribed

For individual samples, field teams should on-the-spot actions to ensure that field activitiesdocument the exact location and time the sample are conducted in accordance with the QAPP. was taken, any measurement made (with real- The QA representative should document anytime equipment), physical description of the deficiencies encountered and the correctionssample, sample number, depth, volume, type of made. Results of the audit should be maintainedsample, and equipment used to collect the at the site assessment office as part of thesample. This information can be critical to later document control program. Document control isevaluations of the resulting data’s useability. discussed later in this section.

Individual samples should be labeled in the field. Labels should include sample location, samplenumber, date and time of collection, sample type,sampler’s name, and method used to preserve thesample, if applicable. (Sample preservationinvolves the treatment of a sample usuallythrough the addition of a compound that adjustspH to retain the sample properties, includingconcentration of substances, until it can beanalyzed.) The field team should follow asample summary table similar to form F-2 of theQAPP template for each sampling event. Thetable should include a listing of the total numberof samples, types of sample matrices, all analysesplanned for each sample differentiating criticalmeasurements, and other information that may be

should be discussed in QAPP template form K:Sample Handling and Custody Requirements.

Field System Audits

procedures, the field team manager should take

Quality Control in the Laboratory

The team should select laboratories that havedefined QA protocols. All laboratories used toanalyze samples should have an overall QualityAssurance Plan available for review, includingSOPs and analytical methods, internal QA/QCprocedures and logs, and data review procedures. The team may decide to conduct a laboratorysystem audit to ensure that these plans andprocedures are in place and in use.

The team may also decide to audit its laboratoryby submitting performance evaluation (PE)samples to the laboratory with the otherenvironmental samples collected at theBrownfields site. A PE sample is a sample of

Page 35: Quality Assurance Guidance for Conducting Brownfields Site

2525

known composition provided for laboratory that is collected and produced in the field. Theanalysis to monitor laboratory and method laboratory QC sample is prepared by the personperformance. A PE sample can be used to rate conducting the first step of the sample analysis. the laboratory’s ability to produce analytical The number of field and laboratory QC samplesresults within the pre-set limits documented in used during the project depends on the analyticalthe QAPP. PE samples may be the simplest and method and requirements of the QAPP. Themost cost-effective way to audit a laboratory. general rule is that 10% of samples should be QC

Laboratories that participate in EPA’s Contract collected, at least two additional samples shouldLaboratory Program (CLP) and State programs be submitted as QC samples. Exhibit 6 liststypically analyze PE samples on a routine basis. typical QC samples and the data they provide. The team should request a copy of thelaboratory’s PE results as part of its audit Three basic types of QC samples that areprogram. The team should rely on existing audit prepared in the field and laboratory includeinformation, if available and relevant, to blanks, spikes, and replicates. A blank sample isdetermine the reliability of a laboratory. a clean sample that has not been exposed to the

Quality Control Samples

QC samples are collected and analyzed todetermine whether sample concentrations havechanged between the time of sample collectionand sample analysis, and if so, when and how. For example, cross-contamination may occurduring sampling, and degradation may occurduring storage. A field QC sample is a sample

samples. This means that if 20 samples are

sample medium being analyzed but is subjectedto the same procedures used in the preparationand analysis of the sample from the mediumbeing analyzed. The blanks may be exposed tothe same decontamination, transport, storage, oranalytical process as the regular samples toobtain a baseline value that may be used later toevaluate other data.

EXHIBIT 6TYPES OF QC SAMPLES

QC Sample Information Provided

Blanks Bias introduced during sampling and analysis field blanks field handling or transport rinsate blanks contaminated equipment reagent blanks contaminated reagent method blank any aspect of laboratory analytical system

Spikes Bias introduced in laboratory matrix spike preparation and analysis matrix spike duplicate preparation and analysis precision analysis matrix spike instrumentation surrogate spike analysis

Duplicates, Splits, etc. Precision co-located samples sampling and analysis precision field duplicates precision of all steps after sample collection field splits shipping and interlaboratory precision laboratory duplicates analytical precision laboratory splits interlaboratory precision analysis duplicates instrument precision

Source: Guidance for Quality Assurance Project Plans. February 1998, EPA QA/G-5 EPA 600-R-98-018.

Page 36: Quality Assurance Guidance for Conducting Brownfields Site

2626

A spike sample is a sample to which a known under the next heading of this document,amount of a chemical has been added for the expertise in analytical chemistry and statistics arepurpose of determining the efficiency of recovery necessary for the last stages of the assessment ofof the analytes. These QC samples are data useability. Therefore, during planning forparticularly helpful during analysis of complex the Brownfields site assessment, these teammatrices (e.g., sediment or sludge). A duplicate members should be consulted to identify thesample is a second sample taken from the same types of documentation they will need. Criticalsource at the same time and analyzed under types of documentation include the following:identical conditions. A split is a duplicate that issent to a different laboratory for analysis. When Field Logbookmore than one duplicate is collected it is called a � Site sketch or map with location of eachreplicate. sample collection point

For the portion of the Springfield site slated for analytical SOPs, SAP, and the QAPPreuse as a public park, the team planned for 40 to � Description of field sampling conditions and50 samples. Following the 10% rule of thumb physical parameter data as appropriate forstated above, the team will need four to five QC the media involvedsamples. Considering the importance of theaccuracy of the data, the team will be particularly QAPPinterested in knowing if any bias is present in the � Site description, including surroundingdata. Because historical data did not identify structures and terrain features, nearbyhazardous waste activities on this portion of the populations, flow directions of relevantproperty the team will not be as concerned about media, and a description of active industrialcross contamination. One field QC sample will processesbe collected to detect any contamination � Description and rationale for samplingintroduced by field procedures. Two laboratory design and procedures and references to allQC samples, a matrix spike and a matrix spike SOPsduplicate will be collected to check bias andprecision. The fourth sample, a co-located Field SOPssample, might be collected to test the precision of � Sampling, decontamination, and calibrationfield collection procedures. Many of the QC proceduressamples are defined in the Glossary of Terms inAppendix B of this document. Analytical SOPs

Document Control

Document control is a crucial component of QA. Although it is critical to completion of the laststage of a Brownfields site assessment (review ofdata useability), it is sometimes overlooked. Data useability review depends on thoroughdocumentation of predefined data specificationsand the events that take place duringimplementation of the project which may causethe data to fall short of the predefinedspecifications.

The team should identify those documents thatwill be necessary to check for compliance withall field and laboratory procedures. As explained

� Full descriptions of all deviations from

� Analytical methods used, sample trackingand log-in procedures

Laboratory Deliverables� Narrative explanation of level of analytical

data review used by the laboratory andresulting data qualifiers, indicating directionof bias based on the assessment of QCsamples (e.g., blanks, field and laboratoryspikes)

� Results for each analyte and sample qualifiedfor analytical limitations

� Sample quantitation limits (SQLs) anddetection limits for undetected analytes, withan explanation of the detection limitsreported and any qualifications

Page 37: Quality Assurance Guidance for Conducting Brownfields Site

2727

� Instrument printouts and logbooks, spectra, the QAPP. As in all aspects of a Brownfieldsand raw data site assessment, the activities that occur during

Laboratory Notebook documented in the QAPP. Forms Q-1, Q-2, and� Full descriptions of all deviations from R provide space for the team to describe what

analytical SOPs, SAP, and the QAPP procedures will be performed during data

Custody Records� Chain-of-custody forms Through data useability review, the team� Laboratory custody records determines whether the project has performed

Additionally, project specific documentation may objectives. The decision to accept data, rejectinclude status reports, teleconference records, data, or accept only a portion of the data, shouldand other correspondence. be made after consideration and analysis of all

The documents and records that should be should have an analytical chemist and atracked and secured should be listed on form P of statistician for portions of this assessment butthe QAPP template. The list should identify the some items do not require special expertise. Theparty responsible for producing these reports and data useability review begins with an analysis ofprovide directions for where they should be the data for their own merit and ends with astored or sent. statistical reconciliation of the data with site-

Documentation permits the reviewer to trace asample from collection to analysis and reporting The stages of data useability review generallyof results. The goal of the document control begin with an evaluation of the effectiveness ofprogram is to account for all necessary project the sampling operations, their conformance to thedocuments produced during planning, SAP and SOPs, and whether any unusualimplementation, or analysis. circumstances are documented in the field logs.

Grantees under EPA’s Brownfields program assessed during the review of field proceduresshould adhere to program requirements for include completeness, comparability,record retention found in 40 CFR Subpart O. representativeness, precision, and biasRequirements include a numerical document (accuracy). These terms are described incontrol system, document inventory procedure, Appendix B of this document. and a central filing system with a designatedperson(s) responsible for its maintenance. Some activities that should be carried out include

Deliverables for Data UseabilityReview

Review of the useability of data culminates in thedetermination of whether actual data meet thedata objectives. This determination is impossiblewithout the development, implementation, anddocumentation of procedures used for samplecollection, shipment, analysis, and data reporting. This information will allow QA reviewers todetermine, with reasonable certainty, whetherdata collected during the Brownfields siteassessment meet the DQO criteria documented in

this stage of the project should be planned and

useability review.

within the specifications in the planning

parameters described in the QAPP. The team

specific data quality objectives.

The five data quality indicators that should be

identifying all samples (including locations andanalytes) called for in the QAPP and comparingthose to sample locations documented in the fieldlog and on a field map. These samples are alsocompared to the sample results submitted by thelaboratory to see if the predefined number ofsamples was actually analyzed. This reviewshould also check if the predefined analyticalmethods and detection limits were used. Thesereviews should give an indication of thecompleteness of the data. Any discrepancies thatcannot be resolved may affect the level ofcertainty in the final decision about site reuse.

Page 38: Quality Assurance Guidance for Conducting Brownfields Site

2828

The next step is to verify the analytical data. determined by the team as part of Step 6 of theThis activity is often guided by the analytical DQO process: Specifying Limits on Decisionmethod used, and several guidance documents Errors. Corrective actions are intended toare available that provide a framework for data improve data quality and reduce uncertainty, andverification based on the analytical method. may eliminate the need to qualify or reject data. QAPP form Q-2 provides space to describe the Some corrective actions include the following:process to be used during data verification.

During data verification (some EPA Guidance � Resolving technical or procedural problemsdocuments use the terms verification and by requesting additional explanation orvalidation interchangeably), an analytical chemist clarification from the technical team;reviews the results of QC samples to identify � Requesting reanalysis of sample(s) from thesources of error in the data overall, on a sample extract stored at the laboratory;basis and analyte by analyte. The reviewer will � Requesting construction and re-interpretationlook at sample holding times that may affect the of analytical results from the laboratory ordata from a single sample, and calibration and team chemist;analyte recoveries that may affect the � Requesting additional sample collection andquantification of individual analytes. After the analysis for site or backgrounddata verification is complete, the team should characterization; modeling potential impactshave a better idea of what chemicals are present on uncertainty using sensitivity analysis toat what levels and what threats the site poses. determine range of effect;

The five data quality indicators listed above approved default options and routines; andshould also be applied when verifying the � Qualifying or rejecting data for use in theanalytical data. The QAPP should describe the site assessment.level of verification that will be required.

The last stage of data useability review isvalidation, which should be carried out by thestatistician to determine whether the data cansupport their intended use. The QAPP shouldexplain on form R how the results will bereconciled with the predefined data requirements. Methods for determining possible deviationsfrom planning assumptions should be described. The QAPP should also specify how thelimitations on data use will be reported. EPA’sGuidance for Data Quality Assessment providessome information on this determination (seeAppendix C for full reference).

Improving Data Useability

The team should plan for corrective actions to that follows in Appendix A can be used to guide improve data useability when performance fails the design of a QAPP or the forms can simply beto meet objectives for data that is critical to the reproduced and completed as the QAPP.Brownfields site assessment. Corrective actionscan be costly (resampling) or relativelyinexpensive (requesting additional informationfrom the laboratory). Much of this information is

� Retrieving missing information;

� Adjusting or questioning data based on

Summary

This document has presented the DQO processas a systematic planning tool for cost-effectivesite assessments. It has introduced elements of aQAPP, which helps the team maintain control ofthe project and achieve its objectives through useof QA and QC measurements. It also discussedelements of sampling strategies and how todetermine whether resulting data meet projectobjectives. These tools will help municipalities,Tribes, and States reduce the environmentaluncertainty associated with Brownfields sites byproducing environmental measurement data ofknown and adequate quality. This will, in turn,support defensible decision-making andstakeholder satisfaction. The QAPP template

Page 39: Quality Assurance Guidance for Conducting Brownfields Site

A-1

Appendix A - Model Quality Assurance Project Plan

INTRODUCTION

The EPA requires that all environmental monitoring and measurement efforts participate in a centrallymanaged quality assurance (QA) program.

Any Brownfields team generating data under this quality assurance program has the responsibility toimplement minimum procedures to ensure that the precision, accuracy, completeness, andrepresentativeness of its data are known and documented. To ensure the responsibility is met uniformly,each party should prepare a written QA Project Plan (QAPP) covering each project it is to perform.

The QAPP documents the project planning process, enhances the credibility of sampling results, producesdata of known quality, and saves resources by reducing errors and the time and money spent correctingthem. The QAPP is a formal document describing in comprehensive detail the necessary QA, qualitycontrol (QC), and other technical activities that should be implemented to ensure that the results of thework performed will satisfy the stated performance criteria.

All QA/QC procedures should be in accordance with applicable professional technical standards, EPArequirements, government regulations and guidelines, and specific project goals and requirements.

The tables and figures contained in the Appendices to this document can be used to compile theBrownfields Site QAPP. These forms can be reproduced or downloaded from EPA’s Brownfields webpage located at http://www.epa.gov/swerosps/bf/, or similar forms with the same requirements can becreated. Standard Operating Procedures (SOPs) and standard analytical methods should be referenced inthe text and included as appendices to the Brownfields Site QAPP. SOPs should be referenced in theQAPP by title, date, revision number and the originator’s name.

Page 40: Quality Assurance Guidance for Conducting Brownfields Site
Page 41: Quality Assurance Guidance for Conducting Brownfields Site

A-3

Brownfields Quality Assurance Project Plan

TABLE OF CONTENTS

FORMS

Project ManagementForm A Title and Approval PageForm B Project Organization and ResponsibilityForm C Problem DefinitionForm D Project Description/Project Timeline

Measurement Data AcquisitionForm E Sampling DesignForm F-1 Method and SOP Reference TableForm F-2 Sampling and Analytical Methods RequirementsForm G Preventive Maintenance - Field EquipmentForm H Calibration and Corrective Action - Field EquipmentForm I Preventive Maintenance - Laboratory EquipmentForm J Calibration and Corrective Action - Laboratory EquipmentForm K Sample Handling and Custody RequirementsForm L Analytical Precision and AccuracyForm M Field Quality Control Requirements/Laboratory Quality Control

RequirementsForm N Data Management and Documentation

Assessment/OversightForm O Assessment and Response ActionsForm P Project Reports

Data Validation and UseabilityForm Q-1 Verification of Sampling ProceduresForm Q-2 Data Verification and ValidationForm R Data Useability

Page 42: Quality Assurance Guidance for Conducting Brownfields Site
Page 43: Quality Assurance Guidance for Conducting Brownfields Site

A-5

Form A

Title and Approval Page

Document Title

Prepared by: (Preparer’s Name and Organizational Affiliation)

Address and Telephone Number

Day/Month/Year

Project Manager: Signature

Printed Name/Date

Project QA Officer:

Signature

Printed Name/Date

U.S. EPA Project Manager Approval: Signature

Printed Name/Date

U.S. EPA QA Officer Approval: Signature

Printed Name/Date

Page 44: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-6

Form B

Project Organization and Responsibility(Fill-in the blanks, if applicable, otherwise insert another project-specific chart.)Develop an organizational chart that identifies the chain of command of key personnel, including the QArepresentative. Include titles, responsibilities, and organizational affiliation of all project participants.

Page 45: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-7

Form C

Problem Definition (use multiple pages if needed)Briefly state the specific problem that the data collection project is designed to solve or the decisions to bemade (i.e., the project objectives). Include relevant characteristics of the site, such as site use history,suspected locations and identification of contaminants, range of contaminant concentrations, media thatmay be affected, and likely migration routes. Cite previous studies that indicate why the project is needed.

Page 46: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-8

Form D

Project Description (use multiple pages if necessary):Provide a detailed description of the work to be performed, e.g., identify media to be sampled, whetherfield or fixed laboratories will be used, if field analytical methods will be used, likely action levels, workschedules, required reports, etc.

Page 47: Quality Assurance Guidance for Conducting Brownfields Site

Titl

e:R

evis

ion

Num

ber:

Site

Nam

e:R

evis

ion

Dat

e:S

ite L

ocat

ion:

Pag

e: _

__ o

f ___

A-9

For

m D

(C

ont.)

Pro

ject

Tim

elin

eP

rep

are

an o

vera

ll p

roje

ct t

imet

able

tha

t o

utlin

es b

egi

nnin

g an

d e

ndin

g d

ate

s fo

r th

e en

tire

pro

ject

as

we

ll as

sp

ecifi

c ac

tivi

ties

and

pro

duc

ts w

ithin

the

pro

ject

.

Act

iviti

es (

list p

rodu

cts)

Act

ivity

Sta

rt

Dat

es (

MM

/DD

/YY

)

A

ctiv

ity E

nd

Page 48: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-10

Form E

Sampling Design (use multiple pages if needed)Discuss project sampling design and provide a rationale for the choice of sampling locations for eachparameter/matrix to be sampled during this project, e.g., a judgmental sampling strategy with broadspectrum analysis using methods from SW-846. Identify action levels. Attach a detailed site map withanticipated sampling locations. State whether and how field analytical techniques will be used and identifythe number of field analyzed samples that will be sent for confirmation by a permanent laboratory.

Page 49: Quality Assurance Guidance for Conducting Brownfields Site

Titl

e:R

evis

ion

Num

ber:

Site

Nam

e:R

evis

ion

Dat

e:S

ite L

ocat

ion:

Pag

e: _

__ o

f ___

A-1

1

For

m F

- 1

Met

hod

and

SO

P R

efer

ence

Tab

le (

use

mul

tiple

pag

es if

nee

ded)

Use

this

form

to c

reat

e an

SO

P R

efer

ence

Tab

le.

Th

e ap

pro

pria

te n

um

ber

/lette

r re

fere

nce

from

this

tab

le w

ill b

e u

sed

to c

omp

lete

For

ms

F-2

thro

ugh

J,

and

For

m L

. A

ttach

all

refe

ren

ced

Pro

ject

An

alyt

ical

an

d S

amp

ling

SO

Ps

to th

e Q

AP

P.

Ana

lytic

al M

etho

d R

efer

ence

:P

roje

ct A

naly

tical

SO

Ps:

Incl

ude

docu

men

t titl

e, m

etho

d na

me/

num

ber,

rev

isio

nIn

clud

e do

cum

ent t

itle,

dat

e, r

evis

ion

num

ber,

and

num

ber,

dat

e or

igin

ator

’s n

ame

1a.

1b

.

2a.

2b

.

3a.

3b

.

4a.

4b

.

Pro

ject

Sam

plin

g S

OP

s:*

Incl

ude

docu

men

t titl

e, d

ate,

rev

isio

n nu

mbe

r, a

nd o

rigin

ator

’s n

ame

1c.

2c.

3c.

4c.

* P

roje

ct S

amp

ling

SO

Ps

incl

ud

e sa

mp

le c

olle

ctio

n,

sam

ple

pre

serv

atio

n,

equ

ipm

ent d

econ

tam

inat

ion

, p

reve

ntiv

e m

ain

ten

ance

, et

c.

Page 50: Quality Assurance Guidance for Conducting Brownfields Site

Titl

e:R

evis

ion

Num

ber:

Site

Nam

e:R

evis

ion

Dat

e:S

ite L

ocat

ion:

Pag

e: _

__ o

f ___

A-1

2

For

m F

-2

Sam

plin

g an

d A

naly

tical

Met

hods

Req

uire

men

ts (

use

mul

tiple

pag

es if

nee

ded)

Des

crib

e th

e d

etai

ls o

f th

e d

ata

colle

ctio

n a

nd

an

alys

is d

esig

n fo

r th

e p

roje

ct.

Inse

rt th

e ap

pro

pria

te S

OP

nu

mb

er/le

tter

refe

ren

ce in

the

tab

le.

For

mF

-1 c

onta

ins

the

Met

hod

an

d S

OP

Ref

eren

ce T

able

. A

ttach

an

alyt

ical

SO

Ps

for

sam

ple

col

lect

ion

an

d a

nal

ysis

for

each

par

amet

er/

mat

rix.

Th

efo

llow

ing

is e

xam

ple

dat

a.

Par

amet

erM

atrix

Num

ber

ofA

naly

tical

Sam

plin

gC

onta

iner

sP

rese

rvat

ion

Max

imum

Hol

ding

Sam

ples

Met

hod*

SO

P*

per

Sam

ple

Req

uire

men

tsT

ime

at L

ab(in

clud

e(n

umbe

r, s

ize

(tem

pera

ture

,(p

repa

ratio

n/fie

ld Q

C)

and

type

)lig

ht, c

hem

ical

)an

alys

is)

ben

zo(a

)so

il2

3C

LP O

rgan

ic1

c2

x 4

oz,

flin

tst

ored

in d

ark

atex

trac

t in

7 d

ays

pyr

ene

SO

W O

LM 0

3.2

glas

s ja

r,4

�C

pol

ypro

p c

ap,

teflo

n li

ner

* In

sert

the

app

rop

riate

SO

P n

um

ber

/lette

r re

fere

nce

in th

e ab

ove

tab

le.

For

m F

-1 c

onta

ins

the

Met

hod

an

d S

OP

Ref

eren

ce T

able

.

Page 51: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-13

Form G

Preventive Maintenance - Field EquipmentIdentify the equipment and/or systems requiring periodic preventive maintenance. Cite references on howperiodic preventive and corrective maintenance of measurement or test equipment shall be performed toensure availability and satisfactory performance of the systems. Cite descriptions of how to resolvedeficiencies and when re-inspection will be performed. Describe the availability of spare parts identifiedin the manufacturer’s operating instructions and how SOPs will be maintained.

Instrument Activity Frequency Ref. *SOP

* Insert the appropriate reference number/letter from Form F-1, Method and SOP Reference Table.

Page 52: Quality Assurance Guidance for Conducting Brownfields Site

Titl

e:R

evis

ion

Num

ber:

Site

Nam

e:R

evis

ion

Dat

e:S

ite L

ocat

ion:

Pag

e: _

__ o

f ___

A-1

4

For

m H

Cal

ibra

tion

and

Cor

rect

ive

Act

ion

- F

ield

Equ

ipm

ent

Iden

tify

all t

ools

, ga

uge

s, in

stru

men

ts,

and

oth

er e

qu

ipm

ent u

sed

for

dat

a co

llect

ion

act

iviti

es th

at m

ust

be

calib

rate

d to

mai

nta

in p

erfo

rman

ce w

ithin

spec

ified

lim

its.

Ref

eren

ce c

alib

ratio

n pr

oced

ures

to

be c

ondu

cted

usi

ng c

ertif

ied

equi

pmen

t an

d st

anda

rds

with

kn

own

rela

tion

ship

s to

rec

ogn

ized

per

form

ance

sta

nd

ard

s.

Ref

eren

ce p

roce

du

res

on th

e m

ain

ten

ance

of r

ecor

ds

of c

alib

ratio

n.

Inst

rum

ent

Act

ivity

Fre

quen

cyA

ccep

tanc

eC

orre

ctiv

eS

OP

Crit

eria

Act

ion

Ref

. *

* In

sert

the

ap

pro

pri

ate

re

fere

nce

num

be

r/le

tter

fro

m F

orm

F-1

, M

eth

od

and

SO

P R

efe

renc

e T

ab

le.

Page 53: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-15

Form I

Preventive Maintenance - Laboratory EquipmentIdentify the equipment and/or systems requiring periodic preventive maintenance. Cite references on howperiodic preventive and corrective maintenance of equipment shall be performed to ensure availability andsatisfactory performance. Cite discussions of how the availability of critical spare parts, identified in themanufacturer’s operation instructions and/or SOPs, will be assured and maintained. Cite corrective actionsfor calibration check samples that exceed the control limits, drift in the calibration curve, or if a reagentblank indicates contamination.

Instrument Activity Frequency Ref. *SOP

* Insert the appropriate reference number/letter from Form F-1, Method and SOP Reference Table.

Page 54: Quality Assurance Guidance for Conducting Brownfields Site

Titl

e:R

evis

ion

Num

ber:

Site

Nam

e:R

evis

ion

Dat

e:S

ite L

ocat

ion:

Pag

e: _

__ o

f ___

A-1

6

For

m J

Cal

ibra

tion

and

Cor

rect

ive

Act

ion

- La

bora

tory

Equ

ipm

ent

Iden

tify

all t

ools

, ga

uge

s, in

stru

men

ts a

nd

oth

er e

qu

ipm

ent u

sed

for

dat

a co

llect

ion

act

iviti

es th

at m

ust

be

calib

rate

d to

mai

nta

in p

erfo

rman

ce w

ithin

spec

ified

lim

its.

Ref

eren

ce c

alib

ratio

n pr

oced

ures

to

be c

ondu

cted

usi

ng c

ertif

ied

equi

pmen

t an

d/or

sta

ndar

ds w

ith k

now

n re

lat

ion

ship

s to

reco

gniz

ed p

erfo

rman

ce s

tan

dar

ds.

R

efer

ence

pro

ced

ure

s on

the

mai

nte

nan

ce o

f rec

ord

s of

cal

ibra

tion

.

Inst

rum

ent

Act

ivity

Fre

quen

cyA

ccep

tanc

eC

orre

ctiv

eS

OP

Crit

eria

Act

ion

Ref

. *

* In

sert

the

ap

pro

pri

ate

re

fere

nce

num

be

r/le

tter

fro

m F

orm

F-1

, M

eth

od

and

SO

P R

efe

renc

e T

ab

le.

Page 55: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-17

Form K

Sample Handling and Custody Requirements (use multiple pages if needed)Describe the procedures for sample handling and custody. Include chain-of-custody forms; identify thesampling tags and custody seals the field teams should use. Refer to SOPs for collecting, transferring,storing, analyzing, and disposing of samples.

Page 56: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-18

Form L

Analytical Precision and Accuracy (use multiple pages if needed)Identify the analytical methods and equipment required, including sub-sampling or extraction methods,laboratory decontamination procedures and materials, waste disposal requirements (if any), and specificperformance requirements (i.e., quantitation limits, precision, and accuracy) for each method.

Analyte Analytical Detection Quantitation Precision AccuracyMethod* Limit Limit (water/soil) (water/soil)

(water/soil) (water/soil)(units) (units)

* Insert the appropriate reference number/letter from Form F-1, Method and SOP Reference Table.

Page 57: Quality Assurance Guidance for Conducting Brownfields Site

Titl

e:R

evis

ion

Num

ber:

Site

Nam

e:R

evis

ion

Dat

e:S

ite L

ocat

ion:

Pag

e: _

__ o

f ___

A-1

9

For

m M

Fie

ld Q

ualit

y C

ontr

ol R

equi

rem

ents

Th

e p

roce

du

res

and

req

uire

men

ts c

onta

ined

in

EP

A R

eq

uirem

en

ts f

or

Qu

alit

y A

ssu

ran

ce P

roje

ct P

lan

s,

Oct

ober

, 1

99

7.

EP

A Q

A/R

-5 (

Dra

ftF

inal

), o

r la

test

rev

isio

n,

shou

ld b

e fo

llow

ed a

nd

ref

eren

ced

bel

ow.

QC

Sam

ple

Fre

quen

cy *

Acc

epta

nce

Cor

rect

ive

Crit

eria

Act

ion

Dup

licat

e5%

per

par

amet

er p

er m

atrix

or _

____

____

____

__

Equ

ipm

ent B

lank

5% p

er p

aram

eter

per

mat

rix

or _

____

____

____

__

VO

A T

rip B

lank

1 pe

r C

oole

r

or _

____

____

____

__

Coo

ler

1 pe

r C

oole

rT

empe

ratu

re B

lank

or _

____

____

____

__

Bot

tle B

lank

1 pe

r Lo

t #

or _

____

____

____

__

Oth

er (

spec

ify)

* C

ircle

crit

eria

list

ed o

r in

dic

ate

alte

rnat

ive

crite

ria

Page 58: Quality Assurance Guidance for Conducting Brownfields Site

Titl

e:R

evis

ion

Num

ber:

Site

Nam

e:R

evis

ion

Dat

e:S

ite L

ocat

ion:

Pag

e: _

__ o

f ___

A-2

0

For

m M

(C

ont.)

Labo

rato

ry Q

ualit

y C

ontr

ol R

equi

rem

ents

QC

Sam

ple

Fre

quen

cy *

Acc

epta

nce

Cor

rect

ive

Crit

eria

Act

ion

VO

AD

aily

Rea

gent

/Met

hod

Bla

nkor

___

____

____

____

Rea

gent

/Met

hod

5% p

er p

aram

eter

per

mat

rixB

lank

or _

____

____

____

__

Dup

licat

e5%

per

par

amet

er p

er m

atrix

or _

____

____

____

__

Mat

rix S

pike

5% p

er p

aram

eter

per

mat

rix

or _

____

____

____

__

Per

form

ance

5% p

er p

aram

eter

per

mat

rixE

valu

atio

n (P

E)

per

con

cent

ratio

n le

vel

Sam

ple

or _

____

____

____

__

Oth

er

Oth

er

* C

ircle

crit

eria

list

ed o

r in

dic

ate

alte

rnat

ive

crite

ria.

Page 59: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-21

Form N

Data Management and Documentation (use multiple pages if needed)Briefly discuss data documentation and management from field collection and laboratory analysis to datastorage and use. Analytical data packages should include all relevant documents (for example, a laboratorynarrative, tabulated summary forms for laboratory standards, quality control, and field sample results inorder of analysis, raw data for laboratory standards, quality control, and laboratory log book sheets). Describe procedures for detecting and correcting errors during data reporting and data entry. Provideexamples of any forms or checklists, such as chain-of-custody or field calibration forms.

Types of information to request from the laboratory:a) Data Results Sheets (include any performance evaluation sample results)b) Method Blank Resultsc) Surrogate Recoveries and Acceptance Limitsd) Matrix Spike/Matrix Spike Duplicate Results and Acceptance Limitse) Spike/Duplicate Results and Acceptance Limitsf) Laboratory Control Sample Results and Acceptance Limitsg) ICP Serial Dilution Resultsh) ICP Interference Check Sample ResultsI) Project Narrative which contains all observations and deviations

Page 60: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-22

Form O

Assessment and Response Actions (use multiple pages if needed)Describe procedures for identifying and correcting any problems encountered during specific projectoperations.

Page 61: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-23

Form P

Project Reports (use multiple pages if needed)Identify the frequency, content, and distribution of project reports that detail project status, results ofinternal assessments, corrective actions implemented, and project results. For example, the field team maybe required to submit daily status reports comprised of field log sheets describing any field measurementstaken, number of samples collected and their status (shipped, at lab, or awaiting shipment), deviations fromSOPs, etc.

Page 62: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-24

Form Q - 1

Verification of Sampling Procedures (use multiple pages if needed)Describe the process to be used to review the sampling procedures to verify that they conform torequirements in the sampling and analysis plan.

Page 63: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-25

Form Q - 2

Data Verification and Validation (use multiple pages if needed)Describe the process to be used to verify conformance of the analytical data with predefined requirements. Describe the process to be used to validate conformance of the analytical data to the predefined needs ofthe Brownfields site assessment.

Page 64: Quality Assurance Guidance for Conducting Brownfields Site

Title: Revision Number:Site Name: Revision Date:Site Location: Page: ___ of ___

A-26

Form R

Data Useability (use multiple pages if needed)Describe the process for determining whether the data successfully meet the requirements for theirintended use. Outline methods to be used to identify anomalies and departures from assumptions in thesampling and analysis design. Discuss how limitations of the data will be reported.

Page 65: Quality Assurance Guidance for Conducting Brownfields Site

B-1

Appendix B - Glossary of Terms

Accuracy A measure of the closeness of an individual measurement or the averageof a number of measurements to the true value. Accuracy is influencedby a combination of random error (precision) and systematic error (bias)components which are due to sampling and analytical operations. EPArecommends that this term not be used and that precision and bias be usedto convey the information usually associated with accuracy.

Analyte The chemical for which a sample is analyzed.

ASTM American Society for Testing and Materials — An organization whichdevelops and publishes standard methods of analysis and standards formaterials and procedures.

Background A level of hazardous substances that approximates the level that would bepresent in the medium of concern if the source of contamination underanalysis did not exist.

Bias The systematic or persistent distortion of a measurement process whichcauses errors in one direction (i.e., the expected sample measurement isdifferent from the sample’s true value). Bias can result from improperdata collection, poorly calibrated analytical or sampling equipment, orlimitations or errors in analytical methods and techniques.

Bioaccumulation The tendency of a hazardous substance to be taken up and accumulated inthe tissue of organisms, either directly or through consumption of foodcontaining the hazardous substance. Bioaccumulation typically results inincreasing concentrations of hazardous substances in tissues of organismshigher up the food chain.

Blank A sample that has not been exposed to the analyzed sample stream inorder to monitor contamination during sampling, transport, storage, oranalysis. The blank is subjected to the same analytical or measurementprocess as other samples to establish a zero baseline value and issometimes used to adjust or correct routine analytical results.

Brownfields Site Manager Person appointed by the cooperative agreement recipient or lead agency tooversee cleanups at specific sites.

Calibration Comparison of a measurement standard, instrument, or item with astandard or instrument of higher accuracy to detect and quantifyinaccuracies and to report or eliminate those inaccuracies by adjustments.

Calibration standard Standards prepared by successive dilution of a standard solution coveringthe full concentration range required and expected to be seen in thesamples, for the organic or inorganic analytical method. The calibrationstandard must be prepared using the same type of acid or solvent used toprepare samples for analysis.

Page 66: Quality Assurance Guidance for Conducting Brownfields Site

B-2

CERCLA Comprehensive Environmental Response, Compensation, and LiabilityAct of 1980, as amended.

Chain-of-Custody An unbroken trail of accountability that ensures the physical security of samples, data, and records.

CLP U.S. EPA’s Contract Laboratory Program. Refers to laboratoryspecifications, analytical methods, and QA/QC protocols required forSuperfund and related activities.

Comparability The confidence with which one data set can be compared to another.

Completeness A measure of the amount of valid data obtained from a measurementsystem compared to the amount that was expected to be obtained undercorrect, normal conditions.

Composite sample Non-discrete samples composed of one or more individual samples takenat different locations at a site. Composite samples are representative ofthe average concentrations of contaminants across a large area.

Control Sample A QC sample introduced into a data collection process to monitor theperformance of the system.

Cooperative Agreement A form of assistance provided by a Federal agency in which substantial interaction is anticipated between the Federal agency and the assistance recipient (e.g., State, Tribal, or local government or other) during the performance of the contemplated activity.

Data Validation Confirmation through examination and provision of objective evidence that requirements for a specific intended use have been met. The process of examining the analytical data to determine conformance to user needs.

Data Verification Confirmation through examination and provision of objective evidence that predefined requirements for a specific intended use have been met. The process of examining the result of a given activity to verify conformance to stated requirements for that activity.

Definitive Data Data that are documented as appropriate for rigorous uses that requireboth hazardous substance identification and concentration. Definitivedata are often used to quantify the types and extent of releases ofhazardous substances. Guidance for Performing Site Inspections UnderCERCLA, Interim Final, p. 99; Guidance for Data Useability in SiteAssessment, Draft, pp. 13 and 14.

DL Detection Limit — the lowest concentration or amount of the targetanalyte that can be determined to be different from zero by a singlemeasurement at a stated level of probability.

Duplicate Sample A second sample taken from and representative of the same populationand carried through all steps of the sampling and/or analytical procedures

Page 67: Quality Assurance Guidance for Conducting Brownfields Site

B-3

in an identical manner. See Field Duplicate, Matrix Duplicate, andMatrix Spike Duplicate.

DQOs Data Quality Objectives — Qualitative and quantitative statements(derived from the DQO Process) that clarify the objectives of studies,technical processes and quality assurance programs, define theappropriate type of data, and specify tolerable levels of potential decisionerrors that will be used as the basis for establishing the quality andquantity of data needed to support decisions.

Equipment Blank Also called the Equipment Rinsate. A sample of analyte-free reagenttaken after completion of decontamination and prior to sampling at thenext sample location. It is used to check field decontaminationprocedures to ensure that analytes from one sample location have notcontaminated a sample from the next location.

False Positive Decision Error The erroneous decision that the null hypothesis is correct.

False Negative Decision Error The erroneous decision that the null hypothesis is incorrect.

Field Blank A blank used to provide information about contaminants that may beintroduced during sample collection, storage, and transport. A cleansample, carried to the sampling site, exposed to sampling conditions, andreturned to the laboratory and treated as an environmental sample.

Field Duplicate An independent sample collected from the same location or source, asclose as possible to the same point in space and time. Duplicates arestored in separate containers and analyzed separately for the purpose ofdocumenting the precision of the sampling process. (Laboratoryvariability will also be introduced into the samples’ results.)

GC Gas Chromatography — An analytical technique used to analyzeenvironmental matrices for contaminants.

GC/MS Gas Chromatography/Mass Spectrometry — This is a gaschromatography analyzer combined with a mass spectrometer detector. The mass spectrometer uses the difference in mass-to-charge ratio (m/e)of ionized atoms or molecules to separate them from each other and toquantify their concentrations.

Grab Samples Discrete samples that are representative of a specific area and a specifictime. Useful in identifying “hot spots” of contamination at a site.

Hazardous Substances CERCLA hazardous substances, pollutants, and contaminants, as defined in CERCLA Sections 101(14) and 101(33).

Holding Time The period a sample may be stored prior to its required analysis. Although exceeding the holding time does not necessarily negate theveracity of analytical results, it causes the qualifying or “flagging” of thedata for not meeting all of the specified acceptance criteria.

Page 68: Quality Assurance Guidance for Conducting Brownfields Site

B-4

Human Exposure Any exposure of humans to a release of one or more hazardous substancesvia inhalation, ingestion, or dermal contact. Amdur, Mary O., JohnDoull, and Curtis D. Klaassen, Toxicology, The Basic Science of Poisons,Fourth Edition, 1991, p. 14; Hazard Ranking System Guidance Manual,Interim Final, pp. 153, 259, 293, 317, 363, and 411.

Interference An element, compound, or other matrix effect present in a sample whichinterferes with detection of a target analyte leading to inaccurateconcentration results for the target analyte.

Matrix The substrate containing the analyte of interest — examples are soil,water, sediments, and air. Also called medium or media.

Matrix Duplicate A duplicate field sample used to document the precision of sampling andhomogeneity of a given sample matrix. (Same as field duplicate.)

Matrix Spike (MS) A sample prepared by adding a known mass of target analyte to aspecified amount of matrix sample for which an independent estimate oftarget analyte concentration is available. Spiked samples are used, forexample, to determine the effect of the matrix on a method’s recoveryefficiency.

Matrix Spike Duplicate (MSD) A split sample, both portions of which are spiked with identicalconcentrations of target analytes, for the purpose of determining the biasand precision of a method in a particular sample matrix.

Maximum Contaminant Maximum concentration of a contaminant allowed in drinking water Level (MCL) systems by the National Primary Drinking Water regulations: 40 CFR

141.11 (inorganic chemicals) and 141.12 (organic chemicals).

Method Blank A clean sample processed simultaneously with and under the sameconditions as samples containing an analyte of interest through all steps ofthe analytical procedure.

Method Detection Limit (MDL) The minimum concentration of an analyte that can be measured and reported with 99% confidence. It is determined by analysis of samples with known concentrations at various dilutions. This limit is matrix-specific (e.g., soils vs. waters).

Null Hypothesis Presumed or baseline condition. In the case of environmental investigations, generally either that the site is contaminated or that the siteis clean.

ppb Parts per billion; �g/kg (micrograms per kilogram); �g/l (micrograms per liter).

ppm Parts per million; mg/kg (milligrams per kilogram); mg/l (milligrams per liter).

Page 69: Quality Assurance Guidance for Conducting Brownfields Site

B-5

Precision A measure of mutual agreement among individual measurements of thesame property, usually under prescribed similar conditions, expressedgenerally in terms of the standard deviation.

Priority Pollutants List of inorganic and organic analytes commonly tested for in theNational Pollution Discharge Elimination System (NPDES) program.

QA Quality Assurance — An integrated system of management activities involving planning, implementation, assessment, reporting, and quality improvement to ensure that a process, item, or service is of the type and quality needed and expected.

QAPP Quality Assurance Project Plan — A formal document describing incomprehensive detail the necessary QA, QC, and other technical activitiesthat must be implemented to ensure that the results of the work performedwill satisfy the stated performance criteria.

QC Quality Control — The overall system of technical activities thatmeasures the attributes and performance of a process, item, or serviceagainst defined standards to verify that they meet the stated requirementsestablished by the customer; operational techniques and activities that areused to fulfill requirements for quality.

QL Quantitation Limit — The level above which quantitative results may be obtained with a specified degree of confidence.

RCRA The Resource Conservation and Recovery Act of 1976, as amended.

Release Any spilling, leaking, pumping, pouring, emitting, emptying, discharging,injecting, escaping, leaching, dumping or disposing into the environment(including the abandonment or discharging of barrels, containers, andother closed receptacles containing any hazardous substance or pollutantor contaminant). CERCLA § 101(22)

Representativeness A measure of the degree to which the measured results accurately reflectthe medium being sampled. It is a qualitative parameter that is addressedthrough the design of the sampling program in terms of sample location,number of samples, and actual material collected as a “sample” of thewhole.

SAP Sampling and Analysis Plan — Site- and event- specific plan detailingsampling rationale, protocols, and analyses planned per sample type. Apart of the QAPP.

Screening Data Data that are appropriate for applications that only require determinationof gross contamination areas and/or for site characterization decisions thatdo not require quantitative data. Screening data are often used to specifywhich areas to sample to collect definitive data. Guidance for PerformingSite Inspections Under CERCLA, Interim Final, pp. 99 and 100;Guidance for Data Useability in Site Assessment, Draft, p. 15.

Page 70: Quality Assurance Guidance for Conducting Brownfields Site

B-6

SOP Standard Operating Procedure — A written document that details themethod for an operation, analysis, or action with thoroughly prescribedtechniques and steps, and that is officially approved as the method forperforming certain routine or repetitive tasks.

Source Area An area of contamination from which substances may have migrated toother media. Several source areas can be located within a site.

Spike A known quantity of a chemical that is added to a sample for the purposeof determining (1) the concentration of an analyte by the method ofstandard additions, or (2) analytical recovery efficiency, based on samplematrix effects and analytical methodology. Also called analytical spike.

Split Samples Two or more representative portions taken from one sample in the field orin the laboratory and analyzed by different analysts or laboratories. Splitsamples are used to replicate the measurement of the variable(s) ofinterest.

Standard Addition The practice of adding a known amount of an analyte to a sampleimmediately prior to analysis used to evaluate interferences.

Standard Curve A plot of concentrations of known analyte standards versus the instrumentresponse to the analyte.

Surrogate A pure substance with properties that mimic the analyte of interest. It isunlikely to be found in environmental samples and is added to them toestablish that the analytical method has been performed properly.

SVOA Semi-Volatile Organic Analysis or Analyte.

SVOC Semi-Volatile Organic Compound. BNA; extractable organic compound.

SW-846 U.S. EPA “Test Methods for Evaluating Solid Waste,” 1986 (ThirdEdition), plus Updates, a publication describing standard methods ofanalysis, sampling techniques, and QA/QC procedures.

Trip Blank A clean sample of matrix that is carried to the sampling site andtransported to the laboratory for analysis without having been exposed tosampling procedures.

VOA Volatile Organic Analysis or Analyte.

VOC Volatile Organic Compound.

Page 71: Quality Assurance Guidance for Conducting Brownfields Site

C-1

Appendix C - References

Brownfields General

Tool Kit of Information Resources for Brownfields Investigation and Cleanup. EPA 542-B-97-001. Washington, DC: U.S. Environmental Protection Agency (5102G). 1997.

Road Map to Understanding Innovative Technology Options for Brownfields Investigation and Cleanup. EPA 542-B-97-002. Washington, DC: U.S. Environmental Protection Agency (5102G). 1997.

SW-846/RCRA General

Test Methods for Evaluating Solid Waste, Physical/Chemical Methods (SW-846), vol.1, ch. 9 “Sampling Plan.” 3rd ed. Washington, DC: U.S. Environmental Protection Agency, 1986.

Test Methods for Evaluating Solid Waste, Physical/Chemical Methods (SW-846), vol.2, ch. 10 “Sampling Methods.” 3rd ed. Washington, DC: U.S. Environmental Protection Agency, 1986.

CLP/Superfund General

A Compendium of Superfund Field Operations Methods, NTIS PB88-181557; EPA 540-P-87-001. Washington, DC: U.S. Environmental Protection Agency, December 1987.

Data Quality Objectives for Remedial Response Activities - Development Process, EPA 540-G-87-003. Washington, DC: U.S. Environmental Protection Agency, March 1987.

Guidance for Conducting Remedial Investigations and Feasibility Studies Under CERCLA, NTIS PB89-184626; EPA 540-G-89-004. Washington, DC: U.S. Environmental Protection Agency, October, 1988.

Sampler’s Guide to the Contract Laboratory Program, EPA 540-P-90-006. Washington, DC: U.S. Environmental Protection Agency, December 1990.

Superfund Analytical Review and Oversight, NTIS PB90-249541; EPA 9240.0-03. Washington, DC: U.S.Environmental Protection Agency, October 1988.

User’s Guide to the Contract Laboratory Program, EPA/9240.0-1. Washington, DC: U.S. EnvironmentalProtection Agency, December 1988.

General Sampling and Data Guidance - U.S. EPA

Decision Error Feasibility Trials (DEFT) Software for the Data Quality Objectives Process. EPA QA/G-4D. EPA 600-R-96-056, September 1994. <http://es.epa.gov/ncerqa/qa/qa_docs.html>.

DQO Software Tools. U. S. Department of Energy, Office of Environmental Management (EM-76), Pacific Northwest National Laboratory. <http://etd.pnl.gov:2080/DQO/software/intro.html>.

EMAP QA Terms, Office of Research and Development, Office of Modeling, Monitoring Systems and Quality Assurance. October 15, 1997. <http://www.epa.gov/emap/html/qa_terms.html#mm>.

Page 72: Quality Assurance Guidance for Conducting Brownfields Site

C-2

EPA Requirements for Quality Assurance Project Plans, (Draft Final) EPA QA/R-5. Washington, DC: U.S. Environmental Protection Agency, October 1997.

Field Analytical and Site Characterization Technologies: Summary of Applications. EPA 542-R-97-011. Washington, DC: U.S. Environmental Protection Agency (5102G). November 1997.

Field Screening Methods Catalog: User’s Guide, NTIS PB89-134159; EPA 540-2-88-005. Washington, DC: U.S. Environmental Protection Agency, September 1988.

Guidance for the Data Quality Objectives Process, EPA QA/G-4: EPA 600-R-96-055. Washington, DC: U.S. Environmental Protection Agency, September 1994.

Guidance for Data Quality Assessment, EPA Office of Research and Development, EPA 600-R-96-084. Washington, DC: U.S. Environmental Protection Agency, January 1998.

Guidance for Data Useability in Risk Assessment (Part A), Final. NTIS PB92 - 963356; Publication 9285.7-09A. Washington, DC: U.S. Environmental Protection Agency, April 1992.

Guidance for Data Useability in Risk Assessment (Part B), Final. NTIS PB92 - 963362; Publication 9285.7-09B. Washington, DC: U.S. Environmental Protection Agency, May 1992.

Guidance for the Preparation of Standard Operating Procedures for Quality-Related Operations, EPA QA/G-6: Final - EPA 600-R-96-027. Washington, DC: U.S. Environmental Protection Agency,November 1995.

Guidance on Quality Assurance Project Plans, EPA QA/G-5; EPA 600-R-98-018. Washington, DC: U.S.Environmental Protection Agency, February 1998.

Keith, L.H., Environmental Sampling and Analysis: A Practical Guide. In Print. Washington, DC: American Chemical Society, 1990.

Quality Assurance/Quality Control Guidance for Removal Activities, Sampling QA/QC Plan and Data Validation Procedures, Interim Final. NTIS PB90-274481/CCE; EPA 540-G-90-004. Washington, DC: U.S. Environmental Protection Agency, April 1990.

Sample Custody: NEIC Policies and Procedures, EPA 330-9-78-DDI-R. Washington, DC: U.S.Environmental Protection Agency, Revised June 1985.

Standard Operating Procedures for Field Samplers, EPA Region VIII, Environmental Services Division, Denver, CO, March 1986.

Vendor Field Analytical and Characterization Technologies System (Vendor FACTS) v. 3.0. Office of Solid Waste and Emergency Response (5102G). CD-ROM from NCEPI or download from Internetat CLU-IN. 1998. <http://www.clu-in.com/supply1.htm>.

Gilbert, R.O., Statistical Methods for Environmental Pollution Monitoring, Van Nostrand Reinhold NY, 1987

Page 73: Quality Assurance Guidance for Conducting Brownfields Site

C-3

Groundwater Sampling and Monitoring

Compendium of ERT Groundwater Sampling Procedures, NTIS/PB91-921274; EPA 540-P-91-007. Washington, DC: U.S. Environmental Protection Agency, January 1991.

Handbook - Ground Water. EPA 625-6-87-016. Washington, DC: U.S. Environmental Protection Agency, March 1987.

Practical Guide for Ground-Water Sampling, NTIS PB86-137304; EPA 600-2-85-104. Washington, DC: U.S. Environmental Protection Agency, September 1985.

Resource Conservation and Recovery Act (RCRA) Ground-Water Monitoring Technical Enforcement Guidance Document, NTIS PB87-107751. Washington, DC: U.S. Environmental Protection Agency,September, 1986.

Test Methods for Evaluating Solid Waste, Physical/Chemical Methods (SW-846), vol. 2, ch. 11 “Ground Water Monitoring.” 3rd ed. Washington, DC: U.S. Environmental Protection Agency.

Surface Waters

Compendium of ERT Surface Water and Sediment Sampling Procedures, EPA 540-P-91-005.Washington, DC: U.S. Environmental Protection Agency, January 1991.

Kitrell, F.W. A Practical Guide to Water Quality Studies of Streams, NTIS PB-196367. Washington, DC: U.S. Environmental Protection Agency, 1969.

Geophysical Methods

Compendium of ERT Soil Sampling and Surface Geophysics Procedures, EPA 540-P-91-006.Washington, DC: U.S. Environmental Protection Agency, January 1991.

Geophysical Methods for Locating Abandoned Wells, NTIS PB84-212711. Washington, DC: U.S. Environmental Protection Agency, July 1984.

Geophysical Techniques for Sensing Buried Wastes and Waste Migration, NTIS PB84-198449; EPA600-7-84-064. Washington, DC: U.S. Environmental Protection Agency, June 1984.

Subsurface Characterization and Monitoring Techniques, vols.1 and 2. EPA 625-R-93-003a.Washington, DC: U.S. Environmental Protection Agency, May 1993.

Use of Airborne, Surface, and Borehole Geophysical Techniques at Contaminated Sites: A Reference Guide, EPA 625-R-92-007. Washington, DC: U.S. Environmental Protection Agency, September 1993.

Soils and Sediments

Compendium of ERT Surface Water and Sediment Sampling Procedures, EPA 540-P-91-005.Washington, DC: U.S. Environmental Protection Agency, January 1991.

Page 74: Quality Assurance Guidance for Conducting Brownfields Site

C-4

Determination of Background Concentrations of Inorganics in Soils and Sediments at Hazardous Waste Sites (Engineering Forum Issue), Washington, DC: U.S. Environmental Protection Agency, December1995. (http://www.epa.gov/crdlvweb/tsc/issue.htm)

Guidance Document for Cleanup of Surface Impoundment Sites, NTIS PB87-110664; EPA 9380.0-06. Washington, DC: U.S. Environmental Protection Agency, June 1986.

Methods for Evaluating the Attainment of Cleanup Standards, vol. 1, Soils and Solid Media, NTIS PB89-234959. Washington, DC: U.S. Environmental Protection Agency, February 1989.

Sediment Sampling Quality Assurance User’s Guide, NTIS PB85-233542; EPA 600-4-85-048. Washington, DC: U.S. Environmental Protection Agency, July 1985.

Hazardous Waste

Compendium of ERT Waste Sampling Procedures, EPA 540-P-91-008. Washington, DC: U.S. Environmental Protection Agency, January 1991.

Drum Handling Practices at Hazardous Waste Sites, NTIS PB86-165362; EPA 600-2-86-013. Washington, DC: U.S. Environmental Protection Agency, January 1986.

Guidance Document for Cleanup of Surface Tank and Drum Sites, NTIS PB87-110672; EPA 9380.0-03. Washington, DC: U.S. Environmental Protection Agency, May 1985.

Handbook for Stabilization/Solidification of Hazardous Wastes, EPA 540-2-86-001. Washington, DC: U.S. Environmental Protection Agency, June 1986.

Remediation Technologies

Innovative Site Remediation Technology: Phase I (Process Descriptions and Limitations)vol.1 “Bioremediation” EPA 542-B-94-006 June 1995vol.2 “Chemical Treatment” EPA 542-B-94-004 September 1994vol.3 “Soil Washing/Soil Flushing” EPA 542-B-93-012 November 1993vol.4 “Solidification/Stabilization” EPA 542-B-94-001 June 1994vol.5 “Solvent/Chemical Extraction” EPA 542-B-94-005 June 1995vol.6 “Thermal Desorption” EPA 542-B-93-011 November 1993vol.7 “Thermal Destruction” EPA 542-B-94-003 October 1994vol.8 “Vacuum Vapor Extraction” EPA 542-B-94-002 April 1995

(Washington, DC: U.S. Environmental Protection Agency, November 1993 to June 1995)

Innovative Site Remediation Technology: Phase II (Design and Application)vol.1 “Bioremediation” EPA 542-B-97-004 May 1998vol.2 “Chemical Treatment” EPA 542-B-97-005 September 1997vol.3 “Liquid Extraction Technologies” EPA 542-B-97-006 May 1998vol.4 “Solidification/Stabilization” EPA 542-B-97-007 September 1997vol.5 “Thermal Desorption” EPA 542-B-97-008 September 1997vol.6 “Thermal Destruction” EPA 542-B-97-009 August 1998vol.7 “Vacuum Extraction and Air Sparging” EPA 542-B-97-010 May 1998

(Washington, DC: U.S. Environmental Protection Agency, September 1997 to August 1998)

Page 75: Quality Assurance Guidance for Conducting Brownfields Site

C-5

SOURCES OF DOCUMENTS

U.S. Environmental Protection Agency (no charge)Center for Environmental Research Information (CERI)ORD Publications26 West Martin Luther King DriveCincinnati, OH 45268(513) 569-7562

http://es.epa.gov/program/epaorgs/ord/ceri.html

U.S. EPA Technology Innovation OfficeClean-Up Information Internet Site

http://clu-in.com

Public Information Center (PIC) (no charge)U. S. Environmental Protection AgencyPublic Information Center (PIC)PM-211B401 M Street, S.W.Washington, DC 20460(202) 382-2080

e-mail: [email protected]

Superfund Docket and Information Center (SDIC) (no charge)U.S. Environmental Protection AgencySuperfund Docket and Information Center (SDIC)OS-245401 M Street, S.W.Washington, DC 20460(202) 382-6940

http://www.epa.gov/earth100/records/a00108.html

National Technical Information Services (NTIS) (cost varies)National Technical Information ServiceU. S. Department of Commerce5285 Port Royal RoadSpringfield, VA 22161

http://www.ntis.gov/

National Center for Environmental Publications and Information (no charge)U.S. EPA/NCEPIP.O. Box 42419Cincinnati, Ohio 45242-2419(800) 490-9198 Fax (513) 489-8695

http://www.epa.gov/epahome/publications.htm