1 university of massachusetts, amherst, © 2004 evaluating stormwater technology performance module...

67
University of Massachusetts, Amherst, © 2004 “Evaluating Stormwater Technology Performance” Module II Training in Support of the TARP Stormwater Protocol: Stormwater Best Management Practice Demonstrations” April 2004 Prepared by Eric Winkler Ph.D. and Nicholas Bouthilette Center for Energy Efficiency and Renewable Energy University of Massachusetts – Amherst

Upload: steven-wallace

Post on 27-Mar-2015

215 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 20041

“Evaluating Stormwater Technology Performance”

Module IITraining in Support of the TARP Stormwater Protocol:

“Stormwater Best Management Practice Demonstrations”

April 2004

Prepared byEric Winkler Ph.D. and Nicholas Bouthilette

Center for Energy Efficiency and Renewable EnergyUniversity of Massachusetts – Amherst

Page 2: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 20042

Meet the Instructors

Nancy BakerMassachusetts Dept. of

Environmental Protection1 Winter StreetBoston, MA 02108(617) 654-6524 (Voice)(617)292-5850 (Fax)[email protected]

Eric Winkler, Ph.D.

University of Massachusetts

160 Governors DriveAmherst, MA 01003-9265(413) 545-2853 (Voice)(413) 545-1027 (Fax)[email protected]

Page 3: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 20043

Meet the Sponsors

TARP Stormwater Work Group:

► California► Maryland► Massachusetts► New Jersey► Pennsylvania► Virginia

State of Washington, Illinois, New York, and ETV are collaborating with TARP

TARP Member State

Page 4: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 20044TARP Member State

Goals of TARP Stormwater Work Group

► Use Protocol to Test New BMPs► Approve Effective New Stormwater BMPs► Get Credible Data on BMP Effectiveness► Share Information and Data► Increase Expertise on New BMPs► Use Protocol for Appropriate State

Initiatives

Page 5: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 20045

Evaluating Stormwater Technology Performance

Learning Objectives► Use the TARP Stormwater

Demonstration Protocol to review a test plan and a technology evaluation.

► Recognize data gaps and deficiencies.

► Develop, implement, and review a test plan.

► Understand, evaluate, and use statistical methods.

Logistical Reminders► Phone Audience

Keep phone on mute * 6 to mute your phone and

again to un-mute Do NOT put call on hold

► Simulcast Audience

Use at top of each slide

to submit questions

► Course Time = 2 hours

► 2 Question & Answer Periods

► Links to Additional Resources

► Your Feedback

Page 6: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 20046

Project NoticePrepared by The Center for Energy Efficiency and Renewable Energy, University of Massachusetts Amherst for submission under Agreement with the Environmental Council of States. The preparation of this training material was financed in part by funds provided by the Environmental Council of States (ECOS).

This product may be duplicated for personal and government use and is protected under copyright laws for the purpose of author attribution.

“Publication of this document shall not be construed as endorsement of the views expressed therein by the Environmental Council of States/ITRC or any federal agency."

Page 7: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 20047

DISCLAIMER

The contents and views expressed are those of the authors and do not necessarily reflect the views and policies of the Commonwealth of Massachusetts its agencies or the University of Massachusetts. The contents of this training are offered as guidance. The University of Massachusetts and all technical sources referenced herein do not (a) make any warranty or representation, expressed or implied, with respect to accuracy, completeness, or usefulness of the information contained in this training, or that the use of any information, apparatus, method, or process disclosed in this report may not infringe on privately owned rights; (b) assume any liabilities with respect to the use of, or for damages resulting from the use of, any information, apparatus, method or process disclosed in this report. Mention or images of trade names or commercial products does not constitute endorsement or recommendation of use.

Page 8: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 20048

Module I: Planning for A Stormwater BMP Demonstration

1. Factors Affecting Stormwater Sampling2. Data Quality Objectives and the Test QA

Plan

3. Sampling Design4. Statistical Analyses5. Data Adequacy: Case Study

Module II: Collecting and AnalyzingStormwater BMP Data

Page 9: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 20049

3. Sampling Design

► Stormwater Data Collection Guidance

► Locating Samples & Stations ► Selecting Water Quality Parameters► QA/QC► Sample Handling and Record

Keeping► Field Measures

Page 10: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200410

Sampling Plan Elements TARP Data Collection Criteria

(Section 3.3, TARP Protocol)

► Any relevant historic data

► Monthly mean rainfall and snowfall data (12 months over the period of record)

► Rainfall intensity over 15 minute increments

Page 11: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200411

Protocol Minimum Criteria Identifying Qualifying Storm Event

(Section 3.3.1.2 and Section 3.3.1.3, TARP Protocol)

► Minimum rainfall event depth is 0.1 inch.

► Minimum inter-event duration of 6 hours

(duration beginning a cessation of flow to unit).

► Base flow should not be sampled.

Identification of qualifying event needs to

verify flow to the unit and rainfall

concurrently.

Page 12: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200412

Analysis for Determining Number of Samples

Paired sampling approach-Significant difference in mean concentration • Coefficient of variation• Power of 80%• 95% confidence

(Urban Stormwater BMP Performance Monitoring. USEPA, 2002.)

Coefficient of Variation

Diff

eren

ce in

sam

ple

set m

eans

(%

)

Example: 80% difference in means, 100% CV = 20 sample pairs

Page 13: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200413

Qualifying Event Sample TARP Protocol Criteria

► 10 water quality samples per event 10 influent and 10 effluent If composite - 2 composites, 5 sub-samples

► Data for flow rate and flow volume► At least 50% of the total annual rainfall

CA – monitor 80-90% of rainfall.

Page 14: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200414

Qualifying Event Sample (continued)

► Preferably 20 storms, 15 minimum

► Sampling over the course of a full year of sampling to account for seasonal variation

► Compositing flow-weighted samples cover at least 70% of storm flow (and as much of the first 20% as possible)

► Examples of variation within TARP community: PA - Temporary BMPs sized using 2 year event

NJ – Water Quality design based on volume from a 1.25 inch event.

Page 15: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200415

Sampling Locations

► Location in close proximity to BMP technology inlet

and outlet.► Consider Field personnel safety during equipment

access.► Secure equipment to avoid vandalism► Provide a scaled plan view of site that indicates:

All buildings Land uses Storm drain inlets Other control devices

Page 16: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200416

Automated Sampling Equipment

► Water surface elevation should be less than 15 feet below the elevation of the pump in the sampler.

► Access to the equipment should take into consideration confined space safety.

► Flow measurement equipment should be located to avoid measuring turbulent flow.

Page 17: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200417

Adequacy of Sampling and Flow Monitoring Procedures

► Primary and secondary flow measurement devices are required.

► Programmable automatic flow samplers with continuous flow measurements are recommended.

► Time-weighted composite samples are not acceptable unless flow is monitored and the event mean concentration can be calculated.

Page 18: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200418

Monitoring Location Recommendations

► Monitoring location design should consider whether the upgradient catchment system is served by a separate storm drain system.

► Pay attention to potential combined sewer system or illicit connections which may contaminate stormwater system.

► The storm drain system should be well understood to allow reliable delineation and description of catchment area.

► Flow-measuring monitoring stations in open channels should have suitable hydraulic control and where possible the ability to install primary flow measurement devices.

Page 19: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200419

Monitoring Location Recommendations (continued)

► Avoid steep slopes, pipe diameter changes, junctions, and irregular channel shaping (due to breaks, roots, debris).

► Avoid locations likely to be affected by backwater and tidal conditions.

► Pipe, culvert, or tunnel stationing should be located to avoid surcharging (pressure flow) over the normal range of precipitation.

► Use of reference watershed and remote rainfall data are discouraged.

Page 20: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200420

Selecting Applicable Water Quality Parameters

► Designated uses of the receiving water – consider stormwater discharge constituents

► Overall program objectives and resources – adjust parameter list according to resources (test method capability, personnel, funds, and time)

► Use of “Keystone” pollutant may vary from state to state

Page 21: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200421

Resources for Standardized Test Methods

(Section 3.1, TARP Protocol)

► EPA Test Methods – pollutant analysis www.epa.gov/epahome/Standards.html

► ASME Standards and Practices – pressure flow measurements

► ASCE Standards – hydraulic flow estimation► ASTM Standards – precision open-channel flow

measurements for water constituent analysis► National Field Manual for Collection of Water

Quality Data, Wilde et al., USGS water.usgs.gov/owq/FieldManual/

► “Guidance Manual: Stormwater Monitoring Protocols” – Caltrans. www.dot.ca.gov/hq/env/stormwater/

Page 22: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200422

Examples of Analytical Laboratory Methods

► Total Phosphorus – SM4500-PE► Nitrate and Nitrite – EPA353.1► Ammonia – EPA350.1► Total Kjeldahl Nitrogen – EPA351.2► TSS – SM2540D► SSC – ASTM D3977-97► Enterococci – SM9230C► Fecal Coliform – SM9222D► Chronic Microtox Toxicity Test – Azur Environmental

Reference: ASTM, EPA, American Public Health Association, (Other non-standard methods or tests approved through acceptable regulatory process, EPA or state authority)

Page 23: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200423

Constituent Listing w/ Detection Limits

Urban Stormwater BMP Performance Monitoring. US EPA, 2002.

Page 24: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200424

Representative Sampling

► Sample Collection Uniformity varies between

staff and technique

Automated sampling allows for reproducible samples

Standard Operating Procedures (SOPs) and Quality Assurance help track variability

Page 25: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200425

Required Elements in Lab QA/QC (Section 3.3.5, TARP Protocol)

The QAPP Test QA Plan and Sampling and Analysis Plan should include: Laboratory and sampling equipment decontamination Sample preservation and holding time Sample volumes QC Samples (Spikes, Blanks, Splits, Field and Blank Lab

Duplicates) QA on Sampling Equipment (Calibration) Packaging and Shipping Identification and Labeling Chain of Custody Lab Certification

Page 26: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200426

Quality Control Tests and Acceptance Limits for Physical/Chemical Parameters Analyzed in the

Laboratory

Parameter

Accuracy Precision

QC Test

Acceptance Limits (RPD)a Frequency QC Test

Acceptance Limits(RPD)

Frequency

Total Phosphorus

QCSb 90-110% rec.Each sample batch of ≤ 20

samples (≥5%)Duplicates ≤20

Each sample batch of ≤20

samples (≥5%)LFBc 85-115% rec.

LFMd 80-120% rec.

Nitrate and Nitrite N

QCS 90-110% rec.Each sample batch of ≤ 20

samples (≥5%)Duplicates ≤45

Each sample batch of ≤

20 samples (≥ 5%)

LFB 85-115% rec.

LFM 80-120% rec.

Ammonia-N

QCS 90-110% rec.Each sample batch of ≤ 20

samples (≥5%)Duplicates ≤ 52

Each sample batch of ≤ 20

samples (≥5%)LFB 85-115% rec.

LFM 80-120% rec.

Total Kjeldahl N

QCS 90-110% rec.Each sample batch of ≤ 20

samples (≥5%)Duplicates ≤28

Each sample batch of ≤ 20

samples (≥5%)LFB 85-115% rec.

LFM 80-120% rec.

Total Suspended Solids

QCS 75-125% rec.Each sample batch of ≤ 20

samples (≥5%)Duplicates ≤ 25

Each sample batch of ≤ 20

samples (≥5%)

Suspended Sediment

ConcentrationQCS 75-125% rec.

Each sample batch of ≤ 20

samples (≥5%)Duplicates ≤ 25

Each sample batch of ≤ 20

samples (≥5%)aRPD = relative percent difference among duplicates cLFB= laboratory fortified blank samplebQCS = quality control sample from source outside of laboratory dLFM = laboratory fortified matrix sample

Page 27: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200427

Laboratory Records

► Sample data Number of samples, holding times, location,

deviation from SOP’s, time of day, and date. Corrective procedures for samples inconsistent with

the protocol.

► Management of records Consider electronic filing. Document data validation, calculations and analysis,

and data presentation. Review data reports for completeness, including

requested analyses and all required QA.

Page 28: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200428

Field Sample QC► Field Matrix Spike

A sample prepared at the sampling point by adding a known mass of the target analyte to a specified amount of sample.

Used to determine the effect of sample preservation, shipment, storage, and preparation on analyte recovery efficiency.

► Field Split Sample The split of a sample into two representative portions

to be sent off to different laboratories. Estimates inter-laboratory precision.

Page 29: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200429

Quality Control

► Field Blanks A clean sample carried to the sampling site, exposed to

sampling conditions, and returned to the laboratory. Provides useful information about pollutants and error that

may be introduced during the sampling process.

► Field Duplicates Identical samples taken from the same sampling location

and time (not split). Identical sampling and analytical procedures to assess

variance of sampling and analysis.

Page 30: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200430

Quality Assurance / Quality ControlDocumentation and Records

► Documentation: Include all QC data Define critical records and information, as

well as the data reporting format and document control procedures

► Reporting: Field operation records Laboratory records Data handling records

Page 31: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200431

Field Operation Records

► Sample collection records Show that the proper sampling protocol was used in the

field by indicating persons names, sample numbers, collection points, maps, equipment/methods, climatic conditions, and unusual observations

► Chain of custody records Document the progression of samples as they travel from

sampling location to the lab to disposal area

► QC sample records Document QC samples such as field, trip, blanks, and

duplicate samples Provide information on frequency, conditions, level of

standards, and instrument calibration history

Page 32: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200432

Field Operation Records (continued)

► General field procedures Record field procedures and areas of difficulty in

gathering samples.

► Corrective action reports Where deviations of standard operating

procedures occurs, report methods used and/or details of procedure, and a plan to resolve noncompliance issues.

Page 33: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200433

Questions and Answers (1 of 3)

Page 34: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200434

Working With Stormwater Data

1. Factors Affecting Stormwater Sampling2. Data Quality Objectives and the Test QA

Plan 3. Sampling Design4. Statistical Analyses5. Data Adequacy: Case Study

Page 35: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200435

4. Statistical Analyses

► Data Reporting and Presentation► Statistical Method Review► Appropriate forms of Analyses► Results Interpretation

Page 36: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200436

Presenting Statistical Data and Applicability Efficiency Calculation

Implications

► Determine the category of BMP. ► BMPs with well-defined inlets and outlets whose

primary treatment depends upon extended detention storage of stormwater.

► BMPs with well-defined inlets and outlets that do not depend upon significant storage of water.

► BMPs that do not have a well defined inlet and/or outlet.

► Widely distributed BMPs that use reference watersheds to evaluate effectiveness.

Page 37: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200437

Analysis to Address Data Quality Objectives

► What degree of pollution control does the BMP provide under typical operating conditions?

► How does efficiency vary from pollutant to pollutant?

► How does efficiency vary with input concentrations?► How does efficiency vary with storm

characteristics?► How do design variables affect performance?► How does efficiency vary with different operational

and/or maintenance approaches?

Page 38: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200438

Analysis to Address Data Quality Objectives (Continued)

► Does efficiency improve, decay, or remain stable over time?

► How does efficiency, performance, and effectiveness compare to other BMPs?

► Does the BMP reduce toxicity?► Does the BMP cause an improvement or

protect downstream biotic communities?► Does the BMP have potential downstream

negative impacts?

Page 39: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200439

Efficiency Ratio (ER)TARP Protocol Recommended

Method

Where Event Mean Concentration (EMC):

V=volume of flow during period in=total number of events C=average concentration associated with period jm=number of events measured

Page 40: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200440

Efficiency Ratio Interpretation

► EMCs weight all storms equally.► Most useful when loads are directly proportional to the

relative magnitude of the storm – accuracy varies with BMP type.

► Minimizes impacts of smaller/cleaner storms.► Allows for use of data where portions of data are

missing – would not significantly effect the average EMC.

• Can apply log normalization to avoid equal weighting of events.

Page 41: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200441

Summation of Loads

The sum of loads can be calculated using concentration and flow volume, as follows:

• Removal efficiencies calculated by the summation of loads tend to be dominated by larger storm events.

• The summation of loads method uses a mass balance approach.

Page 42: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200442

Regression of Loads

The Regression of Loads is defined as:  

Percent reduction in loads is approximated as:

β - slope term in regression analysis.

• Data can be dominated by large storm events.• It is recommended that the line not be forced through origin.• May require polynomial fit to achieve higher R2

Page 43: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200443

Mean Concentration

The Mean Concentration is defined as:

average of outlet concentration

average of inlet concentration

MC = 1 –

• May be useful for threshold level pollutants, e.g., bacterial or toxics.• Weights samples equally and may result in bias due to variances in sampling

protocols.• Not amenable to mass balance approach.• Flow measure must represent total event characteristics

Page 44: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200444

Efficiency of Individual Storm Loads

The efficiency of the BMP for a single storm is given by:

Average efficiency can be calculated as follows:

• Weights all storms equally.• Must have paired data for inflow and outflow.• Does not account for interrelationship between storm events.

Page 45: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200445

Interpreting Results and Inappropriate Analyses

► Efficiency Ratio – Most useful when loads are directly proportional to the relative magnitude of the storm – accuracy varies with BMP type.

► Summation of Loads – A small number of large storms can significantly influence results.

► Regression of Loads – Assumes removal efficiency is uniform over a range of operating conditions and concentrations.

► Mean Concentration – Not appropriate where flow-weighted sampling is performed – weighs all events equally.

► Average efficiency requires that inflow and outflow are related.

Page 46: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200446

Using Statistics Inappropriately

► Each method is likely to produce a different efficiency.

► Method should be chosen by its relevance and applicability to each scenario, not by the efficiency value it produces.

► Be aware of statistics being misused to support claims.

► Reporting of ranges may be more appropriate under certain test conditions, e.g., determination that data is qualitative versus quantitative.

Page 47: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200447

Questions and Answers (2 of 3)

Page 48: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200448

Working With Stormwater Data

1. Factors Affecting Stormwater Sampling2. Data Quality Objectives and the Test QA

Plan 3. Sampling Design4. Statistical Analyses5. Data Adequacy: Case Study

Page 49: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200449

5. Case Study I: Test Plan and Data Adequacy

► Review of Data Test QA Plan and Data Quality Assurance Project

Plan Field and lab data adequacy Data reporting and depiction of performance

claims

► Potential problems Field and lab violations Data reduction issues

Page 50: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200450

Background► Study of Structural BMP

► Performance Claim made – 80% TSS removal

► Study commenced prior to TARP Protocol as well as other published, public domain stormwater BMP monitoring protocols.

► Resources for conducting study borne by single private entity.

► Study design included limited information on site, sampling equipment, sampler programming, calibration, sample collection and analysis.

► TSS/SSC primary water quality parameter of interest.

► Flow measurement and rain gauge equipment to be installed.

► Analytical testing to be performed by outside laboratory.

► Sample collection by technology developer.

Page 51: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200451

Sample Plan Adequacy – Issues Identified

► Provide complete engineering drawings of system including: Entire drainage area connected to system. Pipe sizing and inlet locations. Description of pervious and impervious surfaces. Design calculations used to size unit. Climatic data used to design structures. Any additional structures or site details that may have bearing on

system.

► Provide use characteristics of site including: Vehicle and industrial usage. Maintenance of site relative to sweeping, gutter maintenance,

spill containment, and snow stockpiling/disposal.

► Indicate condition and maintenance of system prior to commencing test, e.g., was system cleaned out during or before initiation of this phase of the study?

Page 52: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200452

Sample Plan Adequacy – Test Equipment and Sampling

Issues► Provide a description and/or reference to

manufacturer’s documentation showing how flow meter and sampling equipment is calibrated and their location in the system (cross-section and plan view).

► Detailed description of sampling program by design and storm event.

► Indicate when and how much sample to be taken.► Provide explanation and methodology for discreet

sampling. Indicate who, when, and where in the system these samples were taken.

► Identify equipment used and calibrations used for sampling.

► Provide discussion/explanation for sampling without detention delay in the system between inlet and outlet.

Page 53: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200453

Sample Plan Adequacy – Lab and Data Analysis Issues

► Identify laboratory and controls for sampling handling,

QA/QC on sample data

► Identify methodology used for PSD, including sieve sizes

and sample sizes.

► Analytical method for TSS stated in protocol - EPA 160.2.

► Need explanation and equations used to calculate EMC,

including calculation worksheets.

Page 54: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200454

Field Test Issues

► Flow meters not calibrated for the first 10 events.► Truncated sampling protocol.► No documented instrument calibration.► Once flow measurement error was identified, an

adjusted flow factor is applied to the first 10 events, based on the outcome of the second 10 events.

► Measured and calculated storm volumes vary by up to a factor of 10 resulting in adjustment to flow volume.

Page 55: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200455

Data Analysis IssueStorm

Rainfall Depth (in) Flow Volume M (ft3)1

Number of Sub Samples

Influent EMC(mg/L)

Effluent EMC(mg/L)

Event Removal Efficiency1

1 0.3 21,600 (2880) 28 65.9 50.3 24 %2 0.52 12,051 (1607) 17 1010.7 149.2 85 %

3 0.32 9,117 (1216) 12 1364.9 63.6 95 %

4 0.32 13,203 (1760) 13 857.6 49.4 94 %5 0.53 9,630 (1284) 11 367.6 145.9 60 %

6 0.46 6,831 (911) 18 533.2 57.8 89 %

7 0.55 14,441 (1925) 30 43 31 28 %

8 0.75 18,045 (2406) 40 1088.8 52 95 %

9 0.10 2,183 (291) 6 37.2 33.6 10 %

10 0.17 4,559 (608) 12 61 38 38 %

11 5.45 147,586 123 88.8 59.1 33 %

12 0.48 1,284 40 111.6 47.3 58 %

13 0.53 13,210 70 46.2 19.8 57 %

14 0.13 2,908 12 69.2 14.7 79 %

15 0.43 9,543 40 33.1 12.6 62 %

16 1.91 71,607 40 164.1 93.2 43 %

17 1.02 29,378 80 233.6 102.4 56 %

18 0.27 6,858 33 93.3 25.5 73 %

19 0.25 6,614 32 57.4 21 63 %

20 0.30 7,753 37 188.4 70.3 63 %Values in italics are untransformed flow values in ft3

Page 56: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200456

Predicted / Measured Runoff Values

► The values highlighted in red indicate that the relationship between rainfall and total runoff is inconsistent.

► Applying the rational method (Q=CiA) leads reviewers to believe that measured runoff is inaccurate when compared to predicted values.

0

5000

10000

15000

20000

25000

30000

0 0.5 1 1.5 2

Precip. (inches)

Ru

no

ff (

ft3)

Adjusted Runoff

Measured Runoff

95% Conf +

95% Conf -

Predicted Runoff

Linear (95% Conf +)

Linear (95% Conf -)

Page 57: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200457

Variation in Performance Values

Removal Efficiencies for all Events:

Removal Efficiency by Efficiency Ratio: 83%

Removal Efficiency by Summation of Loads: 73%

Removal Efficiency by Regression of Loads: 77%

Removal Efficiency by Efficiency of Individual Storm Events: 60%

Removal Efficiencies for Events 11-20:

Removal Efficiency by Efficiency Ratio: 57%

Removal Efficiency by Summation of Loads: 44%

Removal Efficiency by Regression of Loads: 40%

Removal Efficiency by Efficiency of Individual Storm Events: 59%

Page 58: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200458

Data Analysis and Presentation Issues

► Impact of adjusted flow on average net TSS/SSC removal resulted in positive bias in performance efficiency

► Missing raw data including documented deviations from sampling plan, lab analysis and data management

► Missing laboratory and data management QA/QC

► No independent validation of calibrations, sampling and analysis

► Missing statistical analysis including relative percent differences (precision) and percent recovery (accuracy) as well as number of sample (n), standard deviations, means (specify arithmetic or geometric) and standard errors.

► Missing documentation of chain of custody protocol

Page 59: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200459

Ensuring Adequate Data

► Provide QC activities including blanks, duplicates, matrix spikes, lab control samples, surrogates, or second column confirmation.

► State the frequency of analysis and the spike compound sources and levels.

► State required control limits for each QC activity and specify corrective actions and effectiveness if limits exceeded.

Page 60: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200460

Summary of Data Quality Assessment

5 Steps1. Review the DQOs and Sampling Design.

2. Conduct a Preliminary Data Review.

3. Select the Statistical Test.

4. Verify the Assumptions of the Statistical Test.

5. Draw Conclusions from the Data.

Page 61: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200461

Summary : Sampling Design

► Use of TARP Protocol data collection criteria► Consideration of issues relating to sampling

locations► Sampling water quality parameters► Lab analysis of water quality parameters

Page 62: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200462

Summary : Statistical Analysis

► Protocol recommends the efficiency ratio► Summation of Loads, regression of loads, &

mean concentration may be applicable ► Each method gives somewhat different

results► Check statistics to be sure they support

claims

Page 63: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200463

Summary : Case Study

► Learn to recognize limitations in test plans, equipment data and documentation deficiencies, and problems with the statistical analysis

► Making decisions on the usability of the data

► Sharing the data among TARP states and others

Page 64: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200464

Module II: Retrospective

► Sampling Design Planning, Contingency Planning, & Flexibility

► Statistical Analysis Methods Show Different Results --- Use

Caution Interpreting Results

► Data Adequacy : Case Study Learning to Deal with Sampling & Data

Deficiencies

Page 65: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200465

What Have You Accomplished?

► Guidance for Using the TARP Protocol► Exposure to Key Issues in a Technology

Demonstration Field Test► Knowledge of TARP Stormwater Work

Group and Others Evaluating Technologies

Page 66: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200466

Questions and Answers (3 of 3)

Page 67: 1 University of Massachusetts, Amherst, © 2004 Evaluating Stormwater Technology Performance Module II Training in Support of the TARP Stormwater Protocol:Stormwater

University of Massachusetts, Amherst, © 200467

Evaluating Stormwater Technology Performance

TARP: http://www.dep.state.pa.us/dep/deputate/pollprev/techservices/tarp/

Links to Additional Resources