a hydrologic water quality model application … · a hydrologic⁄water quality model application...

14
A HYDROLOGIC WATER QUALITY MODEL APPLICATION PROTOCOL 1 Bernard Engel, Dan Storm, Mike White, Jeff Arnold, and Mazdak Arabi 2 ABSTRACT: This paper presents a procedure for standard application of hydrologic water quality models. To date, most hydrologic water quality modeling projects and studies have not utilized formal protocols, but rather have employed ad hoc approaches. The procedure proposed is an adaptation and extension of steps identified from relevant literature including guidance provided by the U.S. Environmental Protection Agency. This proto- col provides guidance for establishing written plans prior to conducting modeling efforts. Eleven issues that should be addressed in model application plans were identified and discussed in the context of hydrologic water quality studies. A graded approach for selection of the level of documentation for each item was suggested. The creation and use of environmental modeling plans is increasingly important as the results of modeling projects are used in decision-making processes that have significant implications. Standard modeling application proto- cols similar to the proposed procedure herein provide modelers with a roadmap to be followed, reduces modelers’ bias, enhances the reproducibility of model application studies, and eventually improves acceptance of modeling outcomes. (KEY TERMS: simulation; quality assurance quality control; runoff; nonpoint source pollution.) Engel, Bernard, Dan Storm, Mike White, Jeff Arnold, and Mazdak Arabi, 2007. A Hydrologic Water Quality Model Application Protocol. Journal of the American Water Resources Association (JAWRA) 43(5):1223-1236. DOI: 10.1111/j.1752-1688.2007.00105.x INTRODUCTION Planning for modeling projects is just as important as planning traditional environmental measurements for data collection projects. If model predictions are to be used for regulatory purposes, research or design, then the modeling effort should be scientifically sound, robust, and defensible. To ensure this and to lead to confidence in results, the U.S. Environmen- tal Protection Agency (USEPA, 2002) recommends a planning process that incorporates the following ele- ments: (1) A systematic planning process including identi- fication of assessments and related performance criteria. (2) Peer-reviewed theory and equations. (3) Carefully designed life-cycle development pro- cesses that minimize errors. (4) Documentation of changes from the original plan. 1 Paper No. J05065 of the Journal of the American Water Resources Association (JAWRA). Received May 18, 2005; accepted February 5, 2007. ª2007 American Water Resources Association. Discussions are open until April 1, 2008. 2 Respectively, Professor, Agricultural and Biological Engineering, Purdue University, 225 S. University Street, West Lafayette, Indiana 47907; Professor and Research Associate, 121 Agricultural Hall, Oklahoma State University, Stillwater, Oklahoma 74078-6016; Agricultural Engineer, USDA-ARS, 808 East Blackland Road, Temple, Texas 76502; and Postdoctoral Research Associate, Agricultural and Biological Engineering, Purdue University, 225 S. University Street, West Lafayette, Indiana 47907 (E-mail/Engel: [email protected]). JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION 1223 JAWRA JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION Vol. 43, No. 5 AMERICAN WATER RESOURCES ASSOCIATION October 2007

Upload: lamxuyen

Post on 27-Jul-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

A HYDROLOGIC ⁄WATER QUALITY MODEL APPLICATION PROTOCOL1

Bernard Engel, Dan Storm, Mike White, Jeff Arnold, and Mazdak Arabi2

ABSTRACT: This paper presents a procedure for standard application of hydrologic ⁄ water quality models. Todate, most hydrologic ⁄ water quality modeling projects and studies have not utilized formal protocols, but ratherhave employed ad hoc approaches. The procedure proposed is an adaptation and extension of steps identifiedfrom relevant literature including guidance provided by the U.S. Environmental Protection Agency. This proto-col provides guidance for establishing written plans prior to conducting modeling efforts. Eleven issues thatshould be addressed in model application plans were identified and discussed in the context of hydrologic ⁄ waterquality studies. A graded approach for selection of the level of documentation for each item was suggested. Thecreation and use of environmental modeling plans is increasingly important as the results of modeling projectsare used in decision-making processes that have significant implications. Standard modeling application proto-cols similar to the proposed procedure herein provide modelers with a roadmap to be followed, reduces modelers’bias, enhances the reproducibility of model application studies, and eventually improves acceptance of modelingoutcomes.

(KEY TERMS: simulation; quality assurance ⁄ quality control; runoff; nonpoint source pollution.)

Engel, Bernard, Dan Storm, Mike White, Jeff Arnold, and Mazdak Arabi, 2007. A Hydrologic ⁄ Water QualityModel Application Protocol. Journal of the American Water Resources Association (JAWRA) 43(5):1223-1236.DOI: 10.1111/j.1752-1688.2007.00105.x

INTRODUCTION

Planning for modeling projects is just as importantas planning traditional environmental measurementsfor data collection projects. If model predictions are tobe used for regulatory purposes, research or design,then the modeling effort should be scientificallysound, robust, and defensible. To ensure this andto lead to confidence in results, the U.S. Environmen-tal Protection Agency (USEPA, 2002) recommends a

planning process that incorporates the following ele-ments:

(1) A systematic planning process including identi-fication of assessments and related performancecriteria.

(2) Peer-reviewed theory and equations.(3) Carefully designed life-cycle development pro-

cesses that minimize errors.(4) Documentation of changes from the original

plan.

1Paper No. J05065 of the Journal of the American Water Resources Association (JAWRA). Received May 18, 2005; accepted February 5,2007. ª2007 American Water Resources Association. Discussions are open until April 1, 2008.

2Respectively, Professor, Agricultural and Biological Engineering, Purdue University, 225 S. University Street, West Lafayette, Indiana47907; Professor and Research Associate, 121 Agricultural Hall, Oklahoma State University, Stillwater, Oklahoma 74078-6016; AgriculturalEngineer, USDA-ARS, 808 East Blackland Road, Temple, Texas 76502; and Postdoctoral Research Associate, Agricultural and BiologicalEngineering, Purdue University, 225 S. University Street, West Lafayette, Indiana 47907 (E-mail/Engel: [email protected]).

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION 1223 JAWRA

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION

Vol. 43, No. 5 AMERICAN WATER RESOURCES ASSOCIATION October 2007

Page 2: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

(5) Clear documentation of assumptions, theory,and parameterization that is detailed enough soothers can fully understand the model predic-tions.

(6) Input data and parameters that are accurateand appropriate for the application.

(7) Model prediction data that can be used to helpinform decision-making.

A Quality Assurance Project Plan and good projectmanagement in modeling projects are closely linked.A good Quality Assurance Project Plan documents allcriteria and assumptions in one place for easy reviewand reference. The plan can be used to guide projectpersonnel through the model development or applica-tion process and helps ensure that choices are consis-tent with project objectives and requirements.However, it should be noted that many assumptionsand decisions cannot be made until the modelingeffort is underway. A well prepared plan can be help-ful in providing guidance in such situations. Assump-tions and decisions made during the modeling processshould be documented.

Quality assurance (QA) in hydrologic modeling isthe procedural and operational framework put in placeby the organization managing the modeling study toensure adequate execution of all project tasks, and toensure that all modeling-based analyses are verifiableand defensible (Taylor, 1985). The two major elementsof QA are quality control (QC) and quality assessment.QC addresses the procedures that ensure the qualityof the final product. The procedures include: (1) theuse of appropriate methodology in developing andapplying computer models; (2) suitable verification,calibration, and validation procedures; and (3) properuse of the methods and model. Quality assessment isapplied to monitor the QC procedures (van der Heijde,1987).

Use of a modeling protocol provides several poten-tial benefits to projects that include a significantmodeling component. These include: (1) reducespotential modeler bias, (2) provides a roadmap to befollowed, (3) allows others to assess decisions madein modeling the system of interest, (4) allows othersto repeat the study, and (5) improves acceptance ofmodel results.

A modeling protocol, preferably written, should beestablished prior to conducting a modeling study. Todate, most hydrologic ⁄ water quality modeling projectsand studies have not utilized formal modeling proto-cols, but rather ad hoc approaches are typicallyemployed. The goal of this paper is to define the con-tent of a modeling protocol or a modeling QA planthat can be used to help hydrologic ⁄ water qualitymodelers establish such protocols for their modelingprojects.

LITERATURE REVIEW

In following the scientific method, steps should betaken to minimize the potential influence of scientists’possible bias. The use of a modeling protocol or a QAplan in modeling projects can provide the documenta-tion needed to assess the project and can be helpful inreducing potential bias. By definition, the scientificmethod is impartial and the results from the applica-tion of the scientific method must be reproducible.Therefore, the modeling protocol and associated docu-mentation must provide enough detail to allow themodeling project to be repeated. It should be notedthat models are not hypotheses, but are simply toolsthat are used to evaluate a hypothesis. As applied tohydrologic modeling, the steps in the scientific methodmay be given as follows:

(1) Based on existing theory and data, develop ahypothesis that is consistent with the currentunderstanding of the system being modeled.

(2) Based on the hypothesis, make predictions byapplying an appropriate hydrologic model.

(3) Test the hypothesis by comparing model predic-tions with observed data.

(4) Accept or reject the hypothesis based on appro-priate criteria.

(5) If needed, modify the hypothesis and repeatSteps 2-5.

Refsgaard (1997) defined a modeling protocol asdepicted in Figure 1. Refsgaard makes a distinctionbetween a model and a model code; a model is anyhydrologic model established for a particularwatershed. Others might refer to Refsgaard’s defini-tion of a model as a model ‘‘setup’’ or a ‘‘parameter-ized’’ model. Refsgaard (1997) defined a model codeas a generalized software package, which withoutchanges, can be used to establish a model with thesame basic types of equations (but allowing diffe-rent parameter values) for different watersheds.Refsgaard (1997) defined model validation as theprocess of demonstrating that a given site-specificmodel is capable of making ‘‘sufficiently accurate’’predictions, where ‘‘sufficiently accurate’’ will varyby application and project needs. A model is consid-ered validated if its accuracy and predictive capabi-lity in the validation period have been proven to liewithin acceptable limits. Again, acceptable limitswill vary by application and project requirements.Interestingly, Refsgaard (1997) does not include amodel sensitivity analysis in his steps. Sensitivityanalyses, discussed in more detail later in the paper,can be helpful for a variety of purposes in modelingprojects.

ENGEL, STORM, WHITE, ARNOLD, AND ARABI

JAWRA 1224 JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION

Page 3: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

Developing efficient and reliable hydrologic ⁄ waterquality models and applying them requires numer-ous steps, each of which should be taken conscien-tiously and reviewed carefully. Taking a systematic,well-defined and controlled approach to all steps ofthe model development and application process isessential for successful model implementation. QAprovides the mechanisms and framework to ensurethat decisions made during this process are basedon the best available data and analyses.

USEPA Quality Assurance

The USEPA uses the Quality Assurance ProjectPlan to help project managers and planners docu-ment the type and quality of data and information

needed for making environmental decisions. TheUSEPA (2002) has developed a document, Guidancefor Quality Assurance Project Plans for Modeling(EPA QA ⁄ G-5 M), to provide recommendations onhow to develop a Quality Assurance Project Plan forprojects involving modeling (e.g., model development,model application, as well as large projects with amodeling component). A ‘‘model’’ is defined by USEPAas something that creates a prediction. The guidanceregarding modeling is based on recommendations andpolicies from USEPA Quality Assurance Project Planprotocols, but is written specifically for modeling pro-jects. However, modeling projects have different QAconcerns than traditional environmental monitoringdata collection projects. The structure for the QualityAssurance Project Plan for modeling is consistentwith the EPA Requirements for Quality AssuranceProject Plans (QA ⁄ R-5) (USEPA, 2001) and EPAGuidance for Quality Assurance Project Plans (QA ⁄G-5) (USEPA, 1998), though for modeling not allelements are included because not all are relevant.

The USEPA Quality System defined in USEPAOrder 5360.1 A2 (USEPA, 2000), Policy and ProgramRequirements for the Mandatory Agency-Wide QualitySystem, includes environmental data produced frommodels. Environmental data includes any measure-ments or information that describes environmentalprocesses, location, or conditions, ecological or healtheffects and consequences, or the performance of envi-ronmental technology. As defined by USEPA, environ-mental data includes information collected directlyfrom measurements, produced from models, or com-piled from other sources, such as databases or litera-ture. The USEPA Quality System is based on theAmerican National Standard ANSI ⁄ ASQC E4-1994.

Graded Approach to QA Project Plans

USEPA defines the graded approach as ‘‘the pro-cess of basing the level of application of managerialcontrols applied to an item or work according to theintended use of the results and degree of confidenceneeded in the quality of the results’’ (USEPA, 1998).This allows the application of QA and QC activities tobe adapted to meet project specific needs. Models thatprovide an initial ‘‘ballpark’’ estimate or nonregulatorypriorities, for example, would likely not require thesame level of QA and planning as would models thatwill be used to set regulatory requirements. However,USEPA provides no explicit categorizations or otherspecific guidelines for applying the graded approach(USEPA, 2002).

In applying the graded approach, USEPA suggeststwo aspects that are important for defining the levelof QA that a modeling project needs: (1) intended use

Analytical solutions

other codes

Define purpose

Conceptual model

Code selection

Existing code

suitable

No

Yes

Numerical formulation

Computer program

Code verification

Model construction

Performance criteria

Calibration

Validation

Simulation

Presentation of results

Postaudit

Code development

Comparison with

field data

Field data

Field data

Field data

FIGURE 1. A Hydrological ModelProtocol as Proposed by Refsgaard (1997).

A HYDROLOGIC ⁄ WATER QUALITY MODEL APPLICATION PROTOCOL

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION 1225 JAWRA

Page 4: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

of the model and (2) the project scope and magnitude(USEPA, 2002). The intended use of the model is adetermining factor because it is an indication of thepotential consequences or impacts that might occurbecause of the QC problems. For example, higherstandards might be set for projects that involvepotentially large consequences, such as Congressionaltestimony, development of new laws and regulations,or the support of litigation. More modest levels ofdefensibility and rigor would often be acceptable fordata used for technology assessment or ‘‘proof of prin-ciple,’’ where no litigation or regulatory action areexpected. Still lower levels of defensibility would likelyapply to basic exploratory research requiring extre-mely fast turnaround, or high flexibility and adaptabil-ity. In such cases, the work may have to be replicatedunder more stringent controls or the results carefullyreviewed prior to publication. The USEPA (2002) sug-gests peer review may be substituted, to some extent,for the level of QA. By analyzing the end-use needs,appropriate QA criteria can be established to guide theprogram or project. The examples presented are forillustration only, and the degree of rigor needed forany particular project should be determined based onan evaluation of the project needs and resources.

Other aspects of the QA effort can be establishedby considering the scope and magnitude of the pro-ject. The scope of the model development and applica-tion determines the complexity of the project; morecomplex models or modeling projects likely need moreQA effort. The magnitude of the project defines theresources at risk if quality problems lead to reworkand delays.

The QA Project Plan Elements for a ModelApplication Project

The USEPA (2002) defined the following ninemodel application tasks and mapped them intoQuality Assurance Project Plan elements: (1) needsassessment; (2) purpose, objectives, and output speci-fications; (3) define quality objectives, desired perfor-mance criteria, and documentation needs for modeloutput; (4) select the most appropriate model; (5) datadevelopment, model parameterization, and model cal-ibration; (6) determine whether data, models, andparameters for the application meet desired perfor-mance criteria; (7) run the computer code; (8) modeloutput testing and peer review; and (9) summarizeresults and document. Further details on how thesemodeling tasks fit within a potential modeling QAplan are described in detail in Guidance for QualityAssurance Project Plans for Modeling (USEPA, 2002).

In this paper, we develop a standard protocol forconducting modeling efforts with an emphasis on

hydrologic ⁄ water quality modeling. To this end, thework of USEPA (2002) and other relevant literaturewere reviewed to arrive at a preliminary list of rec-ommended steps in conducting modeling studies.These steps were further extended based on theauthors’ experience to include issues, such as repre-sentation of nonpoint source (NPS) best managementpractices (BMPs). A detailed description of the pro-posed steps follows.

MODEL APPLICATION PROTOCOL STEPS

A hydrologic ⁄ water quality model application pro-tocol is proposed based on the authors’ experiencesand review of the literature, including the USEPA(2002) Guidance for Quality Assurance Project Plansfor Modeling document. The authors recognize that a‘‘graded’’ approach in implementing a modeling proto-col will be required, and thus not all modeling QAplans will include all sections or issues suggested.Further, the level of detail in such plans will varygreatly depending on the purpose of the model appli-cation project. The USEPA (2002) suggests that agraded approach can be used to define the level ofQA effort that a modeling project needs based on theintended use of the model and the project scope andmagnitude (Table 1).

The following items or sections should be includedin a hydrologic ⁄ water quality modeling protocol:(1) problem definition ⁄ background; (2) model applica-tion goals, objectives, and hypothesis; (3) model selec-tion; (4) model sensitivity analysis; (5) available data;(6) data to be collected; (7) model representationissues – data, BMPs, etc.; (8) model calibration; (9)model validation; (10) model scenario prediction; and(11) results interpretation ⁄ hypothesis testing.

The proposed modeling protocol steps may be itera-tive. For example, the scientific literature and a preli-minary sensitivity analysis using general data mayinitially be used to identify the model parametersthat are the most sensitive. A more comprehensivesensitivity analysis assessment may be performedlater once more detailed location specific data havebeen collected or obtained.

Decisions made throughout the modeling effort andthe rationale for these decisions should be docu-mented. In most instances, it will be necessary tomake various assumptions and decisions throughoutthe modeling project. Many of these assumptions arebest made during the modeling project rather thanbefore the modeling starts, as information from priorsteps may impact decisions. The amount of documen-tation that should be created depends on the project

ENGEL, STORM, WHITE, ARNOLD, AND ARABI

JAWRA 1226 JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION

Page 5: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

goals and the consequences of decisions that will bemade as a result of the project findings. Each of themodeling protocol steps is discussed in more detail inthe sections that follow.

Item 1. Problem Definition ⁄ Background

Background information and preliminary data forthe study area should be obtained to help initiallydefine the overall problem that will be addressed bythe study. The background information and data col-lected in this step will be useful to determine whethermodeling will be necessary, assist in defining the mod-eling objectives (if modeling is required) and to selectthe model or models to be used. More detailed objec-tives or hypotheses to be examined within the projectare defined in the subsequent step. This initial step issimilar to the initial observation phase commonlyemployed within the scientific method.

Questions that may be addressed when definingthe problem include the following:

(1) What is the specific problem?(2) What are the overall goals and objectives of this

project that will address this problem?(3) Why should a modeling approach be used to

address the problem?(4) How will modeling of the problem help to

address the overall goals of the project?

It is also important to place the problem in contextto provide a sense of the project’s purpose relativeto other project and program phases and initia-tives. Questions that might be addressed include thefollowing:

(1) What information, previous work, or previousdata may currently exist that this project canuse?

(2) Given that the problem is best solved by a mod-eling approach, what models currently exist (ifany) that can be used to achieve this project’sgoals and objectives?

(3) What are the advantages and disadvantages ofusing these models?

The presentation of background information mayalso include a discussion of initial ideas or potentialapproaches for model application.

Item 2. Model Application Goals, Objectives, andHypothesis

The specific objectives and ⁄ or hypotheses to beaccomplished or tested by the modeling effort aredefined based on the background information anddata collected in the first step. The objectives orhypotheses should be stated in a manner thatthey can be tested or evaluated using the modelpredictions.

In setting the objectives or hypotheses to be tested,one should keep in mind that models are typicallymore accurate when making relative comparisonsrather than making absolute predictions. Thus, anobjective or hypothesis might be written to compareexpected pollutant losses for different tillage systemsrather than examining whether a particular tillagesystem results in pollutant losses below a given mag-nitude. Model calibration can help improve the accu-racy of absolute predictions, but adequate data forcalibration that represent the range of conditions ofinterest for the location of interest are not alwaysavailable.

A summary of the work to be performed and the‘‘products’’ to be created by the model applicationeffort should be identified. These will be described inmore detail in subsequent sections.

TABLE 1. Examples of Modeling Projects With Differing Intended Uses (adapted from USEPA, 2002).

Purpose for Obtaining Model-GeneratedInformation (intended use) Typical QA Issues Level of QA

Regulatory complianceLitigationCongressional testimony

Legal defensibility of data sourcesCompliance with laws and regulatory mandates applicableto data gathering

Regulatory developmentState Implementation Plan (SIP) attainmentVerification of model

Compliance with regulatory guidelinesExisting data obtained under suitable QA programAudits and data reviews

Trends monitoring (nonregulatory)Technical development‘‘Proof of principle’’

Use of accepted data-gathering methodsUse of widely accepted modelsAudit and data reviews

Basic researchBench-scale testing

QA planning and documentation at the facility levelPeer review of novel theories and methodology

m

A HYDROLOGIC ⁄ WATER QUALITY MODEL APPLICATION PROTOCOL

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION 1227 JAWRA

Page 6: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

Item 3. Model Selection

An appropriate model should be selected based on(1) the project goals, objectives or hypotheses; (2) howmodel results will be used; (3) the characteristicsof the hydrologic ⁄ water quality system that areimportant to the objectives or hypotheses; and(4) various other factors including: (a) appropriatelevel of detail (space and time); (b) important chemi-cal, physical, and biological processes are included;(c) calibration requirements; (d) data requirementsand availability; (e) previous applications of the modeland acceptance in the scientific, regulatory, andstakeholder communities; (f) ease of use; (g)sensitivity to processes of interest; and (h) Availableresources (e.g. modeler expertise, model technicalsupport) and time.

Item 4. Model Sensitivity Analysis

A model sensitivity analysis can be helpful inunderstanding which model inputs are most impor-tant or sensitive and in understanding potential limi-tations of the model. Additional care should be takenwhen estimating model parameters that are the mostsensitive. Data collection efforts that support themodeling study may focus on obtaining better datafor the most sensitive parameters.

The sensitivity analysis can also identify potentiallimitations of the model. If a model is not sensitive toparameters that are to be varied in testing the pro-ject objectives or hypotheses, a different model mayneed to be selected. Models are abstractions of thesystems they simulate and therefore typically repre-sent system components with varying levels of detail.For example, the scientific literature may indicatethat differences in tillage practices influence pesticidelosses in surface runoff. In such a case, the use of amodel that is not sensitive to tillage to examine theimpact of switching from conventional tillage to con-servation tillage on pesticide losses in surface runoffis likely inappropriate.

The literature and model documentation are oftenexcellent sources of information on model sensitivity.For example, Muttiah and Wurbs (2002) identified thesensitivity of Soil and Water Assessment Tool (SWAT)to various parameters. However, it may be necessaryto conduct a sensitivity analysis for the studywatershed if its conditions are significantly differentthan those for model sensitivity analyses reported inthe literature, as model sensitivity may be specific tothe model setup. Thus, limited data for parameteriz-ing the model may need to be collected prior toconducting a sensitivity analysis. Generally, thesensitivity analysis should be completed using

an uncalibrated model setup, as the sensitive para-meters and those with the greatest uncertainty aretypically used for model calibration. For example,Spruill et al. (2000) conducted a SWAT sensitivityanalysis to evaluate parameters that were thoughtto influence stream discharge predictions. Duringcalibration, the average absolute deviation betweenobserved and simulated streamflows was minimizedand used to identify optimum values or ranges foreach parameter.

Item 5. Available Data

The goal of this step is to select the most appropri-ate data for the modeling effort. Data available forthe modeling effort will likely come from numeroussources. An assessment of available data, its quality,and the time period it covers should be made. Theamount of data available for a watershed can varygreatly, as can the quality of the data. For example,flow and water quality data may be available for1983 through 1988, while land use data might havebeen developed for conditions in 1995. This mayresult in a misrepresentation of the land uses thatwere present during the observed flow and waterquality data period, especially for areas experiencingrapid urbanization. In other instances, differences indata collected at different dates may be negligible.For example, soil property data used in modelingrunoff from a watershed would not typically changesignificantly over time, even over periods of tens ofyears. In instances where data, such as land use,may have changed significantly, it may be necessaryto estimate data for the period of interest by interpo-lating between datasets for different time periods orby adjusting the data from the available time periodusing other sources of data and information.

The USEPA (2002) indicates that a Quality Assur-ance Project Plan for modeling should address the fol-lowing issues regarding information on how nondirectmeasurements (data and other information that havebeen previously collected or generated under someeffort outside the specific project being addressed bythe Quality Assurance Project Plan) are acquired andused in the project:

(1) The need and intended use of each type of dataor information to be acquired;

(2) How the data will be identified or acquired, andexpected sources of these data;

(3) The method of determining the underlying qual-ity of the data;

(4) The criteria established for determiningwhether the level of quality for a given set ofdata is acceptable for use in the project.

ENGEL, STORM, WHITE, ARNOLD, AND ARABI

JAWRA 1228 JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION

Page 7: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

Water quality and runoff data for the studywatershed may be available from federal, state orlocal government agencies. For example, the U.S.Geological Survey is often an excellent source ofstreamflow data and the USEPA STORET databasemay provide useful water quality data. Datasets mayalso be available from past studies, and often are doc-umented in project reports. In many instances, thesedata will not be identified by simply conducting a lit-erature search; rather contacts with local universi-ties, state and local agencies, and local watershedgroups will likely be necessary.

Well documented and widely used datasets, suchas soil properties from the U.S. Department of Agri-culture Natural Resources Conservation Service,often have well understood properties and degree ofuncertainty. It is useful to understand this degree ofuncertainty, the assumptions in the data and howthese will likely impact the model results. Spatial orgeographic information systems (GIS) data may beavailable from federal, state and local governmentagencies. Increasingly, county and local governmentsare developing detailed spatial datasets. For example,many county governments with urban areas havedeveloped detailed elevation datasets that providemore detail than state and national elevation data-sets. Spatial data from these sources should havemetadata available that describe the accuracy andother properties of the data that will be helpful inunderstanding the data quality and limitations.

Remotely sensed datasets from satellites and aerialphotography can potentially provide land use andother data needed in hydrologic ⁄ water quality model-ing studies. In addition, archived satellite data andaerial photography may be useful in creating landuse information for the past. Remotely sensed data-sets will require interpretation to create the land useor other data that are needed. Accuracy assessmentsof the interpreted results should be performed to pro-vide information concerning the quality of the landuse products created.

The scientific literature may contain some informa-tion about the study area. Project reports, however,are more likely to contain the detailed data typicallyrequired for a model application project. Scientificpapers may also provide insight into transformation ofvarious data into the data or parameters required bythe model. In most cases, these data must be trans-formed into values and formats required by the model.

After identifying the data available and its variousproperties, including quality and temporal aspects,an assessment of the suitability of the data for use inthe model that has been selected must be made. Themodel data requirements and the sensitivity of themodel to various parameters should be consideredwhen evaluating and selecting the data to use. The

rationale for the data selected for use in the modelshould be well documented, as should any requireddata transformation.

The scientific literature contains numerous studieson the impacts that various data sources anddata errors can have on model results. For example,Chaubey et al. (1999) explored the assumption of spa-tial homogeneity of rainfall when parameterizingmodels and concluded large uncertainty in estimatedmodel parameters can be expected if detailed varia-tions in the input rainfall are not considered. Nan-dakumar and Mein (1997) examined the levels ofuncertainty in rainfall-runoff model predictions as aresult of errors in hydrological and climatic data, andconsidered the implications for prediction of thehydrologic effect of landuse changes. Studies, suchas these highlight the importance of understandingthe consequences of the data used in the project onthe model results and their interpretation.

Item 6. Data to Be Collected

Based on the project objectives and hypotheses,available data and model sensitivity should be consid-ered in deciding what, if any, additional data shouldbe collected. After assessing these issues, the modelermay conclude that additional data should be collected.Following calibration or validation, the modeler mayalso decide that additional data should be collected inan attempt to improve model performance. The collec-tion of additional data can be expensive, as well asrequire a significant amount of time. An appropriateQA plan for the collection of additional data should beprepared and followed (USEPA, 1998).

Item 7. Model Representation Issues – Data, BestManagement Practices, etc.

Models are abstractions of the systems they aresimulating. Therefore, the modeler will be required tomake decisions on how to represent the various com-ponents of the system being modeled. This mayinclude decisions on representation of componentswithin the model and in the transformation of avail-able data into the formats needed by the model.These decisions should be documented. The expectedeffect of these assumptions on the results, relative toalternative assumptions that could have been madeshould also be documented.

One of the data representation issues typicallyfaced is related to pollutant sources. It is typicallyimpossible to include all pollutant sources in themodeling effort. For example, if the amount of phos-phorus leaving a watershed is of interest, the modeler

A HYDROLOGIC ⁄ WATER QUALITY MODEL APPLICATION PROTOCOL

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION 1229 JAWRA

Page 8: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

may decide not to include phosphorus losses fromseptic systems, if they are small relative to othersources. Criteria should be established to determinewhich pollutant sources to include in the modeland ⁄ or overall analysis. A simple mass balance forwater and pollutants of interest may be helpful toidentify the most important components of thehydrologic cycle and system to model and the mostimportant sources of pollutants to consider. Anotheroption is to use the selected model to perform a sim-ple preliminary model simulation using limited data.Based on such a model, criteria to exclude pollutantsources that represent less than 5% (or other levelsdeemed appropriate) of the pollutant might be estab-lished. It should be noted, however, that pollutantsources less than this threshold may be included ifthese data are readily available and easy to incor-porate into the model. When potential pollutantsources are not incorporated into the model, care inthe interpretation of the final model results isrequired.

The representation of BMPs within the model maynot be well defined. Model documentation and the sci-entific literature can often provide guidance in BMPrepresentation (Bracmort et al., 2006). However, inmost instances, these sources do not fully describehow a specific BMP, such as a grassed waterway,should be represented within a particular model;rather the modeler must exercise judgment in theBMP representation decision. Therefore, the modelerwill need to determine how BMPs will be representedin the application of a model to a given location.

The accuracy of hydrologic ⁄ water quality modelsalso depends in part on how well model input param-eters describe the relevant characteristics of thewatershed. Data that are obtained for a watershedwill typically require some transformation and inter-pretation to create the inputs required by the model.For example, soil properties in the SSURGO databaseare often reported with a range of values, while themodel will require a single value for each soil prop-erty. The model documentation and scientific litera-ture can often provide guidance in transformingcommonly available data into the inputs required bythe model. These data were used and the decisionsmade in data transformations should be documented.

Input parameter aggregation may have a substan-tial impact on model output. For example, FitzHughand Mackay (2000) used SWAT to determine how thesize or number of subwatersheds used to partitionthe watershed affect model output and the processesresponsible for model behavior. Mankin et al. (2002)explored the errors introduced when translating GISdata into model-input data. Watershed modelersusing GIS data should be aware of the issues relatedto appropriate grid cell sizes, generation of land-

management practice GIS coverages, accuracy of GISdata, and accuracy of interface algorithms.

Refsgaard and Storm (1996) indicated that a rigor-ous model parameterization procedure is crucial toavoid methodological problems in subsequent phasesof model calibration and validation. They suggest thefollowing points are important to consider in modelparameterization:

(1) Parameter classes (soil types, vegetation types,etc.) should be selected so it is easy, in an objec-tive way, to associate parameter values. Thus,when possible parameter values in the classesshould be determined based on available fielddata.

(2) Determine which parameters can be assessedfrom field data or the literature and which willrequire calibration. For parameters subject tocalibration, the physically acceptable intervalsfor the parameter values should be estimatedand documented.

(3) The number of calibration parameters should beminimized both from practical and methodolo-gical points of view. Fixing a pattern for aspatially varying parameter but allowing itsvalue to be modified uniformly throughout thewatershed can help minimize the number ofcalibrated parameters.

Item 8. Model Calibration

The USEPA (2002) indicates that if no nationallyrecognized calibration standards exist, the basis forthe calibration should be documented. Quality Assur-ance Project Plan guidance indicates that calibrationfor data collection efforts address calibration of theanalytical instruments that will be utilized to gener-ate analytical data. In modeling projects, by analogy,the ‘‘instrument’’ is the predictive tool (the model)that is to be applied (USEPA, 2002). All models, bydefinition, are a simplification of the processes theyare intended to represent. When formulating themathematical representations of these processes,there are relationships and parameters that need tobe defined. Estimating parameters for these relation-ships is called calibration. Some model parametersmay need to be estimated for every application of themodel using site-specific data. Similar to an analyti-cal instrument, models are calibrated by comparingthe predictions (output) for a given set of assumedconditions to observed data for the same conditions.This comparison allows the modeler to evaluatewhether the model and its parameters reasonablyrepresent the environment of interest. Statisticalmethods typically applied when performing model

ENGEL, STORM, WHITE, ARNOLD, AND ARABI

JAWRA 1230 JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION

Page 9: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

calibrations include regression analyses and good-ness-of-fit methods. An acceptable level of model per-formance should be defined prior to the initiation ofmodel calibration. The details of the model calibrationprocedure, including statistical analyses that areinvolved, should be documented.

Calibration Procedures

Model calibration is often important in hydrologicmodeling studies, as uncertainty in model predic-tions can be reduced if models are properly cali-brated. Factors contributing to difficulties in modelcalibration include calibration data with limitedmetadata, data with measurement errors, and spa-tial variability of rainfall or watershed propertiespoorly represented by point measurements. Modelcalibration can be done manually or by a combina-tion of manual and automatic procedures. Manualcalibration can be subjective and time-consuming(Eckhardt and Arnold, 2001). Initial values can beassigned to parameters, which are then optimizedby an automatic procedure (Gan et al., 1997).Chanasyk et al. (2002) calibrated SWAT until thepredicted and observed results were visibly close.Many studies use comparable ad hoc approaches incalibration. However, approaches that use onlyvisual comparison should be avoided. One of theadvantages of an automated approach to calibrationis that it uses a systematic approach in adjustingthe model parameters, thereby removing potentialmodeler bias. With an ad hoc calibration approach,the modeler could potentially adjust model parame-ters during calibration that would create a modelsetup or parameterization that would be more likelyto provide desired results when testing the projectobjectives or hypotheses.

Santhi et al. (2001a) presented a flow chart withthe decision criteria used during the calibration ofSWAT. This flowchart has been adapted by Arabiet al. (2004) and others for calibration of SWAT, andan adapted version is presented in Figure 2. In someinstances, this approach is too rigid to be strictly fol-lowed because of the interactions between modelparameters, and thus the modeler may need to devi-ate from strictly following such an approach.

The approach that will be followed in calibratingthe model should be identified prior to beginning cali-bration. Performance criteria should also be estab-lished prior to beginning model calibration so thatthe modeler knows when the model has been success-fully calibrated. The scientific literature can oftenprovide an idea of the likely performance of the modelfollowing calibration. Statistical measures can beused to identify performance criteria for determining

whether the model has been calibrated successfully.For some efforts, an ad hoc calibration approach maybe acceptable, while in other instances it will bedesirable to have a specific calibration protocol.

For projects supporting regulatory decision-making,the USEPA (2002) suggests the level of detail onmodel calibration in the Quality Assurance ProjectPlan should be sufficient to allow another modeler toduplicate the calibration method, if the modeler isgiven access to the model and to the data being usedin the calibration process. For other projects (e.g.,some basic research projects), it may be acceptable toprovide less detail on this issue for the Quality Assur-ance Project Plan. In some instances, projects mayuse procedures that are somewhat different fromstandard calibration techniques, such as ‘‘benchmark-ing’’ procedures, and therefore the level of detail maydiffer from what is generally portrayed for calibration.

Examples of features that the model calibrationportion of the Quality Assurance Project Plan mayaddress include the following:

(1) Objectives of model calibration activities, includ-ing acceptance criteria;

(2) Details on the model calibration procedure;(3) Method of acquiring the input data;(4) Types of output generated during model calibra-

tion;(5) Method of assessing the goodness-of-fit of the

model calibration equation to calibration data;(6) Method of quantifying variability and uncer-

tainty in the model calibration results; and(7) Corrective action to be taken if acceptance crite-

ria are not met.

The calibration plan should identify the parame-ters that will be adjusted, the order in which theywill be adjusted, and ranges in which the adjustedparameters must fall. The ranges of parametersused in calibration and the calibration resultsobtained should be documented during calibration.

Not all models must be calibrated prior to usingthe model to test the project objectives or hypotheses.However, in most cases, calibration of the model forthe study watershed(s) conditions can reduce theuncertainty in model predictions. If models are notcalibrated, they should still be validated for the studywatershed if data are available.

For hydrologic ⁄ water quality models, the hydro-logic components are usually calibrated first. In thecalibration of the hydrologic components of the model,it may be necessary to separate streamflow intodirect or surface runoff and base flow. In balancingsurface runoff and base flow volumes for a system ofinterest, other components associated with the hydro-logic component could be ultimately balanced. The

A HYDROLOGIC ⁄ WATER QUALITY MODEL APPLICATION PROTOCOL

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION 1231 JAWRA

Page 10: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

model is typically calibrated first to obtain acceptableperformance in the hydrologic components, then forsediment, and finally for nutrients, pesticides, bacte-ria, or other constituents.

Calibration Data

Data that will be used for calibration should beidentified. One common method is to split observeddata into one dataset for calibration and one forvalidation. It is important that the calibration andvalidation datasets each have observed data ofapproximately the same magnitudes. For example,

both calibration and validation datasets should haveperiods with high and low flows in order to increasethe robustness of the model.

Yapo et al. (1996) used varying lengths of calibra-tion data and found that approximately eight years ofdata were needed to obtain calibrations that wereinsensitive to the calibration period selected for theirwatershed. Gan et al. (1997) indicated that ideally,calibration should use three to five years of data thatinclude average, wet, and dry years so that the dataencompass a sufficient range of hydrologic events toactivate all the model components during calibration.However, the required amount of calibration data isproject specific.

FIGURE 2. Example SWAT Calibration Flowchart (adapted from Santhi et al., 2001a).

ENGEL, STORM, WHITE, ARNOLD, AND ARABI

JAWRA 1232 JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION

Page 11: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

Calibration Statistics

The goodness-of-fit statistics to be used indescribing the model’s performance relative to theobserved data should be selected prior to calibrationand validation. The American Society of Civil Engi-neers (ASCE) Task Committee (1993) recommendedgraphical and statistical methods useful for evaluat-ing model performance. In most instances, bothvisual comparisons of predicted and observed data,as well as goodness-of-fit statistics, should be used.Plotting of predicted results and observed resultsalong with the 1:1 line can be helpful in identifyingmodel bias. The percent deviation of predicted val-ues from observed values is one numerical good-ness-of-fit criterion. A second basic goodness-of-fitcriterion recommended by the ASCE Task Commit-tee (1993) is the Nash-Sutcliffe coefficient or coeffi-cient of simulation efficiency. Legates and McCabe(1999) evaluated various goodness-of-fit measuresfor hydrologic model validation and suggested thatcorrelation and correlation-based measures (e.g., thecoefficient of determination) are oversensitive toextreme values and are insensitive to additive andproportional differences between model estimates andobserved values. Thus, correlation-based measurescan indicate that a model is a good predictor, evenwhen it is not. Legates and McCabe (1999) concludedthat measures, such as the Nash-Sutcliffe coefficientof efficiency and the index of agreement are bettermeasures for hydrologic model assessment than corre-lation-based measures. Legates and McCabe (1999)suggested a modified Nash-Sutcliffe coefficient that isless sensitive to extreme values may be appropriatein some instances. They also suggested additionalevaluation measures, such as summary statistics andabsolute error measures (observed and modeledmeans and standard deviations, mean absolute errorand root mean square error) should be reported formodel results.

There are no standards or a range of values forgoodness-of-fit statistical parameters that willadjudge the model performance as acceptable (Loagueand Green, 1991). Ramanarayanan et al. (1997) sug-gested values of goodness-of-fit statistics computedbased on monthly computations for determining theacceptable performance of the APEX model. Theyindicated that values close to zero for the correlationcoefficient and ⁄ or the Nash-Sutcliffe coefficient indi-cated the model performance was unacceptable orpoor. They judged the model performance as satisfac-tory if the correlation coefficient was greater than 0.5and the Nash-Sutcliffe coefficient was greater than0.4. Santhi et al. (2001a) assumed a Nash-Sutcliffecoefficient greater than 0.5 and a goodness of fit (R2)greater than 0.6 indicated acceptable model perfor-

mance when calibrating SWAT. However, acceptablestatistical measures are project specific.

The literature can provide typical ranges of good-ness-of-fit statistics for models. For example, Salehet al. (2000) obtained Nash-Sutcliffe coefficients foraverage monthly flow, sediment, and nutrient loadingat 11 locations with values ranging from 0.65 to 0.99,indicating reasonable SWAT predicted values. SWATalso adequately predicted monthly trends in averagedaily flow, sediment, and nutrient loading over thevalidation period with Nash-Sutcliffe coefficientsranging from 0.54 to 0.94, except for NO3-N, whichhad a value of 0.27. Fernandez et al. (2002) developeda GIS-based, lumped parameter water quality modelto estimate the spatial and temporal nitrogen-loadingpatterns for lower coastal plain watersheds in easternNorth Carolina. Predicted nitrogen loads werehighly correlated with observed loads (correlationcoefficients of 0.99 for nitrate-nitrogen, 0.90 for totalKjeldahl nitrogen, and 0.96 for total nitrogen). How-ever, the limitations of correlation coefficients, as dis-cussed previously, should be considered ininterpretation of these results. Spruill et al. (2000)evaluated SWAT and its parameter sensitivities forstreamflow from a small central Kentucky watershedand concluded the model adequately predicted thetrends in daily streamflow, although Nash-Sutcliffecoefficient values were 0.19 for calibration and )0.04for validation. The Nash-Sutcliffe coefficients formonthly total flows were 0.58 for validation and 0.89for calibration.

In some instances, model calibration may not yieldresults that are acceptable based on the predefinedmodel performance criteria. If this occurs, theobserved flow and pollutant data as well as the modelinput data should be examined for potential errors.The poor model performance may be an indicationthat more detailed model inputs are required. Inother cases, this may be an indication that the modelis unable to adequately represent the processes ofinterest for this watershed.

Item 9. Model Validation

When possible, it is important to reserve someobserved data (e.g., flow and water quality data) formodel validation. Additional discussion of the datafor validation and calibration can be found in theModel Calibration section. Prior to beginning modelvalidation, the criteria used to validate, that is,accept, reject, or qualify the model results, should bedocumented (USEPA, 2002). The same statistics usedand reported for model calibration should be used inmodel validation. Typically, the values of these statis-tics are lower for validation than calibration. Accept-

A HYDROLOGIC ⁄ WATER QUALITY MODEL APPLICATION PROTOCOL

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION 1233 JAWRA

Page 12: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

able levels of performance may be difficult to identify.Acceptable model performance levels that have beenproposed are discussed in the Model Calibration sec-tion. The scientific literature can provide suggestionsfor levels of performance that might be anticipatedfor a given model. The specific purpose of the study,the available data and other factors should beconsidered when establishing the performance crite-ria. For example, the time period considered canimpact model performance. Typically, model perfor-mance is poorer for shorter periods than for longerperiods (e.g., daily vs. monthly or yearly). For exam-ple, Yuan et al. (2001) found that AnnAGNPS pro-vided an R2 of 0.5 for event comparison of predictedand observed sediment yields, while the agreementbetween monthly data had an R2 of 0.7.

In some instances, acceptable model performancemay not be obtained during the validation step. Notethat the utility of the model may not depend on a sin-gle performance indicator, and therefore, some judg-ment will be required by the modeler. The potentialuncertainty associated with models and model setupsthat do not attain the desired level of performance dur-ing validation will be greater than those for whichmodel performance is deemed acceptable. Unacceptablemodel performance for validation can be an indicationthat the validation period data ranges or conditions aresignificantly different than those for the calibrationperiod. Therefore, care in the selection of the data forcalibration and validation periods is needed. In othercases, poor performance during validation may be anindication that the model has not been adequately orproperly calibrated. It is possible that numerous modelsetups or parameterizations can provide acceptablemodel results for calibration. However, during valida-tion such setups may provide poor results. In suchcases, the model should be re-calibrated and then vali-dation attempted again. In addition, in some cases, thelack of acceptable validation may be the result of inac-curate validation data.

If data are unavailable for validation, otherapproaches might be used to evaluate the potentialperformance of the model. The literature on themodel may provide an indication of the model’sexpected performance. However, care should be takenin inferring the model’s likely performance for thestudy watershed based on validation results found inthe literature. Data used and model parameterizationfor studies reported in the literature are not oftendescribed with enough detail to allow a good assess-ment of the model’s likely performance in otherwatersheds. Further, if the model study reported inthe literature included calibration, assessment of themodel’s likely performance in the study watershedwill be even more difficult as the model will not becalibrated for the study watershed.

Observed runoff and water quality data from asimilar watershed could potentially be used to deter-mine the likely performance of the model for thestudy watershed. Sogbedji and McIsaac (2002) dem-onstrated the expected performance of the ADAPTmodel through calibration of the model using datafrom a comparable watershed and then applying it tosimilar watersheds. However, it may be desirable notto calibrate the model for the similar watershed, butrather simply validate the model for such watersheds,as data are unavailable for calibration of the modelin the study watershed.

The USEPA (2002) indicates that a model can beevaluated by comparing model predictions of currentconditions with similar field or laboratory data notused in the model calibration process, or with compa-rable predictions from accepted models or by othermethods (e.g., uncertainty and sensitivity analyses).The results of a simple mass balance model could becompared with those of the model used in the studyto see how well results match. Multiple comprehen-sive models might also be applied to the studywatershed if data are unavailable for calibration andvalidation. If multiple models provide similar results,confidence in the results that are obtained may beincreased. One must be cautious though with theinterpretation of results in such a case, especially ifthe models use similar modeling components orapproaches.

If validation is not possible, varying ranges ofmodel inputs might be used in later stages of the mod-eling effort to determine the sensitivity of the modelresults to the model inputs. The use of Monte Carlotechniques and other approaches can also be used toidentify confidence limits on outputs. ‘‘Biasing’’ themodel inputs may also be used in later stages ofthe modeling effort to determine the sensitivity of theresults to assumptions in model inputs. In such asituation, the model inputs would be set to extremevalues in their expected ranges. If the same conclu-sions are reached with these inputs, the confidence inthe conclusions reached would be increased since theconclusions are not sensitive to the model inputassumptions.

Item 10. Model Scenario Prediction

Once the model has been validated and theresults are deemed acceptable, the model is readyto be parameterized to the conditions of interest(e.g., a landuse change, implementation of BMPs).The parameterization of the model and the rational-ization for decisions regarding data and representa-tions within the model should be documented toallow others to recreate the model setup (see Model

ENGEL, STORM, WHITE, ARNOLD, AND ARABI

JAWRA 1234 JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION

Page 13: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

Representation Issues section). These data andrepresentation decisions should be consistent withthose used in setting up the model for calibrationand validation.

If possible, the uncertainty in model predictionswhen parameterized for the condition(s) of interestshould be explored. The results from the validationstage provide some basis for expected model perfor-mance and level of uncertainty. Monte Carlo and othertechniques can also be used to place confidence inter-vals on the expected results. For example, Kuczeraand Parent (1998) used two Monte Carlo-basedapproaches for assessing parameter uncertainty incomplex hydrologic models.

An approach that can be helpful in exploring theextremes in the uncertainty of model predictions is tobias model inputs in a direction that would beexpected to represent the ‘‘worst case.’’ If the modelresults for such a case result in the same conclusionbeing reached, the confidence in the conclusionshould be high.

Item 11. Results Interpretation ⁄ Hypothesis Testing

Model results should be interpreted accounting forthe expected uncertainty. Typically, the uncertaintyin models cannot be quantified because of complexityof interactions, and thus it will be necessary to quali-tatively assess the objectives or hypotheses takinginto account the expected uncertainty in the results.The approach to be utilized in testing the objectivesor hypotheses should be identified and documentedprior to initiating the modeling.

The literature contains numerous examples ofinterpretation of model results. For example, Kirschet al. (2002) tested the SWAT model within pilotwatersheds and then applied it throughout a largerwatershed in Wisconsin to quantify impacts from theapplication of basin-wide BMPs. Modeling resultsindicated that implementation of improved tillagepractices (predominantly conservation tillage) couldreduce sediment yields by almost 20%. They deemedthis a significant reduction relative to current condi-tions. Santhi et al. (2001b) applied SWAT, which hadbeen validated for flow and sediment and nutrienttransport, to a watershed to quantify the effects ofBMPs related to dairy manure management andmunicipal wastewater treatment plant effluent. Kingand Balogh (2001) used 99-year SWAT simulationsfor three locations to test hydrologic ⁄ water qualityimpacts of continuous corn, a forested environment, agolf course built in a previously agricultural setting,and a golf course constructed in a previously forestedsetting. Differences in hydrologic, nitrate-nitrogen,and pesticide impacts were examined using Tukey’s

pairwise comparison to determine whether differ-ences were statistically different.

SUMMARY AND CONCLUSIONS

Data collection for environmental projects typicallyfollows a QA ⁄ QC plan. Likewise, QA planning forenvironmental modeling projects should follow a stan-dard procedure. A modeling protocol, preferably writ-ten, should be established prior to conductingmodeling studies. A modeling standard protocolwould: (1) reduce potential modelers’ bias, (2) providea roadmap to be followed, (3) allow others to repeatthe study, and (4) improve acceptance of modelresults.

This paper presents a model application protocolfor hydrologic ⁄ water quality studies. The presentwork is an adaptation and extension of the guidanceavailable in the literature, including the USEPA, forQA project plans for modeling. Eleven issues thatshould be addressed in hydrologic ⁄ water qualitymodel application plans were identified that include:(1) problem definition ⁄ background; (2) model applica-tion goals, objectives, and hypothesis; (3) model selec-tion; (4) model sensitivity analysis; (5) available data;(6) data to be collected; (7) model representationissues – data, BMPs, etc.; (8) model calibration;(9) model validation; (10) model scenario prediction;and (11) results interpretation ⁄ hypothesis testing.

It is essential to document the decisions made foreach of these items and the rationale for these deci-sions. The extent of documentation that should beprepared depends on various factors, including thepurpose of the modeling study. A graded approachwas recommended for defining the level of QA effortthat is required for a modeling study. A detaileddiscussion of the above-mentioned steps was providedwith an emphasis on hydrologic ⁄ water quality studiesincluding considerations relevant to representation ofNPS BMPs with watershed models and appropriatecriteria for evaluation of the performance hydrologicmodels.

LITERATURE CITED

Arabi, M., R.S. Govindaraju, and M.M. Hantush, 2004. Source Iden-tification of Sediments and Nutrients in Watersheds – FinalReport.’’EPA ⁄ 600 ⁄ R-05 ⁄ 080. National Risk ManagementResearch Laboratory, Office of Research and Development,U.S. Environmental Protection Agency, Cincinnati, Ohio,120 pp. http://www.epa.gov/nrmrl/pubs/600r05080/600r05080.pdf.accessed August 3, 2007.

A HYDROLOGIC ⁄ WATER QUALITY MODEL APPLICATION PROTOCOL

JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION 1235 JAWRA

Page 14: A HYDROLOGIC WATER QUALITY MODEL APPLICATION … · A HYDROLOGIC⁄WATER QUALITY MODEL APPLICATION PROTOCOL1 ... in modeling the system of interest, (4) ... hydrologic modeling,

American Society of Chemical Engineers, Task Committee, 1993.Criteria for Evaluation of Watershed Models. Journal of Irriga-tion and Drainage Engineering 119(3): 429-442.

Bracmort, K., M. Arabi, J. Frankenberger, and B. Engel, 2006.Modeling Long-Term Water Quality Impacts of StructuralBMPs. Transactions of the ASABE 49(2):367-374.

Chanasyk, D.S., E. Mapfumo, and W. Williams, 2002. Quantifica-tion and Simulation of Surface Runoff From Fescue GrasslandWatersheds. Agricultural Water Management 1782(2002):1-17.

Chaubey, I., C.T. Haan, S. Grunwald, and J.M. Salisbury, 1999.Uncertainty in the Model Parameters Due to Spatial Variabilityof Rainfall. Journal of Hydrology 220(1999):48-61.

Eckhardt, K. and J.G. Arnold, 2001. Automatic Calibration of aDistributed Catchment Model. Journal of Hydrology251(2001):103-109.

Fernandez, G.P., G.M. Chescheir, R.W. Skaggs, and D.M. Amatya,2002. WATGIS: A GIS-Based Lumped Parameter Water QualityModel. Transactions of the ASAE 45(3):593-600.

FitzHugh, T.W. and D.S. Mackay, 2000. Impacts of Input Parame-ter Spatial Aggregation on an Agricultural Nonpoint Source Pol-lution Model. Journal of Hydrology 236(2000):35-53.

Gan, T.Y., E.M. Dlamini, and G.F. Biftu, 1997. Effects of ModelComplexity and Structure, Data Quality, and Objective Func-tions on Hydrologic Modeling. Journal of Hydrology192(1997):81-103.

King, K.W. and J.C. Balogh, 2001. Water Quality Impacts Associ-ated With Converting Farmland and Forests to Turfgrass.Transactions of the ASAE 44(3):569-576.

Kirsch, K., A. Kirsch, and J.G. Arnold, 2002. Predicting Sedimentand Phosphorus Loads in the Rock River Basin Using SWAT.Transactions of the ASAE 45(6):1757-1769.

Kuczera, G. and E. Parent, 1998. Monte Carlo Assessment ofParameter Uncertainty in Conceptual Catchment Models: TheMetropolis Algorithm. Journal of Hydrology 211(1998):69-85.

Legates, D.R. and G.J. McCabe, Jr., 1999. Evaluating the Use of‘‘Goodness-of-Fit’’ Measures in Hydrologic and HydroclimaticModel Validation. Water Resources Research 35(1):233-241.

Loague, K. and R.E. Green, 1991. Statistical and Graphical Meth-ods for Evaluating Solute Transport Models: Overview andApplication. Journal of Contaminant Hydrology 7:51-73.

Mankin, K.R., R.D. DeAusen, and P.L. Barnes, 2002. Assessmentof a GIS-AGNPS Interface Model. Transactions of the ASAE45(5):1375-1383.

Muttiah, R.S. and R.A. Wurbs, 2002. Scale-Dependent Soil and Cli-mate Variability Effects on Watershed Water Balance of theSWAT Model. Journal of Hydrology 256(2002):264-285.

Nandakumar, N. and R.G. Mein, 1997. Uncertainty in Rainfall-Runoff Model Simulations and the Implications for Predictingthe Hydrologic Effects of Land-Use Change. Journal of Hydrol-ogy 192(1997):211-232.

Ramanarayanan, T.S., J.R. Williams, W.A. Dugas, L.M. Hauck,and A.M.S. McFarland 1997. Using APEX to Identify Alterna-tive Practices for Animal Waste Management. Paper No.97-2209. ASAE, St. Joseph, Michigan.

Refsgaard, J.C., 1997. Parameterisation, Calibration and Validationof Distributed Hydrological Models. Journal of Hydrology198(1997):69-97.

Refsgaard, J.C. and B. Storm 1996. Construction, Calibration andValidation of Hydrological Models. In: Distributed HydrologicalModeling, M.B. Abbott, and J.C. Refsgaard (Editors). KluwerAcademic Press, Dordrecht, The Netherlands, pp. 41-54.

Saleh, A., J.G. Arnold, P.W. Gassman, L.M. Hauck, W.D. Rosen-thal, J.R. Williams, and A.M.S. McFarland, 2000. Application ofSWAT for the Upper North Bosque River Watershed. Transac-tions of the ASAE 43(5):1077-1087.

Santhi, C., J.G. Arnold, J.R. Williams, W.A. Dugas, R. Srinivasan,and L.M. Hauck 2001a. Validation of the SWAT Model on a

Large River Basin With Point and Nonpoint Sources. Journal ofthe American Water Resources Association 37: 1169-1188.

Santhi, C., J.G. Arnold, J.R. Williams, L.M. Hauck, and W.A.Dugas, 2001b. Application of a Watershed Model to EvaluateManagement Effects on Point and Nonpoint Source Pollution.Transactions of the ASAE 44(6):1559-1570.

Sogbedji, J.M. and G.F. McIsaac, 2002. Evaluation of the ADAPTModel for Simulating Water Outflow From Agricultural Water-sheds With Extensive Tile Drainage. Transactions of the ASAE45(3):649-659.

Spruill, C.A., S.R. Workman, and J.L. Taraba, 2000. Simulation ofDaily and Monthly Stream Discharge From Small WatershedsUsing the SWAT Model. Transactions of the ASAE 43(6):1431-1439.

Taylor, J.K. 1985. What is Quality Assurance? In: Quality Assurancefor Environmental Measurements, J.K. Taylor and T.W. Stanley(Editors). American Society for Testing and Materials SpecialTechnical Publication No. 867, ASTM International, Philadel-phia, Pennsylvania.

USEPA (U.S. Environmental Protection Agency), 1998. Guidancefor Quality Assurance Project Plans QA ⁄ G-5. ReportEPA ⁄ 600 ⁄ R-98 ⁄ 018, Office of Research and Development, Wash-ington, D.C.

USEPA (U.S. Environmental Protection Agency), 2000. Policy andProgram Requirements for the Mandatory Agency-Wide QualitySystem. EPA Order 5360.1 A2, Washington, D.C. http://www.epa.gov/quality/qs-docs/5360-1.pdf. Accessed August 3, 2007.

USEPA (U.S. Environmental Protection Agency), 2001. EPARequirements for Quality Assurance Project Plans QA ⁄ R-5.Report EPA ⁄ 240 ⁄ B-01 ⁄ 003, Office of Environmental Informa-tion, Washington, D.C.

USEPA (U.S. Environmental Protection Agency), 2002. Guidancefor Quality Assurance Project Plans for Modeling. EPA QA ⁄ G-5 M. Report EPA ⁄ 240 ⁄ R-02 ⁄ 007, Office of Environmental Infor-mation, Washington, D.C.

van der Heijde, P.K.M., 1987. Quality Assurance in Computer Sim-ulations of Groundwater Contamination. Environmental Soft-ware 2(1):19-28.

Yapo, P.O., H.V. Gupta, and S. Sorooshian, 1996. AutomaticCalibration of Conceptual Rainfall-Runoff Models: Sensitivity toCalibration Data. Journal of Hydrology 181(1996):23-48.

Yuan, Y., R.L. Bingner, and R.A. Rebich, 2001. Evaluation ofANNAGNPS on Mississippi Delta MSEA Watersheds. Transac-tions of the ASAE 44(5):1183-1190.

ENGEL, STORM, WHITE, ARNOLD, AND ARABI

JAWRA 1236 JOURNAL OF THE AMERICAN WATER RESOURCES ASSOCIATION