scene setting: conceptual model uncertainty · this project has received funding from the...

22
This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287. Scene setting: Conceptual model uncertainty Laura Urso, Philipp Hartmann, Martin Steiner (BfS) TERRITORIES Workshop: Multidisciplinary forum to discuss the scientific basis for reducing uncertainties and improving risk assessment Madrid, 13-14 June 2018

Upload: others

Post on 06-Oct-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Scene setting: Conceptual model

uncertainty

Laura Urso, Philipp Hartmann, Martin Steiner (BfS)

TERRITORIES Workshop: Multidisciplinary forum to discuss the scientific basis for reducing uncertainties and

improving risk assessment

Madrid, 13-14 June 2018

Page 2: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Total uncertainty

• Total uncertainty of a model consists of:

• Parameter uncertainty

• Conceptual model uncertainty

• Scenario uncertainty including monitoring uncertainty

• Measurement error uncertainty

• Modeler uncertainty

• Numerical uncertainty

• …..

Page 3: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Scene setting

1. Definition

2. Techniques to quantify conceptual model uncertainty

3. The role of conceptual model uncertainty in radioecology

4. Approach in TERRITORIES

Page 4: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Conceptual model uncertainty

• Definition

• Deals with simplifications needed to translate a conceptual model into mathematics (EPA, 2009; EPA, 2014)

• Refers to incomplete understanding and simplified representations of modelled processes as compared to reality (Refsgaard, 2006)

� If neglected, it may lead to uncertainty bands that are not sufficiently wide

� Can indicate over-fitting the available data (Draper, 1995; Engeland, 2005)

Page 5: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Literature overview

• Conceptual model uncertainty is addressed in different disciplines, e.g. hydrology, epidemiology, agronomy, ecology, environmental engineering, ...

• Peer-reviewed articles and various reports are available

• In radioecology this has not been addressed so far

Key questions when addressing conceptual uncertainty:

• One or multiple models?

• Is data available?

Page 6: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Available techniques (with data)

• For each model and available experimental data :• Use a split-sample approach for calibrating and validating

the model and increase the range of parameter uncertainty to implicitly account for structural uncertainty.

• Estimate the total predictive forecast uncertainty with statistical analysis of the residuals and subtract the propagated parameter uncertainty from total uncertainty.

• Carry out a (process-) sensitivity analysis

• If multiple models are available and experimental data are available:

• Use advanced statistical tools such as MIM, BMA and GLUE.�Focus on model intercomparison

Page 7: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Available techniques (lacking data)

• If multiple models are available and experimental data are not available or not sufficient to carry out a statistical analysis:

• Model comparison can be carried out. If models have a similar parameterization and/or parameter uncertainty can be assessed, the difference in the models’ outputs can be seen as the conceptual model uncertainty.

• If a single model is available and data are not available:

• Only qualitative analysis is possible, e.g. pedigree or expert elicitation.

Page 8: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

How about radioecology?

• Conceptual model uncertainty is present:

• When using the so-called transfer factors or concentration ratios, e.g. transfer factor soil-plant, soil-mushroom, soil-cow milk etc.

• Simplifications of a model e.g. process of resuspension is neglected.

• Whenever the natural heterogeneity of the system is not accounted for.

• When assuming that the system is in an equilibrium state while this is not the case.

Page 9: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Challenges in radioecology

• Radioecological processes can be intrinsically heterogeneous (natural variability).

• Model calibration and validation for radioecological models is difficult because often a limited amount of experimental data exists.

• Radioecological models often differ in terms of calculated endpoints.

• Not so many different modelling structures exist, e.g. multiplicators, parametric (linear or exponential) equations, linear ordinary differential equations and partial differential equations

Page 10: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

What can we build upon in TERRITORIES?

• Sensitivity studies have been carried out (e.g. ECOFOR model)

• Results of elicitation processes available, e.g. TERRITORIES questionnaire about models

• Parameter uncertainty and model performance have already been quantified in:

• Sy et al., JER (2015)

• Gonze and Sy, STOTEN (2016)

• Simon-Cornu et al., JER (2015)

This provides a basis for quantifying conceptual model uncertainty!

Page 11: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Challenges

• How to translate scientific (physical, biological or chemical) hypotheses into statistical ones in order to be able to apply the statistical tools properly.

• The assumption that uncertainties are additive is debatable. Assuming that uncertainty is multiplicative is as reasonable.

• Only few models are available and one cannot be sure that all the model space is covered by the available model ensemble.

• Asserting that models have a similar parameterization is not trivial: does each model need exactly the same parameters?

Page 12: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Food for thought:

Increased model complexity

Reduced conceptual uncertainty

Increased parameter uncertainty

Page 13: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Thank you!

Page 14: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Discussion

• Data unavailability affects the possibility to quantify the conceptual model uncertainty;

• The use of residual analysis can be applied to quantify the overall uncertainty and in this case the assumption of uncertainties being additive is often used to quantify the various uncertainties contributions.

• Different numerical metrics exist and can be used: what is the best one?

• The quantification of parameter uncertainty is often a pre-requisite for determining the conceptual uncertainty.

• The use of the Bayesian or a probabilistic approach allows to include quantitatively expert judgment in the analysis i.e. to attribute a probabilistic function to the quantities of interest.

• The use of statistical models (emulators e.g. linear regression models) to make the assessments and/or the use of a wrapper framework is often required.

Page 15: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Some lexicon and tools available

• (Linear) regression models and residual error

• Statistical metrics:

• Deviance Information Criterion

• Akaike Information Criterion

• Mean square error of prediction (MSEP)

• Posterior predictive loss criterion(PPLC)

• Standard Bayesian analysis

• Wrapping framework for Monte Carlo simulations: embed you own model in a tool that allows for Monte Carlo calculations

Page 16: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Hydrology

Jin et al., JoH 2010 apply standard Bayesian method to their water balance model

• Simulation error quantifies conceptual model uncertainty

����,� � ���,� ��

� � � 2��� ��� ⋅ exp��

���∑ ����,� � ���,�

� �!� ]

Page 17: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Ecology

• Lindenschmidt et al. EM (2007) address the conceptual model uncertainty for a water quality analysis model by means of sensitivity analysis

• Monte Carlo method and a wrapper framework setting.

Page 18: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Epidemiology

• Kaiser et al., REB (2011) apply MMI to mortality data from Japanese A-bomb survivors

Model reference # parameters Deviance AIC wi

EAR model of Preston et al., (2004) 22 2,258.7 2,302.7 0

BEIR VII, phase 2 (2006) 19 2,255.2 2,293.2 0,001

Richardson et al., (2009) 21 2,250.1 2,292.1 0,001

Schneider and Walsh (2009) LQ 13 2,258.0 2,284.0 0,069

Schneider and Walsh (2009) LQ-exp 14 2,253.9 2,281.9 0,198

Little et al. (2008) 11 2,259.7 2,281.7 0,219

UNSCEAR (2006) 10 2,260.0 2,280.0 0,512

Richardson et al., (2009) Main model for all types of leukemia, their Table 3 448 1,915.5 2,811.5 0

Richardson et al., (2009) model 3, their Appendix Table A1 1086 1,560.3 3,732.3 0

Page 19: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Agronomy

• Alderman & Stanfill, EJA (2017) apply the standard Bayesian method to more than one model for predicting wheat phenology.

• Mean square error of prediction (MSEP).

• Mean variance includes variance for conceptual model uncertainty

• This method valid for non-linear models.

"#$ %&�'( ��)

� �*� �*)

� �+�

,-./ 0 � .*,)1,+2 % � 31 0; �1�

Page 20: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Multimodel Inference Analysis (MIM)

• Akaike Information Criterion (AIC): numerical value bywhich to rank models in terms of information loss in approximating the unknowable truth

• Estimate of model parameters

• Unique set of data for all models i

• Fit of the models � Deviance + AIC

• Multimodel averaging: 5 6 7� 8�6�

9:;#;<=<:>?@A8� �BCDEFG/�

∑ BCDEFG/�

(Burnham and Anderson, 2002)

Page 21: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Bayesian Model Averaging

Consider models ,�, ,�, … .,( and Δ the quantity of interest.

For every model ,(:

1. / �( ,( is the prior density of �( given model ,(

2. / �( , ,( is the likelihood of the data given �( and ,(

3. / ,( is the prior probability that ,( is the true model

4. / ,( � O / �( / �( ,P Q�( � quantifies how well ,(reproduces data

(Hoeting et al., 1999)

Page 22: Scene setting: Conceptual model uncertainty · This project has received funding from the Euratomresearch and training programme 2014-2018 under grant agreement No 662287. Scene setting:

This project has received funding from the Euratom research and training programme 2014-2018 under grant agreement No 662287.

Generalized Likelihood Uncertainty Estimation Method (GLUE)

Models ,�, ,�, …,(

1. Determine the statistics of the models, parameters and variables via prior distributions /R�( , ,(S

2. Run stochastic simulations based on models, parameters and variables set in 1. to obtain unconditional estimates of the state variables

3. Compare every simulation to observed data: acceptance/non-acceptance of simulation according to a fixed rule (to be determined)

4. Accept simulation with a weight and build a likelihood function �R ∣ �( , ,(S

5. Apply Bayes’ rule and quantify how well ,( reproduces the data

(Biven and Binley, 1992)