experimentally validated “best estimate + uncertainty” modeling of

35
1 D. G. Cacuci NCSU, March 10, ‘11 Experimentally Validated “Best Estimate + Uncertainty” Modeling of Complex Systems: the Cornerstone of Predictive Science Dan Gabriel Cacuci

Upload: dinhdat

Post on 16-Jan-2017

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

1D. G. Cacuci NCSU, March 10, ‘11

Experimentally Validated“Best Estimate + Uncertainty”

Modeling of Complex Systems:the Cornerstone of Predictive Science

Dan Gabriel Cacuci

Page 2: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

2D. G. Cacuci NCSU, March 10, ‘11

Outline• “Predictive Science” refers to the application of verified and validated computational simulations to predict properties of complex systems, particularly in cases where routine experimental tests are not feasible.

• Verification, Validation and Uncertainty Quantification of Models are the bricks underlying “Predictive Science”

• Terminology and Methodology :

Model Verification and Validation, Sensitivity & Uncertainty Quantification

Experimental Data Assimilation for Model Calibration and Best-Estimate Predictions

• Nuclear Engineering Illustrative Examples:

– Reactor Core Design (Gen-IV SFR)– Validation & Best Estimate Prediction Activities within CASL (DOE

Energy Innovation Hub)

• An Incomplete List of Open Issues…

Page 3: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

3D. G. Cacuci NCSU, March 10, ‘11

COMPUTERIZED MODEL

REALITY

CONCEPTUAL(Mathematical) MODEL

ModelQualification

ComputerSimulation

ModelValidation“Physics”

ModelVerification“Numerics”

Analysis

Programming

Model Verification, Validation, Qualification

Page 4: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

4D. G. Cacuci NCSU, March 10, ‘11

Mathematical Model

• linear and/or nonlinear equations

• independent variables

• dependent variables

• parameters

• constraints, uncertainties, and/or pdf’s for parameters:

• responses to be computed and/or optimized

Page 5: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

5D. G. Cacuci NCSU, March 10, ‘11

Code verification: “Are you solving the mathematical model correctly?”

Numerical Algorithm Verification Software Quality Assurance Practices

Types of Algorithm Testing:• Analytic solutions for simplified physics• Method of manufactured solutions• ODE benchmark solutions• PDE benchmark solutions• Conservation tests• Alternate coordinate systemtests• Symmetry tests• Iterative convergence tests

Configuration Management

Software Quality Analysis and Testing

StaticAnalysis

DynamicTesting

FormalAnalysis

RegressionTesting

Black BoxTesting

Glass BoxTesting

Page 6: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

6D. G. Cacuci NCSU, March 10, ‘11

Notes on “Model Verification”:

Numerical Algorithm Verification (NAV) Goal: to determine the observed (demonstrated) order of accuracy of

models (by using a priori and a posteriori methods);

Observed Accuracy could be less than the Formal Accuracy of the respective numerical method

– A posteriori methods: Richardson’s extrapolation, adaptive grid refinement, "grid convergence index”, etc.

– Issues: programming errors, singularities, discontinuities, grid clustering, under-resolved grids, boundary condition effects, non-asymptotic convergence, inadequate iterations, coupling of numerical errors to appearance of new time- and spatial-scales, etc;

• Responsibility for NAV: code development team (code specific)

Page 7: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

7D. G. Cacuci NCSU, March 10, ‘11

“Code validation”: “Does the model represent reality?”

• ASC: “Validation is the process of confirming that the predictions of a computer code adequately represent measured physical phenomena”.

• AIAA: “Validation is the process of determining the degree to which a model is an accurate representation of the real world from the perspective of the intended uses of the model”.

• Validation Experiments must be identified and designed within the application-specific Phenomena Identification and Ranking Table (PIRT), and must allow precise and conclusive comparisons of computations with experimental data for the purpose of quantifying model fidelity and credibility.

Page 8: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

8D. G. Cacuci NCSU, March 10, ‘11

CONCEPTUAL MODEL

COMPUTATIONAL MODEL

COMPUTATIONAL SOLUTION

CORRECT ANSWER PROVIDED BY EXPERIMENTAL DATA

• Unit Problems

• Benchmark Cases

• Subsystem Cases

• Complete System

VALIDATIONTEST

=Comparisonand Test of Agreement

Code validation: “Does the model represent reality?”

REALWORLD

Page 9: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

9D. G. Cacuci NCSU, March 10, ‘11

Application Requirements

PIRT

Physical Phenomena Importance

ExperimentalAdequacy

ConceptualModel

Adequacy

CodeVerificationAdequacy

ValidationMetric

Adequacy

PIRT Interactively Guides Prioritization of Validation Tasks

Phenomena Identification and Ranking Table (PIRT)

Page 10: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

10D. G. Cacuci NCSU, March 10, ‘11

Input

resp

onse

experimentcomputation

(a) Deterministic

Input

resp

onse

experimentcomputation

(b) ExperimentalUncertainty

Increasing Quality of Validation Metrics (1/2)

•Validation Experiments must be identified and designed within the application-specific Phenomena Identification and Ranking Table(PIRT), and must allow conclusive comparisons of computations with experimental data for quantifying model fidelity and credibility.

Page 11: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

11D. G. Cacuci NCSU, March 10, ‘11

Input

experimentcomputation

(c) Computational Uncertainties

Input

experimentcomputation

(d) “Ensemble” Computation (e) QuantitativeComparison

Inputresp

onse

resp

onse

Com

puta

tion

-Exp

erim

ent

Increasing Quality of Validation Metrics (2/2)

Page 12: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

12D. G. Cacuci NCSU, March 10, ‘11

• Sensitivity and uncertainty analysis procedures can be either local or global in scope. • The objective of local analysis is to analyze the behavior of the system response locally around a chosen point (for static systems) or chosen trajectory (for dynamical systems) in the combined phase space of parameters and state variables. • The objective of global analysis is to determine all of the system's critical points (bifurcations, turning points, response maxima, minima, and/or saddle points) in the combined space of parameters and state variables, and subsequently analyze these critical points by local sensitivity and uncertainty analysis. • The methods for sensitivity and uncertainty analysis are based on:

– statistical procedures or – deterministic procedures (especially efficient adjoint methods).

• In practice, deterministic methods are used mostly for local analysis while statistical methods are used for both local and global analysis.

Sensitivity and Uncertainty Quantification……are an Essential Ingredient of PIRT

C1C2

Page 13: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

Slide 12

C1 Cacuci, 2/24/2011

C2 Cacuci, 2/24/2011

Page 14: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

13D. G. Cacuci NCSU, March 10, ‘11

The Adjoint Sensitivity/Uncertainty Analysis Procedure (ASAP) should be used, wherever possible, to compute sensitivities efficiently…

Fundamental Goal of ASAP: use Adjoint Operatorsto compute deterministically the response sensitivitiesto system parameters

exactly and efficiently.

• ASAP circumvents the need to perform repeatedly the expensive “Forward Sensitivity” calculations.

, 1, , ,iR i m

Page 15: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

14D. G. Cacuci NCSU, March 10, ‘11

Predictive Estimation

• Predictive estimation (PE) starts with the identification and characterization of errors or uncertainties from all steps in the sequence of modeling and simulation process leading to a computational model prediction.

• Predictive estimation for computer experiments has three key elements:

(A) model calibration, (B) estimation of the validation domain, (C) model extrapolation.

• The result of the PE analysis is a probabilistic description of possible future outcomes based on all recognized errors and uncertainties.

Page 16: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

15D. G. Cacuci NCSU, March 10, ‘11

A. Model Calibration (1/2)

The “Model Calibration” activity integrates all experimental and computational information, including uncertainties, for the

purpose of updating (“calibrating”) the parameters of the computer model.

• All types and sources of uncertainties must be included: aleatory (inherent, irreducible variations) and epistemic (lack of knowledge, reducible) uncertainties (data uncertainties, numerical discretization errors …)

• The mathematical framework for model calibration is provided by data adjustment (reactor physics) or data assimilation(geophysical sciences) procedures.

• Response Sensitivities are a fundamental component of data assimilation (adjustment) procedures

Page 17: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

16D. G. Cacuci NCSU, March 10, ‘11

• Included within the scope of model calibration and best-estimate adjustment procedures is also the important subject of consistency among experiments and computations.

• Current methods for model calibration are hampered in practice by the significant computational effort required.

• Methods for reducing the computational effort are of great interest; adjoint methods show great promise in his regard.

• The end-product of “Model Calibration” is the“Best-Estimate + Uncertainty” (BE+U) Model, which yields:

– best-estimate values for parameters and responses, and– reduced uncertainties (i.e., “smaller” values for the

variance-covariance matrices) for the best-estimate adjusted parameters and responses.

A. Model Calibration (2/2)

Page 18: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

17D. G. Cacuci NCSU, March 10, ‘11

C. Model Extrapolation

• Addresses the prediction uncertainty in new environments or conditions of interest, including both untested parts of the parameter space and higher levels of system complexity in the validation hierarchy.

• Extrapolation of models and the resulting increase of uncertainty are poorly understood, particularly the estimation of uncertainty that results from nonlinear coupling of two or more physical phenomena that were not coupled in the existing validation database.

Page 19: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

18D. G. Cacuci NCSU, March 10, ‘11

Data Assimilation + Model Calibration (1/4)

• Discretized time-span:

• Parameters:

• Parameter covariances:

| ,i ti α α J J

t t= {1,. . . , N }J

= {1,. . . , N } J

11 12

21 22

... ...

... ...... ... ... ...... ... ... t tN N

C C

C CC

C

,ij i jc α α

Page 20: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

19D. G. Cacuci NCSU, March 10, ‘11

Data Assimilation + Model Calibration (2/4)• Computed responses:

• Covariance of computed responses (using first-order sensitivities only):

• Measured responses:

• Covariances for measured responses:

• “Response deviations”:

• Response-Parameter Covariances:

• Note: and have the same “time-structure” as

| ,n M tn M M J J

| ,n R tn R R J J = {1,. . . , N }R RJ

, , ; ,TR t

C S C S J J

= {1,. . . , N }M MJ

MC

RC

d R M

MC RC C

Page 21: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

20D. G. Cacuci NCSU, March 10, ‘11

Data Assimilation + Model Calibration (3/4)Bayes’ Theorem + “Incomplete Newton” evaluation of the posterior

distribution yields:

Adjusted (“best-estimate”) parameters:

where denotes the corresponding -element of the block-matrix

Adjusted (“best-estimate”) responses:

1 1 1

1t tN N

be †m m r d t, , ,N

r r C C S K d

0 0 0 0††d rc r r m . C α dd C α C S α S α C C

dK , 1

dC

1 1 1

1t tN N

be †m m r d t, , ,N

r r C C S K d

Page 22: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

21D. G. Cacuci NCSU, March 10, ‘11

Data Assimilation + Model Calibration (4/4)

adjusted (“best-estimate”) parameter covariances:

adjusted (“best-estimate”) response covariances:

adjusted (“best-estimate”) response covariances:

“C & E” Consistency Indicator: 2 1Td d C d

1 1 1 1

t tN Nbe †

r d r

C C C C S K C S C

1 1 1 1

t tN Nbe †r m m r d m r

C C C C S K C S C

1 1 1 1

t tN Nbe †r r m r d r

C C C C S K C S C

Page 23: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

22D. G. Cacuci NCSU, March 10, ‘11

…Mathematical details are ín following recent books…

D.G. Cacuci, M.I. Navon, and M. Ionescu-Bujor: Computational Methods for Data Analysis and Assimilation, Chapman & Hall/CRC, Boca Raton, (under contract; scheduled for 2011).

Page 24: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

23D. G. Cacuci NCSU, March 10, ‘11

Illustrative Examples:

Best Estimate Prediction through

Data Assimilation + Model Calibration

in Nuclear Engineering:

• Core Design of the French Gen-IV Prototype Sodium Cooled Fast Reactor

• Validation & Best Estimate Prediction Activities within CASL (DOE Energy Innovation Hub)

Page 25: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

24D. G. Cacuci NCSU, March 10, ‘11

The Current R&D Program for Sodium Fast Reactors in France …

Choice of technologiesconstruction decision

Consolidation of orientations

1st phase of studies

Safety reports (preliminary, final..), R&D for qualification,

Construction

Evaluation of options, R&D on

technologies

2007 2009 2012 2015 2020

… prepares the scientific and techno-economical elements necessary to enable in 2012 the decision regarding the nature of and specifications for the Prototype SFR, which is to become operational in 2020, for industrial deployment of SFRs commencing around 2040

Page 26: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

25D. G. Cacuci NCSU, March 10, ‘11

Model Calibration (Cross Section Adjustment) Procedure at CEA

Experimental Data Base

Covariances

Cross Sections

and Covariances, JEF 2.2

Neutron Transport Computation of Sensitivities

Data Adjustment + Model Calibration

ERALIB 1

Adjusted Library

Over 300 integral experimental values obtained during 1967-2000 from MASURCA, ZEBRA and SNEAK have been used to validate ERALIB-1 for SFR with (U,Pu)O2 core, fertile blanket, and steel structures.

Page 27: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

26D. G. Cacuci NCSU, March 10, ‘11

CASL: The Consortium for Advanced Simulation of Light Water ReactorsA DOE Energy Innovation Hub for Modeling & Simulation of Nuclear

Reactors

Core partnersOak Ridge

National LaboratoryElectric Power

Research InstituteIdaho National LaboratoryLos Alamos National LaboratoryMassachusetts Institute

of TechnologyNorth Carolina State UniversitySandia National LaboratoriesTennessee Valley AuthorityUniversity of MichiganWestinghouse Electric Company

Winning Proposal Team$122,000,000

Over5 Years

(potential renewal for additional 5 years)

Individual contributorsASCOMP GmbHCD-adapco, Inc.

City University of New YorkFlorida State University

Imperial College LondonRensselaer Polytechnic Institute

Southern States Energy BoardTexas A&M University

University of FloridaUniversity of TennesseeUniversity of WisconsinNotre Dame University

The CASL Team: A unique lab-university-industry partnership with remarkable assets

Page 28: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

27D. G. Cacuci NCSU, March 10, ‘11

CASL’s technical focus areas will execute the plan

18 integrated and interdependent projects

MNMModels and Numerical Methods

Bill Martin, LeadRob Lowrie, Deputy

Radiation transport

Thermal hydraulics

VRIVirtual Reactor

IntegrationJohn Turner, Lead

Randy Summers, DeputyRich Martineau, Deputy

Coupled multi- physics environment

VR simulation suite Coupled mechanics

VUQValidation and

Uncertainty QuantificationJim Stewart, Lead

Dan Cacuci, Deputy

V&V and calibration through data assimilation

Sensitivity analysis and uncertainty quantification

AMAAdvanced Modeling

ApplicationsJess Gehin, Lead

Zeses Karoutas, DeputySteve Hess, Deputy

VR requirements VR physical

reactor qualification

Challenge problem application

VR validation NRC engagement

MPOMaterials

Performance and OptimizationChris Stanek, Lead

Sid Yip, DeputyBrian Wirth, Deputy

Upscaling (CMPM) Fuel

microstructure Clad/internals

microstructure Corrosion CRUD deposition Failure modes

Page 29: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

28D. G. Cacuci NCSU, March 10, ‘11

The validation hierarchy integrates all CASL Focus Areas, executed in a bottom-up and top-down way

AMA

VRI

MNM, MPO

Page 30: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

29D. G. Cacuci NCSU, March 10, ‘11

…an incomplete list of specific OPEN ISSUES…

• Quantifying unrecognized errors that cause discrepancies between experiments and computations beyond the accompanying computational and experimental uncertainties; new data assimilation / model calibration (DA/MC) methodology to incorporate such errors;

• Quantifying modeling errors that arise from omitted physical processes(“incomplete modeling”) in addition to discretization-errors; new DA/MC methodology to incorporate such errors;

• Developing ASAP for systems with strong shocks --extending the ASAP for multi-physics systems undergoing phase-change? (Cacuci and Ionescu-Bujor, NSE, 151, 55-66, 2005);

• Exploiting the property of “importance functions” to extend the general use of adjoint functions (current use: acceleration and variance reduction for Monte Carlo methods);

Page 31: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

30D. G. Cacuci NCSU, March 10, ‘11

…an incomplete list of specific OPEN ISSUES…

• Quantifying errors due to nonlinearities and non-Gaussian distributions, up to fourth-order in sensitivities and data distribution-moments (skewness & kurtosis), if available; new (fourth-order sensitivities) DA/MC methodology to incorporate such errors;

• Investigating the trade-off between using a high-order DA/MC vs. iterative application of low-order DA/MC procedures;

• Reducing the phase-space of high-order predictive modeling (data assimilation/model calibration methodology) to be developed in item (i) through (iii) by combining forward and adjoint models (analogous to dual-weighted residuals methods) appropriate to the fourth-order methodology developed in item (iii).

• Global analysis: computation of critical (maxima, minima, saddle, bifurcations) points followed by local sensitivity & uncertainty analysis.

• Exa-scale computing: speed vs. memory

Page 32: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

31D. G. Cacuci NCSU, March 10, ‘11

1. Specification of Complex System

2. V&V + PIRT Planning Activities

4. Validation Experiments Design and Execution

6. Quantification of Validation Metric Results(Forward + Adjoint / Dual Model,

Sensitivities, Uncertainties, Reduced-Order Model)

7. Quantification of Predictive Capability

8. BE+U Model

3. Code Verification,SQA Activities,

and Error Assessment5. Definition of

Validation Metrics

Best Estimate Predictions Require…

Page 33: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

32D. G. Cacuci NCSU, March 10, ‘11

…further challenges…• Leading edge simulation is beyond capabilities of

individual scientists or small groups;

• Scientific expertise is beyond a single field; techniques are broad, visualization and understanding are challenging (e.g. particle-physics experiments, where petabytes of data are created, then time is spent sifting though the data...);

• Transition from lab experiments to centralized national & international facilities; costs and time between upgrades becoming so large that centralization becomes must;

• Multi-core parallelism and petascale machines have already arrived… the next challenge is “predictive capabilities” in the exa-scale computing framework.

Page 34: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

33D. G. Cacuci NCSU, March 10, ‘11

…thank you!

Page 35: Experimentally Validated “Best Estimate + Uncertainty” Modeling of

34D. G. Cacuci NCSU, March 10, ‘11

Longer-term priorities (years 6–10)Near-term priorities (years 1–5)• Deliver improved predictive simulation

of PWR core, internals, and vessel– Couple VR to evolving out-of-vessel

simulation capability– Maintain applicability to other NPP types

• Execute work in 5 technical focus areas to:

– Equip the VR with necessary physical models and multiphysics integrators

– Build the VR with a comprehensive, usable, and extensible software system

– Validate and assess the VR models with self-consistent quantified uncertainties

CASL scope: Develop and apply the VR to assess fuel design, operation, and safety criteria

• Expand activities to include structures, systems, and components beyond the reactor vessel

• Established a focused effort on BWRs and SMRs

• Continue focus on delivering a useful VR to:– Reactor designers– NPP operators– Nuclear regulators– New generation

of nuclear energy professionals

Focus on challenge problem solutions