bayesian techniques for parameter estimation

33
Bayesian Techniques for Parameter Estimation He has Van Goghs ear for music, Billy Wilder Reading: Sections 4.6, 4.8 and Chapter 12 1

Upload: others

Post on 15-Jun-2022

11 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Bayesian Techniques for Parameter Estimation

Bayesian Techniques for Parameter Estimation

�He has Van Gogh�s ear for music,� Billy Wilder

Reading: Sections 4.6, 4.8 and Chapter 12

1

Page 2: Bayesian Techniques for Parameter Estimation

Statistical Inference

Goal: The goal in statistical inference is to make conclusions about a phenomenon based on observed data.

Frequentist: Observations made in the past are analyzed with a specified model. Result is regarded as confidence about state of real world.

• Probabilities defined as frequencies with which an event occurs if experiment is repeated several times.

• Parameter Estimation:

o Relies on estimators derived from different data sets and a specific sampling distribution.

o Parameters may be unknown but are fixed and deterministic.

Bayesian: Interpretation of probability is subjective and can be updated with new data.

• Parameter Estimation: Parameters are considered to be random variables having associated densities.

2

Page 3: Bayesian Techniques for Parameter Estimation

Bayesian Inference

Framework:

• Prior Distribution: Quantifies prior knowledge of parameter values.

• Likelihood: Probability of observing a data if we have a certain set of parameter values; Comes from observation models in Chapter 5!

• Posterior Distribution: Conditional probability distribution of unknown parameters given observed data.

Joint PDF: Quantifies all combination of data and observations

Bayes� Relation: Specifies posterior in terms of likelihood, prior, and normalization constant

Problem: Evaluation of normalization constant typically requires high dimensional integration.

⇡(✓, y) = ⇡(y |✓)⇡0(✓)<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

⇡(✓|y) =f (y |✓)⇡0(✓)R

Rp f (y |✓)⇡0(✓)d✓<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

3

Page 4: Bayesian Techniques for Parameter Estimation

Bayesian Inference

Uninformative Prior: No a priori information parameters

Informative Prior: Use conjugate priors; prior and posterior from same distribution

Evaluation Strategies:

• Analytic integration --- Rare

• Classical Gaussian quadrature; e.g., p = 1 - 4

• Sparse grid quadrature techniques; e.g., p = 5 - 40

• Monte Carlo quadrature Techniques

• Markov chain methods

⇡(✓|y) =f (y |✓)⇡0(✓)R

Rp f (y |✓)⇡0(✓)d✓<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

e.g., ⇡0(✓) = 1<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

4

Page 5: Bayesian Techniques for Parameter Estimation

Bayesian Inference: Motivation

e

Example: Displacement-force relation (Hooke’s Law)

Parameter: Stiffness E

Strategy: Use model fit to data to update prior information

Prior Information

Information Provided by Model and Data

Updated Information

Non-normalized Bayes’ Relation:

ModelData

s(M

Pa)si = Eei + "i , i = 1, ... , N

"i ⇠ N(0,�2)

⇡0(E) e-PN

i=1[si-Eei ]2/2�2 ⇡(E |s)

⇡(E |s) = e-PN

i=1[si-Eei ]2/2�2⇡0(E) 5

Page 6: Bayesian Techniques for Parameter Estimation

Bayesian InferenceBayes� Relation: Specifies posterior in terms of likelihood and prior

• Prior Distribution: Quantifies prior knowledge of parameter values

• Likelihood: Probability of observing a data given set of parameter values.

• Posterior Distribution: Conditional distribution of parameters given observed data.

Problem: Can require high-dimensional integration

• e.g., Many applications: p = 10-50!

• Solution: Sampling-based Markov Chain Monte Carlo (MCMC) algorithms.

• Metropolis algorithms first used by nuclear physicists during Manhattan Project in 1940’s to understand particle movement underlying first atomic bomb.

Posterior Distribution

Normalization Constant

Prior Distribution

Likelihood: e-PN

i=1[si-Eei ]2/2�2 , q = E� = [s1, ... , sN ]

⇡(✓|y) =f (y |✓)⇡0(✓)R

Rp f (y |✓)⇡0(✓)d✓<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

6

Page 7: Bayesian Techniques for Parameter Estimation

Bayesian Model CalibrationBayesian Model Calibration:

• Parameters assumed to be random variables

Bayes’ Relation:

P (A|B) =P (B|A)P (A)

P (B)

Example: Coin Flip

Likelihood:

Posterior with flat Prior:

⇡(✓|y) =f (y |✓)⇡0(✓)R

Rp f (y |✓)⇡0(✓)d✓<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

⇡(y |✓) =NY

i=1

✓yi (1 - ✓)1-yi

= ✓N1(1 - ✓)N0<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

⇡0(✓) = 1<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Yi(!) =

�0 , ! = T

1 , ! = H<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

⇡(✓|y) =✓N1(1 - ✓)N0

R10 ✓

N1(1 - ✓)N0 dq=

(N + 1)!N0!N1!

✓N1(1 - ✓)N0

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

7

Page 8: Bayesian Techniques for Parameter Estimation

Bayesian Inference

Example:

1 Head, 0 Tails 5 Heads, 9 Tails 49 Heads, 51 Tails

Note:

8

Page 9: Bayesian Techniques for Parameter Estimation

Bayesian InferenceExample: Now consider

Note: Poor informative prior incorrectly influences results for a long time.

50 Heads, 50 Tails5 Heads, 5 Tails

⇡0(✓) =1

�p

2⇡e-(✓-µ)2/2�2

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

with µ = 0.3 and � = 0.1<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

9

Page 10: Bayesian Techniques for Parameter Estimation

Likelihood:

Assumption: Assume that measurement errors are iid and

Parameter Estimation Problem

"i ⇠ N(0,�2)

is the sum of squares error.

where

SS✓ =nX

j=1

[yj - fi(✓)]2

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Observation Model:

yi = fi(✓) + "i , i = 1, ... , n<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

10

f (y |✓) = L(✓,�|y) =1

(2⇡�2)n/2 e-SS✓/2�2

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Page 11: Bayesian Techniques for Parameter Estimation

Parameter Estimation: Example

Example: Consider the spring model Note: Take

z + Cz + Kz = 0

z(0) = 2 , z(0) = -C<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

K = 20.5, C0 = 1.5<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

z(t) = 2e-Ct/2 cos(p

K - C2/4 · t)<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Take K to be known and ✓ = C. Assume that "i ⇠ N(0,�20)

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

where �0 = 0.1<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

11

Page 12: Bayesian Techniques for Parameter Estimation

Parameter Estimation: ExampleExample: The sensitivity matrix is

where

Here

so that

1.4 1.45 1.5 1.55 1.60

5

10

15

20

25

Optimal C

Dens

ity

ContructedSampling

Figure: Sampling distribution compared with that constructed using 10,000 estimated values of C.

Note: In 10,000 simulations, 9455 of confidence intervals contained true parameter value.

X(✓) =

@y@C

(t1, ✓), · · · ,@y@C

(tn, ✓)�T

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

@y@C

= e-Ct/2

Ctp4K - C2

sin⇣p

K - C2/4 · t⌘- t cos

⇣pK - C2/4 · t

⌘�

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

V = �2c = �2

0⇥XT (✓)X(✓)

⇤-1= 3.35 ⇥ 10-4

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

bC ⇠ N�C0,�2

c�

, �c = 0.0183<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

12

Page 13: Bayesian Techniques for Parameter Estimation

Parameter Estimation: ExampleBayesian Inference: Employ the flat prior

Note: •Slow even for one parameter.

•Strategy: create Markov chain using random sampling so that created chain has the posterior distribution as its limiting (stationary) distribution.

Posterior Distribution:

Midpoint formula:

Issue:

1.4 1.45 1.5 1.55 1.60

5

10

15

20

25

Damping Parameter C

Posterior Density

Sampling Density

⇡0(✓) = �[0,1)(✓)<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

⇡(✓|y) =e-SS✓/2�2

0

R10 e-SS⇣/2�2

0 d⇣=

1R10 e-(SS⇣-SS✓)/2�2

0 d⇣<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

⇡(✓|y) ⇡ 1Pk

i=1 e-(SS⇣i -SS✓)/2�20 wi

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

13

e-SS✓MAP ⇡ 3 ⇥ 10-113<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Page 14: Bayesian Techniques for Parameter Estimation

Bayesian Model Calibration

Bayesian Model Calibration:

•Parameters considered to be random variables with associated densities.

Problem:

•Often requires high dimensional integration;

o e.g., p = 18 for MFC model

o p = thousands to millions for some models

Strategies:

•Sampling methods

•Sparse grid quadrature techniques

14

⇡(✓|y) =f (y |✓)⇡0(✓)R

Rp f (y |✓)⇡0(✓)d✓<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

f

1=θ*

θ*

θ2=θ*

J(θ*| )θk−1

θ0

θ =θ3 2

(y|θ)

θ

Page 15: Bayesian Techniques for Parameter Estimation

Markov ChainsDefinition:

Note: A Markov chain is characterized by three components: a state space, an initial distribution, and a transition kernel.

State Space:

Initial Distribution: (Mass)

Transition Probability: (Markov Kernel)

15

Page 16: Bayesian Techniques for Parameter Estimation

Markov Chain TechniquesMarkov Chain: Sequence of events where current state depends only on last value.

Baseball: •Assume that team which won last game has 70% chance of winning next game and 30% chance of losing next game.

•Assume losing team wins 40% and loses 60% of next games.

•Percentage of teams who win/lose next game given by

•Question: does the following limit exist?

States are S = {win,lose}. Initial state is p0 = [0.8, 0.2].

0.7 win lose

0.4

0.3

0.6

p1 = [0.8 , 0.2]0.7 0.30.4 0.6

�= [0.64 , 0.36]

pn = [0.8 , 0.2]0.7 0.30.4 0.6

�n

16

Page 17: Bayesian Techniques for Parameter Estimation

Markov Chain TechniquesBaseball Example: Solve constrained relation

⇡ = ⇡P ,X

⇡i = 1

) [⇡win , ⇡lose]0.7 0.30.4 0.6

�= [⇡win , ⇡lose] , ⇡win + ⇡lose = 1

to obtain

⇡ = [0.5714 , 0.4286]

17

Page 18: Bayesian Techniques for Parameter Estimation

Markov Chain TechniquesBaseball Example: Solve constrained relation

⇡ = ⇡P ,X

⇡i = 1

) [⇡win , ⇡lose]0.7 0.30.4 0.6

�= [⇡win , ⇡lose] , ⇡win + ⇡lose = 1

to obtain

⇡ = [0.5714 , 0.4286]

Alternative: Iterate to compute solution

n pn n pn n pn

0 [0.8000 , 0.2000] 4 [0.5733 , 0.4267] 8 [0.5714 , 0.4286]1 [0.6400 , 0.3600] 5 [0.5720 , 0.4280] 9 [0.5714 , 0.4286]2 [0.5920 , 0.4080] 6 [0.5716 , 0.4284] 10 [0.5714 , 0.4286]3 [0.5776 , 0.4224] 7 [0.5715 , 0.4285]

Notes:• Forms basis for Markov Chain Monte Carlo (MCMC) techniques

• Goal: construct chains whose stationary distribution is the posterior density 18

Page 19: Bayesian Techniques for Parameter Estimation

Irreducible Markov Chains

Irreducible:

Reducible Markov Chain:

p1 p2

Note: Limiting distribution not unique if chain is reducible.

19

Page 20: Bayesian Techniques for Parameter Estimation

Periodic Markov Chains

Example:

Periodicity: A Markov chain is periodic if parts of the state space are visited at regular intervals. The period k is defined as

20

Page 21: Bayesian Techniques for Parameter Estimation

Periodic Markov Chains

Example:

21

Page 22: Bayesian Techniques for Parameter Estimation

Stationary Distribution

Theorem: A finite, homogeneous Markov chain that is irreducible and aperiodic has a unique stationary distribution and the chain will converge in the sense of distributions from any initial distribution .

Recurrence (Persistence):

Example: State 3 is transient

Ergodicity: A state is termed ergodic if it is aperiodic and recurrent. If all states of an irreducible Markov chain are ergodic, the chain is said to be ergodic. 22

Page 23: Bayesian Techniques for Parameter Estimation

Matrix Theory

Definition:

Lemma:

Example:

23

Page 24: Bayesian Techniques for Parameter Estimation

Matrix Theory

Theorem (Perron-Frobenius):

Corollary 1:

Proposition:

24

Page 25: Bayesian Techniques for Parameter Estimation

Stationary Distribution

Corollary:

Proof:

Convergence: Express

25

UPV = ⇤ =

2

6664

1 0 · · · 00 �2...

. . ....

0 · · · �k

3

7775

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

where 1 > |�2| > · · · > |�k | and V = U-1<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

limn!1

Pn = limn!1

V

2

6664

1 0 · · · 00 �n

2...

. . ....

0 · · · �nk

3

7775U = V

2

6664

1 0 · · · 00 0...

. . ....

0 · · · 0

3

7775U

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Page 26: Bayesian Techniques for Parameter Estimation

Stationary Distribution

26

UP = ⇤U implies<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

2

64⇡1 · · · ⇡k...

...uk1 · · · ukk

3

75

2

4 P

3

5 =

2

6664

1�2

. . .�n

3

7775

2

64⇡1 · · · ⇡k...

...uk1 · · · ukk

3

75

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

V = U-1 )<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

UV =

2

64⇡1 · · · ⇡k...

...uk1 · · · ukk

3

75

2

641 · · · v1k...

...1 · · · vkk

3

75 =

2

641 · · · 0...

...0 · · · 1

3

75

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

limn!1

pn = limn!1

p0Pn

= limn!1

⇥p0

1, ... , p0k⇤2

641 · · · vk1...

...1 · · · vkk

3

75

2

6664

1�n

2. . .

�nk

3

7775

2

64⇡1 · · · ⇡k...

...uk1 · · · ukk

3

75

=⇥

p01 · · · p0

k

⇤2

641 · · · vk1...

...1 · · · vkk

3

75

2

6664

10

. . .0

3

7775

2

64⇡1 · · · ⇡k...

...uk1 · · · ukk

3

75

= [⇡1, ... ,⇡k ]

= ⇡,<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Page 27: Bayesian Techniques for Parameter Estimation

Detailed Balance Conditions

Reversible Chains: A Markov chain determined by the transition matrix is reversible if there is a distribution that satisfies the detailed balance conditions

Proof: We need to show that

Example:

27

Page 28: Bayesian Techniques for Parameter Estimation

Markov Chain Monte Carlo Methods

Strategy: Markov chain simulation used when it is impossible, or computationally prohibitive, to sample q directly from

Note:

• In Markov chain theory, we are given a Markov chain, P, and we construct its equilibrium distribution.

• In MCMC theory, we are �given� a distribution and we want to construct a Markov chain that is reversible with respect to it.

28

⇡(✓|y) =f (y |✓)⇡0(✓)R

Rp f (y |✓)⇡0(✓)d✓<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

• Create a Markov process whose stationary distribution is ⇡(✓|y)<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Page 29: Bayesian Techniques for Parameter Estimation

Assumption: Assume that measurement errors are iid and

Model Calibration Problem

"i ⇠ N(0,�2)

Likelihood:

is the sum of squares error.

where

29

SS✓ =nX

j=1

[yj - fi(✓)]2

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

f (y |✓) = L(✓,�|y) =1

(2⇡�2)n/2 e-SS✓/2�2

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Page 30: Bayesian Techniques for Parameter Estimation

Markov Chain Monte Carlo Methods

General Strategy:

Intuition: Recall that

30

• Current value: Xk-1 = ✓k-1<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

• Propose candidate ✓⇤ ⇠ J(✓⇤|✓k-1) from proposal (jumping) distribution<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

• With probability ↵(✓⇤, ✓k-1), accept ✓⇤; i.e., Xk = ✓⇤<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

• Otherwise, stay where you are: Xk = ✓k-1<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

⇡(✓|y) =f (y |✓)⇡0(✓)R

Rp f (y |✓)⇡0(✓)d✓<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

f (y |✓) =1

(2⇡�2)n/2 e-Pn

i=1[yi-fi(✓)]2/2�2=

1(2⇡�2)n/2 e-SS✓/2�2

<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

θ|θ)f

θ* θk−1 θ* θk−1θ θ

SS(y

Page 31: Bayesian Techniques for Parameter Estimation

Markov Chain Monte Carlo MethodsIntuition:

Note: Narrower proposal distribution yields higher probability of acceptance.

31

θ|θ)f

θ* θk−1 θ* θk−1θ θ

SS(y

• Consider r(✓⇤|✓k-1) = ⇡(✓⇤|y)⇡(✓k-1|y) = f(y |✓⇤)⇡0(✓⇤)

f(y |✓k-1)⇡0(✓k-1)<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

� If r < 1 ) f (y |✓⇤) < f (y |✓k-1), accept with probability ↵ = r<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

� If r > 1, accept with probability ↵ = 1<latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit><latexit sha1_base64="(null)">(null)</latexit>

Page 32: Bayesian Techniques for Parameter Estimation

Markov Chain Monte Carlo MethodsNote: Narrower proposal distribution yields higher probability of acceptance.

0 2000 4000 6000 8000 100001.44

1.46

1.48

1.5

1.52

1.54

1.56

1.58

Chain Iteration

Pa

ram

ete

r V

alu

e

0 2000 4000 6000 8000 100001.44

1.46

1.48

1.5

1.52

1.54

1.56

1.58

Chain Iteration

Para

mete

r V

alu

e

32

f

1=θ*

θ*

θ2=θ*

J(θ*| )θk−1

θ0

θ =θ3 2

(y|θ)

θ

f* θ*

θ*

θ1=θ*θ

(

(y|θ)

0 θ2=θ

|θJ

θ =θ3 1

k−1)

1

θ

Page 33: Bayesian Techniques for Parameter Estimation

Proposal Distribution

Proposal Distribution: Significantly affects mixing

• Too wide: Too many points rejected and chain stays still for long periods;

• Too narrow: Acceptance ratio is high but algorithm is slow to explore parameter space

• Ideally, it should have similar �shape� to posterior distribution.

• Anisotropic posterior, isotropic proposal;

• Efficiency nonuniform for different parameters

Problem:

Result:

• Recovers efficiency of univariate case

33

(b)

(θ|y) π (θ|y)θ2

θ1

J(θ*| k−1θ ) θ2 J(θ*| k−1θ )

θ1

(a)

π