markov chain monte carlo algorithms for the bayesian analysis of

53
Markov chain Monte Carlo algorithms for the Bayesian analysis of phylogenetic trees Bret Larget [email protected] Departments of Botany and of Statistics University of Wisconsin—Madison June 25, 2008 1 / 32

Upload: others

Post on 24-Mar-2022

4 views

Category:

Documents


0 download

TRANSCRIPT

Markov chain Monte Carlo algorithms for the Bayesiananalysis of phylogenetic trees

Bret [email protected]

Departments of Botany and of StatisticsUniversity of Wisconsin—Madison

June 25, 2008

1 / 32

Markov chain Monte Carlo

Markov chain Monte Carlo (MCMC) is a very general method tosample from probability distributions by means of simulation.

A Markov chain is a sequence of random variables where thedistribution of each random variable depends only on the value of theprevious random variable.

Given the present, the future is independent of the past.

The term Monte Carlo signifies a computer simulation of randomnumbers.

We first demonstrate MCMC with an example.

Introduction 2 / 32

Markov chain Monte Carlo

Markov chain Monte Carlo (MCMC) is a very general method tosample from probability distributions by means of simulation.

A Markov chain is a sequence of random variables where thedistribution of each random variable depends only on the value of theprevious random variable.

Given the present, the future is independent of the past.

The term Monte Carlo signifies a computer simulation of randomnumbers.

We first demonstrate MCMC with an example.

Introduction 2 / 32

Markov chain Monte Carlo

Markov chain Monte Carlo (MCMC) is a very general method tosample from probability distributions by means of simulation.

A Markov chain is a sequence of random variables where thedistribution of each random variable depends only on the value of theprevious random variable.

Given the present, the future is independent of the past.

The term Monte Carlo signifies a computer simulation of randomnumbers.

We first demonstrate MCMC with an example.

Introduction 2 / 32

Markov chain Monte Carlo

Markov chain Monte Carlo (MCMC) is a very general method tosample from probability distributions by means of simulation.

A Markov chain is a sequence of random variables where thedistribution of each random variable depends only on the value of theprevious random variable.

Given the present, the future is independent of the past.

The term Monte Carlo signifies a computer simulation of randomnumbers.

We first demonstrate MCMC with an example.

Introduction 2 / 32

Markov chain Monte Carlo

Markov chain Monte Carlo (MCMC) is a very general method tosample from probability distributions by means of simulation.

A Markov chain is a sequence of random variables where thedistribution of each random variable depends only on the value of theprevious random variable.

Given the present, the future is independent of the past.

The term Monte Carlo signifies a computer simulation of randomnumbers.

We first demonstrate MCMC with an example.

Introduction 2 / 32

Example

We have a function h(θ) from which we want to sample.

We only need to know h up to a normalizing constant.

Target Distribution

Introduction Example 3 / 32

Initial Point

We begin the Markov chain at a single point.

We evaluate the value of h at this point.

Initial Point

Introduction Example 4 / 32

Proposal Distribution

Given our current state, we have a proposal distribution for the nextcandidate state.

Proposal Distribution

Introduction Example 5 / 32

First Proposal

We propose a candidate new point.

Current state θ; Proposed state θ∗

This proposal is accepted.

First Proposal

θθθθ*

Accept with probability 1

Introduction Example 6 / 32

First Proposal

We propose a candidate new point.

Current state θ; Proposed state θ∗

This proposal is accepted.

First Proposal

θθθθ*

Accept with probability 1

Introduction Example 6 / 32

Second Proposal

The proposal was accepted, so proposed state becomes current.

Current state θ; Proposed state θ∗; Make another proposal.

This proposal is rejected.

Second Proposal

θθ θθ*

Accept with probability 0.153

Introduction Example 7 / 32

Second Proposal

The proposal was accepted, so proposed state becomes current.

Current state θ; Proposed state θ∗; Make another proposal.

This proposal is rejected.

Second Proposal

θθ θθ*

Accept with probability 0.153

Introduction Example 7 / 32

Third Proposal

The proposal was rejected, so proposed state is sampled again andremains current.Current state θ; Proposed state θ∗; Make another proposal.This proposal is accepted.

Third Proposal

θθ θθ*

Accept with probability 0.536

Introduction Example 8 / 32

Third Proposal

The proposal was rejected, so proposed state is sampled again andremains current.Current state θ; Proposed state θ∗; Make another proposal.This proposal is accepted.

Third Proposal

θθ θθ*

Accept with probability 0.536

Introduction Example 8 / 32

Beginning of Sample

The first four sample points.

Vertical position is random to separate points at the same point.

Sample So Far

●●

Introduction Example 9 / 32

Larger Sample

Repeat this for 10,000 proposals and show the sample.

Large Sample

Introduction Example 10 / 32

Comparison to Target

Introduction Example 11 / 32

Things to Note

The resulting sample mimics the target sample very well.

The shape of the proposal distribution did not depend on the targetdistribution at all: almost any type of proposal method would haveworked.

There is a lot of autocorrelation: MCMC produces dependent samples.

The acceptance probabilities depend on the proposal distributions andrelative values of the target.

Summaries of the sample are good estimates of corresponding targetquantities:

I The sample mean converges to the mean of the target.I The sample median converges to the median of the target.I The sample tail area above 1.0 converges to the relative area above 1.0

in the target.

Introduction Example 12 / 32

Things to Note

The resulting sample mimics the target sample very well.

The shape of the proposal distribution did not depend on the targetdistribution at all: almost any type of proposal method would haveworked.

There is a lot of autocorrelation: MCMC produces dependent samples.

The acceptance probabilities depend on the proposal distributions andrelative values of the target.

Summaries of the sample are good estimates of corresponding targetquantities:

I The sample mean converges to the mean of the target.I The sample median converges to the median of the target.I The sample tail area above 1.0 converges to the relative area above 1.0

in the target.

Introduction Example 12 / 32

Things to Note

The resulting sample mimics the target sample very well.

The shape of the proposal distribution did not depend on the targetdistribution at all: almost any type of proposal method would haveworked.

There is a lot of autocorrelation: MCMC produces dependent samples.

The acceptance probabilities depend on the proposal distributions andrelative values of the target.

Summaries of the sample are good estimates of corresponding targetquantities:

I The sample mean converges to the mean of the target.I The sample median converges to the median of the target.I The sample tail area above 1.0 converges to the relative area above 1.0

in the target.

Introduction Example 12 / 32

Things to Note

The resulting sample mimics the target sample very well.

The shape of the proposal distribution did not depend on the targetdistribution at all: almost any type of proposal method would haveworked.

There is a lot of autocorrelation: MCMC produces dependent samples.

The acceptance probabilities depend on the proposal distributions andrelative values of the target.

Summaries of the sample are good estimates of corresponding targetquantities:

I The sample mean converges to the mean of the target.I The sample median converges to the median of the target.I The sample tail area above 1.0 converges to the relative area above 1.0

in the target.

Introduction Example 12 / 32

Things to Note

The resulting sample mimics the target sample very well.

The shape of the proposal distribution did not depend on the targetdistribution at all: almost any type of proposal method would haveworked.

There is a lot of autocorrelation: MCMC produces dependent samples.

The acceptance probabilities depend on the proposal distributions andrelative values of the target.

Summaries of the sample are good estimates of corresponding targetquantities:

I The sample mean converges to the mean of the target.I The sample median converges to the median of the target.I The sample tail area above 1.0 converges to the relative area above 1.0

in the target.

Introduction Example 12 / 32

How does MCMC work?

The MCMC I have shown is an example of a Metropolis-Hastingsmethod.

There is an arbitrary proposal distribution.

Each proposal is either accepted or rejected.

Proposals to relatively good places are always accepted.

Proposals to relatively bad places are only accepted some of the time.

Good states may be resampled many times in succession.

Bad states are quickly abandoned.

Introduction Theory 13 / 32

The Theory

Suppose that h is a (potentially unnormalized) probability distributionover a space Θ.

The stationary distribution of a Markov chain is proportional to thelong-run proportion of time that the Markov chain spends in thespace.

Thus, to sample from h we want a Markov chain where the stationarydistribution π is proportional to h.

MCMC depends on the following theorem where q̃ is the actualtransition density.

Introduction Theory 14 / 32

Theorem

Theorem

If the Markov chain is irreducible (can get from anywhere to anywhere elseeventually) and if there exists a function π such that for any two statesx , y ∈ Θ,

π(x)q̃(y | x) = π(y)q̃(x | y)

then π is proportional to the stationary distribution.

The condition of this theorem is called detailed balance.

It means that for any two states x and y , the stationary rate ofmoving from x to y is equal to the stationary rate y to x .

We can show that the Metropolis-Hastings algorithm satisfies detailedbalance for h.

Introduction Theory 15 / 32

Metropolis-Hastings

The target distribution is h.

The proposal density uses the Metropolis-Hastings algorithm in whicha proposal distribution q which can depend on the most recentlysampled θi generates a proposal θ∗ which is accepted with someprobability.

When accepted, θi+1 = θ∗.

When rejected, θi+1 = θi .

The proposal distribution q is essentially arbitrary provided it canmove around the entire space Θ.

Metropolis-Hastings 16 / 32

Metropolis-Hastings

The target distribution is h.

The proposal density uses the Metropolis-Hastings algorithm in whicha proposal distribution q which can depend on the most recentlysampled θi generates a proposal θ∗ which is accepted with someprobability.

When accepted, θi+1 = θ∗.

When rejected, θi+1 = θi .

The proposal distribution q is essentially arbitrary provided it canmove around the entire space Θ.

Metropolis-Hastings 16 / 32

Metropolis-Hastings

The target distribution is h.

The proposal density uses the Metropolis-Hastings algorithm in whicha proposal distribution q which can depend on the most recentlysampled θi generates a proposal θ∗ which is accepted with someprobability.

When accepted, θi+1 = θ∗.

When rejected, θi+1 = θi .

The proposal distribution q is essentially arbitrary provided it canmove around the entire space Θ.

Metropolis-Hastings 16 / 32

Metropolis-Hastings

The target distribution is h.

The proposal density uses the Metropolis-Hastings algorithm in whicha proposal distribution q which can depend on the most recentlysampled θi generates a proposal θ∗ which is accepted with someprobability.

When accepted, θi+1 = θ∗.

When rejected, θi+1 = θi .

The proposal distribution q is essentially arbitrary provided it canmove around the entire space Θ.

Metropolis-Hastings 16 / 32

Metropolis-Hastings

The target distribution is h.

The proposal density uses the Metropolis-Hastings algorithm in whicha proposal distribution q which can depend on the most recentlysampled θi generates a proposal θ∗ which is accepted with someprobability.

When accepted, θi+1 = θ∗.

When rejected, θi+1 = θi .

The proposal distribution q is essentially arbitrary provided it canmove around the entire space Θ.

Metropolis-Hastings 16 / 32

Metropolis-Hastings Algorithm

The acceptance probability is

min

{1,

h(θ∗)

h(θ)× q(θ | θ∗)

q(θ∗ | θ)

}Notice the target density appears only as a ratio, so the normalizingcoefficient cancels.

Metropolis-Hastings 17 / 32

Metropolis-Hastings Algorithm

The acceptance probability is

min

{1,

h(θ∗)

h(θ)× q(θ | θ∗)

q(θ∗ | θ)

}Notice the target density appears only as a ratio, so the normalizingcoefficient cancels.

Metropolis-Hastings 17 / 32

Detailed Balance of M-H

Metropolis-Hastings satisfies detailed balance.

The actual transition probability from θ to θ∗ is the probability ofproposing and then accepting the proposal.

By the theorem, if

h(θ)× q̃(θ∗ | θ) = h(θ∗)× q̃(θ | θ∗)

then h is proportional to the stationary distribution.

h(θ)q̃(θ∗ | θ) = h(θ)q(θ∗ | θ)×min

{1,

h(θ∗)

h(θ)× q(θ | θ∗)

q(θ∗ | θ)

}= min {h(θ)q(θ∗ | θ), h(θ∗)q(θ | θ∗)}= h(θ∗)q̃(θ | θ∗)

Metropolis-Hastings 18 / 32

Bayesian Inference

In Bayesian inference, the posterior distribution is proportional to theproduct of the likelihood and the prior distribution.

For parameters θ and data D,

P {θ | D} =P {D | θ}P {θ}

P {D}.

The denominator is the marginal likelihood of the data, which is theintegral of the likelihood against the prior distribution.

Bayesian Phylogenetics Mathematical Background 19 / 32

Bayesian Inference

In Bayesian inference, the posterior distribution is proportional to theproduct of the likelihood and the prior distribution.

For parameters θ and data D,

P {θ | D} =P {D | θ}P {θ}

P {D}.

The denominator is the marginal likelihood of the data, which is theintegral of the likelihood against the prior distribution.

Bayesian Phylogenetics Mathematical Background 19 / 32

Bayesian Phylogenetics

For a phylogenetic problem, the parameter θ could include:I the tree topology;I edge lengths or divergence times;I parameters for a nucleotide substitution model;I a history of genome rearrangements;I a history of insertions and deletions;I and more.

Bayesian Phylogenetics Phylogenetics 20 / 32

Phylogenetic Inference

Given:I a prior distribution π(θ);I a likelihood model P {D | θ}; andI data D;

the posterior distribution is, by Bayes’ Theorem

P {θ | D} =P {D | θ} × π(θ)

P {D}

We use MCMC to sample from the posterior distribution where

h(θ) = P {D | θ}π(θ) ∝ P {θ | D} .

Bayesian Phylogenetics Phylogenetics 21 / 32

MCMC

Theory says that MCMC estimates converge to posterior quantities,provided that the chain is run sufficiently long.

Since Metropolis-Hastings is so general, some MCMC methods can bemuch more efficient than others.

Some MCMC methods for phylogenetics do not mix well for somedata sets and can be very misleading.

There is no gold standard to ensure good MCMC results, since MCMCis often the only method that can perform a specific calculation.

There are, however, several ways to diagnose lack of convergence.

MCMC Convergence 22 / 32

Good MCMC Practices

Run many short runs and examine acceptance probabilities:I change tuning parameters if some acceptance rates are out of rangeI (good is often 0.10 - 0.50, but this is not always possible)

Run several (four or more) long runs using different random seedsfrom different starting places.

Examine trace plots to make sure that burn in is long enough.

Examine post-burn in estimates of several key variables(log-likelihood, clade probabilities, total tree length) and make surethat they are close.

Look at autocorrelation plots of several key variables.

MCMC Convergence 23 / 32

Good MCMC Practices

Run many short runs and examine acceptance probabilities:I change tuning parameters if some acceptance rates are out of rangeI (good is often 0.10 - 0.50, but this is not always possible)

Run several (four or more) long runs using different random seedsfrom different starting places.

Examine trace plots to make sure that burn in is long enough.

Examine post-burn in estimates of several key variables(log-likelihood, clade probabilities, total tree length) and make surethat they are close.

Look at autocorrelation plots of several key variables.

MCMC Convergence 23 / 32

Good MCMC Practices

Run many short runs and examine acceptance probabilities:I change tuning parameters if some acceptance rates are out of rangeI (good is often 0.10 - 0.50, but this is not always possible)

Run several (four or more) long runs using different random seedsfrom different starting places.

Examine trace plots to make sure that burn in is long enough.

Examine post-burn in estimates of several key variables(log-likelihood, clade probabilities, total tree length) and make surethat they are close.

Look at autocorrelation plots of several key variables.

MCMC Convergence 23 / 32

Good MCMC Practices

Run many short runs and examine acceptance probabilities:I change tuning parameters if some acceptance rates are out of rangeI (good is often 0.10 - 0.50, but this is not always possible)

Run several (four or more) long runs using different random seedsfrom different starting places.

Examine trace plots to make sure that burn in is long enough.

Examine post-burn in estimates of several key variables(log-likelihood, clade probabilities, total tree length) and make surethat they are close.

Look at autocorrelation plots of several key variables.

MCMC Convergence 23 / 32

Good MCMC Practices

Run many short runs and examine acceptance probabilities:I change tuning parameters if some acceptance rates are out of rangeI (good is often 0.10 - 0.50, but this is not always possible)

Run several (four or more) long runs using different random seedsfrom different starting places.

Examine trace plots to make sure that burn in is long enough.

Examine post-burn in estimates of several key variables(log-likelihood, clade probabilities, total tree length) and make surethat they are close.

Look at autocorrelation plots of several key variables.

MCMC Convergence 23 / 32

Trace Plots (good)

●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

0 200 400 600 800 1000

−11

000

−10

500

−10

000

−95

00

Full Runs

Iteration

log−

likel

ihoo

d

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

0 200 400 600 800 1000

−93

90−

9380

−93

70−

9360

Post−Burn in

Iteration

log−

likel

ihoo

d ●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●

●●●●

●●

●●●

●●●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●●●

●●

●●

●●

●●

●●

●●●●

●●

●●●

●●

●●

●●●

●●●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●●

●●

●●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●●●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●

●●

●●●

●●

●●

●●

●●●

●●●

●●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

MCMC Convergence 24 / 32

Trace Plots (bad)

●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

0 200 400 600 800 1000

−11000

−10500

−10000

−9500

Full Runs

Iteration

log−

likel

ihoo

d

●●

●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●

●●●●●●●●

●●●

●●●●

●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●

●●

●●●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●●

●●

●●

●●

●●●●

●●●

●●●

●●

●●

●●

●●●●●

●●●

●●

●●

●●

●●●●

●●●

●●●

●●

●●

●●

●●●●

●●●

●●●●●

●●●

●●●

●●●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●●

●●

●●

●●●

●●

●●

●●●●

●●

●●

●●

●●●●

●●●

●●●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●●

●●●

●●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●●●●

●●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●●●●●

●●

●●

●●

●●●

●●●

●●●

●●

●●

●●

●●

●●

●●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

0 200 400 600 800 1000

−9420

−9400

−9380

−9360

Post−Burn in

Iteration

log−

likel

ihoo

d

●●

●●

●●●

●●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●●●

●●

●●●●

●●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●●●●●

●●●●●

●●

●●

●●●●●

●●

●●

●●●

●●

●●

●●●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●●●

●●

●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●●●●●

●●

●●

●●●

●●●

●●●

●●

●●

●●●

●●●●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●●●●

●●●●

●●●

●●

●●

●●

●●

●●

●●●●

●●●

●●●●●

●●

●●

●●

●●●●

●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●●●●●

●●

●●

●●

●●●

●●●

●●

●●

●●●

●●

●●●●●

●●

●●

●●

●●

●●●

●●●●

●●●

●●

●●●●

●●

●●●

●●

●●

●●

●●

●●●

●●●●●●●

●●●

●●●●

●●●●

●●●

●●●●

●●

●●

●●

●●

●●●

●●

●●●●

●●

●●●

●●●

●●●●●●

●●●

●●

●●

●●●

●●●●●●●●●

●●●

●●●

●●

●●●

●●

●●

●●●

●●●●

●●

●●

●●

●●●●

●●●●●

●●

●●

●●

●●

●●

●●●

●●●●●

●●●●

●●

●●

●●●●

●●

●●

●●

●●●

●●

●●●●

●●

●●

●●

●●

●●

●●●●●

●●

●●●●●●●●

●●

●●

●●

●●●

●●

●●●●●

●●●

●●●●

●●

●●

●●

●●●

●●

●●

●●

●●

●●●●

●●●

●●●

●●●

●●●●

●●

●●●●●●

●●●

●●

●●●

●●

●●●●●●●

●●●●

●●●

●●

●●●

●●

●●

●●●●

●●

●●

●●●●

●●

●●●

●●●●

●●

●●

●●●

●●

●●●

●●●

●●●●

●●●

●●

●●●

●●●

●●

●●

●●

●●●●

●●

●●

●●

●●

●●●●

●●

●●●

●●●●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●●●●

●●●

●●

●●

●●

●●●

●●●

●●●●

●●

●●●

●●

●●

●●●

●●

●●●

●●●

●●●

●●●●

●●●●●

●●●

●●

●●●●●

●●●●

●●

●●

●●●

●●

●●

●●●●●

●●●

●●●●

●●

●●

●●

●●●

●●

●●●●

●●

●●●●

●●●

●●

●●●

●●●●

●●●●

●●●

●●

●●

●●●

●●●●

●●●

●●●●

●●

●●

●●

●●

●●●●

●●

●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●

●●●

●●

●●●

●●

●●

●●●●

●●●●

●●

●●

●●

●●

●●

●●

●●

●●●●

●●●

●●●●

●●●

●●

●●●●●

●●●●●

●●

●●●●

●●●●●

●●

●●●

●●

●●●

●●●

●●●

●●

●●●

●●●

●●●●●●●●●●●

●●

●●

●●●

●●

●●●●

●●●

●●

●●

●●●

●●

●●●●●

●●●

●●●●●

●●●●

●●

●●●●

●●

●●●●●

●●●

●●

●●●●●

●●●

●●

●●●

●●●

●●

●●●

●●

●●

●●●

●●

●●

●●

●●

●●●

●●

●●●●

●●●

●●●●

●●●

●●●●

●●●●●

●●

●●

●●

●●●

●●

●●●●

●●

●●●●

●●

●●

●●

●●●

●●

●●●

●●

●●●

●●

●●●●

●●

●●●●●●

●●

●●

●●

●●

●●

●●●●

●●●

●●●●●

●●●

●●●●

●●

●●

●●●

●●

●●●

●●●

●●●

●●

●●

●●●●

●●

●●●

●●●

●●●

●●

●●●

●●●

●●

●●●●●

●●

●●●

●●

●●●

●●

●●

●●

●●●

●●

●●

●●●●

●●●●●

●●●●

●●●●●●●●●

●●

●●●

●●●●

●●●●●●

●●

●●

●●

●●

●●

●●●●●

●●●

●●

●●●

●●●●

●●

●●●●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●●●

●●

●●

●●

●●●

●●●

●●

●●●●

●●●●●

●●

●●

●●

●●●

●●●

●●●●●●

●●●●

●●

●●

●●

●●

●●

●●●

●●

●●●●●●

●●

●●

●●●

●●

●●●

●●

●●

●●

●●

●●

●●

●●●

●●●

●●

●●

●●

●●●

●●●●●●

●●●

●●●

●●

MCMC Convergence 25 / 32

ACF Plot

0 20 40 60 80 100

0.0

0.2

0.4

0.6

0.8

1.0

Lag

AC

F

ACF Plot

MCMC Convergence 26 / 32

MCMCMC

Metropolis-coupled Markov chain Monte Carlo (MCMCMC) is acommonly used method to improve mixing and MCMC convergence.

In MCMCMC, several chains are run in parallel.

All chains but one are typically heated.

The heated chains sample from the wrong distribution, but are betterable to traverse among peaks.

Occasionally, swaps of the states of the chains are proposed andpossibly accepted.

This can allow the cold chain to“jump”.

However, results can be misleading if the heated chains do notactually mix well.

Advanced Topics MCMCMC 27 / 32

MCMCMC Example

Advanced Topics MCMCMC 28 / 32

Sample from True Distribution

●●

●●

●●

●●

●●

●●

●●

●●

● ●

● ●

●●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●● ●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

●●

●● ●

●●

●●

●●

●●

●●

●●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

● ●●

● ●

● ●

●●

●●

● ●

●● ●

●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

●●

●● ●

●●

●●

●●

●●●

●●

● ●

●●

●●

●●

●●

●●

●●

●● ●

● ●

●●● ●

●●●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●●

● ●

●●

●●

● ●

●●

●●

●● ● ●

●●

●●

●●

●●

●●

●●

●●●

●●

●● ●

● ●●

●●

●●

● ●

●●

● ●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

● ●

●●

●●

● ●

● ●

●●

●●

●●

● ●

●●

Advanced Topics MCMCMC 29 / 32

MCMCMC Sample 1

● ●●●

●●

●●

● ●●

●●

●●

● ●●●

●●

●●

● ●

●●

●●

●●

●●● ●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●● ●●

●●

● ●

● ●● ●●

●●

●●

●●

●●

●●

●●

●●

●●

●●●

●●

● ●●

●●

● ●●

●●

●●

●● ●

●●

●●

●●

●●

● ●

Advanced Topics MCMCMC 30 / 32

MCMCMC Sample 2

●●

● ●●

● ●

●●

●●

● ●

●●

●●

● ●

●●

●●

●●

●●

●●

● ●

● ●●

●●

●●

●●

● ●

●●

● ●● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

●●

● ●

●●

●●

●●

● ●

●●

●●

●●

●●

●●

●●

●●

● ●●

Advanced Topics MCMCMC 31 / 32

Summary

MCMC makes Bayesian phylogenetic inference possible.

MCMC samples can be summarized in many different ways to learnabout different aspects of the posterior distribution.

Do not assume that MCMC samples are good:I Do several runs and examine convergence carefully.

Go Bayes!

Summary 32 / 32