spao &temporal,dependence:,, ablessing,and,acurse,for...
TRANSCRIPT
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference
(illustrated by composi$onal data modeling) (and with an introduc$on to NIMBLE)
NSF-‐CBMS Workshop on Bayesian Spa$al Sta$s$cs August 2017
Christopher Paciorek UC Berkeley Sta$s$cs
Joint work with: The PalEON project team (hQp://paleonproject.org) The NIMBLE development team (hQp://r-‐nimble.org)
Funded by various NSF grants to the PalEON and NIMBLE projects
PalEON Project
Goal: Improve the predic$ve capacity of terrestrial ecosystem models Friedlingstein et al. 2006 J. Climate
“This large varia-on among carbon-‐cycle models … has been called ‘uncertainty’. I prefer to call it ‘ignorance’.” -‐ Pren-ce (2013) Grantham Ins-tute
Cri-cal issue: model parameteriza$on and representa$on of decadal-‐ to centennial-‐scale processes are poorly constrained by data Approach: use historical and fossil data to es$mate past vegeta$on and climate and use this informa$on for model ini$aliza$on, assessment, and improvement Spa$o-‐temporal dependence: a blessing
and a curse for computa$on and inference 2
Fossil Pollen Data
2000
1900
1800
1700
1600
1500
1300
1200
1100
1000
1400
Year A
D
Pine
20
Hemloc
k
20 40
Birch
20
Oak Hickory
20 40
Beech
Chestn
ut
20
Grass &
wee
ds
1500
Charco
al:Poll
en
60
% orga
nic m
atterBerry Pond WestBerry Pond, W Massachusetts
Onset of Little Ice Age
European settlement
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 3
SeQlement-‐era Land Survey Data Survey grid in Wisconsin Surveyor notes
Raw oak tree propor$ons (on a grid in the western por$on and in irregular township areas in the eastern por$on)
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 4
Outline • Applica$on 1: Spa$al smoothing of composi$onal data
– Se]ng: Mul$variate data, high-‐dimensional quan$$es, non-‐conjugate models
– A hierarchical mul$nomial probit model with CAR spa$al process – Data augmenta$on – How much smoothness (in space)? – Computa$onal implica$ons
• Computa$onal tools – Overview of current soaware – Introduc$on to NIMBLE
• Applica$on 2: Temporal predic$on of biomass from composi$onal data – How much smoothness (in $me)? – A hierarchical s$ck-‐breaking composi$onal model with Generalized Pareto
nonsta$onary temporal smoothing – Default MCMC and computa$onal challenges – Customized MCMC using NIMBLE
• Concluding thoughts Spa$o-‐temporal dependence: a blessing
and a curse for computa$on and inference 5
Applica$on 1: Spa$al smoothing of composi$onal data
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 6
• Mul$variate: ~20 taxa (species) • Sum-‐to-‐one constraint on
propor$ons • 8 km by 8 km grid:
• ~10,000 grid points • 1.3 million trees (> 20 cm diameter)
in total • ~125 trees per grid cell
Applica$on 1: Should we model spa$al dependence?
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 7
• Yes: • We want to es$mate composi$on at all loca$ons. • We want to smooth over noise at observed loca$ons. • We are interested in joint inference for mul$ple loca$ons, so we need
to account for posterior covariance. • No:
• We would need to model the spa$al dependence, with the resul$ng computa$onal implica$ons.
Applica$on 1: Should we model mul$variate dependence?
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 8
• Yes: • Taxa do show correlated abundance (taxa have similari$es in their
ecological characteris$cs). • If joint inference on mul$ple tree species is desired, need mul$variate
correla$on structure to properly characterize given our actual knowledge.
• No: • Dependence varies by loca$on (nonsta$onarity)
• E.g., hemlock/beech posi$vely correlated in general, but beech not present in some loca$ons where hemlock appears (different western range limits)
• Would require more complex model • Loca$ons with data have data for all taxa
• Imputa$on is only spa$al not mul$variate • With no measurement error and separable covariance, kriging
predic$on for a taxon depends only on data from that taxon at other loca$ons
• Inference not focused on mul$-‐taxon func$onals
Applica$on 1: Spa$al smoothing of composi$onal data
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 9
• Mul$variate: ~20 taxa (species) • Sum-‐to-‐one constraint on
propor$ons • 8 km by 8 km grid:
• ~10,000 grid points • 1.3 million trees (> 20 cm diameter)
in total • ~125 trees per grid cell
Model overview: • Mul$nomial likelihood (no over-‐dispersion) • One spa$al process per taxon
• Sum-‐to-‐one constraint based on a mul$nomial probit specifica$on • Otherwise, no mul$variate structure
• Spa$al process hyperparameters
Applica$on 1: Standard Spa$al Mul$nomial Logit Model
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 10
A spa$al mul$nomial logit model:
yi ⇠ Multi(ni, ✓(si))
✓p(si) =
exp(gp(si))Pk exp(gk(si))
gp(·) ⇠ GP(�p)
for loca$on i and taxon p. Computa$onal implica$ons:
• No conjugacy! • Can’t integrate analy$cally over the latent processes • How propose good values of each g process?
Consider McCulloch and Rossi (1994) mul$nomial extension of Albert and Chib (1993) data augmenta$on (DA) trick for probit regression.
Applica$on 1: Spa$al Mul$nomial Probit Model with Data Augmenta$on
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 11
A spa$al mul$nomial probit model:
for loca$on i, tree j, and taxon p. Computa$onal implica$ons:
• Data augmenta$on version allows conjugate updates of each g process
• But! Introduce new level in model – higher dimensional and with poten$al for cross-‐level dependence to impede MCMC performance
yij = p i↵ wijp = max
kwijk
wijp ⇠ N (gp(si), 1)
gp(·) ⇠ GP(�p)
Applica$on 1: How much smoothness?
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 12
Applica$on is based on 8 km grid, so CAR style (i.e., Markov random field) models a natural choice. How smooth spa$ally?
• First order (simple neighborhood) CAR models: not smooth spa$ally.
• Second order (thin-‐plate spline) CAR models: very smooth spa$ally. • Lindgren et al (2011) SPDE approxima$on to Matern-‐based Gaussian
process: range parameter and limited control over differen$ability parameter.
yij = p i↵ wijp = max
kwijk
wijp ⇠ N (gp(si), 1)
gp ⇠ N (0,�2pQ
�) (ICAR)
Applica$on 1: Smoothness and computa$on
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 13
• Sparse precision matrices • Very computa$onally efficient for conjugate updates • Without conjugacy not clear how to generate good proposals for
en$re spa$al field for a taxon, so computa$onal efficiency of limited relevance • Loca$on-‐specific updates would mix poorly when there is
strong spa$al dependence • Simple CAR models may show reasonable mixing for spa$al
process values with fixed hyperparameters because of lesser spa$al smoothness
• Cross-‐level dependence from separate updates of latent data values, spa$al process values, spa$al hyperparameters • Updates of spa$al process and hyperparameters not directly
informed by data
Applica$on 1: MCMC design
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 14
• Cross-‐level dependence from separate updates of latent data values, spa$al process values, spa$al hyperparameters
• Adequate performance required joint (cross-‐level) updates of {gp, σp}: • Metropolis proposal for σp with conjugate proposal for gp • Equivalent to marginalizing over gp but avoids correlated truncated
normal density for w
yij = p i↵ wijp = max
kwijk
wijp ⇠ N (gp(si), 1)
gp ⇠ N (0,�2pQ
�) (ICAR)
Applica$on 1: MCMC implementa$on
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 15
• Overall MCMC wriQen in R • Truncated normal computa$ons done in C++ via Rcpp (can also use
openMP for paralleliza$on) • Joint {gp, σp} samples done in R using sparse matrix computa$ons with
spam package (which uses Fortran) • Even with customiza$on, MCMC takes order of two weeks • Computa$on pre-‐dates NIMBLE but NIMBLE designed to allow users to
set up customized MCMC sampling for components of models • E.g., the joint {gp, σp} sampling could be coded as a user-‐defined
sampler in NIMBLE (and NIMBLE provides such a sampler for some such situa$ons)
yij = p i↵ wijp = max
kwijk
wijp ⇠ N (gp(si), 1)
gp ⇠ N (0,�2pQ
�) (ICAR)
Applica$on 1: MCMC performance
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 16
Trace plots for taxon-‐specific hyperparameters
iteration
Ash
0 50 150 250
0.16
0.18
0.20
0.22
iteration
Basswo
od
0 50 150 250
0.20
0.22
0.24
0.26
iteration
Beech
0 50 150 250
0.65
0.70
0.75
0.80
iteration
Birch
0 50 150 250
0.16
0.18
0.20
iteration
Gum
0 50 150 250
0.2
0.4
0.6
0.8
1.0
iteration
Cedar
0 50 150 250
0.46
0.50
0.54
Applica$on 1: Results
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 17
Model selec$on: • First order CAR and Lindgren GP approxima$on have similar performance but GP approxima$on has anomalies at the spa$al boundaries.
• Second order (thin plate spline) CAR too smooth. Predic$on:
hQp://gandalf.berkeley.edu:3838/paciorek/setVegComp
Bayesian soaware landscape
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 18
Hand-‐coded algorithms: • R, Python: fast to develop and easy to share, but slow computa$on • C++, Rcpp: slower to develop and harder to share, but fast computa$on • Julia: fast to develop and fast computa$onally but less widely used
Black-‐box MCMC engines: • JAGS: single variable samplers with a focus on conjugate samplers • Stan: Hamiltonian MC, varia$onal Bayes • PyMCMC3: flexible sampler choice, Hamiltonian MC, varia$onal Bayes
NIMBLE: • Customizable MCMC and other algorithms plus a system for programming
algorithms for hierarchical models in R
Applica$on 1: Soaware needs
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 19
• Exploit sparsity • Flexibility in choosing samplers for parts of the model • Joint sampling of spa$ally-‐dependent process values • Customize joint sampling of hyperparameters and spa$al process to improve mixing
• Use compiled code for computa$onal boQlenecks
Notes: • NIMBLE can’t do all of this yet (no sparse matrices right now), but
designed for such flexibility • Would be interes$ng to compare performance of my customized
sampling to Stan’s HMC
Exis$ng soaware
X(1) X(2) X(3)
Y(1) Y(2) Y(3)
Model Algorithm
e.g., BUGS (WinBUGS, OpenBUGS, JAGS), INLA, Stan, various R packages
NIMBLE: The Goal
X(1) X(2) X(3)
Y(1) Y(2) Y(3)
Model Algorithm language
+
=
X(1) X(2) X(3)
Y(1) Y(2) Y(3)
MCMC Flavor 1
MCMC Flavor 2
Par$cle Filter
Importance Sampler
Maximum likelihood
Quadrature
MCEM
Data cloning
Your new method
Divorcing Model Specifica$on from Algorithm
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 22
NIMBLE’s goals
– Retaining BUGS compa$bility – Providing a variety of standard algorithms – Allowing developers to add new algorithms (including modular combina8on of algorithms)
– Allowing users to operate within R – Providing speed via compila$on to C++, with R wrappers
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 23
sta$s$cal model (BUGS code) + algorithm (nimbleFunc$on)
R objects + R under the hood
R objects + C++ under the hood ² We generate C++ code, ² compile and load it, ² provide interface object.
NIMBLE System Summary
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 24
NIMBLE
1. Model specifica$on
BUGS language è R/C++ model object
2. Algorithm library
MCMC, Par$cle Filter/Sequen$al MC, etc.
3. Programming algorithms NIMBLE programming language within R è R/C++ algorithm object
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 25
The Success of R
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 26
You give NIMBLE:
liQersCode <-‐ nimbleCode( { for(j in 1:G) { for(I in 1:N) {
r[i, j] ~ dbin(p[i, j], n[i, j]); p[i, j] ~ dbeta(a[j], b[j]);
} mu[j] <-‐ a[j]/(a[j] + b[j]); theta[j] <-‐ 1.0/(a[j] + b[j]); a[j] ~ dgamma(1, 0.001); b[j] ~ dgamma(1, 0.001); } )
You get this:
> liQersModel$a[1] <-‐ 5 # set values in model > simulate(liQersModel, ‘p') # simulate from prior > p_deps <-‐ liQersModel$getDependencies(‘p’) # model structure > calculate(liQersModel, p_deps) # calculate probability density > getLogProb(pumpModel, ‘r')
NIMBLE: Programming with Models
NIMBLE also extends BUGS: mul$ple parameteriza$ons, named parameters, and user-‐defined distribu$ons and func$ons.
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 27
> liQersMCMCconf <-‐ configureMCMC(liQersModel) > liQersMCMCconf$printSamplers() […snip…] [3] RW sampler; targetNode: b[1], adap$ve: TRUE, adaptInterval: 200, scale: 1 [4] RW sampler; targetNode: b[2], adap$ve: TRUE, adaptInterval: 200, scale: 1 [5] conjugate_beta sampler; targetNode: p[1, 1], dependents_dbin: r[1, 1] [6] conjugate_beta sampler; targetNode: p[1, 2], dependents_dbin: r[1, 2] [...snip...] > liQersMCMCconf$addSampler(‘a[1]’, ‘slice’, list(adaptInterval = 100)) > liQersMCMCconf$addSampler(‘a[2]’, ‘slice’, list(adaptInterval = 100)) > liQersMCMCconf$addMonitors(‘theta’) > liQersMCMC <-‐ buildMCMC(liQersMCMCspec) > liQersMCMC_Cpp <-‐ compileNimble(liQersMCMC, project = liQersModel) > liQersMCMC_Cpp$run(20000)
User Experience: Specializing an Algorithm to a Model liQersModelCode <-‐ modelCode({ for(j in 1:G) { for(I in 1:N) {
r[i, j] ~ dbin(p[i, j], n[i, j]); p[i, j] ~ dbeta(a[j], b[j]);
} mu[j] <-‐ a[j]/(a[j] + b[j]); theta[j] <-‐ 1.0/(a[j] + b[j]); a[j] ~ dgamma(1, 0.001); b[j] ~ dgamma(1, 0.001); })
sampler_slice <-‐ nimbleFunc$on( setup = func$on((model, mvSaved, control) { calcNodes <-‐ model$getDependencies(control$targetNode) discrete <-‐ model$getNodeInfo()[[control$targetNode]]$isDiscrete() […snip…] run = func$on() { u <-‐ getLogProb(model, calcNodes) -‐ rexp(1, 1) x0 <-‐ model[[targetNode]] L <-‐ x0 -‐ runif(1, 0, 1) * width […snip….] …
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 28
NIMBLE
1. Model specifica$on
BUGS language è R/C++ model object
2. Algorithm library
MCMC, Par$cle Filter/Sequen$al MC, MCEM, etc.
3. Programming algorithms NIMBLE programming language within R è R/C++ algorithm object
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 29
NIMBLE’s algorithm library – MCMC samplers:
• Conjugate, adap$ve Metropolis, adap$ve blocked Metropolis, slice, ellip$cal slice sampler, par$cle MCMC, specialized samplers for par$cular distribu$ons (Dirichlet, CAR) • Flexible choice of sampler for each parameter • User-‐specified blocks of parameters
– Sequen$al Monte Carlo (par$cle filters) • Various flavors
– MCEM – Write your own
Spa$o-‐temporal dependence: a blessing
and a curse for computa$on and inference 30
NIMBLE
1. Model specifica$on
BUGS language è R/C++ model object
2. Algorithm library
MCMC, Par$cle Filter/Sequen$al MC, etc.
3. Algorithm specifica$on NIMBLE programming language within R è R/C++ algorithm object
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 31
NIMBLE: Programming With Models
We want: • High-‐level processing (model structure) in R
• Low-‐level processing in C++
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 32
sampler_myRW <-‐ nimbleFunc$on( setup = func$on(model, mvSaved, targetNode, scale) { calcNodes <-‐ model$getDependencies(targetNode) }, run = func$on() { model_lp_ini$al <-‐ calculate(model, calcNodes) proposal <-‐ rnorm(1, model[[targetNode]], scale) model[[targetNode]] <<-‐ proposal model_lp_proposed <-‐ calculate(model, calcNodes) log_MH_ra$o <-‐ model_lp_proposed -‐ model_lp_ini$al if(decide(log_MH_ra$o)) jump <-‐ TRUE else jump <-‐ FALSE # …. Various bookkeeping opera$ons … # })
2 kinds of func$ons
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 33
NIMBLE: Programming With Models
sampler_myRW <-‐ nimbleFunc$on( setup = func$on(model, mvSaved, targetNode, scale) { calcNodes <-‐ model$getDependencies(targetNode) }, run = func$on() { model_lp_ini$al <-‐ calculate(model, calcNodes) proposal <-‐ rnorm(1, model[[targetNode]], scale) model[[targetNode]] <<-‐ proposal model_lp_proposed <-‐ calculate(model, calcNodes) log_MH_ra$o <-‐ model_lp_proposed -‐ model_lp_ini$al if(decide(log_MH_ra$o)) jump <-‐ TRUE else jump <-‐ FALSE # …. Various bookkeeping opera$ons … # })
query model structure ONCE
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 34
NIMBLE: Programming With Models
sampler_myRW <-‐ nimbleFunc$on( setup = func$on(model, mvSaved, targetNode, scale) { calcNodes <-‐ model$getDependencies(targetNode) }, run = func$on() { model_lp_ini$al <-‐ calculate(model, calcNodes) proposal <-‐ rnorm(1, model[[targetNode]], scale) model[[targetNode]] <<-‐ proposal model_lp_proposed <-‐ calculate(model, calcNodes) log_MH_ra$o <-‐ model_lp_proposed -‐ model_lp_ini$al if(decide(log_MH_ra$o)) jump <-‐ TRUE else jump <-‐ FALSE # …. Various bookkeeping opera$ons … # })
the actual (generic) algorithm
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 35
NIMBLE: Programming With Models
The NIMBLE compiler (run code) Feature summary: • R-‐like matrix algebra (using Eigen library) • R-‐like indexing (e.g. X[1:5,]) • Use of model variables and nodes • Model calculate (logProb) and simulate func$ons • Sequen$al integer itera$on • If-‐then-‐else, do-‐while • Access to much of Rmath.h (e.g. distribu$ons) • Automa$c R interface / wrapper • Call out to your own C/C++ or back to R • Many improvements / extensions planned
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 36
NIMBLE: What can I program?
• Your own distribu$on for use in a model • Your own func$on for use in a model • Your own MCMC sampler for a variable in a
model • A new MCMC sampling algorithm for general
use • A new algorithm for hierarchical models • An algorithm that composes other exis$ng
algorithms (e.g., MCMC-‐SMC combina$ons)
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 37
• Your own distribu$on for use in a model • Your own func$on for use in a model • Your own MCMC sampler for a variable in a
model • A new MCMC sampling algorithm for general
use • A new algorithm for hierarchical models • An algorithm that composes other exis$ng
algorithms (e.g., MCMC-‐SMC combina$ons)
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 38
NIMBLE: What can I program?
Status of NIMBLE and Next Steps • First release was June 2014 with regular releases since. Lots to do:
– Improve the user interface and speed up compila$on – Refinement/extension of the NIMBLE programming language
• e.g., automa$c differen$a$on, paralleliza$on, sparse matrices – Addi$onal algorithms wriQen in NIMBLE DSL
• e.g., normalizing constant calcula$ons, Laplace approxima$ons, HMC and other samplers
• Bayesian nonparametrics with Claudia Wehrhahn Cortes and Abel Rodriguez (UCSC)
• Interested? – Announcements: nimble-‐announce Google site – User support/discussion: nimble-‐users Google site – Write an algorithm using NIMBLE! – Help with development of NIMBLE: email [email protected] or
see github.com/nimble-‐dev
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 39
Applica$on 2: Predic$ng biomass from composi$onal data
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 40
2000
1900
1800
1700
1600
1500
1300
1200
1100
1000
1400
Year A
D
Pine
20
Hemloc
k
20 40
Birch
20
Oak Hickory
20 40
Beech
Chestn
ut
20
Grass &
wee
ds
1500
Charco
al:Poll
en
60
% orga
nic m
atterBerry Pond WestBerry Pond, W Massachusetts
Onset of Little Ice Age
European settlement
Predic$on: based on calibra$on model and pollen composi$on over $me, predict biomass
Calibra$on: at seQlement $me we have biomass es$mates (based on survey data and a spa$al model) and pollen composi$on (from sediment cores)
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●●●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●
●
●
●●
●
●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●●
●
●
●
●
●●
●
●●
●●
●
●
●
●
●
●●●
●
●●●●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●●
●●●●
●
●
●
●
●
●
Biomass Point Estiamtes at Settlement
010
2030
4050
60
0 10 20 30 40 50 75 100
125
150
Biomass (Mg/ha)
Freq
uenc
y
Biomass es$mates at ponds
Applica$on 2: Calibra$on model
• Pollen propor$on for each taxon determined by transforma$on of a flexible (spline) func$on of biomass • shape1 and shape2 parameters of beta distribu$on are
splines of biomass • Primary calibra$on parameters are spline coefficients
• Mul$nomial likelihood for pollen counts given modeled propor$ons
• Fit in NIMBLE (could be fit in various other packages)
Spa$o-‐temporal dependence: a blessing
and a curse for computa$on and inference 41
Applica$on 2: Calibra$on model fit
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 42
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●
●●●●●●●●●●●●●
●●●
●●●●●●●
●●
0 20 40 60 80 100 140
0.0
0.2
0.4
0.6
0.8
PINUSX
biomass
polle
n p
rop
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●
●●●●●●●●●●●●●
●●●
●●●●●●●
●●
●
●
●
●
●●
● ●
●
●
●●
●
●
●●
●
●
●●
●
●
●● ●
●
●●
●●
●
●●
●●●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
● ●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●● ●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●
0 20 40 60 80 100 140
0.0
0.1
0.2
0.3
0.4
0.5
prairie
biomass
polle
n p
rop ●
●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
● ●
●
●
●
●
●
●
●
●
●
●
●
●●
●
●
●
●
●
●●
●
●
●
●
●
●
●●●
● ●●
●●
●
●●
● ●●
●● ●
●
●●
●
●
●
●
●
●
●●
●
●
●●●●●
●●●●●●●●●●●
●●
●●●●●
●●●
●
●●●●●●●●●●●●●●
●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
0 20 40 60 80 100 140
0.0
0.1
0.2
0.3
0.4
0.5
QUERCUS
biomass
polle
n p
rop
●●●●●●●●●●●
●●
●●●●●
●●●
●
●●●●●●●●●●●●●●
●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●
●
●
●
●●
●
●
●
●
●
●
●
●●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●● ●
●
●
●
● ●
●
●
●
●
●
● ●
●
●
●
●● ●
●
●●
●
●
●●
●
●
● ●
●
●
●● ●●
●●
●●●
●
●●● ●●●
●
●
●
●●
●
●
●
●
●
●
●●●
●
●●●●●
●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
0 20 40 60 80 100 140
0.0
0.1
0.2
0.3
0.4
other herbs
biomass
polle
n p
rop
●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
●
●
●
●●
●
●
●●
●●
●
●●
●
●●
● ●
●
●
●
●
●●●
●
●
●
●
●
●
●●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●
●●
●●
●
●
●
●
●
●●●
●
● ●
●
●
●
●
●●●
●
●
●●
●●
●
●
●●● ●
●
●
●
●
●
●
●●●
●
●●●
●
●
●
Mean and variability of modeled pollen propor$ons across ponds vary with biomass
Applica$on 2: Predic$on Model
$me t
taxon k
β1k , β2k
α1tk, α2tk
ptk
Ytk
bt
Z(bt)
σ
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 43
for(t in 1:nTimes) Y[t, 1] ~ dbetabin(alpha1[t, 1], alpha2[t, 1], n[t]) for(k in 2:(nTaxa-‐1)) { Y[t, k] ~ dbetabin(alpha1[t, k], alpha2[t, k], n[t]-‐sum(Y[t, 1:(k-‐1)])
for (k in 1:nTaxa) for(t in 1:nTimes) {
alpha1[t, k] <-‐ exp(Zb[t, 1:nKnots] %*% beta1[1:nKnots, k]) alpha2[t, k] <-‐ exp(Zb[t, 1:nKnots] %*% beta2[1:nKnots, k]) }
for( t in 1:nTimes) Zb[t, 1:nKnots] <-‐ bspline(b[t], knots[1:nKnots])
for(t in 2:nTimes)
b[t] ~ dnorm(b[t-‐1]), sd = sigma) sigma ~ dunif(0, 10) # Gelman (2006) b[1] ~ dunif(0, 400)
pollen likelihood
latent predictor
hyperpriors
biomass evolu$on
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 44
Applica$on 2: Predic$on Model
Applica$on 2: Biomass predic$on at one site
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
●●●●●
●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●
●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●
●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●
●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●
●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●
●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●
●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●
●●
●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●● ●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●
●●
●●
●●
●●
●●
●●●●
●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●
●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●●●●●●●●●●●●●●●●●●●●●●●●
●●
●●
●●●
●●
●●●●●● ●●●●
●●●
●●●●●
●●●●●
●●●●●●
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
−95 −90 −85
4244
4648
50
●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●●
Calibra$on sites and predic$on site (red)
−8000 −6000 −4000 −2000 0 2000
020
4060
80100
year
biom
ass
Biomass over $me
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 45
Key ecological ques$on: how does biomass (carbon storage) evolve over $me? Sta$s$cal ques$on: how to model temporal process? Smoothness? • Discrete first-‐order autoregressive (i.e., CAR) model is not smooth • Discrete second-‐order autoregressive (i.e., thin plate spline) is very smooth • Nonsta$onarity?
woodland
grassland
forest
Applica$on 2: Generalized Pareto / Trend filtering
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 46
• Discrete autoregressive model is a model (prior) for temporal contrasts (in biomass)
• Nonsta$onarity could be achieved by se]ng some contrasts to zero • Reversible jump • L1 prior (Laplace / double exponen$al) a la the Lasso • Generalized Pareto extends the Laplace prior based on extensive work
on proper$es of shrinkage priors (Carvalho et al (2010), Tansey et al. (2016), Taddy (2013)) • Looks like Laplace prior but with faQer tails
• Could consider first-‐order (piecewise constant model), second-‐order (piecewise linear), third-‐order (piecewise quadra$c) contrasts
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 47
• Marginalized model (third order)
• Sparsity-‐inducing prior and modeling of contrasts produces very complicated and oaen very strong temporal dependence
• Hard to make good MCMC proposals
• Model (third order) with data augmenta$on
• Now have normal prior for b1:T but no conjugacy so s$ll hard to find good proposals
• And we have addi$onal hierarchical levels that can impede MCMC mixing
bt ⇠ GenPar(3bt�1 � 3bt�2 + bt�3, ,�)
bt ⇠ N (3bt�1 � 3bt�2 + bt�3,!t)
!t ⇠ Exp(�2t/2)
�t ⇠ Ga( ,�)
Applica$on 2: Generalized Pareto / Trend filtering
Applica$on 2: MCMC performance
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 48
MCMC iteration
biom
ass
at −
5000
yea
rs
0 200 400 600 800 1000
110
115
120
125
130
Mixing with data augmenta$on using default NIMBLE MCMC
MCMC iteration
biom
ass
at −
5000
yea
rs
0 200 400 600 800 1000
110
115
120
125
130
chain 1chain 2
Mixing in marginalized model using HMC in Stan
(recall non-‐differen$able spike at zero from generalized Pareto)
Applica$on 2: MCMC performance (2)
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 49
Stan-‐based posterior correla$ons of biomass process values
0.0 0.2 0.4 0.6 0.8 1.0
0.0
0.2
0.4
0.6
0.8
1.0
chain 1
−0.5
0.0
0.5
1.0
0.0 0.2 0.4 0.6 0.8 1.0
0.0
0.2
0.4
0.6
0.8
1.0
chain 2
−0.5
0.0
0.5
1.0
Applica$on 2: Customized block sampling in NIMBLE
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 50
1. Use data augmenta$on with normal approxima$on to likelihood [y|b] at each point to provide approximately conjugate proposals for biomass process • Simple to approximate with mode and curvature of likelihood
2. Joint updates for : bivariate random walk for hyperparameters and approximate conjugate update for biomass process values • Joint upda$ng of hyperparameters and process addresses cross-‐level
dependency • Joint upda$ng of mul$ple biomass values addresses temporal
dependency • Local neighborhood updates for biomass reduce computa$on and
avoid high-‐dimensional approximate conjugacy
Sampling done in NIMBLE using a user-‐defined sampler, combined with standard samplers for other model parameters.
!t,�t, bt�l:t+l
Applica$on 2: Customized MCMC performance
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 51
Mixing with data augmenta$on using customized NIMBLE MCMC
MCMC iteration
biom
ass
at −
3000
yea
rs
0 2000 4000
105
115
125
135
MCMC iteration
sigm
a
0 2000 4000
040
8012
0
0.0 0.4 0.8
0.0
0.2
0.4
0.6
0.8
1.0
−0.4−0.20.00.20.40.60.81.0
Applica$on 2: Ini$al results
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 52
●●●
●●●
●
●●
●●●
●
●●
●
●
●●●
●●●●●
●
●
●●●
●●
●●
●
●
●
●
●
●●
●
●
●
●●●
●●●
−8000 −6000 −4000 −2000 0
050
100
150
year
biom
ass
● pointwise MLE
Concluding thoughts • The spa$o(-‐temporal) dependence we need for smoothing/predic$on can greatly affect algorithm performance.
• Blocked sampling can address dependence but good proposals can be hard to find, par$cularly with: – non-‐conjugate models and – dependence across model levels.
• Even with algorithm advances, computa$onal limita$ons s$ll greatly limit our ability to fit rich model structures.
• NIMBLE provides a pla�orm for – customizing algorithms for par$cular models and – developing general-‐purpose algorithms for hierarchical models.
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 53
PalEON Acknowledgements • Pollen-‐biomass Collaborators: Ann Raiho, Jason McLachlan
(Notre Dame Biology) • PalEON inves$gators: Jason McLachlan (Notre Dame, PI),
Mike Dietze (Boston U.), Andrew Finley (Michigan State), Amy Hessl (West Virginia), Phil Higuera (Idaho), Mevin Hooten (USGS/Colorado State), Steve Jackson (USGS/Arizona), Dave Moore (Arizona), Neil Pederson (Harvard Forest), Jack Williams (Wisconsin), Jun Zhu (Wisconsin)
• NSF Macrosystems Program
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 54
NIMBLE Acknowledgements NIMBLE development team: • Perry de Valpine (PI) UC Berkeley Environmental Science,
Policy and Management (ESPM) • Daniel Turek Williams College • Nick Michaud UC Berkeley Sta$s$cs and ESPM • Fritz Obermeyer UC Berkeley Sta$s$cs and ESPM • Duncan Temple Lang UC Davis Sta$s$cs • and various development team alumni
NIMBLE can be installed from CRAN in the usual way for an R package, and a full website with link to the User Manual is at hQp://r-‐nimble.org.
Spa$o-‐temporal dependence: a blessing
and a curse for computa$on and inference 55
References • Albert, James H., and Siddhartha Chib. "Bayesian analysis of binary and polychotomous response
data." Journal of the American sta$s$cal Associa$on 88.422 (1993): 669-‐679.
• Carvalho, Carlos M., Nicholas G. Polson, and James G. ScoQ. "The horseshoe es$mator for sparse signals." Biometrika 97.2 (2010): 465-‐480.
• de Valpine, Perry, et al. "Programming with models: wri$ng sta$s$cal algorithms for general model structures with NIMBLE." Journal of Computa$onal and Graphical Sta$s$cs 26.2 (2017): 403-‐413.
• Lindgren, Finn, Håvard Rue, and Johan Lindström. "An explicit link between Gaussian fields and Gaussian Markov random fields: the stochas$c par$al differen$al equa$on approach." Journal of the Royal Sta$s$cal Society: Series B (Sta$s$cal Methodology) 73.4 (2011): 423-‐498.
• McCulloch, Robert, and Peter E. Rossi. "An exact likelihood analysis of the mul$nomial probit model." Journal of Econometrics 64.1 (1994): 207-‐240.
• Taddy, MaQ. "Mul$nomial inverse regression for text analysis." Journal of the American Sta$s$cal Associa$on 108.503 (2013): 755-‐770.
• Tansey, Wesley, Athey, A., Reinhart, A., and ScoQ, J. G. "Mul$scale spa$al density smoothing: an applica$on to large-‐scale radiological survey and anomaly detec$on." Journal of the American Sta$s$cal Associa$on just-‐accepted (2017).
Spa$o-‐temporal dependence: a blessing and a curse for computa$on and inference 56