bayesian modeling and inference for high-dimensional

35
Bayesian Modeling and Inference for High-Dimensional Spatiotemporal Datasets Sudipto Banerjee University of California, Los Angeles, USA Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Upload: others

Post on 16-Oct-2021

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Bayesian Modeling and Inference for High-Dimensional

Bayesian Modeling and Inference forHigh-Dimensional Spatiotemporal Datasets

Sudipto Banerjee

University of California, Los Angeles, USA

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 2: Bayesian Modeling and Inference for High-Dimensional

Based upon projects involving:I Abhirup Datta (Johns Hopkins University)I Andrew O. Finley (Michigan State University)I Nicholas A.S. Hamm (University of Twente)I Martjin Schaap (TNO Built Environment and Geosciences)

Page 3: Bayesian Modeling and Inference for High-Dimensional

Example 1: U.S. forest biomass data

Figure: Observed biomass (left) and NDVI (right)

I Forest biomass data collected over 114,371 plotsI Normalized Difference Vegetation Index (NDVI) is a measure of

greennessI Forest Biomass Regression Model:

Biomass(`) = β0(`) + β1(`)NDVI(`) + error

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 4: Bayesian Modeling and Inference for High-Dimensional

Example 2: European Particulate Matter (PM10) data

Easting (km)

No

rth

ing

(km

)

05

00

10

00

15

00

20

00

25

00

0 500 1000 1500 2000 2500

● ●●●

●●

●● ●

●●

●●●

●●●●●●●

●●● ●● ●●●●●

●●

● ●

●●

●●●●

● ●●

●●

● ●

●●●●

● ●

●●● ●

●●

●●

●●

● ●

● ● ●

●●

● ●

● ●

●●●

●●●

●● ●

●●●

●●

● ●

●●

●●

● ●

●● ●

●●

●●●

●●

●●

●●●

●●

● ●

●●●

●●

20

40

60

80

100

(a) PM10 levels in March, 2009

Easting (km)

No

rth

ing

(km

)

05

00

10

00

15

00

20

00

25

00

0 500 1000 1500 2000 2500

● ●●●

●●

●● ●

●●

●●●

●●●●●●●

●● ●● ●●● ●●●

●●

● ●

●●

●●●

●●

● ● ●

●●

●●●

● ●● ●

●●

●●

●●

●●

● ●

● ● ●

●●

● ●

●●●

●●

●●● ●

●●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●● ●

●●●●

●●

●●●

●●

●●

●●●

●●

51015202530354045

(b) PM10 levels in June, 2009

I Significant variation across space and time

I Daily observations at 308 stations for 2 years i.e.,n = 308× 730 = 224, 840

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 5: Bayesian Modeling and Inference for High-Dimensional

Example 2: European PM10 data

I Computer models likeChemistry Transport Model(CTM) consistentlyunderestimate PM10 levels

I CTM outputs used ascovariates to improve fitslog(PM10)(`) =β0(`) + β1(`)CTM(`) + ε(`)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 6: Bayesian Modeling and Inference for High-Dimensional

Example 3: Tanana Valley (Alaska) forest canopy height analysis

(a) (b)

Figure: Tanana vally, Alaska, study region. (a) G-LiHT flight lines wherecanopy height was measured at ∼ 6× 106 locations over the percent forestcanopy covariate. (b) Occurrence of forest fire in the past 20 years and areasof interest for prediction illustration.

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 7: Bayesian Modeling and Inference for High-Dimensional

Spatiotemporal regression models

I Y(`) = β0(`) + X(`)β1(`) + e(`)

I Produce maps for intercept and slope:

{β0(`) : ` ∈ L} and {β1(`) : ` ∈ L}

I L is spatial domain (e.g., D ⊂ <d) or spatiotemporal domain(e.g., D ⊂ <d ×<+)

I Potentially very rich: understand spatially- and/ortemporally-varying impact of predictors on outcome.

I Model-based predictions: Y(`0) | {y(`1), y(`2), . . . , y(`n)}.

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 8: Bayesian Modeling and Inference for High-Dimensional

Gaussian spatiotemporal process

I {w(`) : ` ∈ L} ∼ GP(0,Kθ(·, ·)) implies

w = (w(`1),w(`2), . . . ,w(`n))> ∼ N(0,Kθ)

for every finite set of points `1, `2, . . . , `n.

I Kθ = {Kθ(`i, `j)} is a spatial variance-covariance matrix

I Stationary: Kθ(`, `′) = Kθ(`− `′). Isotropy:Kθ(`, `′) = Kθ(‖`− `′‖).

I With “nugget” (esp. when modeling data): Kθ = C(σ,φ) + Dτ ,where θ = {σ, φ, τ}

I No nugget (esp. when modeling random effects): Kθ = C(σ,φ),where θ = {σ, φ}

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 9: Bayesian Modeling and Inference for High-Dimensional

Likelihood from (full rank) GP models

I L = {`1, `2, . . . , `n} are locations where data is observed

I y(`i) is outcome at the ith location, y = (y(`1), y(`2), . . . , y(`n))>

I Model: y ∼ N(Xβ,Kθ)

I Estimating process parameters from the likelihood:

−12

log det(Kθ)−12

(y− Xβ)>K−1θ (y− Xβ)

I Bayesian inference: Priors on {β, θ}

I Challenges: Storage and chol(Kθ) = LDL>.

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 10: Bayesian Modeling and Inference for High-Dimensional

Burgeoning literature on spatial big data

I Low-rank models (Wahba, 1990; Higdon, 2002; Kamman & Wand,2003; Paciorek, 2007; Rasmussen & Williams, 2006; Stein 2007, 2008;Cressie & Johannesson, 2008; Banerjee et al., 2008; 2010; Gramacy &Lee 2008; Sang et al., 2011, 2012; Lemos et al., 2011; Guhaniyogi etal., 2011, 2013; Salazar et al., 2013; Katzfuss, 2016)

I Spectral approximations and composite likelihoods: (Fuentes 2007;Paciorek, 2007; Eidsvik et al. 2016)

I Multi-resolution approaches (Nychka, 2002; Johannesson et al., 2007;Matsuo et al., 2010; Tzeng & Huang, 2015; Katzfuss, 2016)

I Sparsity: (Solve Ax = b by (i) sparse A, or (ii) sparse A−1)1. Covariance tapering (Furrer et al. 2006; Du et al. 2009; Kaufman

et al., 2009; Shaby and Ruppert, 2013)2. GMRFs to GPs: INLA (Rue et al. 2009; Lindgren et al., 2011)3. LAGP (Gramacy et al. 2014; Gramacy and Apley, 2015)4. Nearest-neighbor models (Vecchia 1988; Stein et al. 2004; Stroud

et al 2014; Datta et al., 2016)Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 11: Bayesian Modeling and Inference for High-Dimensional

Reduced (Low) rank models

I Kθ ≈ BθK∗θB>θ + Dθ

I Bθ is n× r matrix of spatial basis functions, r << n

I K∗θ is r × r spatial covariance matrix

I Dθ is either diagonal or sparse

I Examples: Kernel projections, Splines, Predictive process, FRK,spectral basis ...

I Computations exploit above structure: roughlyO(nr2) << O(n3) flops

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 12: Bayesian Modeling and Inference for High-Dimensional

Oversmoothing due to reduced-rank models

(a) True w (b) Full GP (c) PPGP 64 knots

Figure: Comparing full GP vs low-rank GP with 2500 locations. Figure(4(c)) exhibits oversmoothing by a low-rank process (predictive process with64 knots)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 13: Bayesian Modeling and Inference for High-Dimensional

Simple method of introducing sparsity (e.g. graphical models)

Full dependency graph

1

2

3

4

5

67

p(y1)p(y2 | y1)p(y3 | y1, y2)p(y4 | y1, y2, y3)× p(y5 | y1, y2, y3, y4)p(y6 | y1, y2, . . . , y5)p(y7 | y1, y2, . . . , y6) .

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 14: Bayesian Modeling and Inference for High-Dimensional

Simple method of introducing sparsity (e.g. graphical models)

3−Nearest neighbor dependency graph

1

2

3

4

5

67

p(y1)p(y2 | y1)p(y3 | y1, y2)p(y4 | y1, y2, y3)p(y5 |��y1, y2, y3, y4)p(y6 | y1,��y2,��y3, y4, y5)p(y7 | y1, y2,��y3,��y4,��y5, y6)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 15: Bayesian Modeling and Inference for High-Dimensional

Gaussian graphical models: linearity

I Write a joint density p(w) = p(w1,w2, . . . ,wn) as:

p(w1)p(w2 |w1)p(w3 |w1,w2) · · · p(wn |w1,w2, . . . ,wn−1)

I Example: For Gaussian distribution N(w | 0,Kθ), we have alinear model

w1 = 0 + η1;w2 = a21w1 + η2;w3 = a31w1 + a32w2 + η3;wi = ai1w1 + ai2w2 + · · ·+ ai,i−1wi−1 + ηi; i = 4, . . . , n .

I More compactly: w = Aw + η ; η ∼ N(0,D).

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 16: Bayesian Modeling and Inference for High-Dimensional

Simple method of introducing sparsity (e.g. graphical models)

I For Gaussian distribution N(w | 0,Kθ),

Kθ = (I − A)−1D(I − A)−> D = diag(var{wi |w{j<i}})

I If L is from chol(Kθ) = LDL>, then L−1 = I − A.

I aij’s obtained from n− 1 linear systems implied by∑j<i:j∼i

aijwj = E[wi |w{j<i}] i = 2, . . . , n

I Example:

for(i in 1:n) {

a[i+1,] = solve(K[1:i,1:i], K[i, 1:i])

}

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 17: Bayesian Modeling and Inference for High-Dimensional

I Let aij = 0 for all but m nearest neighbors of node i impliessolving ∑

j∈N[i]aijwj = E[wi |w{j∈N[i]}] i = 2, . . . , n ,

where N[i] = {j < i : j ∼ i} are indices for neighbors of i fromits “past.”

I Example:

for(i in 1:n) {

a[i+1,] = solve(K[N[i],N[i]], K[i, N[i]])

}

I We need to solve n− 1 linear systems of size at most m× mI We effectively model a (sparse) Cholesky factor instead of

computing it

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 18: Bayesian Modeling and Inference for High-Dimensional

Sparse precision matrices

N(wR | 0, Kθ) ≈ N(wR | 0, K̃θ) ; K̃−1θ = (I − A)>D−1(I − A)

(a) I − A (b) D−1 (c) K̃−1θ

I det(K̃−1θ ) =

∏ni=1 D−1

ii , K̃−1θ is sparse with O(nm2) entries

(a) I − A (b) D−1 (c) K̃−1θ

I det(K̃−1θ ) =

∏ni=1 D−1

ii , K̃−1θ is sparse with O(nm2) entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 19: Bayesian Modeling and Inference for High-Dimensional

Sparse precision matrices

N(wR | 0, Kθ) ≈ N(wR | 0, K̃θ) ; K̃−1θ = (I − A)>D−1(I − A)

(a) I − A (b) D−1 (c) K̃−1θ

I det(K̃−1θ ) =

∏ni=1 D−1

ii , K̃−1θ is sparse with O(nm2) entries

(a) I − A (b) D−1 (c) K̃−1θ

I det(K̃−1θ ) =

∏ni=1 D−1

ii , K̃−1θ is sparse with O(nm2) entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 20: Bayesian Modeling and Inference for High-Dimensional

Sparse precision matrices

N(wR | 0, Kθ) ≈ N(wR | 0, K̃θ) ; K̃−1θ = (I − A)>D−1(I − A)

(a) I − A (b) D−1 (c) K̃−1θ

I det(K̃−1θ ) =

∏ni=1 D−1

ii , K̃−1θ is sparse with O(nm2) entries

(a) I − A (b) D−1 (c) K̃−1θ

I det(K̃−1θ ) =

∏ni=1 D−1

ii , K̃−1θ is sparse with O(nm2) entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 21: Bayesian Modeling and Inference for High-Dimensional

Sparse precision matrices

N(wR | 0, Kθ) ≈ N(wR | 0, K̃θ) ; K̃−1θ = (I − A)>D−1(I − A)

(a) I − A (b) D−1 (c) K̃−1θ

I det(K̃−1θ ) =

∏ni=1 D−1

ii , K̃−1θ is sparse with O(nm2) entries

(a) I − A (b) D−1 (c) K̃−1θ

I det(K̃−1θ ) =

∏ni=1 D−1

ii , K̃−1θ is sparse with O(nm2) entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 22: Bayesian Modeling and Inference for High-Dimensional

Sparse likelihood approximations (Vecchia, 1988; Stein et al., 2004)

I LetR = {`1, `2, . . . , `r}

I With w(`) ∼ GP(0,Kθ(·)), write the joint density p(wR) as:

N(wR | 0,Kθ) =r∏

i=1

p(w(`i) |wH(`i))

≈r∏

i=1

p(w(`i) |wN(`i)) = N(wR | 0, K̃θ) .

where N(`i) ⊆ H(`i).

I Shrinkage: Choose N(`) as the set of “m nearest-neighbors”among H(`i). Theory: “Screening” effect of kriging.

I K̃−1θ depends on Kθ, but is sparser with at most nm2 non-zero

entries

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 23: Bayesian Modeling and Inference for High-Dimensional

Extension to a GP (Datta et al., JASA, 2016)

I Fix “reference” setR = {`1, `2, . . . , `r} (e.g. observed points)

I N(`i) is the set of at most m nearest neighbors of `i among{`1, `2, . . . , `i−1}.

I First piece: Model wR ∼ N(0, K̃θ) (“Vecchia prior”)

I Second piece: If ` /∈ R, then N(`) is the set of m-nearestneighbors of ` inR

I Third piece: w(`) =∑r

i=1 ai(`)w(`i) + η(`) with ai(`) = 0 if`i /∈ N(`).

I Nonzero ai(`)’s obtained by solving m× m system:

E[w(`) |wN(`)] =∑

i:`i∈N(`)ai(`)w(`i)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 24: Bayesian Modeling and Inference for High-Dimensional

Neighbors in Space and Time

I No universal definition of distance in a space-time domain

I Use Kθ(·, ·) as a proxy for distance

I Datta et al. (2016, AoAS): Efficient algorithm ∼ O(4nm) flopsto do this

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 25: Bayesian Modeling and Inference for High-Dimensional

Example 1: Hierarchical NNGP model

I Start with a desired full GP specification: GP(0,Kθ(·))I Derive the NNGP: NNGP(0, K̃θ(·))

Y(`) ind∼ Pθ exponential family ;g(E[Y(`)]) = β0(`) + X(`)β1(`)

(β0(`), β1(`))> ∼ NNGP(β̃0 + X(`)β̃1, K̃θ(·))(β̃0, β̃1)> ∼ N(0,Vβ) ; θ ∼ p(θ)

I Posterior predictive inference for β0(`0), β1(`0) and Y(`0)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 26: Bayesian Modeling and Inference for High-Dimensional

Example 2: Hierarchical NNGP model

I Start with a desired full GP specification for Y(`):Y(`) ∼ GP(x>(`)β,Kθ(·))

I Derive the NNGP: Y(`) ∼ NNGP(x>(`)β, K̃θ(·))

Y ∼ N(Xβ, K̃θ) ;β ∼ N(0,Vβ) ; θ ∼ p(θ)

I No need for Cholesky: it is modeled.I Easy posterior predictive inference for Y(`0) at new `0.I But no latent spatial-temporal process

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 27: Bayesian Modeling and Inference for High-Dimensional

(a) True w (b) Full GP (c) PPGP 64 knots

(d) NNGP, m = 10 (e) NNGP, m = 20

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 28: Bayesian Modeling and Inference for High-Dimensional

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

m

RM

SP

E

1.15

1.20

1.25

1.30

1.35

●●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

2.10

2.15

2.20

2.25

2.30

2.35

2.40

Me

an

95

% C

I w

idth

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25

NNGP RMSPENNGP Mean 95% CI widthFull GP RMSPEFull GP Mean 95% CI width

Figure: Choice of m in NNGP models: Out-of-sample Root Mean SquaredPrediction Error (RMSPE) and mean width between the upper and lower95% posterior predictive credible intervals for a range of m for the univariatesynthetic data analysis

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 29: Bayesian Modeling and Inference for High-Dimensional

Back to European PM10 data

Easting (km)

No

rth

ing

(km

)

05

00

10

00

15

00

20

00

25

00

0 500 1000 1500 2000 2500

● ●●●

●●

●● ●

●●

●●●

●●●●●●●

●●● ●● ●●●●●

●●

● ●

●●

●●●●

● ●●

●●

● ●

●●●●

● ●

●●● ●

●●

●●

●●

● ●

● ● ●

●●

● ●

● ●

●●●

●●●

●● ●

●●●

●●

● ●

●●

●●

● ●

●● ●

●●

●●●

●●

●●

●●●

●●

● ●

●●●

●●

20

40

60

80

100

(a) PM10 levels in March, 2009

Easting (km)

No

rth

ing

(km

)

05

00

10

00

15

00

20

00

25

00

0 500 1000 1500 2000 2500

● ●●●

●●

●● ●

●●

●●●

●●●●●●●

●● ●● ●●● ●●●

●●

● ●

●●

●●●

●●

● ● ●

●●

●●●

● ●● ●

●●

●●

●●

●●

● ●

● ● ●

●●

● ●

●●●

●●

●●● ●

●●●

●●

●●

● ●

●●

●●

● ●

●●

●●

●● ●

●●●●

●●

●●●

●●

●●

●●●

●●

51015202530354045

(b) PM10 levels in June, 2009

I Interest in estimating short and long term temporal (and spatial)decay (to improve the CTMs)

I log(PM10)(s, t) = β0 + β1CTM(s, t) + w(s, t) + ε(s, t)

I w(s, t) ∼ DNNGP(0, K̃θ(·))Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 30: Bayesian Modeling and Inference for High-Dimensional

European PM10 Dataset

I Significantly improved fit

OLS DNNGPRMSPE 12.8 8.2

I Total time 24 hrs

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 31: Bayesian Modeling and Inference for High-Dimensional

European PM10 Dataset

Easting (km)

No

rth

ing

(km

)

0 500 1000 1500 2000 2500

05

00

10

00

15

00

20

00

25

00

●● ●

● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ●● ● ● ● ● ● ● ● ● ●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●

● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ●● ● ● ●

● ● ● ●● ● ● ● ● ●

● ● ● ●●●●●

●●●●●

Missing[0,20)[20,40)[40,60)[60,80)[80,100)[100,120]

(a) P̂M10 for 04.03.2009

Easting (km)

No

rth

ing

(km

)

0 500 1000 1500 2000 2500

05

00

10

00

15

00

20

00

25

00

●● ●

● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ●● ● ● ● ● ● ● ● ● ●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●

● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ●● ● ● ●

● ● ● ●● ● ● ● ● ●

● ● ● ●●●●●

●●●●●

[0,0.1)[0.1,0.2)[0.2,0.3)[0.3,0.4)[0.4,0.5)[0.5,0.6)[0.6,0.7)[0.7,0.8)[0.8,0.9)[0.9,1]

(b) Pr(P̂M10 > 50µgm−3)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 32: Bayesian Modeling and Inference for High-Dimensional

European PM10 Dataset

Easting (km)

No

rth

ing

(km

)

0 500 1000 1500 2000 2500

05

00

10

00

15

00

20

00

25

00

●● ●

● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ●● ● ● ● ● ● ● ● ● ●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●

● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ●● ● ● ●

● ● ● ●● ● ● ● ● ●

● ● ● ●●●●●

●●●●●

Missing[0,10)[10,20)[20,30)[30,40)[40,50)[50,60)[60,70)[70,80)[80,90]

(a) P̂M10 for 04.05.2009

Easting (km)

No

rth

ing

(km

)

0 500 1000 1500 2000 2500

05

00

10

00

15

00

20

00

25

00

●● ●

● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ●● ● ● ● ● ● ● ● ● ●

● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ●● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ●

● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

●● ●

●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ● ●

● ● ● ● ● ● ● ● ● ● ●● ● ●

● ● ●● ● ● ● ● ●● ● ● ● ●● ● ● ● ● ● ●

● ● ● ● ● ● ●● ● ● ●

● ● ● ●● ● ● ● ● ●

● ● ● ●●●●●

●●●●●

[0,0.1)[0.1,0.2)[0.2,0.3)[0.3,0.4)[0.4,0.5)[0.5,0.6)[0.6,0.7)[0.7,0.8)[0.8,0.9)[0.9,1]

(b) Pr(P̂M10 > 50µgm−3)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 33: Bayesian Modeling and Inference for High-Dimensional

Concluding remarks: Storage and computation

I Algorithms: Gibbs, RWM, HMC, VB, INLA; NNGP/HMCespecially promising

I Model-based solution for spatial “BIG DATA”

I Never needs to store n× n distance matrix. Stores n small m×mmatrices

I Total flop count per iteration is O(nm3) i.e linear in n

I Scalable to massive datasets because m is small—you never needmore than a few neighbors.

I Compare with reduced-rank models: O(nm3) << O(nr2).

I New R package spNNGP (on CRAN soon)

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 34: Bayesian Modeling and Inference for High-Dimensional

Concluding remarks: Comparisons

I Are low-rank spatial models well and truly beaten?

I Certainly do not seem to scale as nicely as NNGPI Have somewhat greater theoretical tractability (e.g. Bayesian

asymptotics)I Can be used to flexibly model smoothnessI Can be constructed for other processes—e.g., Spatial Dirichlet

Predictive ProcessI Compare with scalable multi-resolution frameworks (Katzfuss,

2016)

I Highly scalable meta-kriging frameworks (Guhaniyogi, 2016)

I Future work: High-dimensional multivariate spatial-temporal variableselection

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models

Page 35: Bayesian Modeling and Inference for High-Dimensional

Thank You !

Sudipto Banerjee (UCLA) Climate Informatics 2016: NNGP models