t emporal p robabilistic m odels p t 2. a genda kalman filtering dynamic bayesian networks particle...
TRANSCRIPT
![Page 1: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/1.jpg)
TEMPORAL PROBABILISTIC MODELS PT 2
![Page 2: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/2.jpg)
AGENDA
Kalman filtering Dynamic Bayesian Networks Particle filtering
![Page 3: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/3.jpg)
KALMAN FILTERING
In a nutshell Efficient filtering in continuous
state spaces Gaussian transition and
observation models Ubiquitous for tracking with
noisy sensors, e.g. radar, GPS, cameras
![Page 4: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/4.jpg)
HIDDEN MARKOV MODEL FOR ROBOT LOCALIZATION
Use observations + transition dynamics to get a better idea of where the robot is at time t
X0 X1 X2 X3
z1 z2 z3
Hidden state variables
Observed variables
Predict – observe – predict – observe…
![Page 5: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/5.jpg)
HIDDEN MARKOV MODEL FOR ROBOT LOCALIZATION
Use observations + transition dynamics to get a better idea of where the robot is at time t
Maintain a belief state bt over time bt(x) = P(Xt=x|z1:t)
X0 X1 X2 X3
z1 z2 z3
Hidden state variables
Observed variables
Predict – observe – predict – observe…
![Page 6: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/6.jpg)
BAYESIAN FILTERING WITH BELIEF STATES
Compute bt, given zt and prior belief bt
Recursive filtering equation
![Page 7: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/7.jpg)
Update via the observation ztPredict P(Xt|z1:t-1) using dynamics alone
BAYESIAN FILTERING WITH BELIEF STATES
Compute bt, given zt and prior belief bt
Recursive filtering equation
![Page 8: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/8.jpg)
IN CONTINUOUS STATE SPACES…
Compute bt, given zt and prior belief bt
Continuous filtering equation
![Page 9: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/9.jpg)
GENERAL BAYESIAN FILTERING IN CONTINUOUS STATE SPACES
Compute bt, given zt and prior belief bt
Continuous filtering equation
How to evaluate this integral? How to calculate Z? How to even represent a belief state?
![Page 10: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/10.jpg)
KEY REPRESENTATIONAL DECISIONS
Pick a method for representing distributions Discrete: tables Continuous: fixed parameterized classes vs.
particle-based techniques Devise methods to perform key calculations
(marginalization, conditioning) on the representation Exact or approximate?
![Page 11: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/11.jpg)
GAUSSIAN DISTRIBUTION
Mean m, standard deviation s Distribution is denoted N(m,s) If X ~ N(m,s), then
With a normalization factor
![Page 12: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/12.jpg)
LINEAR GAUSSIAN TRANSITION MODEL FOR MOVING 1D POINT
Consider position and velocity xt, vt
Time step h Without noise
xt+1 = xt + h vt
vt+1 = vt
With Gaussian noise of std s1
P(xt+1|xt) exp(-(xt+1 – (xt + h vt))2/(2s12)
i.e. Xt+1 ~ N(xt + h vt, s1)
![Page 13: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/13.jpg)
LINEAR GAUSSIAN TRANSITION MODEL If prior on position is Gaussian, then the
posterior is also Gaussian
vh s1
N(m,s) N(m+vh,s+s1)
![Page 14: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/14.jpg)
LINEAR GAUSSIAN OBSERVATION MODEL Position observation zt
Gaussian noise of std s2
zt ~ N(xt,s2)
![Page 15: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/15.jpg)
LINEAR GAUSSIAN OBSERVATION MODEL
If prior on position is Gaussian, then the posterior is also Gaussian
m (s2z+s22m)/(s2+s2
2)
s2 s2s22/(s2+s2
2)
Position prior
Posterior probability
Observation probability
![Page 16: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/16.jpg)
MULTIVARIATE GAUSSIANS
Multivariate analog in N-D space Mean (vector) m, covariance (matrix) S
With a normalization factor
X ~ N(m,S)
![Page 17: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/17.jpg)
MULTIVARIATE LINEAR GAUSSIAN PROCESS
A linear transformation + multivariate Gaussian noise
If prior state distribution is Gaussian, then posterior state distribution is Gaussian
If we observe one component of a Gaussian, then its posterior is also Gaussian
y = A x + e e ~ N(m,S)
![Page 18: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/18.jpg)
MULTIVARIATE COMPUTATIONS Linear transformations of gaussians
If x ~ N(m,S), y = A x + bThen y ~ N(Am+b, ASAT)
Consequence If x ~ N(mx,Sx), y ~ N(my,Sy), z=x+yThen z ~ N(mx+my,Sx+Sy)
Conditional of gaussian If [x1,x2] ~ N([m1 m2],[S11,S12;S21,S22])Then on observing x2=z, we have
x1 ~ N(m1-S12S22-1(z-m2), S11-S12S22
-1S21)
![Page 19: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/19.jpg)
KALMAN FILTER ASSUMPTIONS
xt ~ N(mx,Sx) xt+1 = F xt + g + v zt+1 = H xt+1 + w v ~ N(0,Sv), w ~ N(0,Sw)
Dynamics noise
Observation noise
![Page 20: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/20.jpg)
TWO STEPS
Maintain mt, St the parameters of the gaussian distribution over state xt
Predict Compute distribution of xt+1 using dynamics
model alone
Update (observe zt+1) Compute P(xt+1|zt+1) with Bayes rule
![Page 21: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/21.jpg)
TWO STEPS
Maintain mt, St the parameters of the gaussian distribution over state xt
Predict Compute distribution of xt+1 using dynamics
model alone xt+1 ~ N(Fmt + g, F St FT
+ Sv) Let these be N(m’,S’)
Update Compute P(xt+1|zt+1) with Bayes rule
![Page 22: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/22.jpg)
TWO STEPS
Maintain mt, St the parameters of the gaussian distribution over state xt
Predict Compute distribution of xt+1 using dynamics
model alone xt+1 ~ N(Fmt + g, F St FT
+ Sv) Let these be N(m’,S’)
Update Compute P(xt+1|zt+1) with Bayes rule Parameters of final distribution mt+1 and St+1
derived using the conditional distribution formulas
![Page 23: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/23.jpg)
DERIVING THE UPDATE RULE
xt
zt
m’a
= N ( , )S’ BBT C
xt ~ N(m’ , S’)
(1) Unknowns a,B,C
(3) Assumption
(7) Conditioning (1)xt | zt ~ N(m’-BC-1(zt-a), S’-BC-1BT)
(2) Assumption
zt | xt ~ N(H xt, SW)
C-BTS’-1B = SW => C = H S’ HT + SW
H xt = a-BTS’-1(xt-m’) => a=Hm’, BT=HS’ (5) Set mean (4)=(3)
(6) Set cov. (4)=(3)
(8,9) Kalman filtermt = m’ - S’HTC-1(zt-Hm’)
(4) Conditioning (1)zt | xt ~ N(a-BTS’-1xt, C-BTS’-1B)
St = S’ - S’HTC-1HS’
![Page 24: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/24.jpg)
PUTTING IT TOGETHER
Transition matrix F, covariance Sx
Observation matrix H, covariance Sz
mt+1 = F mt + Kt+1(zt+1 – HFmt)St+1 = (I - Kt+1)(FStFT + Sx)
WhereKt+1= (FStFT + Sx)HT(H(FStFT + Sx)HT +Sz)-1
Got that memorized?
![Page 25: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/25.jpg)
PROPERTIES OF KALMAN FILTER Optimal Bayesian estimate for linear
Gaussian transition/observation models Need estimates of covariance… model
identification necessary Extensions to nonlinear
transition/observation models work as long as they aren’t too nonlinear Extended Kalman Filter Unscented Kalman Filter
![Page 26: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/26.jpg)
Tracking the velocity of a braking obstacle
Learning that the road is slick
Actual max deceleration
Braking begins
Estimated max deceleration
Velocity initially uninformed
More distance measurements arrive
Obstacle slows
Stopping distance (95% confidence interval)
Braking initiated Gradual stop
![Page 27: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/27.jpg)
NON-GAUSSIAN DISTRIBUTIONS
Gaussian distributions are a “lump”
Kalman filter estimate
![Page 28: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/28.jpg)
NON-GAUSSIAN DISTRIBUTIONS
Integrating continuous and discrete states
Splitting with a binary choice
“up”
“down”
![Page 29: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/29.jpg)
EXAMPLE: FAILURE DETECTION
Consider a battery meter sensor Battery = true level of battery BMeter = sensor reading
Transient failures: send garbage at time t Persistent failures: send garbage forever
![Page 30: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/30.jpg)
EXAMPLE: FAILURE DETECTION
Consider a battery meter sensor Battery = true level of battery BMeter = sensor reading
Transient failures: send garbage at time t 5555500555…
Persistent failures: sensor is broken 5555500000…
![Page 31: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/31.jpg)
DYNAMIC BAYESIAN NETWORK
BMetert
BatterytBatteryt-1
BMetert ~ N(Batteryt,s)
(Think of this structure “unrolled” forever…)
![Page 32: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/32.jpg)
DYNAMIC BAYESIAN NETWORK
BMetert
BatterytBatteryt-1
BMetert ~ N(Batteryt,s)
P(BMetert=0 | Batteryt=5) = 0.03Transient failure model
![Page 33: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/33.jpg)
RESULTS ON TRANSIENT FAILUREE
(Bat
tery
t)
Transient failure occurs
Without model
With model
Meter reads 55555005555…
![Page 34: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/34.jpg)
RESULTS ON PERSISTENT FAILUREE
(Bat
tery
t)
Persistent failure occurs
With transient model
Meter reads 5555500000…
![Page 35: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/35.jpg)
PERSISTENT FAILURE MODEL
BMetert
BatterytBatteryt-1
BMetert ~ N(Batteryt,s)
P(BMetert=0 | Batteryt=5) = 0.03
Brokent-1 Brokent
P(BMetert=0 | Brokent) = 1
Example of a Dynamic Bayesian Network (DBN)
![Page 36: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/36.jpg)
RESULTS ON PERSISTENT FAILUREE
(Bat
tery
t)
Persistent failure occurs
With transient model
Meter reads 5555500000…
With persistent failure model
![Page 37: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/37.jpg)
HOW TO PERFORM INFERENCE ON DBN? Exact inference on “unrolled” BN
Variable Elimination – eliminate old time steps After a few time steps, all variables in the state
space become dependent! Lost sparsity structure
Approximate inference Particle Filtering
![Page 38: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/38.jpg)
PARTICLE FILTERING (AKA SEQUENTIAL MONTE CARLO)
Represent distributions as a set of particles
Applicable to non-gaussian high-D distributions
Convenient implementations
Widely used in vision, robotics
![Page 39: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/39.jpg)
PARTICLE REPRESENTATION
Bel(xt) = {(wk,xk)} wk are weights, xk are state
hypotheses Weights sum to 1 Approximates the underlying
distribution
![Page 40: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/40.jpg)
Weighted resampling step
PARTICLE FILTERING
Represent a distribution at time t as a set of N “particles” St
1,…,StN
Repeat for t=0,1,2,… Sample S[i] from P(Xt+1|Xt=St
i) for all i Compute weight w[i] = P(e|Xt+1=S[i]) for all i Sample St+1
i from S[.] according to weights w[.]
![Page 41: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/41.jpg)
BATTERY EXAMPLE
BMetert
BatterytBatteryt-1
Brokent-1 Brokent
Sampling step
![Page 42: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/42.jpg)
BATTERY EXAMPLE
BMetert
BatterytBatteryt-1
Brokent-1 Brokent
Suppose we now observe BMeter=0
P(BMeter=0|sample) = ?
0.03
1
![Page 43: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/43.jpg)
BATTERY EXAMPLE
BMetert
BatterytBatteryt-1
Brokent-1 Brokent
Compute weights (drawn as particle size)
P(BMeter=0|sample) = ?
0.03
1
![Page 44: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/44.jpg)
BATTERY EXAMPLE
BMetert
BatterytBatteryt-1
Brokent-1 Brokent
Weighted resampling
P(BMeter=0|sample) = ?
![Page 45: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/45.jpg)
BATTERY EXAMPLE
BMetert
BatterytBatteryt-1
Brokent-1 Brokent
Sampling Step
![Page 46: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/46.jpg)
BATTERY EXAMPLE
BMetert
BatterytBatteryt-1
Brokent-1 Brokent
Now observe BMetert = 5
![Page 47: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/47.jpg)
BATTERY EXAMPLE
BMetert
BatterytBatteryt-1
Brokent-1 Brokent
Compute weights
10
![Page 48: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/48.jpg)
BATTERY EXAMPLE
BMetert
BatterytBatteryt-1
Brokent-1 Brokent
Weighted resample
![Page 49: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/49.jpg)
APPLICATIONS OF PARTICLE FILTERING IN ROBOTICS Simultaneous Localization and
Mapping (SLAM) Observations: laser rangefinder State variables: position, walls
![Page 50: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/50.jpg)
SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM)
Mobile robots Odometry
Locally accurateDrifts significantly over
time Vision/ladar/sonar
Inaccurate locallyGlobal reference frame
Combine the twoState: (robot pose, map)Observations: (sensor
input)
![Page 51: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/51.jpg)
GENERAL PROBLEM
xt ~ Bel(xt) (arbitrary p.d.f.)xt+1 = f(xt,u,ep)zt+1 = g(xt+1,eo)ep ~ arbitrary p.d.f., eo ~ arbitrary
p.d.f.
Process noise
Observation noise
![Page 52: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/52.jpg)
SAMPLING IMPORTANCE RESAMPLING (SIR) VARIANT
Predict
Update
Resample
![Page 53: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/53.jpg)
ADVANCED FILTERING TOPICS
Mixing exact and approximate representations (e.g., mixture models)
Multiple hypothesis tracking (assignment problem)
Model calibration Scaling up (e.g., 3D SLAM, huge maps)
![Page 54: T EMPORAL P ROBABILISTIC M ODELS P T 2. A GENDA Kalman filtering Dynamic Bayesian Networks Particle filtering](https://reader035.vdocument.in/reader035/viewer/2022062309/56649e315503460f94b22b85/html5/thumbnails/54.jpg)
NEXT TIME
Putting it together: intelligent agents Read R&N 2