implementation of particle filter-based target …rajbabu/presentations/pf...implementation of...

Post on 04-Apr-2020

1 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Implementation of Particle Filter-based

Target Tracking

V. Rajbabu

rajbabu@ee.iitb.ac.in

VLSI Group Seminar

Dept. of Electrical EngineeringIIT Bombay

November 15, 2007

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Outline

1 Introduction

2 Target Tracking

3 Bayesian Estimation

4 Particle Filter

5 Implementation

2 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Outline

1 Introduction

2 Target Tracking

3 Bayesian Estimation

4 Particle Filter

5 Implementation

3 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Introduction

Problem scenario

Multi-target tracking - batch measurements [Volkan Cevher]

Particle filters - computational complexity

Sensors operating under constrained environment

Problem Illustration

Objective

Efficient digital implementation of particle filters

Real-time

Low-power

4 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Outline

1 Introduction

2 Target Tracking

3 Bayesian Estimation

4 Particle Filter

5 Implementation

5 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Tracking

What is tracking ?

Tracking - process measurements to sequentially estimatehidden states

Target trackingVisual tracking

Applications

Surveillance and monitoring

Medical imaging

Robotics

Motion in sports

6 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Target Tracking

We consider multi-target tracking

State vector - target or vehicle kinematic characteristics, ex.,2−D position and velocity, or motion parameters in polarcoordinatesMeasurements - range or angle with respect to sensor

7 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

State-space Formulation

Dynamic state-space problem - sequential

State-space model depends on physics of the problem

System transition equation

xt = ft(xt−1, ut), ut − system noise

ft() − system evolution function (possibly nonlinear)

Observation equation

yt = gt(xt , vt), vt − measurement noise

gt() − measurement function (possibly nonlinear)

8 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Bearings-only Tracking

Automated estimation of a moving target’s state using anglemeasurements (bearings)

State update equation

Xt = FXt−1 + Gut ,

where Xt = [x vx y vy ]Tt , ut = [ux uy ]Tt ,

F =

1 1 0 00 1 0 00 0 1 10 0 0 1

and G =

0.5 01 00 0.50 1

.

Measurement equation

zt = arctan {yt/xt} + rt

9 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Batch-based Tracking – State Vector

Multiple target state vector - independent partitionsConstant velocity during batch period T

Xt = [xT1 (t), xT

2 (t), · · · , xTk (t)]

T

xk(t) =

θk(t)Rk(t)vk(t)φk(t)

, k − target index

θk(t) − direction-of-arrival (DOA)

Rk(t) , log range (range)

vk(t) − velocity

φk(t) − heading direction

2

vT

r(t+T)

r(t)

Y

X

1 φ(t)

θ(t) θ(t+T )

10 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Batch-based Tracking – Observation Model

Image template-matching - batch measurementsDOA measurements from acoustic sensor - beamformingRange measurements from radar sensor

Observability - batch of minimum three measurements

0 T 2T 3TMτ

t

DOA or RangeMissing data

ClutterBatch

yt = {yt+mτ (p)}M−1m=0

t

t+Tt+τ

t+(M−1)τ

DOA or Range

hθ,rmτ (x (j))

hθ,rmτ (x (i))

Figure: Template-matching

11 / 55

State-Space Model

Batch-based range tracking

Non-linear state transition equation

θk(t+T )

Rk(t+T )

vk(t+T )

φk(t+T )

=

tan−1

eRk sin θk +Tvk sin φk

eRk cos θk +Tvk cos φk

ff

12

log{e2Rk +T 2v2k+2TeRk vk cos(θk−φk)}

vk

φk

+ uk (t)

where uk(t) − Gaussian process noise

Observation likelihood

p(

y(t)|xk(t))

∝M−1Q

m=0

(

1+ 1−κ√2πσ2

r κλ

P

pi

exp

−(hr

mτ (xk (t))−yt+mτ (pi ))2

2σ2r

ff

)

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Outline

1 Introduction

2 Target Tracking

3 Bayesian Estimation

4 Particle Filter

5 Implementation

13 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Bayesian Estimation

Objective - estimate hidden state xt from observations yt

Probabilistic description - using pdf s

prior distribution - p(x0)state transition - p(xt|xt−1)data likelihood - p(yt |xt)

Bayesian estimation - minimum mean square estimate

Conditional mean E (xt |yt) of the posterior distributionp(xt |yt) ∝ p(yt |xt) · p(xt|xt−1)

14 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Sequential Representation

Cumulative state up to time t: Xt = {xj , j = 0, . . . , t}

Cumulative measurement up to time t:Yt = {yj , j = 0, . . . , t}

Bayesian estimation

Estimate xt based on all available measurements up to t byconstructing the posterior p(Xt |Yt)

15 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Recursive Filter

Consists of two stepsPrediction step: p(xt−1|Yt−1) → p(xt |Yt−1)

Update step: p(xt |Yt−1), yt → p(xt |Yt)

Given the pdf sPrediction step:

p(xt |Yt−1) =

p(xt |xt−1)p(xt−1|Yt−1)dxt−1

Update step:

p(xt |Yt) =p(yt |xt)p(xt |Yt−1)

p(yt |Yt−1)

where, p(yt |Yt−1) =∫

p(yt |xt)p(xt |Yt−1)

16 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Recursive Filter

Recursive application of prediction and update steps providesthe optimal Bayesian solution

Minimum mean square estimate (MMSE) is the conditionalmean - E (xt |Yt)

Analytically intractable - integrals and pdf s

17 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Kalman Filter

Optimal solution for the recursive problem exists

Kalman filter - optimal solution if

state and measurement models - linear, andstate and measurement noises - Gaussian

Extended Kalman filter (EKF) - extension of Kalman filter

state and/or measurement models - nonlinear, andstate and measurement noises - Gaussian

18 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Outline

1 Introduction

2 Target Tracking

3 Bayesian Estimation

4 Particle Filter

5 Implementation

19 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Particle Filter

Suboptimal filter - for nonlinear systems and non-Gaussiannoise

handles nonlinearity as such - without linearisationhandles multimodal distributions

Based on Monte Carlo methods

Monte Carlo - “randomly chosen”

Other names

Sequential Monte Carlo (SMC) methodsBootstrap filterSequential Importance Sampling (SIS) filterCONDENSATION - CONditional DENSity propogATION

20 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Monte Carlo Integration

To numerically evaluate: I =∫

q(x)dx where x ∈ Rnx

Monte Carlo (MC) method

Factorize: q(x) = f (x) · π(x), s.t., π(x) is a pdf. We can drawN samples {x i ; i = 1, . . . , N}

MC estimate of I =∫

f (x)π(x)dx is the sample mean

IN =1

N

N∑

i=1

f (x i)

What if it is difficult to draw samples from π(x) ?

21 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Importance Sampling

Choose an Importance distribution g(x), such that:

π(x) > 0 ⇒ g(x) > 0 for all x ∈ Rnx .

and is easy to draw samples from g().

MC estimate - generate samples {x (i)} ∼ g(x)

IN =1

N

N∑

i=1

f (x (i))w̃ (x (i))

where,w̃(x (i)) =π(x (i))

g(x (i)).

22 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Importance Sampling

Random Support N = 20

x

x (j)∼N (µ,σ2)

Target distribution π(x)

Proposal distribution g(x)

Unnormalized weights : wju = π(x (j))

1√(2πσ2)

exp

− (x(j)−µ)2

2σ2

ff

Target distribution’s expectation:

Eπ{F (x)} ≈∑20

j=1 F (x (j))w ju

∑20j=1 w

(j)u

23 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Particle Filter - (1/2)

Use randomly chosen “particles” to represent posteriordistribution

{

x(i)t , w

(i)t

}N

i=1,

x(i)t : support points

w(i)t : associated weights

N − number of particles

Discrete weighted approximation to the posterior

p(xt |Yt) ≈N

i=1

w(i)t δ(X − X

(i)t )

24 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Particle Filter- (2/2)

Discrete approximation of distribution using

Importance Sampling

Particles - determine the ’support’ regionWeights - proportional to probabilitiesProposal or importance function plays a critical part

Recursive update by propagating ’particles’ and ’weights’

Use updated distributions to obtain estimates

25 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Resampling

Degeneracy of particles - after a few iterations most particleshave negligible weights

ResamplingEliminate or replicate particles depending on their importanceweights

Image adapted from Dr.Volkan Cevher

Weights

Do estimation

Resampling

26 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Sequential Importance Resampling (SIR) Particle Filter

Given the observed data yk at k, do

1. For i = 1, 2, . . . , N,

Sample particles: x(i)k ∼ g(xk |x (i)

k−1, yk).

2. For i = 1, 2, . . . , N,

Calculate the importance weights: w(i)k =

p(yk |x(i)k

)p(x(i)k

|x(i)k−1)

g(x(i)k

|x(i)k−1,yk)

.

For i = 1, 2, . . . , N,

Normalize the weights: w̃(i)k =

w(i)k

PNj=1 w

(i)k

.

3. Calculate the state estimates: E{f (xt)} =∑N

i=1 w̃(i)t f (x

(i)t ).

4. Resample{

x(i)k , w̃

(i)k

}

to obtain new set of particles{

x(j)k , w

(j)k = 1

N

}

.

27 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

PF Flow - Suboptimal Proposal Function

InitializationProposeParticles

Measurements

EvaluateWeights

w∗(i)t =w

(i)t−T

p(yt |X(i)t )

NormalizeWeights

ResampleParticles

EstimateStates

NP

i=1w

(i)t X

(i)t

OutputProposal function - stateupdate

28 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

PF Flow - Optimal Proposal Function

InitializationProposeParticles

Measurements

EvaluateWeights

w∗(i)t =w

(i)t−T

p(yt |x(i)t )p(x

(i)t |xt−T )

Q

k gk (x(i)k

(t)|yt ,x(i)k

(t−T ))

NormalizeWeights

ResampleParticles

EstimateStates

NP

i=1w

(i)t X

(i)t

OutputProposal function -full-posterior

29 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Example: 1D Estimation

Estimate states of a nonlinear, non-stationary state space model

State dynamic equation

xk = 0.5xk−1 +25xk−1

(1 + x2k−1)

+ cos(1.2(k − 1)) + wk

Measurement equation

yk =x2

k

20+ vk

wk and vk are zero-mean, Gaussian white noise.

30 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Example - 1D Estimation

0 5 10 15 20 25 30 35 40 45 50−20

−10

0

10

20

30

Time

Sta

te e

stim

ate

State (xk) of a 1D model

True value

Posterior mean estimate

−20−10

010

20

010

2030

4050

0

100

200

300

Sample space

Posterior using Particle Filters

Time

Po

ste

rio

r d

en

sity

31 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Target Tracking Results

Tracking result

32 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Outline

1 Introduction

2 Target Tracking

3 Bayesian Estimation

4 Particle Filter

5 Implementation

33 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Design Hierarchy

Various stages in developing and implementing a particle filteralgorithm

34 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Design Hierarchy

Various stages in developing and implementing a particle filteralgorithm

35 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

DSP Implementation

Floating-point DSP – TI C6713

Internal memory (IRAM) or External memory (SDRAM)

SpeedSize

Sampling rate of ∼ 0.3 seconds, for K = 1, N = 1000,225 MHz

Table: Memory sizes in the C6713-DSK - Single-target, N = 1000

Type Section Available Occupied

Internal IRAM 192 KB 190.16 KB

36 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

FPGA Realization

Matlab simulation

Finite word length identified from Matlab fixed-pointsimulation

Xilinx System generator - Simulink based tool

Xilinx blocks - bit and cycle true FPGA codeNonlinear functions using CORDIC algorithmDevice: Xilinx Virtex II Pro

Xilinx Embedded Development Kit (EDK)

To use MicroBlaze soft-core and Power PC hard-core

37 / 55

Particle State Update

Particle Proposal Stage

Data Likelihood Evaluation

Particle Weight Evaluation Stage

Reampling Stage

Resampling

Compute number of replications based on particle weight

Control circuitry and logic functions

Figure: Residual Systematic Resampling

1: U ∼ U [0, 1]2: K = M/Wn

3: indr = 0, indd = M − 14: for m=1 to M do

5: temp = w(m)n .K − U

6: r indr = ⌈temp⌉7: U = temp − r indr

8: if r indr > 0 then

9: i(indr )r = m, indr = indr + 1

10: else

11: i(indd )d = m, indd = indd − 1

12: end if

13: end for

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Approximation for Data Likelihood

Data likelihood

p(y(t)|xk (t))∝M−1Q

m=0

(

1+ 1−κ√2πσ2

θκλ

P

pi

exp

(

−(hθ

mτ (xk (t))−yt+mτ (pi ))2

2σ2θ

))

DOA component of nonlinear state transition

hθmτ (xk(t))=arctan

sin(θk (t))+T cos(φk (t))e(Qk(t))

cos(θk (t))+T sin(φk (t))e(Qk(t))

Laplace method to approximate distribution

Newton search – Jacobians and Hessians

43 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Newton search in FPGA (1/2)

Hardware implementation of square-root and division useNewton-Raphson search

Difficulty in using Newton search to identify mode

Cost function , Hessian, and Gradient evaluation - nonlinearfunctions

Software implementation in floating-point - MicroBlaze

soft-core or Power PC hard-core processor

Avoids sensitivity to word length

Accelerated by configurable hardware

44 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Newton search in FPGA (2/2)

MicroBlaze - soft-core IP (processor)

Configurable - has 64 kB RAM for code and dataFloating-point unit (FPU) - accelerates floating-pointoperations

Power PC 405 - hard-core IP (processor)

Non-Configurable - has 128 kB for data and 64 kB for codeFloating-point operations are emulated in software

Newton search for identifying mode - code size > 100 kB

45 / 55

FPGA - Resource Utilization

Batch-based tracker utilizes at least four times more resources

Excluding Newton-search

Requires large, recent FPGA device

Table: Resource utilization - Batch-based tracker

ResourcePF Stage

Proposal Weight evaluation Resampling Overall

# Slices†a 7803 8869 374 17046†a For comparison, a 16 bit multiplier uses 153 slices

Table: Resource utilization - Bearings-only tracker

ResourcePF Stage

Proposal Weight evaluation Resampling Overall

# Slices 2700 1215 374 4635

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

FPGA - Latency

Table: Update rate using FPGA Clock frequency of 100 MHz

Batch-based tracker Bearings-only tracker

Latency 3N + 307 3N + 50

Update rateN = 200 N = 1000 N = 200 N = 1000

9 µs 33 µs 3.5 µs 30 µs

Update rate of 9 µs is sufficient to generate estimatesevery T = 1 s

47 / 55

DSP Implementation Results

Complete batch-based particle filter algorithm on TI C6713

Figure: Target DOAs

0 5 10 15 20−50

0

50

100

θin

[◦

]

time [sec]

Figure: Target x -y tracks

−50 0 50 100 150 200 250 300 350−150

−100

−50

0

50

100

150

200

250

300

Floating point DSP

Matlab

Generic C

Ground−truth

x

y

FPGA Implementation Results

Matlab ModelSim

Proposeparticles

Evaluateweights

Measurements

Resampleparticles

Estimatestates

Figure: FPGA simulation setup.

FPGA Implementation Results

FPGA implementation of data likelihood evaluated in ModelSim

Figure: Target DOAs

0 5 10 15 20

−150

−100

−50

0

50

100

150

θin

[◦

]

time [sec]

Figure: Target x -y tracks

0 50 100 150 200 250 3000

20

40

60

80

100

120

140

FPGA estimate

Matlab estimate

Ground−truth

Sensor

x

y

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Comparison

Comparison of implementation strategies for particle filters

Characteristic DSP FPGA

Accuracy High High

Speed Medium High

Power dissipation High High

ImplementationHigh Medium

flexibility

51 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Discussion (1/2)

Particle filter characteristics

Number of particles N can be large - complexity

Parallel computations for individual particles

Resampling can prevent parallelization - parallel resampling

Nonlinear functions

Constraints

Real-time operation

Low power

Accuracy - word length

52 / 55

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Discussion (2/2)

Digital implementation of particle filter

Multiple FPGAs

SIMD architectures

General purpose Graphical processing units (GP-GPUs)

Random number generators

Analog or mixed-mode implementation of particle filter

Nonlinear functions in analog - arctan, Gaussian

Random number or noise generation

53 / 55

Acknowledgements and References

Thanks to

Advisor Prof.James H. McClellan, Georgia Tech.

Dr.Volkan Cevher, Rice University.

SMC Webpage: http://www-sigproc.eng.cam.ac.uk/smc/

A. Doucet and N. Freitas and N. GordonSequential Monte Carlo Methods in Practice.Springer-Verlag, 2001.

V. Cevher, R. Velmurugan, and J. H. McClellanAcoustic Multi-Target Tracking using Direction-of-Arrival BatchesIEEE Tran. Signal Processing, June 2007.

IntroductionTarget Tracking

Bayesian EstimationParticle Filter

Implementation

Thanks !

Questions

55 / 55

Sequential Importance Sampling (SIS) Particle Filter

Illustration

Source: Michael Isard’s CONDENSATION demoshttp://www.robots.ox.ac.uk/~misard/condensation.html

56 / 55

CORDIC Algorithm

COordinate Rotation DIgital Computer (CORDIC) algorithm

xn+1 = xn − mdnyn2−σ(n)

yn+1 = yn + dnxn2−σ(n)

zn+1 = zn − wσ(n)

Type m σn dn = signzn dn = −signyn

circular1 n

xn → K(x0 cos z0 − y0 sin z0) xn → Kp

x20 + y2

0

yn → K(y0 cos z0 + x0 sin z0) yn → 0

tan−1 2−k zn → 0 zn → z0 − tan−1 y0x0

hyperbolic−1 n − k

xn → K ′(x1 cosh z1 + y1 sinh z1) xn → Kp

x20 + y2

0

yn → K ′(y1 cosh z1 + x1 sinh z1) yn → 0

tanh−1 2−k zn → 0 zn → z0 − tanh−1 y0x0

57 / 55

FPGA Device Detail

Table: Xilinx Virtex II Pro FPGA - XC2VP30

Resource # Logic cells†a # Slices†b Block RAM Clock frequency

30816 13696 2448 kb 100 MHz

†a Logic cell ≈ one 4-input LUT + one Flip-Flop + Carry logic†b Each slice includes two 4-input function generators, carry logic, arithmeticlogic gates, wide function multiplexers, and two storage elements.

58 / 55

top related