You are on page 1of 47

Beyond the Kalman Filter: Particle lters for tracking applications

N. J. Gordon
Tracking and Sensor Fusion Group Intelligence, Surveillance and Reconnaissance Division Defence Science and Technology Organisation PO Box 1500, Edinburgh, SA 5111, AUSTRALIA. Neil.Gordon@dsto.defence.gov.au
N.J. Gordon : Lake Louise : October 2003 p. 1/47

Contents
General PF discussion History Review Tracking applications Sonobuoy TBD DBZ

N.J. Gordon : Lake Louise : October 2003 p. 2/47

What is tracking?
Use models of the real world to estimate the past and present predict the future Achieved by extracting underlying information from sequence of noisy/uncertain observations Perform inference on-line Evaluate evolving sequence of probability distributions

N.J. Gordon : Lake Louise : October 2003 p. 3/47

Recursive lter
System model xt = ft (xt `1 ; t ) Measurement model yt = ht (xt ; Information available y1:t = (y1 ; : : : ; yt ) p (x0 ) Want p (x0:t +i y1:t ) and especially p (xt y1:t )
t)

$ $ p (xt xt `1 ) p (yt xt )

N.J. Gordon : Lake Louise : October 2003 p. 4/47

Recursive lter
Prediction p (xt y1:t `1 ) = Update p (xt y1:t ) = p (yt xt ) p (xt y1:t `1 ) p (yt y1:t `1 ) p (yt xt ) p (xt y1:t `1 ) dxt p (xt xt `1 ) p (xt `1 y1:t `1 ) dxt `1

p (yt y1:t `1 ) = Alternatively ...

p (x0:t y1:t ) = p (x0:t `1 y1:t `1 )

p (yt xt ) p (xt xt `1 ) p (yt y1:t `1 )

p (xt y1:t ) =

p (x0:t y1:t ) dx0:t `1

N.J. Gordon : Lake Louise : October 2003 p. 5/47

Tracking : what are the problems?


On-line processing Target manoeuvres Missing measurements Spurious measurements Multiple objects and/or sensors Finite sensor resolution Prior constraints Signature information

Nonlinear/non-Gaussian models

N.J. Gordon : Lake Louise : October 2003 p. 6/47

Analytic approximations
EKF and variants : linearisation, Gaussian approx, unimodal Score function moment approximation : Masreliez (75), West (81), Fahrmeir (92), Pericchi, Smith (92) Series based approximation to score functions : Wu, Cheng (92)

N.J. Gordon : Lake Louise : October 2003 p. 7/47

Numerical approximations
Discrete grid: Pole, West (88) Piecewise pdf: Kitagawa (87), Kramer, Sorenson (88) Series expansion: Sorenson, Stubberud (68) Gaussian mixtures: Sorenson, Alspach (72), West (92) Unscented lter: Julier, Uhlman (95)

N.J. Gordon : Lake Louise : October 2003 p. 8/47

Monte Carlo Approximations


Sequential Importance Sampling (SIS): Handschin & Mayne, Automatica, 1969. Improved SIS: Zaritskii et al., Automation and Remote Control, 1975. Rao-Blackwellisation: Akashi & Kumamoto, Automatica, 1977...

) Too computationally demanding 20-30 years ago

N.J. Gordon : Lake Louise : October 2003 p. 9/47

Sequential Monte Carlo (SMC)


SMC methods lead to estimate of the complete probability distribution Approximation centred on the pdf rather than compromising the state space model Known as Particle lters, SIR lters, bootstrap lters, Monte Carlo lters, Condensation etc

N.J. Gordon : Lake Louise : October 2003 p. 10/47

Why random samples?


nonlinear/non-Gaussian whole pdf moments/quantiles HPD interval re-parameterisation constraints association hypotheses independent over time multiple models trivial scalable - how big is 1 parallelisable

N.J. Gordon : Lake Louise : October 2003 p. 11/47

Comparison
Kalman lter analytic solution restrictive assumptions deduce state from measurement KF optimal EKF sub-optimal Particle lter sequential MC solution based on simulation no modelling restrictions predicts measurements from states optimal (with 1 computational resources)

N.J. Gordon : Lake Louise : October 2003 p. 12/47

Book Advert
Sequential Monte Carlo methods in practice Editors: Doucet, de Freitas, Gordon Springer-Verlag (2001)
Theoretical Foundations Efciency Measures Applications : Target tracking, missile guidance, image tracking, terrain referenced navigation, exchange rate prediction, portfolio allocation, in-situ ellipsometry, pollution monitoring, communications and audio engineering.

N.J. Gordon : Lake Louise : October 2003 p. 13/47

Useful Information
Books Sequential Monte Carlo methods in practice, Doucet, de Freitas, Gordon, Springer, 2001. Monte Carlo strategies in scientic computing, Liu, Springer, 2001. Beyond the Kalman lter : Tracking applications of particle lters, Ristic, Arulampalam, Gordon, Artech House, 2003? Papers On sequential Monte Carlo sampling methods for Bayesian ltering, Statistics in Computing, Vol 10, No 3, pgs 197-208, 2000. IEEE Trans. Signal Processing special issue, February 2002. Web site www.cs.ubc.ca/ nando/smc/index.html (includes software)

N.J. Gordon : Lake Louise : October 2003 p. 14/47

Particle Filter
Represent uncertainty over x1:t using diversity of weighted particles
i i x1: t ; wt N i =1

Ideally:
i x1: t p (x1:t j y1:t ) =

p (x1:t ; y1:t ) p (y1:t )

where p (y1:t ) = What if we cant sample p (x1:t j y1:t )? p (x1:t ; y1:t )dx1:t

N.J. Gordon : Lake Louise : October 2003 p. 15/47

Particle Filter - Importance Sampling


Sample from a convenient proposal distribution q (x1:t j y1:t ) Use importance sampling to modify weights p (x1:t j y1:t ) q (x1:t j y1:t )f (x1:t )dx1:t q (x1:t j y1:t )
i wti f (x1: t) i =1

p (x1:t j y1:t )f (x1:t )dx1:t =


N

where

i x1: t q (x1:t j y1:t )

wti =

p (x1:t j y1:t ) q (x1:t j y1:t )

N.J. Gordon : Lake Louise : October 2003 p. 16/47

Particle Filter - Importance Sampling


Pick a convenient proposal Dene the un-normalised weight: w ~ti = p (x1:t ; y1:t ) q (x1:t j y1:t )

Can then calculate approximation to p (y1:t )


N

p (y1:t ) Normalised weight is

w ~ti
i =1

wti

p (x1:t ; y1:t ) 1 p (x1:t j y1:t ) = = = q (x1:t j y1:t ) q (x1:t j y1:t ) p (y1:t )

w ~ti
N i =1

w ~ti

N.J. Gordon : Lake Louise : October 2003 p. 17/47

Particle Filter - SIS


To perform Sequential Importance Sampling, SIS q (x1:t j y1:t ) , q (x1:t `1 j y1:t `1 ) q (xt j xt `1 ; yt )
Keep existing path Extend path

The un-normalised weight then takes the appealing form


i ;y jy p (x1: 1:t `1 ) t t = i jy ) q (x1: 1:t t

w ~ti

i i i q (x1: t `1 j y1:t `1 ) q (xt j xt `1 ; yt ) i i i p (yt j xt `1 )p (xt j xt `1 ) i j xi q (xt t `1 ; yt ) Incremental weight

i i i p (x1: t `1 j y1:t `1 ) p (xt ; yt j x1:t `1 )

=wti `1

N.J. Gordon : Lake Louise : October 2003 p. 18/47

Illustration of SIS

N.J. Gordon : Lake Louise : October 2003 p. 19/47

Illustration of SIS - data conict

N.J. Gordon : Lake Louise : October 2003 p. 20/47

SIS
Problem : Whatever the importance function, degeneracy is observed (Kong, Liu and Wong 1994).
i with respectively Introduce a selection scheme to discard/multiply particles x0: t high/low importance weights i ; w ) onto the equally Resampling maps the weighted random measure (x0: t t i ; N `1 ) weighted random measure (x0: t

Scheme generates Ni children such that i E (Ni ) = Nwk

N i =1

Ni = N and satises

N.J. Gordon : Lake Louise : October 2003 p. 21/47

Illustration of SIR

N.J. Gordon : Lake Louise : October 2003 p. 22/47

Illustration of SIR

N.J. Gordon : Lake Louise : October 2003 p. 23/47

Ingredients for Particle lter


Importance sampling function
i) Prior p (xt j xt(` 1) i) Optimal p (xt j xt(` 1 ; yt )

UKF, linearised EKF, : : : Redistribution scheme Multinomial Deterministic Residual Stratied Careful initialisation procedure (for efciency)
N.J. Gordon : Lake Louise : October 2003 p. 24/47

Improvements to SIR
To alleviate degeneracy problems many other methods have been proposed Local linearisation (Doucet, 1998; Pitt & Shephard, 1999) using the EKF to estimate the importance distribution or UKF (Doucet et al, 1999) Rejection methods (Mller, 1991; Hrzeler & Knsch, 1998; Doucet, 1998; Pitt & Shephard, 1999) Auxiliary particle lters (Pitt & Shephard, 1999) Kernel smoothing (Gordon, 1993; Liu & West, 2000; Musso et al, 2000) MCMC methods (Mller, 1992; Gordon & Whitby, 1995; Berzuini et al, 1997; Gilks & Berzuini, 1999; Andrieu et al, 1999) Bridging densities : (Clapp & Godsill, 1999)

N.J. Gordon : Lake Louise : October 2003 p. 25/47

Auxiliary SIR - ASIR


Introduced by Pitt and Shephard 1999. Use importance sampling function q (xt ; i j y1:t ) Auxiliary variable i refers to index of particle at time t ` 1 Importance distribution chosen to satisfy
i i q (xt ; i j y1:t ) / p (yt j it )p (xt j xt `1 )wt `1 i it is some characterisation of xt given xt `1

i i i eg, it = (xt j xt `1 ) or t p (xt j xt `1 )

This gives wtj / wt ` 1


ij j j i j xt )p (xt p (yt j xt `1 ) j j ;i q (xt
j

j y1:t )

i ) p (yt j t

j p (yt j xt )
j

N.J. Gordon : Lake Louise : October 2003 p. 26/47

ASIR
Naturally uses points at t ` 1 which are close to measurement yt If process noise is small then ASIR less sensitive to outliers than SIR This is because single point it characterises p (xt j xt `1 ) well But if process noise is large then ASIR can degrade performance Since a single point it does not characterise p (x t j x t `1 )

N.J. Gordon : Lake Louise : October 2003 p. 27/47

Regularised PF - RPF
Resampling introduced to reduce degeneracy But, also reduces diversity RPF proposed as a solution Uses continuous Kernel based approximation
N

p ^(x1:t j y1:t )
i =1

wti Kh xt ` xti

N.J. Gordon : Lake Louise : October 2003 p. 28/47

RPF

N.J. Gordon : Lake Louise : October 2003 p. 29/47

RPF
Kernel K(.) and bandwidth h chosen to minimise MISE fp ^(xt j y1:t ) ` p (xt j y1:t )g2 dxt

MISE (p ^) =

For equally weighted samples, optimal choice is Epanechnikov kernel nx +2 (1` k x k2 ) if k x k< 1 2cnx = 0 otherwise

Kopt

Optimal bandwidth can be obtained as a function of underlying pdf Assume this is Gaussian with unit covariance matrix p `1 hopt = 8N (nx + 4)(2 )nx cn x
1=(nx +4)

N.J. Gordon : Lake Louise : October 2003 p. 30/47

MCMC Moves
RPF moves are blind Instead, introduce Metropolis style acceptance step
R and created x R = Resampled xk 1:k R; xR xk 1:k `1

R and then sampled from a proposal distribution Resampled xk P q (: j x R ) and created x P = xk 1:k k P ; xR xk 1:k `1

Assume q (: j :) symmetric x1:k = x P


1:k x R 1:k

with probability otherwise 1;


P R P jy p (x1: 1:k )q (xk j xk )) k R jy P R p (x1: 1:k )q (xk j xk ) k P )p (x P j x R ) p (yk j xk k `1 k

= min

= min

1;

R )p (x R j x R ) p (yk j xk k k `1

N.J. Gordon : Lake Louise : October 2003 p. 31/47

Tracking dim targets


Detection and tracking of low SNR targets - better not to threshold the sensor data ! The concept: track-before-detect Conventional TBD approaches: - Hough transform (Carlson et al, 1994) - dynamic programming (Barniv, 1990; Arnold et al, 1993) - maximum likelihood (Tonissen, 1994) The performance improved by 3-5 dB in comparison to the MHT (thresholded data).

N.J. Gordon : Lake Louise : October 2003 p. 32/47

Recursive Bayesian TBD


Drawbacks of conventional TBD approaches: batch processing; prohibit or penalise deviations from the straight line motion; require enormous computational resources. A recursive Bayesian TBD (Salmond, 2001), implemented as a particle lter no need to store/process multiple scans target motion - stochastic dynamic equation valid for non-gaussian and structured background noise the effect of point spread function, nite resolution, unknown and uctuating SNR are accommodated target presence and absence explicitly modelled Run Demo

N.J. Gordon : Lake Louise : October 2003 p. 33/47

Mathematical formulation
Target state vector xk = [ x k x _ k yk y _k Ik ]T : State dynamics: xk +1 = fk (xk ; vk ); Target existence - two state Markov chain, Ek 2 f0; 1g with TPM 1 ` Pb Pb : = Pd 1 ` Pd

N.J. Gordon : Lake Louise : October 2003 p. 34/47

Mathematical formulation (Contd)


Sensor model: 2D map, image of a region x y . At each resolution cell (i ; j ) measured intensity: (i;j ) (i;j ) h ( x ) + w if target present k k k (i;j ) = zk w (i;j ) if target absent k where
(i;j ) hk ( xk )

x y Ik (i x ` xk )2 + (j y ` yk )2 exp ` 2 2 22

N.J. Gordon : Lake Louise : October 2003 p. 35/47

Example
30 frames and target present in 7 to 22 SNR = 6.7 dB (unknown) 20 20 cells
Frame 2 Frame 7 Frame 12

Frame 17

Frame 22

Frame 27

N.J. Gordon : Lake Louise : October 2003 p. 36/47

Particle lter output (6 states)


Figures suppressed to reduce le size.

N.J. Gordon : Lake Louise : October 2003 p. 37/47

Particle lter output (6 states)

TRUE PATH ESTIMATED

16
1 0.9 0.8

14 12

0.7 PROBABILITY 0.6 0.5 0.4 0.3 0.2 0.1 0 5 10 15 FRAME NUMBER 20 25 30

10 y 8 6 4 2 0
START

6 x

10

12

14

N.J. Gordon : Lake Louise : October 2003 p. 38/47

PF based TBD - Performance


Derived CRLBs (as a function of SNR) Compared PF-TBD to CRLBs Detection and Tracking reliable at 5 dB (or higher)

N.J. Gordon : Lake Louise : October 2003 p. 39/47

Sonobuoy and Submarine


Noisy bearing measurements from drifting sensors Uncertainty in sensor locations Sensor loss High proportion of spurious bearing measurements Run demo

N.J. Gordon : Lake Louise : October 2003 p. 40/47

Blind Doppler
Blind Doppler zone to to lter out ground clutter + ground moving targets A simple EP measure against any CW or pulse Doppler radar Causes track loss Aided by on-board RWR or ESM

N.J. Gordon : Lake Louise : October 2003 p. 41/47

Tracking with hard constraints


Prior information on sensor constraints Posterior pdf is truncated (non-Gaussian) Example : 2-D tracking with CV model (r; ; r _) measurements pd < 1 EKF and Particle Filter with identical gating and track scoring

N.J. Gordon : Lake Louise : October 2003 p. 42/47

EKF
(a)
400

(b)

RangeRate [km/h]

200 0 200 400 600 800

50

40

Y [km]

30

50

100

150

T I M E [s] (c)
20 1

Track Score

0.8 0.6 0.4 0.2 TRACK LOSS

10

0 15 10 5 0 5 10 15

0 0 50 100 150

X [km]

T I M E [s]
N.J. Gordon : Lake Louise : October 2003 p. 43/47

Particle Filter
(a) 400 (b)

RangeRate [km/h]
200 0 200 400 600 800

50

40

Y [km]

30

50

100

150

T I M E [s]
(c) 20 CLOUD OF PARTICLES AT t=108 s 1

Track Score
10 0 10

0.8 0.6 0.4 0.2

10

0 0 50 100 150

X [km]

T I M E [s]
N.J. Gordon : Lake Louise : October 2003 p. 44/47

Track continuity

Probability of Track Maintenance

0.8

Probability of Track Maintenance

0.8

0.6

0.6

PF EKF

0.4

0.4

0.2 PF EKF 0

0.2

20

40

60

80

100

120

20

40

60

80

100

120

Doppler blind zone limit [km/h]

Doppler blind zone limit [km/h]

T=2s

T=4s

N.J. Gordon : Lake Louise : October 2003 p. 45/47

Problems hindering SMC


Convergence results Becoming available (Del Moral, Crisan, Chopin, Lyons, Doucet ...) Communication bandwidth Large particle sets impossible to send Interoperability Need to integrate with varied tracking algorithms Computation Expensive so look to minimise Monte Carlo Multi-target problems Rao-Blackwellisation

N.J. Gordon : Lake Louise : October 2003 p. 46/47

Final comments
Sequential Monte Carlo methods Optimal ltering for nonlinear/non Gaussian models Flexible/Parallelizable Not a black-box: efciency depends on careful design. If the Kalman lter is appropriate for your application use it A Kalman lter is a (one) particle lter

N.J. Gordon : Lake Louise : October 2003 p. 47/47

You might also like