Definitions and issuesParticle filtering algorithms
Real applications
Introduction to particle filters:a trajectory tracking example
Alexis Huet
23 May 2014
1/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Outline
1 Definitions and issuesMarkov chainsHidden Markov modelsIssues
2 Particle filtering algorithmsSequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
3 Real applications
2/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Outline
1 Definitions and issuesMarkov chainsHidden Markov modelsIssues
2 Particle filtering algorithmsSequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
3 Real applications
3/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Markov chains
We consider the following state space : (Rn,L(Rn)).
Definition
A sequence of random variables (Xk)k∈N taking values in Rn is aMarkov chain if for all k ≥ 1, x0, . . . xk−1 ∈ Rn and A ∈ L(Rn) :
P(Xk ∈ A|Xk−1 = xk−1, . . . ,X0 = x0) = P(Xk ∈ A|Xk−1 = xk−1).
Hypothesis
In the sequel, the Markov chain is homogeneous and admits afamily of density functions (p(.|x))x∈Rn such that :
P(Xk ∈ A|Xk−1 = xk−1) =
∫Ap(xk |xk−1)dxk .
4/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example
The state space is here R2. For all x ∈ R2, let B(x , 1) be the unitball centered in x .
Definition
For all x ∈ R2, the distribution Unif (B(x , 1)) is defined by itsdensity :
1
π1(. ∈ B(x , 1)).
For the initial condition, we take :
X0 Unif (B(0, 1)).
And for transition distributions :
Xk |(Xk−1 = xk−1) Unif (B(xk−1, 1)).
5/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example
●
●
●
●
−2 −1 0 1 2
−2
−1
01
2
abs
ord
6/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example
●
●
●
●
−2 −1 0 1 2
−2
−1
01
2
abs
ord
●
0
6/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example
●
●
●
●
−2 −1 0 1 2
−2
−1
01
2
abs
ord
●
0
6/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example
●
●
●
●
−2 −1 0 1 2
−2
−1
01
2
abs
ord
●
0
●
1
6/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example
●
●
●
●
−2 −1 0 1 2
−2
−1
01
2
abs
ord
●
0
●
1
6/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example
●
●
●
●
−2 −1 0 1 2
−2
−1
01
2
abs
ord
●
0
●
1
●
2
6/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example
●
●
●
●
−2 −1 0 1 2
−2
−1
01
2
abs
ord
●
●
●
●
●
●
●
●
●●
0
1
2
3
4
5
6
7
89
6/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Hidden Markov models
Definition
(Xk ,Yk)k∈0:m−1 is a hidden Markov model if : (Xk)k∈0:m−1 is aMarkov chain, (Yk)k∈0:m−1 are independent conditionally to(Xk)k∈0:m−1 and for all k , Yk depends only on Xk .
Schematically, we have :
X0 −−−−→ X1 −−−−→ X2 −−−−→ . . . −−−−→ Xm−1y y y y yY0 Y1 Y2 . . . Ym−1
.
7/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example
We choose (Yk) taking values in ]− π, π]. Conditionally toXk = xk , we define :
Yk = Arg(xk) + ε
where ε N (0, σ2).
Related density : yk 7−→ p(yk |xk).
8/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example (σ = 0.1)
●
●
●
●
−4 −2 0 2 4
−4
−2
02
4
abs
ord
●
●
●
●
●
●
●
●
●
0
1
234
5
6
7
8●
9
●
●
●
●●
●
●
●
●
0
1
23
4
5
6
78
●
9
9/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Example (σ = 0.1)
●
●
●
●
−4 −2 0 2 4
−4
−2
02
4
abs
ord
●
●
●
●●
●
●
●
●
0
1
23
4
5
6
78
●
9
10/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Markov chainsHidden Markov modelsIssues
Issues
Now, the hidden chain (xk) is unknown and we only have theobservations (yk).
Aim : reconstitute x0:m−1 := (x0, . . . , xm−1) conditionally to theobservations.
Specifically, to generate a sample according to the followingdensity :
p(x0:m−1|y0:m−1)dx0:m−1.
11/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Outline
1 Definitions and issuesMarkov chainsHidden Markov modelsIssues
2 Particle filtering algorithmsSequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
3 Real applications
12/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Sequential Importance Sampling (SIS) algorithm
We try to simulate a sample from the density p(x0|y0)dx0.
p(x0|y0) =p(x0, y0)
p(y0)=
1
p(y0)p(y0|x0)p(x0).
We can generate a sample from p(x0)dx0.
We can compute x0 7→ p(y0|x0).
13/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
p(x0|y0)dx0 =1
p(y0)p(y0|x0)p(x0)dx0.
SIS algorithm (first step) :
Generate a sample of length N from p(x0)dx0 :
x(1)0 , . . . , x
(N)0 .
Compute for each particle j :
w(j)0 =
1
p(y0)p(y0|x (j)
0 ).
The sample (x(j)0 )j∈1:N which approximates p(x0|y0)dx0 is :
N∑j=1
w(j)0∑N
j ′=1 w(j ′)0
1x
(j)0
(dx0).
14/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
p(x0|y0)dx0 =1
p(y0)p(y0|x0)p(x0)dx0.
SIS algorithm (first step) :
Generate a sample of length N from p(x0)dx0 :
x(1)0 , . . . , x
(N)0 .
Compute for each particle j :
w(j)0 = p(y0|x (j)
0 ).
The sample (x(j)0 )j∈1:N which approximates p(x0|y0)dx0 is :
N∑j=1
w(j)0∑N
j ′=1 w(j ′)0
1x
(j)0
(dx0).
15/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SIS) N = 20
●
−1.0 −0.5 0.0 0.5 1.0 1.5 2.0
−1
01
2
abs
ord
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00 ●
0.00
●
0.93●
0.07
●
16/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
p(x0:i |y0:i )dx0:i =
1
p(y0:i )p(y0|x0) . . . p(yi |xi )p(x0)p(x1|x0) . . . p(xi |xi−1)dx0:i .
SIS algorithm (step i). For all particle j ∈ 1 : N :
Conditionally to x(j)i−1, generate a sample according to
p(xi |xi−1)dxi :
x(1)i , . . . , x
(N)i .
Compute for each particle j :
w(j)i = w
(j)i−1 × p(yi |x
(j)i ).
The sample (x(j)1:i )j∈1:N which approximates p(x1:i |yi )dx1:i is :
N∑j=1
w(j)i∑N
j ′=1 w(j ′)i
1x
(j)1:i
(dx1:i ).
17/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SIS) N = 20
●
−1.0 −0.5 0.0 0.5 1.0 1.5 2.0
−1
01
2
abs
ord
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.00 ●
0.00
●
0.93●
0.07
●
18/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SIS) N = 20
●
−1.0 −0.5 0.0 0.5 1.0 1.5 2.0
−1
01
2
abs
ord
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.00
●●
0.00
●●
0.00
●
●
0.00
●
●
0.00
●
●
0.00
●●
0.00
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.00●
●
0.00
●
●
0.00
●
●
0.00
●●
0.00●
●
0.00
●
●
1.00
●
●
18/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SIS) N = 20
●
−1.0 −0.5 0.0 0.5 1.0 1.5 2.0
−1
01
2
abs
ord
●
●
●
0.00
●
●
●
0.00
●
●
●
0.00
●●
●
0.00
●●
●
0.00
●
●
●
0.00
●
●
●
0.00●
●
●
0.00
●
●
●
0.00
●●
●
0.00●
●
●
0.00
●
●
●
0.00●
●
●
0.00●
●
●
0.00
●
●
●
0.00●
●
●
0.00
●
●
●
0.00
●●
●
0.00
●
●●
0.00
●
●
●
1.00
●
●
●
18/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Convergence and degeneracy problem (SIS)
Convergence :
for m the length of the chain fixed, central limit theoremwhen the number of particles N → +∞.
Degeneracy problem :
many particles have a relative weight close to 0%,
for N fixed, from a certain number of sites, only one particlehas an important relative weight.
→ Resampling of the particles.
19/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
p(x0|y0)dx0 =1
p(y0)p(y0|x0)p(x0)dx0.
SISR algorithm (first step)
Generate a sample of length N according to p(x0)dx0 :
x(1)0 , . . . , x
(N)0 .
Compute for each particle j :
w(j)0 = p(y0|x (j)
0 ).
Generate (x(j)0 )j∈1:N from :
N∑j=1
w(j)0∑N
j ′=1 w(j ′)0
1x
(j)0
(dx0)
and let (w(j)0 )j∈1:N ≡ 1.
20/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
SISR algorithm (step i). For all particle j ∈ 1 : N :
Conditionally to x(j)i−1, generate a sample according to
p(xi |xi−1)dxi :
x(1)i , . . . , x
(N)i .
Compute for each particle j :
w(j)i = w
(j)i−1 × p(yi |x
(j)i ) = p(yi |x
(j)i ).
Generate (x(j)i )j∈1:N from :
N∑j=1
w(j)i∑N
j ′=1 w(j ′)i
1x
(j)i
(dxi )
and let (w(j)i )j∈1:N ≡ 1.
21/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SISR)
●
−1.0 −0.5 0.0 0.5 1.0
−1.
0−
0.5
0.0
0.5
1.0
abs
ord
●
0.00
●
0.00
●
0.00●
0.00
●
0.00
●
0.00
●
0.00
●
0.00●0.00
●
0.00
●
0.00●
0.00
●
0.00●
0.00
●
0.00
●
0.00
●
0.00
●
0.00
●
0.28●0.72●
22/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SISR)
●
−1.0 −0.5 0.0 0.5 1.0
−1.
0−
0.5
0.0
0.5
1.0
abs
ord
●
0.05●
0.05●
0.05●
0.05●
0.05●
0.05●
0.05●0.05
●
0.05●
0.05●
0.05●
0.05●
0.05●
0.05●
0.05●
0.05●
0.05●
0.05●
0.05●
0.05●
22/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SISR)
●
−1.0 −0.5 0.0 0.5 1.0
−1.
0−
0.5
0.0
0.5
1.0
abs
ord
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.03
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.00
●
●
0.04
●
●
0.00
●
●
0.00
●
●
0.00●
●
0.35
●
●
0.22
●
●
0.36
●
●
22/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SISR)
●
−1.0 −0.5 0.0 0.5 1.0
−1.
0−
0.5
0.0
0.5
1.0
abs
ord
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
0.05
●
●
22/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Convergence (SISR)
Elimination of the weights degeneracy problem.
Convergence :
for m the length of the chain fixed, central limit theoremwhen the number of particles N → +∞.
23/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example
Comparison with m = 100 sites and N = 10000 particles.
●
●
●●
●
●
●
●
●●
●
●●
●●
●●
● ●●
●
●
●●●
●● ●
● ●
●●
●●●●
●
●
●
●
● ●●●
●●
●
● ●
●
●
●
●
●●
●
●●
●
●
●
●
●●
●
●
●
●● ●
●●
●●
●
●
●
●
●
●
●
●●●
● ●
●
●
●
●● ●
●●
●●
●
●
●
●
−3 −2 −1 0 1
−4
−2
02
4
abs
ord
0
12
34
5
67
89
10
111213 14
1516
171819
20
21
222324
25 26 27
2829
30 31
3233 343536
37
38
3940 41
424344
4546
47 4849
50
5152
535455
5657
58
5960
61
6263
64
65
66
67 68 69
70 71
7273
7475
76
77
7879
80
818283
84 8586
87
88
8990 91
9293
9495
9697
9899
24/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SIS)
24/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SISR)
24/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SIS)
25/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Sequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
Example (SISR)
25/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Outline
1 Definitions and issuesMarkov chainsHidden Markov modelsIssues
2 Particle filtering algorithmsSequential Importance Sampling algorithmSequential Importance Sampling Resampling algorithmComparison of algorithms
3 Real applications
26/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Path tracking
Observations : noisy position.
Hidden states : real position.
Parameters : behavior of the moving body.
27/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Voice recognition system
Observations : a word is pronounced, cut every 15ms.
Hidden states : phonemes that led to this pronounced word.
Parameters : the set of all dictionary words.
28/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Phylogenetic analysis
Observations : DNA sequences of several species at the leafsof a tree graph.
Hidden states : all DNA sequences from the common ancestrysequence to the present time.
Parameters : mutation parameters, lengths of the treebranches.
29/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Phylogenetic analysis
t = −1 ACGAGGTGA
ACAAGGTGA
ACAGGGTGA
ACAGGGCGA
ACAGGGCAA
t = 0 ACAGGGCAA
30/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Phylogenetic analysis
i 1 2 3 4 5 6 7 8 9t = −1 A C G A G G T G A
A C A A G G T G A
A C A G G G T G A
A C A G G G C G A
A C A G G G C A A
t = 0 A C A G G G C A A
30/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Phylogenetic analysis
i 1 2 3 4 5 6 7 8 9t = −1 ? ? ? ? ? ? ? ? ?
? ? ? ? ? ? ? ? ?
? ? ? ? ? ? ? ? ?
? ? ? ? ? ? ? ? ?
? ? ? ? ? ? ? ? ?
t = 0 A C A G G G C A A
30/31 Alexis Huet
Definitions and issuesParticle filtering algorithms
Real applications
Olivier Cappe, Eric Moulines, and Tobias Ryden. Inference inhidden Markov models. Springer, 2005.
Neil Gordon, David Salmond, and Adrian Smith. Novelapproach to nonlinear/non-Gaussian Bayesian stateestimation. IEEE Proceedings F (Radar and SignalProcessing), 1993.
Lawrence Rabiner. A tutorial on hidden Markov models andselected applications in speech recognition. Proceedings of theIEEE, 1989.
31/31 Alexis Huet
Top Related