Post on 22-Dec-2015
SLAM: Simultaneous Localization and Mapping: Part IIBY TIM BAILEY AND HUGH DURRANT-WHYTE
Presented by Chang Young KimThese slides are based on:Probabilistic Robotics, S. Thrun, W. Burgard, D. Fox, MIT Press, 2005
Many images are also taken fromProbabilistic Robotics.http://www.probabilistic-robotics.com
Overview
Review SLAM
Reducing complexity State Augmentation Partitioned Updates Sparsification
Data association Batch Gating SIFT Multi-Hypothesis
Future works
What is SLAM?
Given:The robot’s controls
Observations of nearby features
Estimate:Map of features
Path of the robot
A robot is exploring an unknown, static environment.
Terminology
Robot State (or pose): Position and heading
Robot Controls: Robot motion and
manipulation
Sensor Measurements: Range scans, images, etc.
Landmark or Map: Landmarks or Map
1{ ,..., }nm m m
zt
ut
xt=(x;y;µ)x1:t = fx1;x2; : : : ;xtg
u1:t =fu1;u2;:::;utg
z1:t = fz1;z2;:::;ztim
}
Terminology
Observation model: or The probability of a measurement zt given that the
robot is at position xt and map m.
Motion Model: The posterior probability that action ut carries the
robot from xt-1 to xt.
( | )t tP z x
),|( 1 ttt uxxP
( | , )t tP z x m
SLAM algorithm
Prediction
Update
1 1 1( , ) ( | , ) ( , )t t t t t tbel x m p x u x bel x m dx
( , ) ( | , ) ( , )t t t tbel x m p z x m bel x m
1: 1: 1:( , | , ) ( , )t t t tp x m z u bel x m
7
EKF State Space Model
Prediction
Update
where
Maintaining values: Bel(xt,m) and its covariance matrix Pt. Map with N landmarks:(3+2N)-dimensional Gaussian.
8
1 2
1 2
1 2
1 1 1 1 1 2 1
2 2 2 1 2 2 2
1 2
2
2
2
21
22
2
( , ) ,
N
N
N
N
N
N N N N N N
x xy x xl xl xl
xy y y yl yl yl
x y l l l
t xl yl l l l l l l
xl yl l l l l l l
Nxl yl l l l l l l
x
y
Bel x m m
m
m
L
L
L
L
LM M M M M M O M
L
EKF-SLAM
Overview
Review SLAM
Reducing complexity State Augmentation Partitioned Updates Sparsification
Data association Batch Gating SIFT Multi-Hypothesis
Future works
Complexity O(N3) with N landmarks due to the covariance matrix and matrix multiplication of Jacobian.
Can handle hundreds of dimensions? It can be reduced by approximation methods:
State Augmentation for the prediction stage Partitioned Updates for the update stage Sparsification using an information form
10
1 2
1 2
1 2
1 1 1 1 1 2 1
2 2 2 1 2 2 2
1 2
2
2
2
21
22
2
( , ) ,
N
N
N
N
N
N N N N N N
x xy x xl xl xl
xy y y yl yl yl
x y l l l
t xl yl l l l l l l
xl yl l l l l l l
Nxl yl l l l l l l
x
y
Bel x m m
m
m
L
L
L
L
LM M M M M M O M
L
EKF-SLAM : Complexity
1 1 1( , ) ( | , ) ( , )t t t t t tbel x m p x u x bel x m dx
11
1 2
1 2
1 2
1 1 1 1 1 2 1
2 2 2 1 2 2 2
1 2
2
2
2
21
22
2
( , ) ,
N
N
N
N
N
N N N N N N
x xy x xl xl xl
xy y y yl yl yl
x y l l l
t xl yl l l l l l l
xl yl l l l l l l
Nxl yl l l l l l l
x
y
Bel x m m
m
m
L
L
L
L
LM M M M M M O M
L
State AugmentationPrediction :
Solution : State Augmentation
• Separating the state into an augmented states
• Update only affected matrixes
Static
State Augmentation
Covariance prediction
Covariance prediction
State Augmentation
Static
O(N3)
O(N)
13
1 2
1 2
1 2
1 1 1 1 1 2 1
2 2 2 1 2 2 2
1 2
2
2
2
21
22
2
( , ) ,
N
N
N
N
N
N N N N N N
x xy x xl xl xl
xy y y yl yl yl
x y l l l
t xl yl l l l l l l
xl yl l l l l l l
Nxl yl l l l l l l
x
y
Bel x m m
m
m
L
L
L
L
LM M M M M M O M
L
Partitioned UpdatesUpdate :
Solution : Partitioned Update with local submap.
• Confines the map to a small local region.
• Only Updates the small local region.
• Updates the whole map only at a much lower frequency
( , ) ( | , ) ( , )t t t tbel x m p z x m bel x m
Partitioned Updates
( , )L t LBel x m
1 2
1 2
1 2
1 1 1 1 1 2 1
2 2 2 1 2 2 2
1 2
2
2
2
21
22
2
,
N
N
N
N
N
N N N N N N
x xy x xl xl xl
xy y y yl yl yl
x y l l l
xl yl l l l l l l
xl yl l l l l l l
Nxl yl l l l l l l
x
y
m
m
m
L
L
L
L
LM M M M M M O M
L
1 2
1 2
1 2
1 1 1 1 1 2 1
2 2 2 1 2 2 2
1 2
2
2
2
21
22
2
,
N
N
N
N
N
N N N N N N
x xy x xl xl xl
xy y y yl yl yl
x y l l l
xl yl l l l l l l
xl yl l l l l l l
Nxl yl l l l l l l
x
y
m
m
m
L
L
L
L
LM M M M M M O M
L
Local State :
Global State: ( , )G t GBel x m Periodically registers
Updated by Local SLAM
• State Bel(xt ,m) and covariance matrix Pt are Gaussian probability density which,
•implicitly describes the two central moments of Gaussian
• Using Moment or Information Form
•Sparsification Pt Yt
• Many of none diagonal components are very close to 0
they can be set to zero.
Sparsification
Moment Form
1
( , ) , ,
where and ( , )
t t tt
t t t tt
Bel x m P y Y
Y P y Y Bel x m
$
$
Sparsification
Covariance prediction
Covariance prediction
Sparsification using the information form
O(N3)
O(N)
Overview
Review SLAM
Computational complexity State Augmentation Partitioned Updates Sparsification
Data association Batch Gating SIFT Multi-Hypothesis
Future works
Data Association Problem
A robust SLAM must consider possible data associations
Solutions: three key methods : Batch Gating SIFT Multi-Hypothesis
Which observation belongs to which landmark?
Batch Gating Basic Principle of Batch: RANSAC Gating : constrained by robot position estimation
< taken from T. Bailey, “Mobile robot localization and mapping in extensive outdoor environments,” Ph.D. dissertation >
If true robot movement is
==> the left case is chosen by using the gating
Batch Gating is not enough for reliable data association SIFT features have “landmark-quality” for SLAM
SIFT correspondences tend to be reliable and recognizable under variable conditions
< taken from “Distinctive Image Featuresfrom Scale-Invariant Keypoints”, David G. Lowe – IJCV 2004 >
Gating If true robot movement is
==> the left case is chosen by using the gating
SIFT
Multi-Hypothesis Data Association Multi-hypothesis data
association Generate a separate
track estimate for each association hypothesis.
Low-likelihood tracks are pruned
FastSLAM is inherently a Multi-hypothesis solution because its data association is done on a per-particle basis.
Landmark 1 Landmark 2 Landmark M…
x, y,
Landmark 1 Landmark 2 Landmark M…
x, y, Particle#1
Landmark 1 Landmark 2 Landmark M…
x, y, Particle#2
ParticleN
…
Per-Particle Data Association
Was the observationgenerated by the redor the blue landmark?
P(observation|red) = 0.3 P(observation|blue) = 0.7
Per-particle data association Pick the most probable match
If the probability is too low, generate a new landmark
Future Woks
Large scale mapping including many vehicles in mixed environments with sensor networks and dynamic landmark.
The delayed data-fusion concept instead of batch association and iterative smoothing to improve estimation quality and robustness