Post on 15-Jan-2016
description
Statistics in the Image Domain forMobile Robot Environment Modeling
L. Abril Torres-Méndez and Gregory Dudek
Centre for Intelligent Machines
School of Computer Science
McGill University
International Symposium of Robotics and Automation, August 25-27, 2004
Our Application
• Automatic generation of 3D maps.• Robot navigation, localization
- Ex. For rescue and inspection tasks.
• Robots are commonly equipped with camera(s) and laser rangefinder.
Would like a full range map of the
the environment. Simple acquisition of data
International Symposium of Robotics and Automation, August 25-27, 2004
Problem Context
• Pure vision-based methods – Shape-from-X remains challenging, especially in
unconstrained environments.
• Laser line scanners are commonplace, but– Volume scanners remain exotic, costly, slow.– Incomplete range maps are far easier to obtain that complete
ones.
Proposed solution: Combine visual and partial depth
Shape-from-(partial) Shape
International Symposium of Robotics and Automation, August 25-27, 2004
Problem Statement
From incomplete range data combined with intensity, perform scene recovery.
From range scans like thisinfer the rest of the map
International Symposium of Robotics and Automation, August 25-27, 2004
Overview of the Method
• Approximate the composite of intensity and range data at each point as a Markov process.
• Infer complete range maps by estimating joint statistics of observed range and intensity.
International Symposium of Robotics and Automation, August 25-27, 2004
What knowledge does Intensity provide about Surfaces?
• Two examples of kind of inferences:
Intensity image Range image
surface smoothness
variations in depth
surface smoothness
far
close
International Symposium of Robotics and Automation, August 25-27, 2004
What about Edges?
• Edges often detect depth discontinuities• Very useful in the reconstruction process!
Intensity Rangeedges
International Symposium of Robotics and Automation, August 25-27, 2004
Isophotes in Range Data
• Linear structures from initial range data• All normals forming same angle with direction to eye
Intensity Range
International Symposium of Robotics and Automation, August 25-27, 2004
Range synthesis basis
Range and intensity images are correlated, in complicated ways, exhibiting useful structure.
- Basis of shape from shading & shape from darkness, but they are based on strong assumptions.
The variations of pixels in the intensity and range images are related to the values elsewhere in the image(s).
Markov Random Fields
International Symposium of Robotics and Automation, August 25-27, 2004
Related Work
• Probabilistic updating has been used for – image restoration [e.g. Geman & Geman, TPAMI
1984] as well as
– texture synthesis [e.g. Efros & Leung, ICCV 1999].
• Problems: Pure extrapolation/interpolation:– is suitable only for textures with a stationary
distribution
– can converge to inappropriate dynamic equilibria
International Symposium of Robotics and Automation, August 25-27, 2004
MRFs for Range Synthesis
States are described as augmented voxels V=(I,R,E).
ZZmm=(x,y):1≤x,y≤m=(x,y):1≤x,y≤m: mxm lattice over which the image are described.
I = {II = {Ix,yx,y}, (x,y)}, (x,y) Z Zmm: intensity (gray or color) of the input image
E is a binary matrix (1 if an edge exists and 0 otherwise).
R={RR={Rx,yx,y}, (x,y)}, (x,y) Z Zmm: incomplete depth values
We model V as an MRF. I and R are random variables.
RI vx,y
AugmentedRange Map
IR
International Symposium of Robotics and Automation, August 25-27, 2004
Markov Random Field Model
Definition: A stochastic process for which a
voxel value is predicted by its neighborhood
in range and intensity.
€
P(Vx,y = vx,y |Vk,l = vk,l ,(k, l) ≠ (x,y)) =
€
P(Vx,y = vx,y |Vk,l = vk,l ,(k, l)∈ Nx,y )
Nx,y is a square neighborhood of size nxn centered at voxel Vx,y.
International Symposium of Robotics and Automation, August 25-27, 2004
Computing the Markov Model
• From observed data, we can explicitly compute
€
P(Vx,y = vx,y |Vk,l = vk,l ,(k, l)∈ Nx,y )
intensity
intensity & range
Vx,y
Nx,y
• This can be represented parametrically or via a table.–To make it efficient, we use the sample data itself as a table.
International Symposium of Robotics and Automation, August 25-27, 2004
Further, we can do this even with partial
neighborhood information.
Estimation using the Markov Model
• From
what should an unknown range value be?
For an unknown range value with a known
neighborhood, we can select the maximum
likelihood estimate for Vx,y.
€
P(Vx,y = vx,y |Vk,l = vk,l ,(k, l)∈ Nx,y )
Even further, if both intensity and range are
missing we can marginalize out the unknown
neighbors.
intensity
intensity & range
International Symposium of Robotics and Automation, August 25-27, 2004
Interpolate PDF• In general, we cannot uniquely solve the desired neighborhood
configuration, instead assume
€
P(Rx,y = rx,y | Ix,y = ix,y , Vk,l = vk,l , (k, l)∈ N x,y ) ≈
P(Ru,v = ru,v | Iu,v = iu,v , Vp,q = v p,q , ( p, q)∈ N u,v )
The values in Nu,v are similar to the values in Nx,y, (x,y) ≠ (u,v).
Similarity measureSimilarity measure:: Gaussian-weighted SSD (sum of squared differences).
Update schedule is purely causal and deterministic.
International Symposium of Robotics and Automation, August 25-27, 2004
Order of Reconstruction
• Dramatically reflects the quality of result• Based on priority values of voxels to be synthesize• Edges+Isophotes indicate which voxels are synthesized first
Region to be synthesized (target region) The contour of target region The source region = i + r
International Symposium of Robotics and Automation, August 25-27, 2004
Priority value computation
€
P(Vx,y ) = C (Vx,y ) ⋅D(Vx,y ) +1/(1+ E).
€
C (Vx,y ) =C (Vp,q )
p,q∈Ν x,y ∩Ω∑| Νx,y |
Confidence value:
€
D(Vx,y ) =α
|∇I x,y
⊥ ⋅nx,y |Data term value:
€
∇I x,y
⊥
€
nx,y
€
α Normalization factor
Isophote (direction and range)
Unit vector orthogonal to €
E Number of voxels having an edge in Nx,y
International Symposium of Robotics and Automation, August 25-27, 2004
Experimental Evaluation
Scharstein & Szeliski’s Data Set Middlebury College
Input intensity image
Intensity edge map
Ground truth range
Input range image65% of range is unknown
Input data given to our algorithm
International Symposium of Robotics and Automation, August 25-27, 2004
Isophotes vs. no Isophotes Constraint
CaseI: 65% of range is unknown
Case II: 62% of range is unknown
Initial range data Results without isophotes Results using isophotes
Synthesized range images
Ground truth range
International Symposium of Robotics and Automation, August 25-27, 2004
More examples
Initial range data. 79% of range is unknown.
Synthesized result.MAR error: 5.94 cms.
Input intensity image Intensity edge map Initial range data Ground truth range
International Symposium of Robotics and Automation, August 25-27, 2004
More examples
Input intensity image Intensity edge map Initial range data Ground truth range
Initial range data. 70% of range is unknown.
Synthesized result.MAR error: 5.44 cms.
International Symposium of Robotics and Automation, August 25-27, 2004
More examples
Input intensity image Intensity edge map Initial range data Ground truth range
Synthesized result.MAR error: 7.54 cms.
Initial range data. 62% of range is unknown.
International Symposium of Robotics and Automation, August 25-27, 2004
Adding Surface Normals
• We compute the normals by fitting a plane (smooth surface) in windows of mxm pixels.
• Normal vector: Eigenvector with the smallest eigenvalue of the covariance matrix.
• Similarity is now computed between surface normals instead of range values.
International Symposium of Robotics and Automation, August 25-27, 2004
Adding Surface Normals
Ground truth range
Previous synthesized result
Initial range data
Synthesized result using surface normals
International Symposium of Robotics and Automation, August 25-27, 2004
Initial range scans
More Experimental Results
Synthesized range image Ground truth range
Edge map Real intensity image Initial range dataReal intensity image Edge map
International Symposium of Robotics and Automation, August 25-27, 2004
Initial range scans
More Experimental Results
Synthesized range image Ground truth range
Edge map Real intensity image Initial range dataReal intensity image Edge map
International Symposium of Robotics and Automation, August 25-27, 2004
Conclusions
• Works very well -- is this consistent?• Can be more robust than standard methods (e.g.
shape from shading) due to limited dependence on a priori reflectance assumptions.
• Depends on adequate amount of reliable range as input.
• Depends on statistical consistency of region to be constructed and region that has been measured.
International Symposium of Robotics and Automation, August 25-27, 2004
Discussion & Ongoing Work
• Surface normals are needed when the input range data do not capture the underlying structure
• Data from real robot – Issues: non-uniform scale, registration, correlation
on different type of data
– Integration of data from different viewpoints
International Symposium of Robotics and Automation, August 25-27, 2004
Questions ?