Digital Geometry Processingsdm...Least Squares Problem โLeast Squares Solution(1)-6 By...
Transcript of Digital Geometry Processingsdm...Least Squares Problem โLeast Squares Solution(1)-6 By...
Digital Geometry Processing
Div. of Systems Science and InformaticsLaboratory of Digital Geometry Processing
Satoshi KANAI
- Robust Estimator-
1
Introduction
Robust Estimationโ Why needed?โ Least Squares Problemโ Least Squares and Outliers โ Hough Transformโ Robust M-estimationโ Weighted Least Squaresโ Iteratively Reweighted Least Squares (IRLS) Algorithmโ Summery
References
1. C.V.Stewart: โRobust Parameter Estimation in Computer Vision โ, SIAM Review, 41(3), pp.513-537 (1999) (Available on Web)
2. J. Huber, Elvezio M. Ronchetti: โRobust Statisticsโ (Wiley Series in Probability and Statistics), Wiley, (2009).
3. ๅฐๆณ็พฉๅคซ๏ผ ใใญใในใๆจๅฎๆณใจใใผใฟ่งฃๆใ, ๆฅๆฌ็ฉ็ๅญฆไผ่ช๏ผ34(10), pp.884-891, (1979) ๏ผCiNiiใใPDFใงใใฆใณใญใผใๅฏ๏ผ
2
Robust Estimation
Common computational problem in visionโ to estimate parameters of a model from image โ Ex. Fitting lines to a set of edge pixels,
Fitting 3D planar regions to laser-scanned point clouds
Key difficultiesโMust fit the models to noisy image or point cloudsโInitial guesses for the models must be generated automaticallyโ Multiple occurrences of the models in one dataโThe data contains โoutliersโ (็ฐๅธธๅค) typically
(i.e. Observations that do not belong the modelbeing fitted.)
3
An outlier
Fit line
Least Squares Problem โFit error estimation-4
L
Parameters of the model (L)
๐: Distance of L from the origin
๐
๐
๐ โ โ, โ
Available in 2D, 3D,4D, โฆ..
Least Squares Problem โMinimizing the squared error-5
Least Squares Problem โLeast Squares Solution(1)-6
By substituting Eq(6) into the line equation ๐ ยท ๐ ๐ 0 , ๐ ยท ๐ ๐ 0 shows that the optimal line must pass trough ๐ .
Least Squares Problem โLeast Squares Solution(2)-7
M : 2x2 ๐๐ โ โ3x3 ๐๐ โ โ
Least Squares Problem โLeast Squares Solution(3)-8
min
Note: The eigenvector n of M is generally not unique. Therefore, n which satisfies the normalized condition n 1 should be selected.
Least Squares Problem โLeast Squares Solution (Summery)-9
๐: The smallest eigenvalue of M๐: The eigenvector of M for ๐
Least Squares Problem โGraphical Interpretations-10
L(Least square solution)
Eigenvector for the smallest eigenvalue = the direction of minimum data distribution
Eigenvector for the largest eigenvalue = the direction of maximum data distribution
Least Squares and Outliers11
Unfortunately, least squares(LS) solutions are sensitive to outliers
LS estimate is strongly influenced by the small cluster of outliers
LS approach is NOT suitablefor the case โ fitting data with outliers,โ multiple solutions are
expected.
Outliers
Robust M-estimation - Overview -12
๐ ๐
๐6
1 1๐๐
๐ ๐
๐6
๐ ๐
Robust M-estimation - Typical estimators-13
estim
ator
r
estim
ator
r
Robust M-estimation - Influence Function-14
Given the estimator ๐ ๐ , the influence function ๐ ๐ is defined
as ๐ ๐ (16)
๐ ๐ gives the sensitivity of the overall estimate ๐, ๐ on data with error ๐.
Examples of the estimators, influence functions and weight functions 15
Objective Function Examples - LS and L1 -
Plotting the objective function ๐ธ ๐ ๐ , ๐ for ๐ ๐ cos๐, sin๐
LS estimator provides a unique local minimum L1 estimator can have multiple local minima The significant influence of data with large errors
16
LS๐ ๐ ๐
L1๐ ๐ ๐
LS
L1
ฮธ
The shape of E around the minima is โshallowโ
Objective Function Examples - GM (with ๐ 1) -17
๐ ๐
The influence of a data with large error vanishes Objective function exhibits 3 strong minima (green lines) There are many weaker local minima (red lines)
Objective Function Examples - GM (with varying ๐) -
๐ ๐
Increasing ๐ in the GM-estimator smooth the objective function
Fewer local minima for larger ๐.
As ๐ is increased, outliers receive greater influence
18
๐ 2
๐ 4
Robust M-estimation - Weight Function (1) -
It is sometimes convenient to re-express a robust estimator in a form resembling weighted least squares such as
๐ธ ๐ฝ โ ๐ ๐ ๐ฝ โ โ ๐ค ๐ ๐ ๐ฝ (a)
where ๐ฝ โถ model parameter vector
The gradient of the objective function ๐ธ ๐ฝ โ ๐ ๐ ๐ฝwith respect to ๐ฝ is described as
๐ฝ๐ฝ โ ๐ ยท
๐ฝ๐ฝ
โ ๐ ยท๐ฝ
ยท ๐ ๐ฝ ยท๐ฝ
๐ฝ (17)
If we put ๐ค ๐ ๐ ๐ ๐ ,
๐ฝ
๐ฝ โ ๐ค ๐ ยท ๐ ๐ฝ ยท๐ฝ
๐ฝ (18)
19
Robust M-estimation - Weight Function (2) -
From eq(18),
๐ฝ๐ฝ โ ๐ค ๐ ยท ๐ ๐ฝ ยท
๐ฝ๐ฝ (b)
On the other hand, from eq(a) in the last slide, if we assume
๐ธ ๐ฝ โ ๐ค ๐ ๐ ๐ฝ ,
๐ฝ๐ฝ โ
๐ฝ๐ค ๐ ๐ ๐ฝ
If we assume ๐ค ๐ is directly independent of ๐ฝ,
๐ฝ๐ฝ โ 2๐ค ๐ ยท ๐ ๐ฝ ยท
๐ฝ๐ฝ
โ ๐ค ๐ ยท ๐ ๐ฝ ยท๐ฝ
๐ฝ
It provides the same result as (b).
20
Robust M-estimation - Weight Function (3) -
Therefore, by taking the weight function for the error ๐ as
๐ค ๐ ๐ ๐ ๐ ,
the estimation can be treated as a weighted least square
๐ธ ๐ฝ โ ๐ค ๐ ๐ ๐ฝ โ ๐๐๐.
21
For ๐ ๐ ๐ , the weights are constant, ๐ค ๐ 2, and the estimation becomes LS.
Examples of the estimators, influence functions and weight functions 22
๐ ๐
๐ ๐
๐ค ๐
๐ ๐ & ๐ค ๐ have opposite properties!
Examples of the estimators, influence functions and weight functions 23
๐ ๐
๐ ๐
๐ค ๐
Robust M-estimation - Weight Function (5) -
Back to the โrobust line fittingโ problem ๐ธ ๐, ๐ โ ๐๐๐, by extending the solution of slide 10, we use the weighted mean and weighted covariance matrix for M-estimation.
โข ๐ โ ๐ค ๐ ๐ (19)
โข ๐ โ ๐ค ๐ ๐ ๐ ๐ ๐ (20)
where ๐ โก โ ๐ค ๐ and ๐ ๐ ๐ ๐ .
Similar to eq(14), a necessary condition for ๐, ๐ is that๐๐ ๐๐, ๐ : the smallest eigenvalue of ๐ (21)
๐ ๐ , ๐ 1 (22)
24
Robust M-estimation - Weight Function (6) -
A necessary condition for ๐, ๐ in robust estimation is that๐๐ ๐๐, ๐ : the smallest eigenvalue of ๐ (21)
๐ ๐ , ๐ 1 (22)
Properties โ ๐ typically depends on ๐, ๐ through the weights ๐ค ๐ , and
therefore eq(21) is a nonlinear eigenvalue problem for ๐.โ Only for LS case, eq(21) becomes the linear eigenvalue problem.โ Eq(22) implies the line must pass through the weighted mean ๐.
To solve this nonlinear problem, there are many ways. Iteratively reweighted least squares algorithm is often
effective.
25
Robust M-estimation - Iteratively Reweighted Least Squares (IRLS) Algorithm -
Based on the repeated update the parameters ๐, ๐from an initial guess ๐ , ๐
1. Evaluate the weights ๐ค ๐ค ๐ ๐ ๐ .2. Compute ๐ and ๐ in eqs(19) and (20).3. Set ๐ to be the eigenvector for the minimum eigenvalue of ๐ .4. Find ๐ by ๐ ๐ ๐.5. Reset ๐ โ ๐, ๐ โ ๐, and repeat from 1
until the update of ๐, ๐ becomes small.
Note that the updated estimate ๐, ๐ is not necessarily a local minima of eq(14), since ๐ and ๐ were computed using ๐ , ๐instead of ๐, ๐ .
The solution may depend on the initial guess ๐ , ๐
26
Robust M-estimation - Example Results -
Line fit using GM estimator ๐ ๐
๐ std. deviation of the noise for the observations near the dominant line From two different initial guesses (green), Iterative Reweighted Least
Square algorithm were applied (black) to get the final fit lines (Blue) Left: Outliers were correctly rejected in the final fit. Right: Converged to an undesirable local minimum in the final fit.
27
Initial guess
Initial guess
The final fit The final fit
How can we avoid the dependency on the initial guess ?
outliersinliers
Robust M-estimation - Controlling Leverage - 28
A simple strategy for dealing with this problem1. Estimate the distribution of data
support along the fitted line (eg. project the weights ๐ค from ๐ onto the line and blur).
2. Determine a longest contiguous region of support ( high weight interval ) along the fitted line. ( Find an interval of support without any large gaps)
3. Reduce the weights ๐ค in the IRLS for points ๐ significantly outside the region of contiguous support.(eg. Set such weights to 0)
Initial guess
ReducedWeight region
ReducedWeight region
High weights are provided.
Robust M-estimation - Leverage Example (1)-29
Iteration 0 Iteration 1 Iteration 2
Iteration 3 Iteration 4 Iteration 5
Blue: Data with significant weightsGreen segment: Contiguous region of support
Robust M-estimation - Successive Estimation -30
Edge fitting problem on the edge image The first fitted segment can the be removed from the active data set,
and repeat the process. Satisfactory fitting results are obtained.
Summery of a Robust Line Estimation Algorithm
1. Initial GuessRandomly sample from the data. Set ๐ , ๐ according to the position and orientation of sampled edge pixels.
2. Iterative Fitting Use the iteratively reweighted least squares algorithm to fit the line parameters. Maintain information about the support (from the data) for the line along its length, and use it to down-weight data likely to cause leverage problems.
3. VerificationGiven a converged solution, decide whether that line has sufficient data support(e.g. consider the sum of the weights). Discard solutions with insufficient support.
4. Model SelectionFor each verified line segment, remove the edge pixels that provide significant support for it from the original data set.
Repeat steps 1 through 3.
31
Line Finder Examples32
Supplement โ Choice of the scale parameter c for estimator function-
33
Est
imat
or ฯ
๐ ๐
๐6
1 1๐๐
๐ ๐
๐6
๐ ๐
Huber
Error e
Tukey
๐ ๐๐ ๐ ๐
๐ 2 ๐ ๐ ๐ ๐
cc
๐ ๐. ๐๐๐ ๐๐ variance ๐ 1.0
๐ ๐. ๐๐๐ ๐๐ variance ๐ 1.0
๐ ๐๐๐ ๐๐๐ ๐
๐. ๐๐๐๐๐ ๐ โ ๐ ๐/๐
๐ผ๐ variance ๐ 1.0
Other Robust Fitting Methods
Trimmed LS L-estimator, R-estimator LMedS (Least Median of Squares) RANSAC(Random Sampling and Consensus) RANSAC Families
-Robust: MSAC, AMLESAC, uMLESAC-Fast: Progressive RANSAC, NAPSAC, PROSAC, GASAC-Accurate: MLESAC, QDEGSAC, MAPSAC, MSAC,
34