Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By...

18
Digital Geometry Processing Div. of Systems Science and Informatics Laboratory of Digital Geometry Processing Satoshi KANAI - Robust Estimator- 1 Introduction Robust Estimation โˆ’ Why needed? โˆ’ Least Squares Problem โˆ’ Least Squares and Outliers โˆ’ Hough Transform โˆ’ Robust M-estimation โˆ’ Weighted Least Squares โˆ’ Iteratively Reweighted Least Squares (IRLS) Algorithm โˆ’ Summery

Transcript of Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By...

Page 1: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Digital Geometry Processing

Div. of Systems Science and InformaticsLaboratory of Digital Geometry Processing

Satoshi KANAI

- Robust Estimator-

1

Introduction

Robust Estimationโˆ’ Why needed?โˆ’ Least Squares Problemโˆ’ Least Squares and Outliers โˆ’ Hough Transformโˆ’ Robust M-estimationโˆ’ Weighted Least Squaresโˆ’ Iteratively Reweighted Least Squares (IRLS) Algorithmโˆ’ Summery

Page 2: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

References

1. C.V.Stewart: โ€œRobust Parameter Estimation in Computer Vision โ€œ, SIAM Review, 41(3), pp.513-537 (1999) (Available on Web)

2. J. Huber, Elvezio M. Ronchetti: โ€œRobust Statisticsโ€ (Wiley Series in Probability and Statistics), Wiley, (2009).

3. ๅฐๆŸณ็พฉๅคซ๏ผš ใ€Œใƒญใƒใ‚นใƒˆๆŽจๅฎšๆณ•ใจใƒ‡ใƒผใ‚ฟ่งฃๆžใ€, ๆ—ฅๆœฌ็‰ฉ็†ๅญฆไผš่ชŒ๏ผŒ34(10), pp.884-891, (1979) ๏ผˆCiNiiใ‹ใ‚‰PDFใงใƒ€ใ‚ฆใƒณใƒญใƒผใƒ‰ๅฏ๏ผ‰

2

Robust Estimation

Common computational problem in visionโ€“ to estimate parameters of a model from image โ€“ Ex. Fitting lines to a set of edge pixels,

Fitting 3D planar regions to laser-scanned point clouds

Key difficultiesโ€“Must fit the models to noisy image or point cloudsโ€“Initial guesses for the models must be generated automaticallyโ€“ Multiple occurrences of the models in one dataโ€“The data contains โ€œoutliersโ€ (็•ฐๅธธๅ€ค) typically

(i.e. Observations that do not belong the modelbeing fitted.)

3

An outlier

Fit line

Page 3: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Least Squares Problem โ€“Fit error estimation-4

L

Parameters of the model (L)

๐‘: Distance of L from the origin

๐‘’

๐‘’

๐‘’ โˆˆ โˆž, โˆž

Available in 2D, 3D,4D, โ€ฆ..

Least Squares Problem โ€“Minimizing the squared error-5

Page 4: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Least Squares Problem โ€“Least Squares Solution(1)-6

By substituting Eq(6) into the line equation ๐’ ยท ๐’™ ๐‘ 0 , ๐’ ยท ๐’™ ๐’Ž 0 shows that the optimal line must pass trough ๐’Ž .

Least Squares Problem โ€“Least Squares Solution(2)-7

M : 2x2 ๐’™๐’Œ โˆˆ โ„œ3x3 ๐’™๐’Œ โˆˆ โ„œ

Page 5: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Least Squares Problem โ€“Least Squares Solution(3)-8

min

Note: The eigenvector n of M is generally not unique. Therefore, n which satisfies the normalized condition n 1 should be selected.

Least Squares Problem โ€“Least Squares Solution (Summery)-9

๐œ‡: The smallest eigenvalue of M๐’: The eigenvector of M for ๐œ‡

Page 6: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Least Squares Problem โ€“Graphical Interpretations-10

L(Least square solution)

Eigenvector for the smallest eigenvalue = the direction of minimum data distribution

Eigenvector for the largest eigenvalue = the direction of maximum data distribution

Least Squares and Outliers11

Unfortunately, least squares(LS) solutions are sensitive to outliers

LS estimate is strongly influenced by the small cluster of outliers

LS approach is NOT suitablefor the case โˆ’ fitting data with outliers,โˆ’ multiple solutions are

expected.

Outliers

Page 7: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Robust M-estimation - Overview -12

๐œŒ ๐‘’

๐‘6

1 1๐‘’๐‘

๐‘’ ๐‘

๐‘6

๐‘’ ๐‘

Robust M-estimation - Typical estimators-13

estim

ator

r

estim

ator

r

Page 8: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Robust M-estimation - Influence Function-14

Given the estimator ๐œŒ ๐‘’ , the influence function ๐œ“ ๐‘’ is defined

as ๐œ“ ๐‘’ (16)

๐œ“ ๐‘’ gives the sensitivity of the overall estimate ๐’, ๐‘ on data with error ๐‘’.

Examples of the estimators, influence functions and weight functions 15

Page 9: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Objective Function Examples - LS and L1 -

Plotting the objective function ๐ธ ๐’ ๐œƒ , ๐‘ for ๐’ ๐œƒ cos๐œƒ, sin๐œƒ

LS estimator provides a unique local minimum L1 estimator can have multiple local minima The significant influence of data with large errors

16

LS๐œŒ ๐‘’ ๐‘’

L1๐œŒ ๐‘’ ๐‘’

LS

L1

ฮธ

The shape of E around the minima is โ€œshallowโ€

Objective Function Examples - GM (with ๐œŽ 1) -17

๐œŒ ๐‘’

The influence of a data with large error vanishes Objective function exhibits 3 strong minima (green lines) There are many weaker local minima (red lines)

Page 10: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Objective Function Examples - GM (with varying ๐œŽ) -

๐œŒ ๐‘’

Increasing ๐œŽ in the GM-estimator smooth the objective function

Fewer local minima for larger ๐œŽ.

As ๐œŽ is increased, outliers receive greater influence

18

๐œŽ 2

๐œŽ 4

Robust M-estimation - Weight Function (1) -

It is sometimes convenient to re-express a robust estimator in a form resembling weighted least squares such as

๐ธ ๐œฝ โˆ‘ ๐œŒ ๐‘’ ๐œฝ โ†’ โˆ‘ ๐‘ค ๐‘’ ๐‘’ ๐œฝ (a)

where ๐œฝ โˆถ model parameter vector

The gradient of the objective function ๐ธ ๐œฝ โˆ‘ ๐œŒ ๐‘’ ๐œฝwith respect to ๐œฝ is described as

๐œฝ๐œฝ โˆ‘ ๐‘’ ยท

๐œฝ๐œฝ

โˆ‘ ๐‘’ ยท๐œฝ

ยท ๐‘’ ๐œฝ ยท๐œฝ

๐œฝ (17)

If we put ๐‘ค ๐‘’ ๐‘’ ๐œ“ ๐‘’ ,

๐œฝ

๐œฝ โˆ‘ ๐‘ค ๐‘’ ยท ๐‘’ ๐œฝ ยท๐œฝ

๐œฝ (18)

19

Page 11: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Robust M-estimation - Weight Function (2) -

From eq(18),

๐œฝ๐œฝ โˆ‘ ๐‘ค ๐‘’ ยท ๐‘’ ๐œฝ ยท

๐œฝ๐œฝ (b)

On the other hand, from eq(a) in the last slide, if we assume

๐ธ ๐œฝ โˆ‘ ๐‘ค ๐‘’ ๐‘’ ๐œฝ ,

๐œฝ๐œฝ โˆ‘

๐œฝ๐‘ค ๐‘’ ๐‘’ ๐œฝ

If we assume ๐‘ค ๐‘’ is directly independent of ๐œฝ,

๐œฝ๐œฝ โˆ‘ 2๐‘ค ๐‘’ ยท ๐‘’ ๐œฝ ยท

๐œฝ๐œฝ

โˆ‘ ๐‘ค ๐‘’ ยท ๐‘’ ๐œฝ ยท๐œฝ

๐œฝ

It provides the same result as (b).

20

Robust M-estimation - Weight Function (3) -

Therefore, by taking the weight function for the error ๐‘’ as

๐‘ค ๐‘’ ๐‘’ ๐œ“ ๐‘’ ,

the estimation can be treated as a weighted least square

๐ธ ๐œฝ โˆ‘ ๐‘ค ๐‘’ ๐‘’ ๐œฝ โ†’ ๐‘š๐‘–๐‘›.

21

For ๐œŒ ๐‘’ ๐‘’ , the weights are constant, ๐‘ค ๐‘’ 2, and the estimation becomes LS.

Page 12: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Examples of the estimators, influence functions and weight functions 22

๐œŒ ๐‘’

๐œ“ ๐‘’

๐‘ค ๐‘’

๐œŒ ๐‘’ & ๐‘ค ๐‘’ have opposite properties!

Examples of the estimators, influence functions and weight functions 23

๐œŒ ๐‘’

๐œ“ ๐‘’

๐‘ค ๐‘’

Page 13: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Robust M-estimation - Weight Function (5) -

Back to the โ€œrobust line fittingโ€ problem ๐ธ ๐’, ๐‘ โ†’ ๐‘š๐‘–๐‘›, by extending the solution of slide 10, we use the weighted mean and weighted covariance matrix for M-estimation.

โ€ข ๐’Ž โˆ‘ ๐‘ค ๐‘’ ๐’™ (19)

โ€ข ๐‘€ โˆ‘ ๐‘ค ๐‘’ ๐’™ ๐’Ž ๐’™ ๐’Ž (20)

where ๐‘  โ‰ก โˆ‘ ๐‘ค ๐‘’ and ๐‘’ ๐’ ๐’™ ๐‘ .

Similar to eq(14), a necessary condition for ๐’, ๐‘ is that๐‘€๐’ ๐œ‡๐’, ๐œ‡ : the smallest eigenvalue of ๐‘€ (21)

๐‘ ๐’ , ๐’ 1 (22)

24

Robust M-estimation - Weight Function (6) -

A necessary condition for ๐’, ๐‘ in robust estimation is that๐‘€๐’ ๐œ‡๐’, ๐œ‡ : the smallest eigenvalue of ๐‘€ (21)

๐‘ ๐’ , ๐’ 1 (22)

Properties โˆ’ ๐‘€ typically depends on ๐’, ๐‘ through the weights ๐‘ค ๐‘’ , and

therefore eq(21) is a nonlinear eigenvalue problem for ๐‘›.โˆ’ Only for LS case, eq(21) becomes the linear eigenvalue problem.โˆ’ Eq(22) implies the line must pass through the weighted mean ๐’Ž.

To solve this nonlinear problem, there are many ways. Iteratively reweighted least squares algorithm is often

effective.

25

Page 14: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Robust M-estimation - Iteratively Reweighted Least Squares (IRLS) Algorithm -

Based on the repeated update the parameters ๐’, ๐‘from an initial guess ๐’ , ๐‘

1. Evaluate the weights ๐‘ค ๐‘ค ๐’ ๐’™ ๐‘ .2. Compute ๐’Ž and ๐‘€ in eqs(19) and (20).3. Set ๐’ to be the eigenvector for the minimum eigenvalue of ๐‘€ .4. Find ๐‘ by ๐‘ ๐’ ๐’Ž.5. Reset ๐’ โ† ๐’, ๐‘ โ† ๐‘, and repeat from 1

until the update of ๐’, ๐‘ becomes small.

Note that the updated estimate ๐’, ๐‘ is not necessarily a local minima of eq(14), since ๐’Ž and ๐‘€ were computed using ๐’ , ๐‘instead of ๐’, ๐‘ .

The solution may depend on the initial guess ๐’ , ๐‘

26

Robust M-estimation - Example Results -

Line fit using GM estimator ๐œŒ ๐‘’

๐œŽ std. deviation of the noise for the observations near the dominant line From two different initial guesses (green), Iterative Reweighted Least

Square algorithm were applied (black) to get the final fit lines (Blue) Left: Outliers were correctly rejected in the final fit. Right: Converged to an undesirable local minimum in the final fit.

27

Initial guess

Initial guess

The final fit The final fit

How can we avoid the dependency on the initial guess ?

outliersinliers

Page 15: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Robust M-estimation - Controlling Leverage - 28

A simple strategy for dealing with this problem1. Estimate the distribution of data

support along the fitted line (eg. project the weights ๐‘ค from ๐’™ onto the line and blur).

2. Determine a longest contiguous region of support ( high weight interval ) along the fitted line. ( Find an interval of support without any large gaps)

3. Reduce the weights ๐‘ค in the IRLS for points ๐’™ significantly outside the region of contiguous support.(eg. Set such weights to 0)

Initial guess

ReducedWeight region

ReducedWeight region

High weights are provided.

Robust M-estimation - Leverage Example (1)-29

Iteration 0 Iteration 1 Iteration 2

Iteration 3 Iteration 4 Iteration 5

Blue: Data with significant weightsGreen segment: Contiguous region of support

Page 16: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Robust M-estimation - Successive Estimation -30

Edge fitting problem on the edge image The first fitted segment can the be removed from the active data set,

and repeat the process. Satisfactory fitting results are obtained.

Summery of a Robust Line Estimation Algorithm

1. Initial GuessRandomly sample from the data. Set ๐’ , ๐‘ according to the position and orientation of sampled edge pixels.

2. Iterative Fitting Use the iteratively reweighted least squares algorithm to fit the line parameters. Maintain information about the support (from the data) for the line along its length, and use it to down-weight data likely to cause leverage problems.

3. VerificationGiven a converged solution, decide whether that line has sufficient data support(e.g. consider the sum of the weights). Discard solutions with insufficient support.

4. Model SelectionFor each verified line segment, remove the edge pixels that provide significant support for it from the original data set.

Repeat steps 1 through 3.

31

Page 17: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Line Finder Examples32

Supplement โ€“ Choice of the scale parameter c for estimator function-

33

Est

imat

or ฯ

๐œŒ ๐‘’

๐‘6

1 1๐‘’๐‘

๐‘’ ๐‘

๐‘6

๐‘’ ๐‘

Huber

Error e

Tukey

๐œŒ ๐‘’๐‘’ ๐‘’ ๐‘

๐‘ 2 ๐‘’ ๐‘ ๐‘’ ๐‘

cc

๐’„ ๐Ÿ. ๐Ÿ‘๐Ÿ’๐Ÿ“ ๐‘–๐‘“ variance ๐‘’ 1.0

๐’„ ๐Ÿ’. ๐Ÿ”๐Ÿ–๐Ÿ“ ๐‘–๐‘“ variance ๐‘’ 1.0

๐‘ ๐’Ž๐’†๐’…๐’Š๐’‚๐’ ๐’†

๐ŸŽ. ๐Ÿ”๐Ÿ•๐Ÿ’๐Ÿ“๐œŒ ๐‘’ โ†’ ๐œŒ ๐‘’/๐‘ 

๐ผ๐‘“ variance ๐‘’ 1.0

Page 18: Digital Geometry Processingsdm...Least Squares Problem โ€“Least Squares Solution(1)-6 By substituting Eq(6) into the line equation ๐’ยท๐’™๐‘ L0, ๐’ยท๐’™๐’Ž ; L0shows that

Other Robust Fitting Methods

Trimmed LS L-estimator, R-estimator LMedS (Least Median of Squares) RANSAC(Random Sampling and Consensus) RANSAC Families

-Robust: MSAC, AMLESAC, uMLESAC-Fast: Progressive RANSAC, NAPSAC, PROSAC, GASAC-Accurate: MLESAC, QDEGSAC, MAPSAC, MSAC,

34