Download - Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Transcript
Page 1: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Digital Geometry Processing

Div. of Systems Science and InformaticsLaboratory of Digital Geometry Processing

Satoshi KANAI

- Robust Estimator-

1

Introduction

Robust Estimation− Why needed?− Least Squares Problem− Least Squares and Outliers − Hough Transform− Robust M-estimation− Weighted Least Squares− Iteratively Reweighted Least Squares (IRLS) Algorithm− Summery

Page 2: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

References

1. C.V.Stewart: “Robust Parameter Estimation in Computer Vision “, SIAM Review, 41(3), pp.513-537 (1999) (Available on Web)

2. J. Huber, Elvezio M. Ronchetti: “Robust Statistics” (Wiley Series in Probability and Statistics), Wiley, (2009).

3. 小柳義夫: 「ロバスト推定法とデータ解析」, 日本物理学会誌,34(10), pp.884-891, (1979) (CiNiiからPDFでダウンロード可)

2

Robust Estimation

Common computational problem in vision– to estimate parameters of a model from image – Ex. Fitting lines to a set of edge pixels,

Fitting 3D planar regions to laser-scanned point clouds

Key difficulties–Must fit the models to noisy image or point clouds–Initial guesses for the models must be generated automatically– Multiple occurrences of the models in one data–The data contains “outliers” (異常値) typically

(i.e. Observations that do not belong the modelbeing fitted.)

3

An outlier

Fit line

Page 3: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Least Squares Problem –Fit error estimation-4

L

Parameters of the model (L)

𝑐: Distance of L from the origin

𝑒

𝑒

𝑒 ∈ ∞, ∞

Available in 2D, 3D,4D, …..

Least Squares Problem –Minimizing the squared error-5

Page 4: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Least Squares Problem –Least Squares Solution(1)-6

By substituting Eq(6) into the line equation 𝒏 · 𝒙 𝑐 0 , 𝒏 · 𝒙 𝒎 0 shows that the optimal line must pass trough 𝒎 .

Least Squares Problem –Least Squares Solution(2)-7

M : 2x2 𝒙𝒌 ∈ ℜ3x3 𝒙𝒌 ∈ ℜ

Page 5: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Least Squares Problem –Least Squares Solution(3)-8

min

Note: The eigenvector n of M is generally not unique. Therefore, n which satisfies the normalized condition n 1 should be selected.

Least Squares Problem –Least Squares Solution (Summery)-9

𝜇: The smallest eigenvalue of M𝒏: The eigenvector of M for 𝜇

Page 6: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Least Squares Problem –Graphical Interpretations-10

L(Least square solution)

Eigenvector for the smallest eigenvalue = the direction of minimum data distribution

Eigenvector for the largest eigenvalue = the direction of maximum data distribution

Least Squares and Outliers11

Unfortunately, least squares(LS) solutions are sensitive to outliers

LS estimate is strongly influenced by the small cluster of outliers

LS approach is NOT suitablefor the case − fitting data with outliers,− multiple solutions are

expected.

Outliers

Page 7: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Robust M-estimation - Overview -12

𝜌 𝑒

𝑐6

1 1𝑒𝑐

𝑒 𝑐

𝑐6

𝑒 𝑐

Robust M-estimation - Typical estimators-13

estim

ator

r

estim

ator

r

Page 8: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Robust M-estimation - Influence Function-14

Given the estimator 𝜌 𝑒 , the influence function 𝜓 𝑒 is defined

as 𝜓 𝑒 (16)

𝜓 𝑒 gives the sensitivity of the overall estimate 𝒏, 𝑐 on data with error 𝑒.

Examples of the estimators, influence functions and weight functions 15

Page 9: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Objective Function Examples - LS and L1 -

Plotting the objective function 𝐸 𝒏 𝜃 , 𝑐 for 𝒏 𝜃 cos𝜃, sin𝜃

LS estimator provides a unique local minimum L1 estimator can have multiple local minima The significant influence of data with large errors

16

LS𝜌 𝑒 𝑒

L1𝜌 𝑒 𝑒

LS

L1

θ

The shape of E around the minima is “shallow”

Objective Function Examples - GM (with 𝜎 1) -17

𝜌 𝑒

The influence of a data with large error vanishes Objective function exhibits 3 strong minima (green lines) There are many weaker local minima (red lines)

Page 10: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Objective Function Examples - GM (with varying 𝜎) -

𝜌 𝑒

Increasing 𝜎 in the GM-estimator smooth the objective function

Fewer local minima for larger 𝜎.

As 𝜎 is increased, outliers receive greater influence

18

𝜎 2

𝜎 4

Robust M-estimation - Weight Function (1) -

It is sometimes convenient to re-express a robust estimator in a form resembling weighted least squares such as

𝐸 𝜽 ∑ 𝜌 𝑒 𝜽 → ∑ 𝑤 𝑒 𝑒 𝜽 (a)

where 𝜽 ∶ model parameter vector

The gradient of the objective function 𝐸 𝜽 ∑ 𝜌 𝑒 𝜽with respect to 𝜽 is described as

𝜽𝜽 ∑ 𝑒 ·

𝜽𝜽

∑ 𝑒 ·𝜽

· 𝑒 𝜽 ·𝜽

𝜽 (17)

If we put 𝑤 𝑒 𝑒 𝜓 𝑒 ,

𝜽

𝜽 ∑ 𝑤 𝑒 · 𝑒 𝜽 ·𝜽

𝜽 (18)

19

Page 11: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Robust M-estimation - Weight Function (2) -

From eq(18),

𝜽𝜽 ∑ 𝑤 𝑒 · 𝑒 𝜽 ·

𝜽𝜽 (b)

On the other hand, from eq(a) in the last slide, if we assume

𝐸 𝜽 ∑ 𝑤 𝑒 𝑒 𝜽 ,

𝜽𝜽 ∑

𝜽𝑤 𝑒 𝑒 𝜽

If we assume 𝑤 𝑒 is directly independent of 𝜽,

𝜽𝜽 ∑ 2𝑤 𝑒 · 𝑒 𝜽 ·

𝜽𝜽

∑ 𝑤 𝑒 · 𝑒 𝜽 ·𝜽

𝜽

It provides the same result as (b).

20

Robust M-estimation - Weight Function (3) -

Therefore, by taking the weight function for the error 𝑒 as

𝑤 𝑒 𝑒 𝜓 𝑒 ,

the estimation can be treated as a weighted least square

𝐸 𝜽 ∑ 𝑤 𝑒 𝑒 𝜽 → 𝑚𝑖𝑛.

21

For 𝜌 𝑒 𝑒 , the weights are constant, 𝑤 𝑒 2, and the estimation becomes LS.

Page 12: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Examples of the estimators, influence functions and weight functions 22

𝜌 𝑒

𝜓 𝑒

𝑤 𝑒

𝜌 𝑒 & 𝑤 𝑒 have opposite properties!

Examples of the estimators, influence functions and weight functions 23

𝜌 𝑒

𝜓 𝑒

𝑤 𝑒

Page 13: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Robust M-estimation - Weight Function (5) -

Back to the “robust line fitting” problem 𝐸 𝒏, 𝑐 → 𝑚𝑖𝑛, by extending the solution of slide 10, we use the weighted mean and weighted covariance matrix for M-estimation.

• 𝒎 ∑ 𝑤 𝑒 𝒙 (19)

• 𝑀 ∑ 𝑤 𝑒 𝒙 𝒎 𝒙 𝒎 (20)

where 𝑠 ≡ ∑ 𝑤 𝑒 and 𝑒 𝒏 𝒙 𝑐 .

Similar to eq(14), a necessary condition for 𝒏, 𝑐 is that𝑀𝒏 𝜇𝒏, 𝜇 : the smallest eigenvalue of 𝑀 (21)

𝑐 𝒏 , 𝒏 1 (22)

24

Robust M-estimation - Weight Function (6) -

A necessary condition for 𝒏, 𝑐 in robust estimation is that𝑀𝒏 𝜇𝒏, 𝜇 : the smallest eigenvalue of 𝑀 (21)

𝑐 𝒏 , 𝒏 1 (22)

Properties − 𝑀 typically depends on 𝒏, 𝑐 through the weights 𝑤 𝑒 , and

therefore eq(21) is a nonlinear eigenvalue problem for 𝑛.− Only for LS case, eq(21) becomes the linear eigenvalue problem.− Eq(22) implies the line must pass through the weighted mean 𝒎.

To solve this nonlinear problem, there are many ways. Iteratively reweighted least squares algorithm is often

effective.

25

Page 14: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Robust M-estimation - Iteratively Reweighted Least Squares (IRLS) Algorithm -

Based on the repeated update the parameters 𝒏, 𝑐from an initial guess 𝒏 , 𝑐

1. Evaluate the weights 𝑤 𝑤 𝒏 𝒙 𝑐 .2. Compute 𝒎 and 𝑀 in eqs(19) and (20).3. Set 𝒏 to be the eigenvector for the minimum eigenvalue of 𝑀 .4. Find 𝑐 by 𝑐 𝒏 𝒎.5. Reset 𝒏 ← 𝒏, 𝑐 ← 𝑐, and repeat from 1

until the update of 𝒏, 𝑐 becomes small.

Note that the updated estimate 𝒏, 𝑐 is not necessarily a local minima of eq(14), since 𝒎 and 𝑀 were computed using 𝒏 , 𝑐instead of 𝒏, 𝑐 .

The solution may depend on the initial guess 𝒏 , 𝑐

26

Robust M-estimation - Example Results -

Line fit using GM estimator 𝜌 𝑒

𝜎 std. deviation of the noise for the observations near the dominant line From two different initial guesses (green), Iterative Reweighted Least

Square algorithm were applied (black) to get the final fit lines (Blue) Left: Outliers were correctly rejected in the final fit. Right: Converged to an undesirable local minimum in the final fit.

27

Initial guess

Initial guess

The final fit The final fit

How can we avoid the dependency on the initial guess ?

outliersinliers

Page 15: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Robust M-estimation - Controlling Leverage - 28

A simple strategy for dealing with this problem1. Estimate the distribution of data

support along the fitted line (eg. project the weights 𝑤 from 𝒙 onto the line and blur).

2. Determine a longest contiguous region of support ( high weight interval ) along the fitted line. ( Find an interval of support without any large gaps)

3. Reduce the weights 𝑤 in the IRLS for points 𝒙 significantly outside the region of contiguous support.(eg. Set such weights to 0)

Initial guess

ReducedWeight region

ReducedWeight region

High weights are provided.

Robust M-estimation - Leverage Example (1)-29

Iteration 0 Iteration 1 Iteration 2

Iteration 3 Iteration 4 Iteration 5

Blue: Data with significant weightsGreen segment: Contiguous region of support

Page 16: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Robust M-estimation - Successive Estimation -30

Edge fitting problem on the edge image The first fitted segment can the be removed from the active data set,

and repeat the process. Satisfactory fitting results are obtained.

Summery of a Robust Line Estimation Algorithm

1. Initial GuessRandomly sample from the data. Set 𝒏 , 𝑐 according to the position and orientation of sampled edge pixels.

2. Iterative Fitting Use the iteratively reweighted least squares algorithm to fit the line parameters. Maintain information about the support (from the data) for the line along its length, and use it to down-weight data likely to cause leverage problems.

3. VerificationGiven a converged solution, decide whether that line has sufficient data support(e.g. consider the sum of the weights). Discard solutions with insufficient support.

4. Model SelectionFor each verified line segment, remove the edge pixels that provide significant support for it from the original data set.

Repeat steps 1 through 3.

31

Page 17: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Line Finder Examples32

Supplement – Choice of the scale parameter c for estimator function-

33

Est

imat

or ρ

𝜌 𝑒

𝑐6

1 1𝑒𝑐

𝑒 𝑐

𝑐6

𝑒 𝑐

Huber

Error e

Tukey

𝜌 𝑒𝑒 𝑒 𝑐

𝑐 2 𝑒 𝑐 𝑒 𝑐

cc

𝒄 𝟏. 𝟑𝟒𝟓 𝑖𝑓 variance 𝑒 1.0

𝒄 𝟒. 𝟔𝟖𝟓 𝑖𝑓 variance 𝑒 1.0

𝑠𝒎𝒆𝒅𝒊𝒂𝒏 𝒆

𝟎. 𝟔𝟕𝟒𝟓𝜌 𝑒 → 𝜌 𝑒/𝑠

𝐼𝑓 variance 𝑒 1.0

Page 18: Digital Geometry Processingsdm...Least Squares Problem –Least Squares Solution(1)-6 By substituting Eq(6) into the line equation 𝒏·𝒙𝑐 L0, 𝒏·𝒙𝒎 ; L0shows that

Other Robust Fitting Methods

Trimmed LS L-estimator, R-estimator LMedS (Least Median of Squares) RANSAC(Random Sampling and Consensus) RANSAC Families

-Robust: MSAC, AMLESAC, uMLESAC-Fast: Progressive RANSAC, NAPSAC, PROSAC, GASAC-Accurate: MLESAC, QDEGSAC, MAPSAC, MSAC,

34