Globally Optimal Estimates for Geometric Reconstruction Problems
description
Transcript of Globally Optimal Estimates for Geometric Reconstruction Problems
![Page 1: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/1.jpg)
Globally Optimal Estimates for Geometric Reconstruction
ProblemsTom Gilat, Adi Lakritz
Advanced Topics in Computer Vision Seminar
Faculty of Mathematics and Computer Science
Weizmann Institute
3 June 2007
![Page 2: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/2.jpg)
outline Motivation and Introduction
Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)
Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation
Application in vision Finding optimal structure Partial relaxation and Schur’s complement
![Page 3: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/3.jpg)
Motivation
Geometric Reconstruction Problems
Polynomial optimization problems (POPs)
![Page 4: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/4.jpg)
Triangulation problem in L2 norm
points image theof worldin the source the,for estimation
:Goal
pointes image ingcorrespond...
matrices camera 43 - ,...,
:Given
21
1
X
xxx
PP
n
n
Multi view - optimization
2x3x
2-views – exact solution
X
1x
![Page 5: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/5.jpg)
Triangulation problem in L2 norm
X
perspective camera i
X
x
yz
ix
0
'
i
iii XPx
ix' err
points image theof worldin the source the,for estimation
:Goal
pointes image ingcorrespond...
matrices camera 43 - ,...,
:Given
21
1
X
xxx
PP
n
n
![Page 6: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/6.jpg)
Triangulation problem in L2 norm
minimize reprojection error in all cameras
convexnon - ))(,()err( :functionerror 1
22
n
iii XPxdXL
convex - 0
cameras all offront in points all isdomain
i
X
x
PX
0 subject to )err(min i X Polynomial
minimization problem
Non convex
![Page 7: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/7.jpg)
More computer vision problems
• Reconstruction problem:known cameras, known corresponding pointsfind 3D points that minimize the projection error of given image points– Similar to triangulation for many points and cameras
• Calculating homographygiven 3D points on a plane and corresponding image points, calculate homography
• Many more problems
![Page 8: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/8.jpg)
Optimization problems
![Page 9: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/9.jpg)
Introduction to optimization problems
m 1 :f ,x
sconstraint– m 1 ,b (x) subject to
function objective– (x) minimize
ni
n
ii
0
iRRR
if
f
set feasiblein vectorsall amongsmallest is )~(
set feasible~ if optimal is ~
sconstraint esatisfy th that vectorsall set feasible
0 xf
x
x
![Page 10: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/10.jpg)
optimization problems
0 s.t. }1{),...,( thereareor
d?partitione be ,..., sequence can the
1
1
ii
in
n
n
xaxxx
aa
?0 }))1(()()({ min 222 i
iii
i xxaxf
NP - complete
![Page 11: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/11.jpg)
optimization problems
optimization
convex non convex
Linear
Programming
(LP)
SemiDefinite
Programming
(SDP)
solutions exist:
interior point methods
problems: local optimum
or high computational cost
![Page 12: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/12.jpg)
non convex optimization
init
level curves of f
Many algorithms
Get stuck in
local minima
MinMax
Non convex feasible set
![Page 13: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/13.jpg)
optimization problems
optimization
convex non convex
LP SDP
solutions exist:
interior point methods
problems: local optimum
or high computational cost
global optimization – algorithms that converge to optimal solution
relaxation of problem
![Page 14: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/14.jpg)
outline Motivation and Introduction
Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)
Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation
Application in vision Finding optimal structure Partial relaxation and Schur’s complement
![Page 15: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/15.jpg)
positive semidefinite (PSD) matrices
0)()(: 2 xAxAxAxAAxMxxAAM TTTTTTTT
0 MxxRx Tn
0by denoted M
Definition: a matrix M in Rn×n is PSD if
1. M is symmetric: M=MT
2. for all
M can be decomposed as AAT (Cholesky)
Proof:
s)eigenvalue enonnegativ and theorm(spectral :
![Page 16: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/16.jpg)
positive semidefinite (PSD) matrices
0 MxxRx Tn
0by denoted M
Definition: a matrix M in Rn×n is PSD if
1. M is symmetric: M=MT
2. for all
M can be decomposed as AAT (Cholesky)
nT RvvvM , then 1rank is M if
![Page 17: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/17.jpg)
principal minors
The kth order principal minors of an n×n symmetric matrix M are the
determinants of the k×k matrices obtained by deleting n - k rows and
the corresponding n - k columns of M
first order: elements on diagonal
second order:
333231
232221
131211
MMM
MMM
MMM
333231
232221
131211
MMM
MMM
MMM
![Page 18: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/18.jpg)
diagonal minors
The kth order principal minors of an n×n symmetric matrix M are the
determinants of the k×k matrices obtained by deleting n - k rows and
the corresponding n - k columns of M
first order: elements on diagonal
second order:
333231
232221
131211
MMM
MMM
MMM
333231
232221
131211
MMM
MMM
MMM
![Page 19: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/19.jpg)
diagonal minors
The kth order principal minors of an n×n symmetric matrix M are the
determinants of the k×k matrices obtained by deleting n - k rows and
the corresponding n - k columns of M
first order: elements on diagonal
second order:
333231
232221
131211
MMM
MMM
MMM
333231
232221
131211
MMM
MMM
MMM
![Page 20: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/20.jpg)
diagonal minors
enonnegativ are M of minors principal theall iff 0 Mmatrix A
The kth order principal minors of an n×n symmetric matrix M are the
determinants of the k×k matrices obtained by deleting n - k rows and
the corresponding n - k columns of M
first order: elements on diagonal
third order: det(M)
second order:
333231
232221
131211
MMM
MMM
MMM
333231
232221
131211
MMM
MMM
MMM
![Page 21: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/21.jpg)
Set of PSD matrices in 2D
0
zy
yx2 ,0, yxzzx
![Page 22: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/22.jpg)
Set of PSD matrices
This set is convex
00)(00,0 tMxMxtMxxtM TT
00)(
0, 0,
LMxLMxx
LxxMxxxLMT
TT
0)1( so LttM
Proof:
![Page 23: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/23.jpg)
LMI – linear matrix inequality
convex is }0|{Kset feasible A(x)Rx n
)(xAnR
matrices symmetrick k are A ,R)x,...,(xx
0)(
in
n1
10
n
iii AxAxA
matricesk k
KyttxyttxA
yAtxtA
A(x),A(y)tKx,y
)1(0))1((
0)()1()(
00,
:Proof
![Page 24: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/24.jpg)
LMI
0
10
02
1
100
011
010
001
001
111
100
020
001
21
221
1211
21
xx
xxx
xxxx
xx
example: find the feasible set of the 2D LMI
)(xA2A1A0A
![Page 25: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/25.jpg)
reminder
enonnegativ are M of minors principal theall iff 0 Mmatrix A
![Page 26: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/26.jpg)
LMI
0
10
02
1
100
011
010
001
001
111
100
020
001
21
221
1211
21
xx
xxx
xxxx
xx
01 ,02 ,01 221 xxx
example: find the feasible set of the 2D LMI
1st order principal minors
)(xA2A1A0A
![Page 27: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/27.jpg)
LMI
0
10
02
1
100
011
010
001
001
111
100
020
001
21
221
1211
21
xx
xxx
xxxx
xx
)(xA
example: find the feasible set of the 2D LMI
2A1A0A
0)1 )(1(
,0)1 )(2(
,0)()2 )(1(
2121
22
22121
xxx
xx
xxxx
2nd order principal minors
![Page 28: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/28.jpg)
LMI
0
10
02
1
100
011
010
001
001
111
100
020
001
21
221
1211
21
xx
xxx
xxxx
xx
)(xA
example: find the feasible set of the 2D LMI
2A1A0A
0)2 (
))()2 )(1)((1(
221
221211
xxx
xxxxx
3rd order principal minors
Intersection of all inequalities
![Page 29: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/29.jpg)
Semidefinite Programming (SDP) = LMI
an extension of LP
0)( tosub.x minimize :SDP T xb, BAxc
bAxxcT tosub. minimize :LP
![Page 30: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/30.jpg)
outline Motivation and Introduction
Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)
Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation
Application in vision Finding optimal structure Partial relaxation and Schur’s complement
![Page 31: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/31.jpg)
Sum Of Squares relaxation (SOS)
Unconstrained polynomial optimization problem (POP)
means the feasible set is Rn
H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, 2006.
![Page 32: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/32.jpg)
Sum Of Squares relaxation (SOS)
}polynomial a is )( and 0)(|{
:Define
xfRxxffN n
})()( s.t. )(),...,( spolynomial|{1
21
n
iin xgxfxgxgfSOS
NSOSNSOS ,
rare is \)( SOSNxf
![Page 33: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/33.jpg)
SOS relaxation for unconstrained polynomials
Npxfp )( s.t. max find :P')(min find :P xf
)(min is 0)(s.t max so
holdnot will thisn larger thay any for and
0)(
then)(min if
:Proof
x fpx fp
p
pxfx
x fp
![Page 34: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/34.jpg)
SOS relaxation for unconstrained polynomials
Npxfp )( s.t. max find :P')(min find :P xf
relaxation - )( s.t. max find :'P' SOSqxfq
)(xf
q
x
p
SDPby solves becan 'P'
: on bound guarantees qpp
![Page 35: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/35.jpg)
monomial basis
r
rnd(r)xvr ))(dim(
22
21
21
2
1
2
1
)(
x
xx
x
x
x
xv
T,x,xx,x,x,x
xxx-xx-xf(x)
)1)(6,4,5,3,2,1(
65432122
212121
2221
2121
then degree of is if )(rdr
T Ra(x) where vaf(x)rf(x) example
![Page 36: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/36.jpg)
SOS relaxation to SDP
}0)()({
})()({
}|))(({
}))(deg( )({
2r degree of spolynomial SOS ofset
)(
1
)(2
1
1
2
2
| VxVvxv
R | axvaaxv
Raxva
rxg|xg
SOS
rT
r
rdir
n
i
Tii
Tr
rdi
n
ir
Ti
i
n
ii
r
0V s.t. )()()( tosub. max xVvxvqxfq rT
rSDP
![Page 37: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/37.jpg)
SOS relaxation to SDP
0 ,
11
)(
22
21
21
2
1
666564636261
565554535251
464544434241
363534333231
262524232221
161514131211
22
21
21
2
1
V
x
xx
x
x
x
VVVVVV
VVVVVV
VVVVVV
VVVVVV
VVVVVV
VVVVVV
x
xx
x
x
x
qxf
T
0V 0,V else ,VV24 ,2V3- ,2V2 ,V-
tosub. max
ii5546131211 q
q
22
2121 432 xxx-xf(x) example:
SDP
![Page 38: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/38.jpg)
SOS for constrained POPs
possible to extend this method for constrained POPs
by use of generalized Lagrange dual
![Page 39: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/39.jpg)
SOS relaxation summary
So we know how to solve a POP that is a SOS
And we have a bound on a POP that is not an SOS
H. Waki, S. Kim, M. Kojima, and M. Muramatsu. SOS and SDP relaxations for POPs with structured sparsity. SIAM J. Optimization, 2006.
POP SOS
problem
SOS
relaxationGlobal
estimate
SDP
![Page 40: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/40.jpg)
relaxations
SOS:
POP SOS
problem
SOS
relaxationGlobal
estimate
SDP
POP linear & LMI
problem
LMI
relaxationGlobal
estimate
SDP +
converge
LMI:
![Page 41: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/41.jpg)
outline Motivation and Introduction
Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)
Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation
Application in vision Finding optimal structure Partial relaxation and Schur’s complement
![Page 42: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/42.jpg)
LMI relaxations
Constraints are handled
Convergence to optimum is guaranteed
Applies to all polynomials, not SOS as well
![Page 43: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/43.jpg)
A maximization problem
Note that:a.Feasible set is non-convex. b.Constraints are quadratic
Feasible set
01)(
0)(
023)(s.t.
)(max
213
21212
22
2121
20
xxxg
xxxxxg
xxxxg
xxg
![Page 44: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/44.jpg)
LMI – linear matrix inequality, a reminder
convex is }0|{Kset feasible A(x)Rx n
)(xAnR
matrices symmetrick k are A ,R)x,...,(xx
0)(
in
n1
10
n
iii AxAxA
matricesk k
![Page 45: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/45.jpg)
Goal
An SDP:
Motivation
bMx
AxAxA
xc
nn
T
0...s.t.
min
110
Polynomial Optimization
Problem
SDP with solution close
to global optimum of the
original problem
What is it good for?
SDP problems can be solved much more efficiently then general
optimization problems.
![Page 46: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/46.jpg)
POP
Linear + LMI + rank constraints
SDP
LMI:
LMI Relaxations is iterative process
Step 1: introduce new variables
Step 2: relax constraints
Apply higher
order
relaxations
![Page 47: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/47.jpg)
LMI relaxation – step 1
Replace monomials by “lifting variables”ji xx 21
Rule:
ijji yxx 21
02
11
20
01
10
22
21
21
2
1
y
y
y
y
y
x
xx
x
x
x
Example:
0)(0)( 110110221212 yyyxgxxxxxg
(the R2 case)
![Page 48: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/48.jpg)
01)(
0)(
023)(s.t.
)(max
213
21212
22
2121
20
xxxg
xxxxxg
xxxxg
xxg
01)(
0)(
023)(s.t.
)(max
113
1101102
0220011
010
yxg
yyyxg
yyyxg
yxg
Introducing lifting variables
Lifting
01)(
0)(
023)(s.t.
)(max
113
1101102
0220011
010
yxg
yyyxg
yyyxg
yxg
![Page 49: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/49.jpg)
Not equivalent to the original problem.
Lifting variables are not independent in the original problem:
New problem is linear, in particular convex
01)(
0)(
023)(s.t.
)(max
113
1101102
0220011
010
yxg
yyyxg
yyyxg
yxg
2111201110 xxyxyxy
100111 yyy
![Page 50: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/50.jpg)
Goal, more specifically
Linear problem
(obtained by lifing)
“relations constraints” on lifting variables
+
SDP
01)(
0)(
023)(s.t.
)(max
113
1101102
0220011
010
yxg
yyyxg
yyyxg
yxg
) :demand weexample,(For 100111 yyy
Relaxation
![Page 51: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/51.jpg)
Question: how do we guarantee that the relations between lifting variables hold?
on..... so and101020
100111
yyy
yyy
![Page 52: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/52.jpg)
LMI relaxation – step 2
021101
112010
01101
yyy
yyy
yy
MApply lifting and get:
Take the basis of the degree 1 polynomials.
Note that:
Txxxv ],,1[)( 211
0
1
)()(22212
21211
21
11
xxxx
xxxx
xx
xvxv T
Because: te)semidefini positive is(0 AAvvA T
![Page 53: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/53.jpg)
If the relations constraints hold then
0
1
021101
112010
0110
yyy
yyy
yy
M
This is because we can decompose M as follows:
],,1[],,1[
1
01100110
021101
112010
0110
yyyy
yyy
yyy
yy
M T
on...) so and ( 110110 yyy Assuming relations hold
Rank M = 1
![Page 54: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/54.jpg)
We’ve seen:
Relations constraints
hold
0
1
021101
112010
0110
yyy
yyy
yy
M
What about the opposite:
0
1
021101
112010
0110
yyy
yyy
yy
MRelations constraints
hold
1Mrank
1Mrank
This is true as well
![Page 55: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/55.jpg)
By the following:
LMI relaxation – step 2, continued
0
1
021101
112010
0110
yyy
yyy
yy
MRelations constraints
hold
1Mrank
1 and0 MRankMvvM T
All relations equalities are in the set
of equalities ijT
ij vvM ][][
![Page 56: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/56.jpg)
Conclusion of the analysis
01)(
0)(
023)(s.t.
)(max
113
1101102
0220011
010
yxg
yyyxg
yyyxg
yxg
021101
112010
01101
yyy
yyy
yy
M
1,0 MrankM The “y feasible set”Subset of feasible set with
Relations constraints hold
![Page 57: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/57.jpg)
Original problem is equivalent to the following:
together with the additional constraint
0)(
01)(
0)(
023)(s.t.
)(max
1
113
1101102
0220011
010
yM
yxg
yyyxg
yyyxg
yxg
021101
112010
0110
1
1
)(
yyy
yyy
yy
MyM
Relaxation, at last
We denote moment matrix
of order 1
1)(1 yMrank
Relax by dropping the non-convex constraint 1)(1 yMrank
![Page 58: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/58.jpg)
LMI relaxation of order 1
Feasible set
0)(
01)(
0)(
023)(s.t.
)(max
1
113
1101102
0220011
010
yM
yxg
yyyxg
yyyxg
yxg
![Page 59: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/59.jpg)
Rank constrained LMI vs. unconstrained
yx
xX
1
![Page 60: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/60.jpg)
LMI relaxations of higher order
It turns out that we can do better:
Apply LMI relaxations of higher order
A tighter SDP
Relaxations of higher order incorporate the inequality constraints in LMI
• We show relaxation of order 2
• It is possible to continue and apply relaxations
• Theory guarantees convergence to global optimum
![Page 61: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/61.jpg)
Let be a basis of polynomials
of degree 2.
Again,
Lifting gives:
Again, we will relax by dropping the rank constraint.
LMI relaxations of second order
Txxxxxxxv ],,,,,1[)( 2221
21212
0)()( 22 Txvxv
![Page 62: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/62.jpg)
Inequality constraints to LMIReplace our constraints by LMIs and have a tighter relaxation.
Linear Constraint : 01)( 213 xxxg 01)( 113 yygLifting
LiftingLMI Constraint : 0)()()( 113 Txvxvxg 0)( 31 ygM
For example, 211010111031231 )1()()( yyyyyygygM
![Page 63: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/63.jpg)
This procedure brings a new SDP
LMI relaxations of order 2
Second SDP feasible set is included in the first SDP feasible set
Silimarly, we can continue and apply higher order relaxations.
0)(
01)(
0)(
023)(s.t.
)(max
1
113
1101102
0220011
010
yM
yxg
yyyxg
yyyxg
yxg
0)(
,0)(,0)(,0)(s.t.
max
2
312111
01
yM
ygMygMygM
y
![Page 64: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/64.jpg)
If the feasible set defined by constraints is compact, then under mild additional assumptions, Lassere proved in 2001 that there is an asymptotic convergence guarantee:
0)( xgi
gpkklim
kp is the solution to k’th relaxation
g is the solution for the original problem (finding a maximum)
Theoretical basis for the LMI relaxations
Moreover, convergence is fast: kp is very close to g for small k
Lasserre J.B. (2001) "Global optimization with polynomials and the problem of moments" SIAM J. Optimization 11, pp 796--817.
![Page 65: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/65.jpg)
The method provides a
certificate of global optimality:
An important experimental observation:
SDP solution is global
optimum
Minimizing
Low rank moment matrix
Checking global optimality
1)( yMrank n
))(( yMtrace n
![Page 66: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/66.jpg)
We add to the objective function the trace of the moment matrix weighted by a sufficiently small positive scalar
0)(
,0)(,0)(,0)(s.t.
))((max
2
312111
201
yM
ygMygMygM
yMtracey
![Page 67: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/67.jpg)
LMI relaxations in vision
Application
![Page 68: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/68.jpg)
outline Motivation and Introduction
Background Positive SemiDefinite matrices (PSD) Linear Matrix Inequalities (LMI) SemiDefinite Programming (SDP)
Relaxations Sum Of Squares (SOS) relaxation Linear Matrix Inequalities (LMI) relaxation
Application in vision Finding optimal structure Partial relaxation and Schur’s complement
![Page 69: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/69.jpg)
point measured theis ~upoint dreprojecte theisu
Camera center
Image plane
Finding optimal structure A perspective camera
PUu
P is the camera matrix and is the depth.
Measured image points are corrupted by independent Gaussian noise.
We want to minimize the least squares errors between measured and projected points.
The relation between a U in the 3D space and u in the image plane is given by:
Niui ..1,~
![Page 70: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/70.jpg)
0)(..))(,~(min1
2
xtsxuud i
N
iii
),( d
We therefore have the following optimization problem:
is the Euclidean distance.
x is the set of unknowns
2
22
212
)(
)()())(,~(
x
xfxfxuud
i
iiii
)(),(),( 21 xxfxf iii Where are polynomials.
Each term in the cost function can be written as:
Our objective is therefore to minimize a sum of rational functions.
![Page 71: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/71.jpg)
How can we turn the optimization problem into a polynomial optimization problem?
Suppose that each term in
N
iii xuud
1
2))(,~( has an upper bound , theni
ii
ii
x
xfxf
2
22
21
)(
)()(
Then our optimization problem is equivalent to the following:
Nix
xxfxfts
i
iiii
N
..10)(
)()()(..
...min22
22
1
21
This is a polynomial optimization problem, for which we apply LMI relaxations.
Note that we introduced many new variables – one for each term.
![Page 72: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/72.jpg)
Problem: an SDP with a large number of variables can be computationally demanding.
A large number of variables can arise from:
LMI relaxations of high order
Introduction of new variables as we’ve seen
This is where partial relaxations come in.
For that we introduce Schur’s complement.
Partial Relaxations
![Page 73: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/73.jpg)
Schur’s comlement
Set:
i
i
i
i
i
C
xf
xfB
x
xA
)(
)(
)(0
0)(
2
1
2
2
0
0
C
CB
BAT
0
01
C
BABC T
![Page 74: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/74.jpg)
0)(
0
)()(
)()(0
)(0)(
21
22
12
x
xfxf
xfx
xfx
i
iii
ii
ii
0
0
C
CB
BAT
0
01
C
BABC T
Schur’s comlement - applying
0)(
)()()( 222
21
x
xxfxf
i
iiii
0)(
)(
)(0
0)()()(
2
1
2
2
21
xf
xf
x
xxfxf
i
i
i
iiii
Derivation of right side:C - BT * A-1 * B > 0
![Page 75: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/75.jpg)
Schur’s complement allows us to state our optimization problem as follows:
Nix
xfxf
xfx
xfx
ts
i
iii
ii
ii
N
..10)(
0
)()(
)()(0
)(0)(
..
...min
21
22
12
21
Partial relaxations
The only non-linearity is due to 2)(xi
We can apply LMI relaxations only on Txxx ,...],[ 21 and leave Nii ..1,
If we were to apply full relaxations for all variables, the problem would become
Intractable for small N.
![Page 76: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/76.jpg)
Partial relaxations
Disadvantage of partial relaxations: we are not able to ensure asymptotic convergence to the global optimum.
However, we have a numerical certificate of global optimality just as in the case of full relaxations:
The moment matrix of the relaxed variables is of rank one
Solution of partially relaxed problem is the global optimum
![Page 77: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/77.jpg)
Full relaxation vs. partial relaxtion
Application: Triangulation, 3 cameras
Goal: find the optimal 3D point. Camera matrices are known, measured point is assumed to be in the origin of each view.
Camera matrices:
![Page 78: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/78.jpg)
Summary• Geometric Vision Problems to POPs
– Triangulation and reconstruction problem
• Relaxations of POPs– Sum Of Squares (SOS) relaxation
• Guarantees bound on optimal solution
• Usually solution is optimal
– Linear Matrix Inequalities (LMI) relaxation• First order LMI relaxation: lifting, dropping rank constraint
• Higher order LMI relaxation: linear constraints to LMIs
• Guarantee of convergence, reference to Lassere
• Certificate of global optimality
• Application in vision• Finding optimal structure • Partial relaxation and Schur’s complement• Triangulation problem, benefit of partial relaxations
![Page 79: Globally Optimal Estimates for Geometric Reconstruction Problems](https://reader035.fdocuments.us/reader035/viewer/2022062422/5681417b550346895dad6a8d/html5/thumbnails/79.jpg)
References F. Kahl and D. Henrion. Globally Optimal Estimates for
Geometric Reconstruction Problems. Accepted IJCV H. Waki, S. Kim, M. Kojima, and M. Muramatsu. Sums of
squares and semidefinite programming relaxations for polynomial optimization problems with structured sparsity. SIAM J. Optimization, 17(1):218–242, 2006.
J. B. Lasserre. Global optimization with polynomials and the problem of moments. SIAM J. Optimization, 11:796–817, 2001.
S. Boyd and L. Vandenberghe. Convex Optimization. Cambridge University Press, 2004.
R. I. Hartley and A. Zisserman. Multiple View Geometry in Computer Vision. Cambridge University Press, 2004. Second Edition.