3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo...

49
3D Sensing 3D Shape from X Perspective Geometry • Camera Model • Camera Calibration • General Stereo Triangulation • 3D Reconstruction

Transcript of 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo...

Page 1: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

3D Sensingbull 3D Shape from X

bull Perspective Geometry

bull Camera Model

bull Camera Calibration

bull General Stereo Triangulation

bull 3D Reconstruction

3D Shape from X

bull shadingbull silhouettebull texture

bull stereo bull light stripingbull motion

mainly research

used in practice

Perspective Imaging Model 1D

xi

xf

f

This is the axis of the real image plane

O O is the center of projection

This is the axis of the frontimage plane which we usezc

xc

xi xc

f zc

=

camera lens

3D objectpoint

B

D

E

image of pointB in front image

real imagepoint

Perspective in 2D(Simplified)

P=(xcyczc) =(xwywzw)

3D object point

xc

yc

zw=zc

yi

Yc

Xc

Zc

xi

F f

cameraPacute=(xiyif)

xi xc

f zc

yi yc

f zc

=

=

xi = (fzc)xc

yi = (fzc)ycHere camera coordinatesequal world coordinates

opticalaxis

ray

3D from Stereo

left image right image

3D point

disparity the difference in image location of the same 3Dpoint when projected under perspective to two different cameras

d = xleft - xright

Depth Perception from StereoSimple Model Parallel Optic Axes

f

f

L

R

camera

baselinecamera

b

P=(xz)

Z

X

image plane

xl

xr

z

z xf xl

=

x-b

z x-bf xr

= z y yf yl yr

= =y-axis is

perpendicularto the page

Resultant Depth Calculation

For stereo cameras with parallel optical axes focal length fbaseline b corresponding image points (xlyl) and (xryr)with disparity d

z = fb (xl - xr) = fbd

x = xlzf or b + xrzf

y = ylzf or yrzf

This method ofdetermining depthfrom disparity is called triangulation

Finding Correspondences

bull If the correspondence is correct triangulation works VERY well

bull But correspondence finding is not perfectly solved (What methods have we studied)

bull For some very specific applications it can be solved for those specific kind of images eg windshield of a car

deg deg

3 Main Matching Methods

1 Cross correlation using small windows

2 Symbolic feature matching usually using segmentscorners

3 Use the newer interest operators ie SIFT

dense

sparse

sparse

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 2: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

3D Shape from X

bull shadingbull silhouettebull texture

bull stereo bull light stripingbull motion

mainly research

used in practice

Perspective Imaging Model 1D

xi

xf

f

This is the axis of the real image plane

O O is the center of projection

This is the axis of the frontimage plane which we usezc

xc

xi xc

f zc

=

camera lens

3D objectpoint

B

D

E

image of pointB in front image

real imagepoint

Perspective in 2D(Simplified)

P=(xcyczc) =(xwywzw)

3D object point

xc

yc

zw=zc

yi

Yc

Xc

Zc

xi

F f

cameraPacute=(xiyif)

xi xc

f zc

yi yc

f zc

=

=

xi = (fzc)xc

yi = (fzc)ycHere camera coordinatesequal world coordinates

opticalaxis

ray

3D from Stereo

left image right image

3D point

disparity the difference in image location of the same 3Dpoint when projected under perspective to two different cameras

d = xleft - xright

Depth Perception from StereoSimple Model Parallel Optic Axes

f

f

L

R

camera

baselinecamera

b

P=(xz)

Z

X

image plane

xl

xr

z

z xf xl

=

x-b

z x-bf xr

= z y yf yl yr

= =y-axis is

perpendicularto the page

Resultant Depth Calculation

For stereo cameras with parallel optical axes focal length fbaseline b corresponding image points (xlyl) and (xryr)with disparity d

z = fb (xl - xr) = fbd

x = xlzf or b + xrzf

y = ylzf or yrzf

This method ofdetermining depthfrom disparity is called triangulation

Finding Correspondences

bull If the correspondence is correct triangulation works VERY well

bull But correspondence finding is not perfectly solved (What methods have we studied)

bull For some very specific applications it can be solved for those specific kind of images eg windshield of a car

deg deg

3 Main Matching Methods

1 Cross correlation using small windows

2 Symbolic feature matching usually using segmentscorners

3 Use the newer interest operators ie SIFT

dense

sparse

sparse

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 3: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Perspective Imaging Model 1D

xi

xf

f

This is the axis of the real image plane

O O is the center of projection

This is the axis of the frontimage plane which we usezc

xc

xi xc

f zc

=

camera lens

3D objectpoint

B

D

E

image of pointB in front image

real imagepoint

Perspective in 2D(Simplified)

P=(xcyczc) =(xwywzw)

3D object point

xc

yc

zw=zc

yi

Yc

Xc

Zc

xi

F f

cameraPacute=(xiyif)

xi xc

f zc

yi yc

f zc

=

=

xi = (fzc)xc

yi = (fzc)ycHere camera coordinatesequal world coordinates

opticalaxis

ray

3D from Stereo

left image right image

3D point

disparity the difference in image location of the same 3Dpoint when projected under perspective to two different cameras

d = xleft - xright

Depth Perception from StereoSimple Model Parallel Optic Axes

f

f

L

R

camera

baselinecamera

b

P=(xz)

Z

X

image plane

xl

xr

z

z xf xl

=

x-b

z x-bf xr

= z y yf yl yr

= =y-axis is

perpendicularto the page

Resultant Depth Calculation

For stereo cameras with parallel optical axes focal length fbaseline b corresponding image points (xlyl) and (xryr)with disparity d

z = fb (xl - xr) = fbd

x = xlzf or b + xrzf

y = ylzf or yrzf

This method ofdetermining depthfrom disparity is called triangulation

Finding Correspondences

bull If the correspondence is correct triangulation works VERY well

bull But correspondence finding is not perfectly solved (What methods have we studied)

bull For some very specific applications it can be solved for those specific kind of images eg windshield of a car

deg deg

3 Main Matching Methods

1 Cross correlation using small windows

2 Symbolic feature matching usually using segmentscorners

3 Use the newer interest operators ie SIFT

dense

sparse

sparse

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 4: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Perspective in 2D(Simplified)

P=(xcyczc) =(xwywzw)

3D object point

xc

yc

zw=zc

yi

Yc

Xc

Zc

xi

F f

cameraPacute=(xiyif)

xi xc

f zc

yi yc

f zc

=

=

xi = (fzc)xc

yi = (fzc)ycHere camera coordinatesequal world coordinates

opticalaxis

ray

3D from Stereo

left image right image

3D point

disparity the difference in image location of the same 3Dpoint when projected under perspective to two different cameras

d = xleft - xright

Depth Perception from StereoSimple Model Parallel Optic Axes

f

f

L

R

camera

baselinecamera

b

P=(xz)

Z

X

image plane

xl

xr

z

z xf xl

=

x-b

z x-bf xr

= z y yf yl yr

= =y-axis is

perpendicularto the page

Resultant Depth Calculation

For stereo cameras with parallel optical axes focal length fbaseline b corresponding image points (xlyl) and (xryr)with disparity d

z = fb (xl - xr) = fbd

x = xlzf or b + xrzf

y = ylzf or yrzf

This method ofdetermining depthfrom disparity is called triangulation

Finding Correspondences

bull If the correspondence is correct triangulation works VERY well

bull But correspondence finding is not perfectly solved (What methods have we studied)

bull For some very specific applications it can be solved for those specific kind of images eg windshield of a car

deg deg

3 Main Matching Methods

1 Cross correlation using small windows

2 Symbolic feature matching usually using segmentscorners

3 Use the newer interest operators ie SIFT

dense

sparse

sparse

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 5: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

3D from Stereo

left image right image

3D point

disparity the difference in image location of the same 3Dpoint when projected under perspective to two different cameras

d = xleft - xright

Depth Perception from StereoSimple Model Parallel Optic Axes

f

f

L

R

camera

baselinecamera

b

P=(xz)

Z

X

image plane

xl

xr

z

z xf xl

=

x-b

z x-bf xr

= z y yf yl yr

= =y-axis is

perpendicularto the page

Resultant Depth Calculation

For stereo cameras with parallel optical axes focal length fbaseline b corresponding image points (xlyl) and (xryr)with disparity d

z = fb (xl - xr) = fbd

x = xlzf or b + xrzf

y = ylzf or yrzf

This method ofdetermining depthfrom disparity is called triangulation

Finding Correspondences

bull If the correspondence is correct triangulation works VERY well

bull But correspondence finding is not perfectly solved (What methods have we studied)

bull For some very specific applications it can be solved for those specific kind of images eg windshield of a car

deg deg

3 Main Matching Methods

1 Cross correlation using small windows

2 Symbolic feature matching usually using segmentscorners

3 Use the newer interest operators ie SIFT

dense

sparse

sparse

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 6: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Depth Perception from StereoSimple Model Parallel Optic Axes

f

f

L

R

camera

baselinecamera

b

P=(xz)

Z

X

image plane

xl

xr

z

z xf xl

=

x-b

z x-bf xr

= z y yf yl yr

= =y-axis is

perpendicularto the page

Resultant Depth Calculation

For stereo cameras with parallel optical axes focal length fbaseline b corresponding image points (xlyl) and (xryr)with disparity d

z = fb (xl - xr) = fbd

x = xlzf or b + xrzf

y = ylzf or yrzf

This method ofdetermining depthfrom disparity is called triangulation

Finding Correspondences

bull If the correspondence is correct triangulation works VERY well

bull But correspondence finding is not perfectly solved (What methods have we studied)

bull For some very specific applications it can be solved for those specific kind of images eg windshield of a car

deg deg

3 Main Matching Methods

1 Cross correlation using small windows

2 Symbolic feature matching usually using segmentscorners

3 Use the newer interest operators ie SIFT

dense

sparse

sparse

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 7: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Resultant Depth Calculation

For stereo cameras with parallel optical axes focal length fbaseline b corresponding image points (xlyl) and (xryr)with disparity d

z = fb (xl - xr) = fbd

x = xlzf or b + xrzf

y = ylzf or yrzf

This method ofdetermining depthfrom disparity is called triangulation

Finding Correspondences

bull If the correspondence is correct triangulation works VERY well

bull But correspondence finding is not perfectly solved (What methods have we studied)

bull For some very specific applications it can be solved for those specific kind of images eg windshield of a car

deg deg

3 Main Matching Methods

1 Cross correlation using small windows

2 Symbolic feature matching usually using segmentscorners

3 Use the newer interest operators ie SIFT

dense

sparse

sparse

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 8: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Finding Correspondences

bull If the correspondence is correct triangulation works VERY well

bull But correspondence finding is not perfectly solved (What methods have we studied)

bull For some very specific applications it can be solved for those specific kind of images eg windshield of a car

deg deg

3 Main Matching Methods

1 Cross correlation using small windows

2 Symbolic feature matching usually using segmentscorners

3 Use the newer interest operators ie SIFT

dense

sparse

sparse

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 9: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

3 Main Matching Methods

1 Cross correlation using small windows

2 Symbolic feature matching usually using segmentscorners

3 Use the newer interest operators ie SIFT

dense

sparse

sparse

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 10: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Epipolar Geometry Constraint1 Normal Pair of Images

x

y1

y2

z1 z2

C1 C2b

P

P1 P2

epipolarplane

The epipolar plane cuts through the image plane(s)forming 2 epipolar lines

The match for P1 (or P2) in the other image must lie on the same epipolar line

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 11: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Epipolar GeometryGeneral Case

P

P1

P2

y1

y2

x1

x2

e1

e2

C1

C2

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 12: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Constraints

P

e1

e2

C1

C2

1 Epipolar Constraint Matching points lie on corresponding epipolar lines

2 Ordering Constraint Usually in the same order across the lines

Q

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 13: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Structured Light

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

camera

light stripe

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 14: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Structured Light3D Computation

3D data can also be derived using

bull a single camera

bull a light source that can produce stripe(s) on the 3D object

lightsource

x axisf

(xacuteyacutef)

3D point(x y z)

b

b[x y z] = --------------- [xacute yacute f] f cot - xacute

(000)

3D image

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 15: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Depth from Multiple Light Stripes

What are these objects

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 16: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Our (former) System4-camera light-striping stereo

projector

rotationtable

cameras

3Dobject

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 17: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Camera Model Recall there are 5 Different Frames of Reference

bull Object

bull World

bull Camera

bull Real Image

bull Pixel Image

yc

xc

zc

zwC

Wyw

xw

A

a

xf

yf

xp

yp

zppyramidobject

image

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 18: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

The Camera Model

How do we get an image point IP from a world point P

c11 c12 c13 c14

c21 c22 c23 c24

c31 c32 c33 1

s IPr

s IPc

s

Px

Py

Pz

1

=

imagepoint

camera matrix C worldpoint

Whatrsquos in C

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 19: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates

1 CP = T R WP2 FP = (f) CP

s FPx

s FPy

s FPz

s

1 0 0 00 1 0 00 0 1 00 0 1f 0

CPx

CPy

CPz

1

=

perspectivetransformation

imagepoint

3D point incamera

coordinates

Why is there not a scale factor here

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 20: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Camera Calibration

bull In order work in 3D we need to know the parameters of the particular camera setup

bull Solving for the camera parameters is called calibration

yw

xwzw

W

yc

xc

zc

C

bull intrinsic parameters are of the camera device

bull extrinsic parameters are where the camera sits in the world

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 21: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Intrinsic Parameters

bull principal point (u0v0)

bull scale factors (dxdy)

bull aspect ratio distortion factor

bull focal length f

bull lens distortion factor (models radial lens distortion)

C

(u0v0)

f

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 22: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Extrinsic Parameters

bull translation parameters t = [tx ty tz]

bull rotation matrix

r11 r12 r13 0r21 r22 r23 0r31 r32 r33 00 0 0 1

R = Are there reallynine parameters

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 23: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Calibration Object

The idea is to snapimages at differentdepths and get alot of 2D-3D pointcorrespondences

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 24: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

The Tsai Procedure

bull The Tsai procedure was developed by Roger Tsai at IBM Research and is most widely used

bull Several images are taken of the calibration object yielding point correspondences at different distances

bull Tsairsquos algorithm requires n gt 5 correspondences

(xi yi zi) (ui vi)) | i = 1hellipn

between (real) image points and 3D points

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 25: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

In this version of Tsairsquos algorithm

bull The real-valued (uv) are computed from their pixel positions (rc)

u = dx (c-u0) v = -dy (r - v0)

where

- (u0v0) is the center of the image

- dx and dy are the center-to-center (real) distances between pixels and come from the camerarsquos specs

- is a scale factor learned from previous trials

This version is for single-plane calibration

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 26: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Tsairsquos Procedure

1 Given the n point correspondences ((xiyizi) (uivi))

Compute matrix A with rows ai

ai = (vixi viyi -uixi -uivi vi)

These are known quantities which will be used to solve for intermediate values which will then beused to solve for the parameters sought

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 27: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Intermediate Unknowns

2 The vector of unknowns is = (1 2 3 4 5)

1=r11ty 2=r12ty 3=r21ty 4=r22ty 5=txty

where the rrsquos and trsquos are unknown rotation and translation parameters

3 Let vector b = (u1u2hellipun) contain the u image coordinates

4 Solve the system of linear equations

A = b

for unknown parameter vector

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 28: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Use to solve for ty tx and 4 rotation parameters

5 Let U = 12 + 2

2 + 32 + 4

2 Use U to calculate ty

2

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 29: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

6 Try the positive square root ty = (t 2 ) and use it to compute translation and rotation parameters

12

r11 = 1 ty

r12 = 2 ty

r21 = 3 ty

r22 = 4 ty

tx = 5 ty

Now we know 2 translation parameters and4 rotation parameters

excepthellip

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 30: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Determine true sign of ty and computeremaining rotation parameters

7 Select an object point P whose image coordinates (uv) are far from the image center

8 Use Prsquos coordinates and the translation and rotation parameters so far to estimate the image point that corresponds to P

If its coordinates have the same signs as (uv) then keep ty else negate it

9 Use the first 4 rotation parameters to calculate the remaining 5

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 31: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Calculating the remaining 5 rotation parameters

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 32: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Solve another linear system

10 We have tx and ty and the 9 rotation parameters Next step is to find tz and f

Form a matrix Aacute whose rows are

aiacute = (r21xi + r22yi + ty vi)

and a vector bacute whose rows are

biacute = (r31xi + r32yi) vi

11 Solve Aacutev = bacute for v = (f tz)

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 33: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Almost there

12 If f is negative change signs (see text)

13 Compute the lens distortion factor and improve the estimates for f and tz by solving a nonlinear system of equations by a nonlinear regression

14 All parameters have been computed Use them in 3D data acquisition systems

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 34: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

We use them for general stereo

P

P1=(r1c1)P2=(r2c2)

y1

y2

x1

x2

e1

e2

B

C

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 35: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

For a correspondence (r1c1) inimage 1 to (r2c2) in image 2

1 Both cameras were calibrated Both camera matrices are then known From the two camera equations B and C we get 4 linear equations in 3 unknowns

r1 = (b11 - b31r1)x + (b12 - b32r1)y + (b13-b33r1)zc1 = (b21 - b31c1)x + (b22 - b32c1)y + (b23-b33c1)z

r2 = (c11 - c31r2)x + (c12 - c32r2)y + (c13 - c33r2)zc2 = (c21 - c31c2)x + (c22 - c32c2)y + (c23 - c33c2)z

Direct solution uses 3 equations wonrsquot give reliable results

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 36: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.

Solve by computing the closestapproach of the two skew rays

V

If the rays intersected perfectly in 3D the intersection would be PInstead we solve for the shortest line segment connecting the two rays and let P be its midpoint

P1

Q1

Psolve forshortest

V = (P1 + a1u1) ndash (Q1 + a2u2)

(P1 + a1u1) ndash (Q1 + a2u2) u1 = 0(P1 + a1u1) ndash (Q1 + a2u2) u2 = 0

u1

u2

  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49
Page 37: 3D Sensing 3D Shape from X Perspective Geometry Camera Model Camera Calibration General Stereo Triangulation 3D Reconstruction.
  • 3D Sensing
  • 3D Shape from X
  • Perspective Imaging Model 1D
  • Perspective in 2D (Simplified)
  • 3D from Stereo
  • Depth Perception from Stereo Simple Model Parallel Optic Axes
  • Resultant Depth Calculation
  • Finding Correspondences
  • 3 Main Matching Methods
  • Epipolar Geometry Constraint 1 Normal Pair of Images
  • Epipolar Geometry General Case
  • Constraints
  • Structured Light
  • Structured Light 3D Computation
  • Depth from Multiple Light Stripes
  • Our (former) System 4-camera light-striping stereo
  • Camera Model Recall there are 5 Different Frames of Reference
  • The Camera Model
  • The camera model handles the rigid body transformation from world coordinates to camera coordinates plus the perspective transformation to image coordinates
  • Camera Calibration
  • Intrinsic Parameters
  • Extrinsic Parameters
  • Calibration Object
  • The Tsai Procedure
  • In this version of Tsairsquos algorithm
  • Tsairsquos Procedure
  • Intermediate Unknowns
  • Use to solve for ty tx and 4 rotation parameters
  • Determine true sign of ty and compute remaining rotation parameters
  • Calculating the remaining 5 rotation parameters
  • Solve another linear system
  • Almost there
  • We use them for general stereo
  • For a correspondence (r1c1) in image 1 to (r2c2) in image 2
  • Solve by computing the closest approach of the two skew rays
  • Slide 37
  • Slide 38
  • Slide 39
  • Slide 40
  • Slide 41
  • Slide 42
  • Slide 43
  • Slide 44
  • Slide 45
  • Slide 46
  • Slide 47
  • Slide 48
  • Slide 49