The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment...

23
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys Anselmo Lastra
  • date post

    15-Jan-2016
  • Category

    Documents

  • view

    213
  • download

    0

Transcript of The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment...

Page 1: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

Automatic Image Alignment for 3D Environment Modeling

Nathaniel WilliamsKok-Lim LowChad HantakMarc PollefeysAnselmo Lastra

Page 2: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

2

Motivation: Real World Models

Forensics

Historical Preservation

Education

Page 3: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

3

The Problem: Multiple Sensors• Digital Camera:

2D color images• Laser Scanner:

2D range map stores reflectance and depth

Page 4: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

4

The Problem: Alignment

• Manual alignment is very time consuming♦ 5-10 minutes per image

• Modeling one room may require 10 scans and 100 images

• Multi-sensor alignment is difficult to automate♦ Differences in sampling EM spectrum,

illumination, occlusion, etc.

Page 5: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

5

Our Approach

• Obtain an initial estimate of the correct alignment

• Recast 2D to 3D registration into a fast 2D image-based process

• Refine the initial alignment by optimizing the chi-square test

Page 6: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

6

Previous Approaches

• Align medical images (e.g. CT, MR) by maximizing mutual information♦ Viola & Wells [1995], Collignon et al, [1995], etc.

• Correlate edges in image & range map♦ McAllister, Nyland, Popescu, Lastra, & McCue [1999]

• Align by comparing object silhouettes♦ Lensch, Heidrich, & Seidel [2000]

• Global optimization of chi-square test♦ Boughorbal et al [1999, 2000]

Page 7: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

7

Data Acquisition

• Acquire range maps and color images of the environment♦ Need more scans in complex scenes

• Annotate all data with initial estimates of the alignment

Page 8: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

8

Initial Pose Estimation [1]• Constrain the sensors’ positions

♦ Rigidly mount camera above scanner♦ Acquire from same center of projection

Page 9: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

9

Initial Pose Estimation [2]• Track the sensors’ positions

♦ Use an optical tracker to measure the pose of the camera relative to the scanner

Page 10: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

10

Camera & Tracker Calibration• Calculate the orientation of the

camera and scanner in the tracker’s coordinate frame

• Find the camera’s intrinsic parameters♦ Tape the lens in place

Page 11: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

11

Data Preprocessing

• Correct for image distortion• Convert all range maps into a

single polygonal model♦ Texture map model with laser

reflectance

• Simplify polygonal model♦ Reduce millions of triangles by 99% or

more

Page 12: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

12

Multi-Sensor Data Alignment• Recast 2D to 3D alignment into a

fast 2D image-based process• Visualize by projectively texture

mapping color image, given pose T

+

Page 13: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

13

Image Comparison Framework

Reference Image r

Floating Image f

Extract intensity & down-sample

- performed once -

Extract from model given pose

T

- performed often -

Color Image

3D Model

Page 14: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

14

Chi-Square Test

• Statistical measure of dependence between random variables

• Estimate joint probability density from a joint histogram

Floating ImageR

efe

rence

Im

age

Reference

Floating

Page 15: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

15

Optimization

• Powell’s multidimensional direction set methods♦ Performs line minimizations given an initial

pose estimate and search direction

• The optimization is unconstrained, but the search is local given good initial estimates

TfrT T |,maxargˆ 2

Page 16: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

16

Video of 3D Alignment

Page 17: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

17

Results

• UNC Laboratory Model + 2 color images♦ Data captured from 3 different points of view♦ 6D optimization: 344 iterations, 28.5sec♦ Rendering=16% Readback=33% Chi-

square=51%Image Model Model + 2

Images

Page 18: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

18

Results

• Global optimization can fail on complicated scenes

Monticello Library

UNC LaboratoryCorrect

Alignment

Page 19: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

19

Conclusions

• Initial pose estimation improves the robustness of automatic alignment

• Acquiring data from a common COP♦ No occlusion makes the alignment more robust♦ Inflexible: camera is mounted on the scanner♦ Inexpensive: requires a simple bracket

• Decoupling the sensors♦ Flexible: collect more surface information♦ Expensive: tracking sensors takes more effort

Page 20: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

20

Future Work

• Determine the ideal tracking method for initial alignment estimation♦ Criteria: portability, accuracy, and expense

• Experiment with other information metrics and optimization schemes

• Investigate error sources♦ Camera calibration, tracker calibration, etc.

• Implement image comparison on graphics hardware

Page 21: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

21

Acknowledgements

• Kurtis Keller and John Thomas (UNC)

• Rich Holloway and 3rdTech, Inc.• The U.S. National Science

Foundation

Page 22: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

22

The End

• Questions?

Page 23: The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Automatic Image Alignment for 3D Environment Modeling Nathaniel Williams Kok-Lim Low Chad Hantak Marc Pollefeys.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

23

References [1] P. K. Allen, I. Stamos, A. Troccoli, B. Smith, M. Leordeanu,and Y. C. Hsu. 3d modeling of historic sites using range andimage data. In Proceedings of International Conference onRobotics and Animation (ICRA), May 2003.[2] F. Boughorbal, D. L. Page, C. Dumont, and M. A. Abidib.Registration and integration of multi-sensor data for photorealisticscene reconstruction. In Proceedings of SPIE Conferenceon Applied Imagery Pattern Recognition, Oct. 1999.[3] C. Buehler, M. Bosse, L. McMillan, S. J. Gortler, and M. F.Cohen. Unstructured lumigraph rendering. In Proceedingsof ACM SIGGRAPH 2001, Computer Graphics Proceedings,Annual Conference Series, pages 425–432, Aug. 2001.[4] W.-C. Chen, L. Nyland, A. Lastra, and H. Fuchs. Acquisitionof large-scale surface light fields. In Proceedings of theSIGGRAPH 2003 Conference on Sketches & Applications, 2003.[5] R. O. Duda, P. E. Hart, and D. G. Stork. Pattern Classification(2nd Edition). Wiley-Interscience, 2000.[6] M. D. Elstrom. A stereo-based technique for the registrationof color and ladar images. Master’s thesis, University ofTennessee, Knoxville, Aug. 1998.[7] H. Lensch,W. Heidrich, and H.-P. Seidel. Automated textureregistration and stitching for real world models. In Proceedingsof Pacific Graphics 2000, pages 317–326, Oct. 2000.[8] K.-L. Low. Calibrating the hiball wand. Technical ReportTR02-018, Department of Computer Science, University ofNorth Carolina at Chapel Hill, Apr. 2002.[9] F. Maes, A. Collignon, D. Vandermeulen, G. Marchal, andP. Suetens. Multimodality image registration by maximizationof mutual information. In IEEE Transactions on MedicalImaging, volume 16, pages 187–198, Apr. 1997.[10] D. McAllister, A. Lastra, and W. Heidrich. Efficient renderingof spatial bi-directional reflectance distribution functions.In Proceedings of the 17th Eurographics/SIGGRAPHworkshop on graphics hardware, pages 79–88, Sept. 2002.[11] D. McAllister, L. Nyland, V. Popescu, A. Lastra, and C. Mc-Cue. Real-time rendering of real world environments. InProceedings of the 10th Eurographics Rendering Workshop ,pages 145–160, 1999.

[12] L. Nyland. Capturing dense environmental range informationwith a panning, scanning laser rangefinder. TechnicalReport TR98-039, Department of Computer Science, Universityof North Carolina - Chapel Hill, Jan. 5 1999.[13] W. H. Press, B. P. Flannery, S. A. Teukolsky, and W. T. Vetterling.Numerical Recipes in C: The Art of Scientific Computing .Cambridge University Press, Cambridge (UK) andNew York, 2nd edition, 1992.[14] M. Sallinen and T. Heikkil¨a. A simple hand-eye calibrationmethod for a 3d laser range sensor. In Advances in NetworkedEnterprises, pages 421–430, 2000.[15] I. Stamos and P. K. Allen. Automatic registration of 2-Dwith 3-D imagery in urban environments. In Proceedingsof the Eighth International Conference On Computer Vision(ICCV-01), pages 731–737, July 2001.[16] E. Trucco and A. Verri. Introductory Techniques for 3-DComputer Vision. Prentice Hall, 1998.[17] P. Viola and W. M. Wells III. Alignment by maximization ofmutual information. In Proceedings of the 5th InternationalConference on Computer Vision, pages 16–23, 1995.[18] R. Wang and D. Luebke. Efficient reconstruction of indoorscenes with color. In Proceedings of the 4th InternationalConference on 3D Imaging and Modeling (3DIM), 2003.[19] G. Welch, G. Bishop, L. Vicci, S. Brumback, K. Keller, andD. Colucci. The hiball tracker: high-performance wide-areatracking for virtual and augmented environments. In Proceedingsof the ACM symposium on Virtual reality softwareand technology, 1999.[20] S. You, U. Neumann, and R. Azuma. Orientation trackingfor outdoor augmented reality registration. IEEE ComputerGraphics and Applications, 19(6):36–42, Nov./Dec. 1999.[21] Z. Zhang. Flexible camera calibration by viewing a planefrom unknown orientations. In Proceedings of the 7th IEEEInternational Conference on Computer Vision (ICCV), pages666–673, 1999.