View Planning Candidacy Exam Paul Blaer December 15, 2003.

25
View Planning Candidacy Exam Paul Blaer December 15, 2003
  • date post

    21-Dec-2015
  • Category

    Documents

  • view

    219
  • download

    0

Transcript of View Planning Candidacy Exam Paul Blaer December 15, 2003.

Page 1: View Planning Candidacy Exam Paul Blaer December 15, 2003.

View Planning

Candidacy ExamPaul Blaer

December 15, 2003

Page 2: View Planning Candidacy Exam Paul Blaer December 15, 2003.

The View Planning Problem:

Find set of sensor configurations to efficiently and

accurately fulfill a reconstruction or inspection task.

Positions often found sequentially so

sometimes called the Next Best View (NBV)

Problem

Page 3: View Planning Candidacy Exam Paul Blaer December 15, 2003.

Tasks

Inspection

Page 4: View Planning Candidacy Exam Paul Blaer December 15, 2003.

Tasks:

Inspection

Surveillance

Page 5: View Planning Candidacy Exam Paul Blaer December 15, 2003.

Tasks:

Inspection

Surveillance

3D Models of Smaller Objects

Page 6: View Planning Candidacy Exam Paul Blaer December 15, 2003.

Tasks:

Inspection

Surveillance

3D Models of Smaller Objects

3D Models of Large Objects (such as buildings).

Page 7: View Planning Candidacy Exam Paul Blaer December 15, 2003.

Tasks:InspectionSurveillance3D Models of Smaller Objects3D Models of Large Objects (such as buildings).Mapping for Mobile Robots

Page 8: View Planning Candidacy Exam Paul Blaer December 15, 2003.

View Planning Literature1. Model Based Methods

• Cowan and Kovesi, 1988

• Tarabanis and Tsai, 1992

• Tarabanis, et al, 1995

• Tarbox and Gottschlich, 1995

• Scott, Roth and Rivest, 2001

2. Non-Model Based Methods• Volumetric Methods

– Connolly, 1985– Banta et al, 1995– Massios and Fisher, 1998– (Papadopoulos-Organos,

1997)– (Soucey, et al, 1998)

• Surface-Based Methods– Maver and Bajcsy, 1993– (Yuan, 1995)– Zha, et al, 1997– Pito, 1999– Reed and Allen, 2000– Klein and Sequeira, 2000

• Whaite and Ferrie, 1997

3. Art Gallery Methods

• (Xie, et al, 1986)

• Gonzalez-Banos, et al, 1997

• Danner and Kavraki, 2000

4. View Planning for Mobile Robots

• Gonzalez-Banos, et al, 2000

• Grabowski, et al, 2003

• Nuchter, et al, 2003

Page 9: View Planning Candidacy Exam Paul Blaer December 15, 2003.

Typical View Planning Constraints

Fundamental – Increase knowledge of the viewing volume.Scanning – Ensure that the viewing volume can be scanned.Overlap – Resample part of object already scanned and be able to ID that part.Tolerance – Sample the object with a minimum accuracy.

Self Termination

Computational Burden – Algorithm should be able to compute NBV in a computationally feasible amount of time

Other constraints:Few assumptions

Generalizable

Page 10: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“Automatic Sensor Placement from Vision Task Requirements,” C. K. Cowan and P. D. Kovesi, 1988

Find camera view points for inspecting a scene.Requirements:

Resolution ConstraintFocus ConstraintField of View ConstraintVisibility Constraint

View surface computed for each and intersected.Constraints Extend to Laser Scanners

Page 11: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“The MVP Sensor Planning System for Robotic Vision Tasks,” K. A. Tarabanis, R. Y. Tsai, and P. K. Allen 1995

Given CAD model of the scene and task requirements.Compute view to fulfill tasks.Requirements:

ResolutionFocus ConstraintField of View ConstraintFeature Visibility Constraint – solved in “Computing Occlusion-Free Viewpoints,” Tarabanis and Tsai, 1992

Requirements written as inequalities.Optimization procedure run to maximize the quality of the viewpoints.

Page 12: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“Planning for Complete Sensor Coverage in Inspection,”G. H Tarbox and S. N. Gottschlich, 1995

“View Planning for Multistage Object Reconstruction,”W. R. Scott, G Roth and J.-F. Rivest, 2001

Model based approaches

Camera and a laser with a fixed baseline.

Measurability matrix, C(i,k), is computed.

Tarbox and Gottschlich:

Next view based on glancing angles and “difficulty to view.”

Scott, Roth, and Rivest:

Similar but add an incremental process and a constraint on sensor measurement error.

otherwise

location camera range from

unoccluded is location object if

0

1

),( k

i

kiC

Page 13: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“The Determination of Next Best Views,”C. I. Connolly, 1985

“The “Best-Next-View” Algorithm for Three-Dimensional Scene Reconstruction Using Range Images,”

J. E. Banta, et al., 1995

Connolly:

Volumetric Model-Based Approach. No prior information.

Volume stored as Octree, regions labeled empty, object surface or unknown.

Sphere around object is discretized into view points

NBV is selected by picking viewpoints that see the most unkown voxels.

Banta, et al.:

Similar to Connolly but voxels are only labeled as occupied or unoccupied.

Views are chosen at points of maximum curvature on the object.

Page 14: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“Occlusions as a Guide for Planning the Next View,”J. Maver and R. Bajcsy, 1993

• Occlusion based approach. • No prior knowledge• Camera-laser triangulation

system. • The planning is done in two

stages: – Resolve occlusions from

the laser stripe not being visible to the camera. Correct by rotating in scanning plane.

– Resolve occlusions from the laser line not reaching parts of the scene. Correct by rotating the scanning plane itself.

Page 15: View Planning Candidacy Exam Paul Blaer December 15, 2003.

More Occlusion Based Methods“Active Modeling of 3-D Objects: Planning on the Next Best Pose

(NBP) for Acquiring Range Images,” H. Zha, K. Korooka, T. Hasegawa, and T. Nagata, 1997

NBV is computed by maximizing a linear combination of three weighted functions.

• Extending constraint for covering unexplored regions.

• Overlapping constraint for registration.

• Smoothness constraint for registration.“A Best Next View Selection Algorithm incorporating

Quality Criterion” N. A. Massios and R. B. Fisher, 1998Voxels partitioned as empty, unseen, seen, or occlusion plane.An occlusion planes are computed along jump edges.A quality criteria based on the difference between the incident angle of the scanner and the normal of the voxel being scanned.NBV’s are chosen to be in the direction of occlusion plane and also to maximize the quality of the voxels being imaged.

),(),(),(),( ssooee fwfwfwf

Page 16: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“A Solution to the Next Best View Problem for Automated Surface Acquisition,”

R. Pito, 1999

No prior knowledge of the object.

Void Volume stored as Void patches on the boundary.

Observation Rays Computed From the Surface and projected into Positional Space.

Potential Range Rays are projected into PS and collinear ORs are found.

The NBV is scanner position that can view the most number of void patches while still viewing a threshold number of patches from the existing model.

Page 17: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“Constraint-Based Sensor Planning for Scene Modeling,”M. K. Reed and P. K. Allen, 2000

Constructs solid models from range imagery. No prior knowledge about the object is knownSurface is tessellated surface from the range data and extruded to the bounding box.A surface is labeled as either imaged or occlusion.N largest targets by surface area are chosen and the set of positions from which the sensor can image the target is computed (the imaging set). A set of occlusion constraints are computed. Finally a set of possible views is computed by subtracting the occlusion constraints from the imaging set. The next view is chosen from that set.A new range image is incorporated into the model by intersecting it with the current model.

Page 18: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“Autonomous Exploration: Driven by Uncertainty,”P. Whaite and F. P. Ferrie, 1997

Autonomous Exploration with a Laser Range ScannerApproximates Target with Superellipsoids.Parameters are estimated and Uncertainty Ellipse is Found.NBV is selected in the direction of least certainty.Restricted to single Superellipsoid.

n

z

n

e

e

y

e

x a

z

a

y

a

xaxf

222

),(

Page 19: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“View Planning for the 3D Modeling of Real World Scenes,”K. Klein, V. Sequeira, 2000

No prior knowledge of the object being scanned.Surface represented as two meshes, a known mesh and a void mesh which is the boundary between the known and unknown regions.A cost benefit ratio is computed:

Benefit: how close is each point viewed to its desired sampling density, and how much void volume is viewed.Cost: how hard is it to get to that view point (manually computed).

For calculation of the quality function at a given view point the mesh is partially rendered on to a view cube.A view is selected that has the best cost/benefit but maintains an overlap with known regions of at least 20%.

Page 20: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“Randomized Planning for Short Inspection Paths,”T. Danner and L. E. Kavraki, 2000

Danner and Kavraki:

Extends the Gonzalez-Banos, et al.’s (1997) randomized art gallery method to 3-D scenes.

The visibility volume of points on the surface is computed.

Random points within volume are chosen.

Points are iteratively added to cover more of the surface.

An approximation of TSP is used to connect the points and form the path.

Page 21: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“Planning Robot Motion Strategies for Efficient Model Construction,” H. H. Gonzalez-Banos, et al., 2000

Goal: Construction of a 2D map of the environment

Uses a Sick laser range sensor

Takes a single scan and extracts polylines to represent the obstacles

NBV is solved by randomly picking locations in the free space and estimating How much new information will be gained.

Best location chosen by maximizing the new information gained and minimizing distance traveled.

Page 22: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“Autonomous Exploration via Regions of Interest”, R. Grabowski, P. Khosla, H. Choset, 2003

Goal to construct 2D map of environment with Sonars.

Data is fused into a occupancy map.

Measurements with a low separation angle are highly coupled. Therefore next best views are chosen that have poses are not highly coupled (higher separation angles).

After a view is taken, the regions that can see the same feature, but from a different angle are marked as regions of interest.

Page 23: View Planning Candidacy Exam Paul Blaer December 15, 2003.

“Planning Robot Motion for 3D Digitalization of Indoor Environments,” A. Nuchter, H. Surmann, J. Hertzberg, 2003

Goal to construct a 3D model of the environment with a Mobile Robot.

Uses a pair of Sick laser scanners.

Scans the ground plane and extracts straight lines, then adds “unseen lines” to close these lines off into a polygon that bounds the free space.

NBV is chosen by randomly choosing views in the free space and evaluating how much of the unseen lines it can view.

Views at a great distance and with a substantial change in angle are penalized.

Page 24: View Planning Candidacy Exam Paul Blaer December 15, 2003.

Discussion

Typical Model Acquisition Steps

Steps are missing

Older methods relied on a fixed and known sensor work space.

Interest is moving toward mobile robot platforms and exploration of complex indoor and outdoor environments.

In complex exploration tasks, many problems become interrelated:

Localization

Mapping

Navigation and Path Planning

Sensor Planning

Page 25: View Planning Candidacy Exam Paul Blaer December 15, 2003.

Open Problems and Future Research

Improve efficiency – to help with the move towards larger scenes

Improve Accuracy and Robustness – as we move towards more unstructured environments, sensor error will increase.

Develop online planning methods – take into account not only the changing model but the changing workspace of the sensor.

Multisensor Fusion Approaches – be able to construct our models out of multiple inputs and plan views that take into account the constraints and benefits of more than just the single sensor.