Robot Vision Control of robot motion from video cmput 615/499 M. Jagersand.
-
Upload
melvin-gibbs -
Category
Documents
-
view
217 -
download
1
Transcript of Robot Vision Control of robot motion from video cmput 615/499 M. Jagersand.
Robot VisionControl of robot motion from video
cmput 615/499cmput 615/499
M. JagersandM. Jagersand
Vision-Based Control (Visual Servoing)
Initial Image
Desired Image
User
Vision-Based Control (Visual Servoing)
: Current Image Features: Desired Image Features
Vision-Based Control (Visual Servoing)
: Current Image Features: Desired Image Features
Vision-Based Control (Visual Servoing)
: Current Image Features: Desired Image Features
Vision-Based Control (Visual Servoing)
: Current Image Features: Desired Image Features
Vision-Based Control (Visual Servoing)
: Current Image Features: Desired Image Features
u,v Image-Space Error
: Current Image Features: Desired Image Features
e t = s t − s d
E = [ à ]
E = yu
yv
ô õà yu
yv
ô õã
E = [y0à yã]One point
Pixel coord
E =y1...y16
2
4
3
5
ã
ày1...y16
2
4
3
5
0
Many points
u
v
Conventional Robotics: Motion command in base coord
•We focus on the geometric transformsWe focus on the geometric transforms
EE
Problem: Lots of coordinate frames to calibrate
CameraCamera– Center of projectionCenter of projection– Different modelsDifferent models
RobotRobot– Base frameBase frame– End-effector frameEnd-effector frame– ObjectObject
Image Formation is NonlinearCamera frame
Camera Motion to Feature MotionCamera frame
Camera Motion to Feature Motion
Interaction matrix is a Jacobian.
This derivation is for one point.
Q. How to extend to more points?
Q. How many points needed?
Problem: Lots of coordinate frames to calibrate
CameraCamera– Center of projectionCenter of projection– Different modelsDifferent models
RobotRobot– Base frameBase frame– End-effector frameEnd-effector frame– ObjectObject
Only covered one transform!
Other (easier) solution:Image-based motion control
Motor-Visual function: Motor-Visual function: y=f(x)y=f(x)Achieving 3d tasks
via2d image control
Other (easier) solution:Image-based motion control
Motor-Visual function: Motor-Visual function: y=f(x)y=f(x)Achieving 3d tasks
via2d image control
Note: What follows will work for one or two (or 3..n) cameras.
Either fixed or on the robot.
Here we will use two cam
Vision-Based Control
Image 1 Image 2
Vision-Based Control
Image 1 Image 2
Vision-Based Control
Image 1 Image 2
Vision-Based Control
Image 1 Image 2
Image 1
Vision-Based Control
Image 2
Image-based Visual Servoing
•Observed features: Observed features:
•Motor joint angles:Motor joint angles:
•Local linear model: Local linear model:
•Visual servoing steps: 1 Solve:Visual servoing steps: 1 Solve:
2 Update:2 Update:
y = y1 y2 . . . ym[ ]T
x = x1 x2. . . xn[ ]T
É y = J É x
yã à yk = J É x
xk+1 = xk + É x
Find J Method 1: Test movements along basis
•Remember: Remember: J J is unknown is unknown m m by by n n matrixmatrix
•Assume movements Assume movements
•Finite difference:Finite difference:
J =@x1
@f1 ááá @xn
@f1
.... . .
@x1
@fm
@xn
@fm
0
B@
1
CA
É x1 = [1;0; . . .;0]T
É x2 = [0;1; . . .;0]T...É xn = [0;0; . . .;1]T
J t
...É y1...
2
4
3
5
...É y2...
2
4
3
5 ááá
...É yn...
2
4
3
5
0
@
1
A
Find J Method 2:Secant Constraints
•Constraint along a line:Constraint along a line:
•Defines Defines m m equations equations
•Collect Collect n n arbitrary, but different measuresarbitrary, but different measures y y
•Solve for Solve for JJ
É y = J É x
ááá É yT1 ááá
â ã
ááá É yT2 ááá
â ã...
ááá É yTn ááá
â ã
0
BB@
1
CCA =
ááá É xT1 ááá
â ã
ááá É xT2 ááá
â ã...
ááá É xTn ááá
â ã
0
BB@
1
CCA J T
Find J Method 3:Recursive Secant Constraints
• Based on initial Based on initial J J and one measure pairand one measure pair
• Adjust Adjust J J s.t. s.t.
• Rank 1 update:Rank 1 update:
• Consider rotated coordinates: Consider rotated coordinates: – Update same as finite difference for Update same as finite difference for nn orthogonal moves orthogonal moves
É y;É x
É y = J k+1É x
Jêk+1 = Jêk +É xTÉ x
(É y à JêkÉ x)É xT
Achieving visual goals:Uncalibrated Visual Servoing
1.1. Solve for motion:Solve for motion:
2.2. Move robot joints:Move robot joints:
3.3. Read actual visual moveRead actual visual move
4.4. Update Jacobian:Update Jacobian:
[yã à yk] = J É x
xk+1 = xk + É x
Jêk+1 = Jêk +É xTÉ x
(É y à JêkÉ x)É xT
Can we always guarantee when a task is achieved/achievable?
É y
Visually guided motion control
Issues:Issues:
1.1. What tasks can be performed?What tasks can be performed?– Camera models, geometry, visual encodingsCamera models, geometry, visual encodings
2.2. How to do vision guided movement?How to do vision guided movement?– H-E transform estimation, feedback, feedforward motion H-E transform estimation, feedback, feedforward motion
controlcontrol
3.3. How to plan, decompose and perform whole How to plan, decompose and perform whole tasks?tasks?
How to specify a visual task?
Task and ImageSpecifications
Task function Task function T(x) = 0T(x) = 0 Image encoding Image encoding E(y) =0E(y) =0
task space error
image space error
image space satisfied
task space satisfied
Visual specifications
•Point to Point task “error”:Point to Point task “error”:
yãE = [yã2 à y0] y0
yã
y0
E =y1...y16
2
4
3
5
ã
ày1...y16
2
4
3
5
0
Why 16 elements?
Visual specifications 2
•Point to LinePoint to Line
Line:Line:E pl(y;l) = yl ál l
yr álr
ô õ
l l = y3â y1
y4â y2
ô õ
yl = y5
y6
ô õ
y1
y2
ô õ
y3
y4
ô õ
Note: y homogeneous coord.
ParallelComposition example
E (y ) =wrench
y - y4 7
y - y2 5
y • (y y )8 3 4
y • (y y )6 1 2
(plus e.p. checks)
Achieving visual goals:Uncalibrated Visual Servoing
1.1. Solve for motion:Solve for motion:
2.2. Move robot joints:Move robot joints:
3.3. Read actual visual moveRead actual visual move
4.4. Update Jacobian:Update Jacobian:
[yã à yk] = J É x
xk+1 = xk + É x
Jêk+1 = Jêk +É xTÉ x
(É y à JêkÉ x)É xT
Can we always guarantee when a task is achieved/achievable?
É y
Task ambiguity
•Will the scissors cut the paper in the Will the scissors cut the paper in the middle?middle?
Task ambiguity
•Will the scissors cut the paper in the Will the scissors cut the paper in the middle? middle? NO!NO!
Task Ambiguity
• Is the probe contacting the wire?Is the probe contacting the wire?
Task Ambiguity
• Is the probe contacting the wire? Is the probe contacting the wire? NO!NO!
Task Ambiguity
• Is the probe contacting the wire?Is the probe contacting the wire?
Task Ambiguity
• Is the probe contacting the wire? Is the probe contacting the wire? NO!NO!
A task T(f)=0 is invariant under a group G of transformations, iff x
f V , g G with g(f ) V T(f )=0 T(g(f ))=0n
xn
If T(f ) = 0 here,
T(f ) must be 0 here, if T is G invariant T(f ) must be 0 here, if T is G invariant
T(f ) must be 0 here, if T is G invariant
T(f ) must be 0 here, if T is G invariant
proj
inj
sim
aff
Task decidability andinvariance
All achievable tasks...
are ALL projectively distinguishable
coordinate systems
One point
Coincidence
Noncoincidence
1000
1000
1000
1000
0100
1000
1000
1000
1000
1000
0100
1000
0100
0010
Collinearity
General Position
2pt. coincidence
3pt. coincidence
1000
0100
1100
1000
1000
1000
1000
1000
1000
0100
0100
1000
0100
1100
100
1000
0100
0010
1110
1000
0100
0010
0001
4pt. coincidence
2 2pt. coincidence
Cross Ratio
General Position
Coplanarity
Tpp
Tcol
Tcopl
Tcr
point-coincidence
collinearitycoplanarity
cross ratios ()
Cinj
Cwk
Task primitives
Tpp
Injective
Projective
Composing possible tasks
Task operators
T1 T2 = T1 T2 = T1•T2 T = 0 if T = 0
1 otherwise
T1
T2
AND OR NOT
wrench: view 1 wrench: view 2
Twrench(x1..8) =
Tcol(x2,x5,x6)Tcol(x4,x7,x8)
Tpp(x1,x5) Tpp(x3,x7) 1
2
34
56
78
Result: Task toolkit
Serial CompositionSolving whole real tasks
• Task primitive/”link”Task primitive/”link”
1.1. Acceptable initial (visual) conditionsAcceptable initial (visual) conditions
2.2. Visual or Motor constraints to be Visual or Motor constraints to be maintainedmaintained
3.3. Final desired conditionFinal desired condition• Task = Task =
A = (E init;M;E final)
A1A2. . .Ak
“Natural” primitive links
1.1. TransportationTransportation Coarse primitive for large movementsCoarse primitive for large movements <= 3DOF control of object centroid<= 3DOF control of object centroid Robust to disturbancesRobust to disturbances
2.2. Fine ManipulationFine Manipulation– For high precision control of both position and orientationFor high precision control of both position and orientation– 6DOF control based on several object features6DOF control based on several object features
Example: Pick and place type of movement
3. Alignment???3. Alignment??? To match transport final to fine manipulation initial conditionsTo match transport final to fine manipulation initial conditions
More primitives
4. Guarded move4. Guarded move– Move along some direction until an external contraint (e.g. Move along some direction until an external contraint (e.g.
contact) is satisfied.contact) is satisfied.
5. Open loop movements:5. Open loop movements: When object is obscuredWhen object is obscured Or ballistic fast movementsOr ballistic fast movements Note can be done based on previously estimated JacobiansNote can be done based on previously estimated Jacobians
Solving the puzzle…
• Visual servoing: The process of minimizing a visually-Visual servoing: The process of minimizing a visually-specified task by using visual feedback for motion specified task by using visual feedback for motion control of a robot.control of a robot.
• Is it Is it difficultdifficult? Yes. ? Yes. – Controlling 6D pose of the end-effector from 2D image features. Controlling 6D pose of the end-effector from 2D image features. – Nonlinear projection, degenerate features, etc.Nonlinear projection, degenerate features, etc.
• Is it important? Of course.Is it important? Of course.– Vision is a versatile sensor. Vision is a versatile sensor. – Many applications: industrial, health, service, space, humanoids, Many applications: industrial, health, service, space, humanoids,
etc. etc.
Conclusions
What environments and tasks are of interest?
• Not the previous examples. Not the previous examples. – Could be solved “structured”Could be solved “structured”
• Want to work in everyday unstructuredWant to work in everyday unstructured environments environments
Use in space applications
•Space stationSpace station
•On-orbit serviceOn-orbit service
•Planetary Planetary ExplorationExploration
Power line inspection
Model landscape: Edmonton railroad society
• A electric UAV A electric UAV carrying camera flown carrying camera flown
over the modelover the model • Simulation: Robot arm Simulation: Robot arm
holds camera holds camera
Use in Power Line Inspection
Some ReferencesM. Spong, S. Hutchinson, M. Vidyasagar. Robot Modeling and Control. Chapter 12: M. Spong, S. Hutchinson, M. Vidyasagar. Robot Modeling and Control. Chapter 12: Vision-Based Control. John Wiley & Sons: USA, 2006. Vision-Based Control. John Wiley & Sons: USA, 2006. (Text)(Text) S. Hutchinson, G. Hager, P. Corke, "A tutorial on visual servo control," IEEE Trans. on S. Hutchinson, G. Hager, P. Corke, "A tutorial on visual servo control," IEEE Trans. on Robotics and Automation, October 1996, vol. 12, no 5, p. 651-670. Robotics and Automation, October 1996, vol. 12, no 5, p. 651-670. (Tutorial)(Tutorial) F. Chaumette, S. Hutchinson, "Visual Servo Control, Part I: Basic Approaches," and F. Chaumette, S. Hutchinson, "Visual Servo Control, Part I: Basic Approaches," and "Part II: Advanced Approaches," IEEE Robotics and Automation Magazine 13(4):82-90, "Part II: Advanced Approaches," IEEE Robotics and Automation Magazine 13(4):82-90, December 2006 and 14(1):109-118, March 2007. December 2006 and 14(1):109-118, March 2007. (Tutorial)(Tutorial) B. Espiau, F. Chaumette, P. Rives, "A new approach to visual servoing in robotics," IEEE B. Espiau, F. Chaumette, P. Rives, "A new approach to visual servoing in robotics," IEEE Trans. on Robotics and Automation, June 1992, vol. 8, no. 6, p. 313-326. Trans. on Robotics and Automation, June 1992, vol. 8, no. 6, p. 313-326. (Image-based)(Image-based) W. J. Wilson, C. C. Williams Hulls, G. S. Bell, "Relative End-Effector Control Using W. J. Wilson, C. C. Williams Hulls, G. S. Bell, "Relative End-Effector Control Using Cartesian Position Based Visual Servoing", IEEE Transactions on Robotics and Cartesian Position Based Visual Servoing", IEEE Transactions on Robotics and Automation, Vol. 12, No. 5, October 1996, pp.684-696. Automation, Vol. 12, No. 5, October 1996, pp.684-696. (Position-based)(Position-based) E. Malis, F. Chaumette, S. Boudet, "2 1/2 D visual servoing," IEEE Trans. on Robotics E. Malis, F. Chaumette, S. Boudet, "2 1/2 D visual servoing," IEEE Trans. on Robotics and Automation, April 1999, vol. 15, no. 2, p. 238-250. and Automation, April 1999, vol. 15, no. 2, p. 238-250. (Hybrid)(Hybrid) M. Jägersand, O. Fuentes, R. Nelson, "Experimental Evaluation of Uncalibrated Visual M. Jägersand, O. Fuentes, R. Nelson, "Experimental Evaluation of Uncalibrated Visual Servoing for Precision Manipulation," Proc. IEEE Intl. Conference on Robotics and Servoing for Precision Manipulation," Proc. IEEE Intl. Conference on Robotics and Automation (ICRA), pp. 2874-2880, 20-25 Apr. 1997. Automation (ICRA), pp. 2874-2880, 20-25 Apr. 1997. (Uncalibrated, model-free)(Uncalibrated, model-free) F. Chaumette, "Potential problems of stability and convergence in image-based and F. Chaumette, "Potential problems of stability and convergence in image-based and position-based visual servoing," The Confluence of Vision and Control, LNCS Series, No position-based visual servoing," The Confluence of Vision and Control, LNCS Series, No 237, Springer-Verlag, 1998, p. 66-78.237, Springer-Verlag, 1998, p. 66-78.