Path planning in image space for robust visual servoing · PDF fileNPQb that represent...

7
HAL Id: inria-00352164 https://hal.inria.fr/inria-00352164 Submitted on 12 Jan 2009 HAL is a multi-disciplinary open access archive for the deposit and dissemination of sci- entific research documents, whether they are pub- lished or not. The documents may come from teaching and research institutions in France or abroad, or from public or private research centers. L’archive ouverte pluridisciplinaire HAL, est destinée au dépôt et à la diffusion de documents scientifiques de niveau recherche, publiés ou non, émanant des établissements d’enseignement et de recherche français ou étrangers, des laboratoires publics ou privés. Path planning in image space for robust visual servoing Youcef Mezouar, François Chaumette To cite this version: Youcef Mezouar, François Chaumette. Path planning in image space for robust visual servoing. IEEE Int. Conf. on Robotics and Automation, ICRA’00, 2000, San Francisco, California, France. 3, pp.2759-2764, 2000. <inria-00352164>

Transcript of Path planning in image space for robust visual servoing · PDF fileNPQb that represent...

Page 1: Path planning in image space for robust visual servoing · PDF fileNPQb that represent respectively the rotation and the translation from object frame E bto E'F and E to EJ (see Figure

HAL Id: inria-00352164https://hal.inria.fr/inria-00352164

Submitted on 12 Jan 2009

HAL is a multi-disciplinary open accessarchive for the deposit and dissemination of sci-entific research documents, whether they are pub-lished or not. The documents may come fromteaching and research institutions in France orabroad, or from public or private research centers.

L’archive ouverte pluridisciplinaire HAL, estdestinée au dépôt et à la diffusion de documentsscientifiques de niveau recherche, publiés ou non,émanant des établissements d’enseignement et derecherche français ou étrangers, des laboratoirespublics ou privés.

Path planning in image space for robust visual servoingYoucef Mezouar, François Chaumette

To cite this version:Youcef Mezouar, François Chaumette. Path planning in image space for robust visual servoing. IEEEInt. Conf. on Robotics and Automation, ICRA’00, 2000, San Francisco, California, France. 3,pp.2759-2764, 2000. <inria-00352164>

Page 2: Path planning in image space for robust visual servoing · PDF fileNPQb that represent respectively the rotation and the translation from object frame E bto E'F and E to EJ (see Figure

Path Planning in ImageSpacefor Robust Visual Servoing

YoucefMezouar [email protected] [email protected]

IRISA - INRIA RennesCampusdeBeaulieu,

35042RennesCedex, France

AbstractVisionfeedback control loop techniquesareefficientfor

a greatclassof applicationsbut they comeupagainstdiffi-cultieswhentheinitial anddesiredpositionsof thecameraare distant. In this paperwe proposea new approach toresolvethesedifficultiesbyplanningtrajectoriesin theim-age. Constraintssuch that theobjectremainsin thecam-era field of view canthusbe taken into account.Further-more, usingthis process,currentmeasurementalwaysre-maincloseto their desiredvalueanda control by Image-basedServoingensurestherobustnesswith respectto mod-eling errors. We applyour methodwhenobjectdimensionare knownor not and/orwhenthecalibration parameter-s of the camera are well or badlyestimated.Finally, realtime experimentalresultsusinga camera mountedon theendeffectorof a six d-o-f robotare presented.

1 Intr oduction

Visual servoing is classifiedinto two main approaches[15, 6, 8]. The first one is calledPosition-basedControl(PbC)or 3D visualservoing. In PbCthecontrolerrorfunc-tion is computedin theCartesianspace.Imagefeaturesareextractedfrom the imageanda perfectmodelof the tar-get is usedto determineits positionwith respectto cam-eraframe. The main advantageof this approachis that itcontrolsthecameratrajectorydirectly in Cartesianspace.However thereis nocontrolin theimagespaceandtheob-ject maygetout of thecamerafield of view duringservo-ing. Furthermore,it is impossibleto analyticallydemon-stratethe stability of the systemin presenceof modelingerrors.Indeed,thesensitivity of poseestimationalgorithmwith respectto calibrationerrorsandmeasurementpertur-bationsis not available[2].

The secondapproachis called Image-basedControl(IbC) or 2D visual servoing. In IbC the poseestimationis omittedandthecontrolerrorfunctionis computedin theimagespace. The IbC approachdoesnot needa precisecalibrationand modelingsincea closedloops schemeisperformed.However, thestability is theoreticallyensured

only in the neighborhoodof the desiredposition. There-fore, if initial anddesiredconfigurationsareclosed,IbC isrobust with respectto measurementandmodelingerrors.Otherwise,thatis if desiredandinitial positionaredistant,the stability is not ensuredand the object canget out ofthecamerafield of view [2]. Control laws taking into ac-countthis last constrainthave beenproposedfor examplein [13, 12]. We proposein this papera more robust ap-proach.

A third approachis describedin [11] and is called2 1/2 D visualservoing. In thiscasethecontrolerrorfunc-tion is computedin part in theCartesianspaceandin partin the2D imagespace.An homography, computedateachiteration,is usedto extract the Cartesianpart of the errorfunction.Hence,this methoddoesnot needa modelof thetarget.Contrarilyto thepreviousapproaches,it is possibleto obtainanalyticalresultsaboutstability with respecttomodelingandcalibrationerrors.However, themaindraw-backof 2 1/2 D visualservoing is its relativesensitivity tomeasurementperturbations.Furthermore,keepingall theobjectin thecamerafield of view is not obvious.

In thispaper, anew method,robustandstableevenifinitial anddesiredpositionsaredistant,is described.Themethodconsistsin planningtrajectoriesof asetof � pointslying on the target in imagespaceandthentrackingthesetrajectoriesby 2D visual servoing (seeFigure 1). Usingthisprocess,currentmeasurementsalwaysremainclosetotheir desiredvalue . Thus the good behavior of IbC insuchconfigurationcanbe exploited. Moreover, it is pos-sible to ensurethat the object will always remainin thecamerafield of view by enforcingsuchconstrainton thetrajectories.

Therearefew papersdealingwith pathplanningin im-agespace.In [7] a trajectorygeneratorusinga stereosys-temis proposedandappliedto obstacleavoidance.In [14]analignmenttaskis realizedusingintermediateview of thetargetsynthesizedby imagemorphing. However, noneofthemweredealingwith robustnessissues.Our pathplan-ning strategy is basedon the potentialfield method.This

Page 3: Path planning in image space for robust visual servoing · PDF fileNPQb that represent respectively the rotation and the translation from object frame E bto E'F and E to EJ (see Figure

methodwasoriginally developedfor an on-line collisionavoidance[9, 10]. In this approachthe robot motionsareunderthe influenceof an artificial potentialfield (

�) de-

finedasthesumof anattractive potential(���

) pulling therobot toward the goal configuration( ��� ) anda repulsivepotential(

���) pushingthe robot away from the obstacles.

Motion planningis performedin an iterative fashion. Ateachiterationanartificial force ���� , wherethe ����� vec-tor � representsa parameterizationof robotworkspace,isinducedby the potential function. This force is definedas ���� ��������� where ���� denotesthegradientvectorof�

at � . Using theseconventions, ���� can be de-composedas the sum of two vectors, � ��� ���� ���� �and � ��� �� � ��!� � , which arecalledthe attractive andrepulsive forcesrespectively. Path generationproceedsa-long thedirectionof ���� andthediscrete-timetrajectoryis givenby thetransitionequation:��"$#&%'�(��"')�*+" ��� " , ��� " .-/- (1)

where 0 is the incrementindex and * " a positive scalingfactordenotingthelengthof the 02143 increment.

Thepaperis organizedasfollows. We describein Sec-tion 2 themethodwhena modelof thetargetandthecali-brationof thecameraareavailable.We presentin Section3 how weproceedif theobjectis planarbut neitheramodelof thetargetandneitheraccuratecalibrationareavailable.In Section4 weusethetaskfunctionapproachto trackthetrajectories.Experimentalresultsarefinally givenin Sec-tion 5.56 789: ;<=> ?@AB CD s*(t) -

+ TControl law

ExtractionFeatures

desired image

initial image Trajectories Planning

Constraints

RO

BO

T

s(t)

Figure1: Block diagramof themethod

2 Known target

Here, we assumethat the calibration parametersanda target model are available. The techniqueconsistsinplanningcameraframe trajectorybringing it from initialcameraframe E'F ( �G�H�IF ) to desiredcameraframe EJ�( �K�L��� ) andthento projectthetargetmodelin theimagealongthetrajectory. Let �NM�O , �QPQO , u and R berespectivelythe rotationalmatrix andthe translationalmatrix betweenthecurrentcameraframe E O and EJ� , therotationaxisandthe rotationangleobtainedfrom �SM O . We chooseaspa-rameterizationof the workspace�ITU�HV �QP TO u RW XTZY . We

thushave �ITF �[V �\P TF u R] XTF Y and �IT� ��^�_N` % . Using aposeestimationalgorithm[3], we candetermineFaM�b , FcPNb ,�NMIb and �NPQb that representrespectively the rotation andthe translationfrom object frame E b to E'F and E b to EJ�(seeFigure3). Thevector �IF is thencomputedusingthefollowing relations:d �QM Fe� �SMIbfF�M Tb�\P F � � �NM F FaPQb ) �QPNbAccording to (1) we constructa pathas the sequenceofsuccessive pathsegmentsstartingat the initial configura-tion � F . We now presenthow thepotentialsfunctionsandtheinducedforcesaredefinedandcalculated.

Attracti ve potential and force. The attractive potentialfield� �

is simply definedasa parabolicfunction in orderto minimizethedistancebetweenthecurrentpositionandthedesiredone:��� � �gih , �j�k��� ,$l � �gmh , � ,$lwhere h is a positive scalingfactor. The attractive forcederiving from

� �is : � ��� n�o� ���� � �p� h � (2)

Repulsive potential and force. A point qnr , whichprojectsontothecamera’s imageplaneat a point with im-agecoordinatessZrt�oV uvrmwQrx�yYzT , is observableby thecam-eraif u{r}|oV u�~}uf��Y and wQr�|oV w�~�w+��Y , where u�~ , uf� ,w ~ , w � are the limits of the image(seeFigure 2). Oneway to createapotentialbarrieraroundthecamerafield ofview, assuringthat all featuresarealwaysobservableanddo not affect thecameramotionwhenthey aresufficientlyfar away from the imagelimits, is to definethe repulsivepotential

� �asfollow (seeFigure2) :��� a�Q x� %lx�/�+�������r�� % %� %$������y��� � %$������$��� %� %$������X��� � %$���������� ��

if ��|�� and� � 4�N x�U� else.

(3)where � is the vectormadeup of the coordinatesuvr�� %$ ¡  � ,wQr�� %$ ¡  � , � is the set ¢Z�S£¥¤�¦§u{r�|¨V uf~o©ZY�ª V uf�«�©¨um��Y or wQr�|�V w�~¬©iY2ª}V w+�­��©®w+��Y]¯ , © beingapos-itive constantdenotingthedistanceof influenceof theim-ageedges.Theartificial repulsive forcederiving from

�{�is : � ��� n�p�p°x± �{� 4�N ± � ² T �o�p°¥± ��� a�Q ± � ± �±�³ ±�³± �´² T

Page 4: Path planning in image space for robust visual servoing · PDF fileNPQb that represent respectively the rotation and the translation from object frame E bto E'F and E to EJ (see Figure

Vm

Um

U M

V M

Vr

Vm

Um

U M

V M

Vr

Figure2: Repulsivepotential

where ³ denotethesituationof thecamerawith respecttoa referenceframe.Thepreviousequationcanbewritten : � ��� n���'µ T¶¸·¹T ° ± ��� a�Q ± �®² Twhere:º · �[».¼»Q½ is the imageJacobian(or interactionmatrix)[4]. It relatesthe variationof imagefeature � to the ve-locity screw of thecamera¾ : ¿��� · ¾ . Thewell knowninteractionmatrix for a point q with coordinatecÀpÁ´Ât in cameraframeandcoordinatessp�Ã4Ä'Åv in the imageexpressedin meters,for aonemeterfocal lengthis :· cs�Æ�Ât n�ð � %Ç � ÈÇ Ä�Å �§É�¹)kÄ l ÊÅ� � %Ç ËÇ É�¹)kÅ l ´��Ä{ÅG��Ä ²When � is composedof the imagecoordinatesof � pointsthecorrespondinginteractionmatrix is :· 4�]Æ�̹ n�HÍ ·nT ÏÎ % Æ� % ÑÐ/ÐÒÐ ·¹T ÓÎ � Æ$ � �Ô T (4)º µ ¶ � »\½»NÕ is the ����� Jacobianmatrix that relatesthevariationof ³ to thevariationof � :µ ¶ �¬Ö �SM TO ^�×2`v×^{×Ø`Ø× Ù �m%Ú Ûwhere[11] :Ù �m%Ú �UÜyÝ ×2`v× ) R g sinc

l ° R g ² V ÞiY4ß�)UX�'� sinccR] � yV ÞiY l ßV ÞiY ß being the antisymmetricmatrix of cross productassociatedto Þº »Nà\á � ¼ �»\¼ is easilyobtainedaccordingto (3).

Let us note, using (2), (3) and (1), we obtain a cam-eratrajectoryin theworkspace.A PbCcouldthusbeusedto follow it. However, it is more interestingto performfeaturestrajectoriesin imagein orderto exploit the goodbehavior of IbC when the current and desiredcamerapositionsareclose.

2D trajectories. Let �NM " , �NP " and "NM b , "QP b be the ro-tationsandtranslationsmappingE " with E � and E b withEâ" , whereEâ" is thecameraframepositionat iteration 0 ofthepathplanning.With thesenotationswehave :d " M�b � � M T " � M�b"NPNb � �SM T " �NPQb � �NP "� In orderto performvisual servo control,we constructthetrajectoryof the projection sZr of eachpoint qnr�� %$ ¡  � on-to the imageusing the known coordinates

b.ã r of q¹r inE � . Thetrajectoryin imageis obtainedusingtheclassicalassumptionthat thecameraperformsa perfectperspectivetransformationwith respectto thecameraopticcenter(pin-holemodel): siråä " �Êæ�V " M b " P b Y b ã rwhere æ is the matrix of cameraintrinsic parameters.Inthenext part,we extendthis methodto thecasewherethetargetmodelis unknown.

3 Unknown planar target

In this section,we assumethat the target is planarbutthe targetmodelis not available. After recallingthe rela-tionsbetweentwo views of a planartarget,we presentthemethodwith accuratecalibrationparametersandthenweproveits robustnesswith respectto calibrationerror.

3.1 Euclidean reconstructionConsidera referenceplane ç given in desiredcamer-

a frame( EJ� ) by the vector èiTé�êV ë¥ì�Tk�îí2ì$Y , where ë¥ìis its unitary normal in EJ� and íØì the distancefrom çto the origin of EJ� (seeFigure 3). It is well known [5]that theprojectionof point q r lying on ç in currentviews r �ÃV u r w r �.YïT andin the desiredview sxìr �HV umìr wØìr �yYzTarelinkedby theprojectiverelation:ð r.sirt�(ñ�ò2s ìr (5)

where ñ ò is aprojectivehomography, expressedin pixels,of plane ç betweenthe currentanddesiredimagesand ða scalingfactor. We canestimateit from a setof óeôöõpoints(threepointsdefining ç ) in generalcaseor from asetof ó«ô§÷ pointsbelongingto ç [11, 5]. Assumingthatthe cameracalibrationis known, the Euclideanhomogra-phy ø�ò is computedasfollows :ø ò �Êæ �i% ñ ò æ (6)

Thematrix ø�ò canbedecomposedusingmotionparame-tersbetweenE � and E O [5]:ø�ò�� O M � ) O P �í ì ë ìÉT � � M T O � � M T O PQùyú ë ìÉT (7)

Page 5: Path planning in image space for robust visual servoing · PDF fileNPQb that represent respectively the rotation and the translation from object frame E bto E'F and E to EJ (see Figure

From ø�ò it is possibleto compute�NM O , PNù ú �GûÉüÉýùyú , andë¥ì usingfor examplethe algorithmpresentedin [5]. Theratio þ r betweenthe coordinate r of a point lying on ç ,with respectto cameraframe,and íØì , thatwe will useinthecontinuation,canalsobedetermined[11] :þ+râ�  rí ì � �¹)´ëÿì�T O M T� O P � £�í2ì\ ë ì�T O M T� æ �m% sir (8)

Z

X

Y

Z

d*

C

p*

Π o

gc

i

XG

I

O

g

g

i

tc

R ti

n*��

��

p

target point

Ri o

c

i

to

Ro

ot

Rg g

g g

g

g

Figure3: Euclideanreconstruction

3.2 Trajectory planningWe now choosethe partial parameterizationof the

workspaceas �IT¨� V P T ù ú u R] ÉTiY . We thushave �ITF �V P T ù ú F u R] TF Y and � � �U^�_S` % . Frominitial anddesiredim-ages,it is possibleto computethe homographyø òØä F andthento obtain �SM F , P ù ú Fÿ� �NP F�£SíØì , ë¥ì andthus �IF . As intheprevioussection,weconstructapathstartingat �IF andorientedalongtheinducedforcesgivenby :� � ��� � � h � � ��� � �'µéT¶ aí2ìQ · T�4�]Æ�íØì. �� »Sà á � ¼ �»\¼�� TAccordingto (4) and(8), · a�+Æåí2ìQ canbewritten :· 4�]Æ�í ì ¹��� �í ì � � (9)

where � �KV � T % Ð/Ð � Tr ÐÒÐ � T� YïT and � V T % Ð/Ð Tr Ð/Ð T� YzTaretwog � ��� matrix independentof íØì :������� ������ rÑ��� � %� � � È �� �� � %� � Ë �� ���� rt� Ö Ä r Å r ���J��Ä lr Å r�¹)­Å lr �JÄØr.ÅNr �JÄØr Û

TheJacobianmatrix µ ¶ 4íØìQ is givenby :µ ¶ aí ì n� Ö íØì �QM T O ^{×Ø`Ø×^�×Ø`Ø× Ù �i%Ú Û (10)

Usingtheaboveequation,thevector ��" canbecomputedateachiterationandfrom � " , therotationmatrix �NM " andthevector PQù ú ä " � �QP " £SíØì areobtained.

2D trajectories The homographymatrix ø�ò2ä " of planeç relatingthecurrentanddesiredimagescanbecomputedfrom �I" using(7) :ø òØä "Ñ� � M T " � � M T " P ù ú ä "Në ìÉTAccording to (5) the imagecoordinatesof the points qnrbelongingto ç at time 0 aregivenby :ð r.sZr�ä " � Í ð r u r�ä " ð r w råä " ð r Ô T �(ñ�ò2ä " s ìr (11)s r�ä " is easilyobtainedby dividing ð r s råä " by its lastcom-ponent,thusthe equation(11) allows us to obtainthe tra-jectoriesin theimage.

Influenceof íØì . TheparameteríØì appearsonly in repul-sive force throughthe matrix � composedof the productof µ T¶ aí ì and · T a�+Æåí ì . Accordingto (9) and(10) wehave: �}�îµ T¶ aí ì · T 4�]Æ�í ì x� Ö �SM�O T· � TÚ � T ÛThatprovesthat � andthusthetrajectoriesin theimageareindependentof parameteríØì .Influence of intrinsic parameters. If the camerais notperfectlycalibredand �æ is usedinsteadof æ , theestimatedhomographymatrix is :�ø�ò2ä F ���æ �m% æ�ø�ò2ä F æ �i% �æ (12)

Let usassumethefollowing hypothesis(H1):�ø ò2ä FZ���æ �i% æIø ò2ä F4æ �m% �æK�����ø ò2ä "����æ �m% æIø ò2ä "Sæ �m% �æThisassumptionmeansthattheinitial errorin theestimat-edhomographyis propagatedalongthetrajectory. Accord-ing to (11)and(6) we obtain:�ð r �sZr�ä " � �æ �ø�ò2ä " �æ �m% s ìr (13)

Considering(H1), (12)and(13),we obtain:�ð r �siråä " �îæ�ø�òØä " æ �i% s ìr � ð r.sZr�ä "Therefore,underassumptionH1, thetrajectoriesin theim-agearenotdisturbedby errorson intrinsicparameters.Wewill checkthis nice propertyon the experimentalresultsgivenin Section5.

4 Control SchemeIn orderto track the trajectoriesusingan Image-based

Controlscheme,a vision-basedtaskfunction !f ³ #"É yÆ$"É [4]is definedas: !��%�· # 4�W ³ #"É � ¥��� ì #"É � (14)

Page 6: Path planning in image space for robust visual servoing · PDF fileNPQb that represent respectively the rotation and the translation from object frame E bto E'F and E to EJ (see Figure

where � is composedof the current image coordinates,�Sì is the desiredtrajectoryof � computedin Sections2,3 and �· # is the pseudo-inverseof a chosenmodelof · .Thevalueof · atthecurrentdesiredpositionis usedfor �· :º if the target is known �· � · 4�Sì& Æ�Ì�ì" where ̹ì" iseasilyobtainedfrom ��" andthetargetmodelº else �· � · 4�Sì& Æ �í2ìQ , �íØì beinganestimatedvalueof í2ìIn order that ! exponentially decreasestoward � thevelocitycontrollaw is givenby [4] :¾é�p�(')!!�+*± !± " (15)

where' is aproportionalgainand �»-,» 1 denotesanestimatedvalueof thetimevariationof ! . If thetargetis motionless,we obtainfrom (14) : ± !± " �p�.�· # ± �Nì± " (16)

Accordingto (16),we rewrite (15) :¾é���('!Ñ) �· #0/ ± � ì± "wheretheterm �· # *»\¼ ú» 1 allowsto compensatetrackingerrorin following thespecifiedtrajectory[1]. It canbeestimatedasfollow : �· #0/± � ì± " �+�· # � ì" ��� ì"Q�m%1 "Thediscretizedcontrollaw at time 0 1 " canfinally bewrit-ten: ¾o�p�('2�· # a�Q"Ñ��� ì" Z)3�· # �Nì" ���Nì"N�m%1 "5 Experiments

Themethodspresentedhave beentestedon a six d-o-feye-in-handsystem.Thetargetis aplanarobjectwith fourwhitemarks(seeFigure4). Displacementbetweentheini-tial and final camerapositionsis very significant ( " È ��]�+�5464 , " Ë �8797+�5464 , ";: � � g �94<4 , uRW È � g õ+í9= , uR] Ë ��>�õ+í9= , uRW ;:I���\÷?>�í5= ) andin this caseclassicalimage-basedandposition-basedvisual servoing fail. Fig-ure5(c) shows the importanceof repulsive potentialwith-out which thevisual featuresgetout largely of thecamerafield of view.

The obtainedresults(seeFigure 5) using the methodpresentedonSection2 andcorrectintrinsicparametersarevery satisfactory. The positioningtask is accuratelyreal-izedwith regularvelocities(becausetheerror � " ���Nì" keeps

a regularvalue). After thecompleterealizationof the tra-jectory, servoing is prolongedwith asmallgainandacon-stantreference.We cannoticethat thedesiredtrajectoriesandthetrackedtrajectoriesarealmostsimilar.The methodpresentedin Section3 hasbeentestedwithtwosetof parameters.In Figure6, intrinsicparametersgiv-en by cameramanufacturerandreal valueof íØì hasbeenusedandin Figure7, anerrorof 20%is addedon intrinsicparametersaswell ason the parameteríØì . In both casesthe resultsaresatisfactory. In particularandasexpected,we will note that the plannedtrajectoriesare practicallyidenticalin bothcases.

(a) 50 100 150 200 250 300

50

100

150

200

2501

2

3 4

(b) 50 100 150 200 250 300

50

100

150

200

250

1 2

4 3

(c) −150 −100 −50 0 50 100 150 200 250 300 350 400

0

100

200

300

400

500

600

700

Figure4: Initial -a-, desired-b- imagesof the targetandtrajec-toriesplannedwithout repulsive potential-c-

(a) 50 100 150 200 250 300

50

100

150

200

250 (b) 50 100 150 200 250 300

50

100

150

200

250

(c) 0 100 200 300 400 500 600 700 800−8

−6

−4

−2

0

2

4

6

translations

rotations

(d) 0 100 200 300 400 500 600 700 800−50

0

50

100

150

200

250

Figure5: First case,a : plannedtrajectories,b : followedtrajectories,c : velocities(cm/sanddg/s),d : erroron pix-elscoordinates-d-

6 ConclusionIn this paper, we have presenteda powerful methodto

increasetheapplicationareaof visualservoing to thecas-es where initial and desiredpositionsof the cameraaredistant. Experimentalresultsshow the validity of our ap-proachandits robustnesswith respectto modelingerrors.Futurework will be devoted to introducesupplementaryconstraintsin theplanedtrajectories: to avoid robot jointlimits, kinematicsingularities,occlusionsand obstacles.Anotherperspective is to generatethe trajectoriesin im-agespaceof morecomplex featuresthat � pointsin orderto applyour methodto realobjects.

Page 7: Path planning in image space for robust visual servoing · PDF fileNPQb that represent respectively the rotation and the translation from object frame E bto E'F and E to EJ (see Figure

(a) 50 100 150 200 250 300

50

100

150

200

250 (b) 50 100 150 200 250 300

50

100

150

200

250

(c) 0 100 200 300 400 500 600 700 800−8

−6

−4

−2

0

2

4

6

translations

rotations

(d) 0 100 200 300 400 500 600 700 800−50

0

50

100

150

200

250

Figure6: Secondcasewithout errors,a : plannedtrajecto-ries,b : followedtrajectories,c : velocities(cm/sanddg/s),d : erroron pixelscoordinates

(a) 50 100 150 200 250 300

50

100

150

200

250 (b) 50 100 150 200 250 300

50

100

150

200

250

(c) 0 100 200 300 400 500 600 700 800 900−10

−8

−6

−4

−2

0

2

4

translations

rotations

(d) 0 100 200 300 400 500 600 700 800 900−50

0

50

100

150

200

250

Figure7: Secondcasewith errors,a : plannedtrajectories,b : followedtrajectories,c : velocities(cm/sanddg/s),d :erroron pixelscoordinates

References

[1] F. Berry, P. Martinet,andJ. Gallice. Trajectorygen-erationby visualservoing. Proc. IEEE/RSJInt. Con-f. on Intelligent Robotsand Systems, 2:1066–1072,1997.

[2] F. Chaumette. Potentialproblemsof stability andconvergencein image-basedandposition-basedvisu-al servoing. TheConfluenceof VisionandControl D.Kriegman,G. Hager, A. Morse(eds),LNCISSeries,Springer Verlag, 237:66–78,1998.

[3] D. DementhonandL.S. Davis. Model-basedobjectposein 25 lines of code. Int. Journal of ComputerVision,15(1/2): 123-141, June1995.

[4] B. Espiau,F. Chaumette,and P. Rives. A new ap-proachto visualservoingin robotics.IEEETrans.onRoboticsandAutomation,8(3) : 313-326, 1992.

[5] O. FaugerasandF. Lustman. Motion andstructurefrom motion in a piecewiseplanarenvironment. Int.Journal of PatternRecognition andArtificial Intelli-gence, 2(3):485–508,1988.

[6] K. Hashimoto.VisualServoing:RealTimeControl ofRobotManipulators Basedon Visual SensoryFeed-back. World ScientificSeriesin RoboticsandAuto-matedSystems,Vol 7,World ScientificPress,Singa-por, 1993.

[7] K. Hosoda,K. Sakamoto,andM. Asada. Trajecto-ry generationfor obstacleavoidanceof uncalibrat-ed stereovisual servoing without 3d reconstruction.Proc.IEEE/RSJInt. ConferenceonIntelligentRobotsandSystems, 1(3):29–34,August1995.

[8] S.Hutchinson,G.D.Hager, andP.I. Corke. A tutorialonvisualservo control. IEEETrans.onRoboticsandAutomation, 12(5):651–670,octobre1996.

[9] O. Khatib. Realtime obstacleavoidancefor manip-ulatorsandmobile robots. Int. Journal of RoboticsResearch, 5(1):90–98,1986.

[10] J. C. Latombe. RobotMotion Planning. Kluwer A-cademicPublishers,1991.

[11] E. Malis, F. Chaumette,andS.Boudet.2 1/2d visualservoing. IEEE Trans.on RoboticsandAutomation,15(2):238–250,April 1999.

[12] E. MarchandandG.D. Hager. Dynamicsensorplan-ning in visual servoing. IEEE Int. ConferenceonRoboticsandAutomation, 3:1988–1993,May 1998.

[13] B.J. Nelson and P.K. Khosla. Integrating sensorplacementand visual tracking strategies. IEEE In-t. Conferenceon Roboticsand Automation, 2:1351–1356,May 1994.

[14] R. Singh, R. M. Voyle, D. Littau, and N. P. Pa-panikolopoulos. Alignementof an eye-in-handsys-tem to real objectsusingvirtual images. WorkshoponRobustVisionfor Vision-BasedControl of Motion,IEEE Inter. Conf. on RoboticsandAutomation, May1998.

[15] L.E Weiss,A.C Sanderson,andC.P Neuman. Dy-namicsensor-basedcontrolof robotwith visualfeed-back. IEEE Journal of Roboticsand Automation,3(5):404–417,October1987.