sheet2

6

Click here to load reader

description

AR DRone experimental lab

Transcript of sheet2

VisualNavigationforFlyingRobots ComputerVisionGroupD.Cremers,J.Sturm,J.Engel,C.Kerl DepartmentofInformaticsSummerTerm2013 TechnicalUniversityofMunichSheet2Topic: MotionModelsandRobotOdometrySubmissiondeadline: Tue,21.05.2013,10:[email protected]: Pleasereadrst!In this exercise sheet, you will learn how to use a Kalman lter to estimate the poseof theAR.Dronefromitssensormeasurements, usingvisual markersattachedtotheground: Intherst exercise, youwill setupeverythingrequiredtoythedrone with a joystick, and record a .bag le of the drone ying above visual markers.In the secondexercise, you will develop and implement the Kalman lter, and useittoestimatethedronestrajectoryfromtherecorded.baglefromexerciseone.As we have more groups than AR.Drones we provide an example .bag le, such thatsomeof youcanstartwiththesecondexercise(forwhichyoudonotrequireanAR.Droneorjoystick).Exercise1: ManualFlight(a) Setupthejoystick: Ifyouareworkingonyourownlaptop,yoursthavetoinstalltheROSjoystickpackagesusingsudo apt-get install ros-fuerte-joystick-driversNowpluginthe joystick, andstart roscoreandthe joystickdriver usingrosrun joy joynode. Inspectthetopic/joy, andverifythatthejoystickworks-youmighthavetopressthePbuttononcetoinitializeit.(b) Setupthecontrol node: Weprovideanodethattranslatestherawjoystickmessagestothecorrectcontrol commandssenttothedrone, whichyoucanndhere:git://github.com/tum-vision/ardronejoystick.gitCloneitintoyourworkspace,runrosmake ardronejoystick,andstartthenodeusingrosrun ardronejoystick ardroneteleop. Theaxesandbut-tonsareassignedasfollows:TheR1buttontogglestheemergencystateoftherobot. PressingR1whileyingwill stoptherotorsimmediately. If theLEDbeneaththerotorsarered(forexample,afteracrash),pressR1toresetthedrone.1The L1button starts the motors of the quadrocopter. It also works as adeadman switch so that the robot will land if you release it during ight.Thequadrocopter will ascendtoonemeter abovegroundandtries toholdthisposition.Theleft stickcanbeusedtocontrol theroll andpitchangleof thedrone. Keepinmindthatthesevelocitiesaregiveninthelocalframeofthedrone!Therightstickcontrolstheyaw-rateandthealtitude.The select button can be used to switch between the two cameras. Thiscanalsobedonebyexecutingrosservice call /ardrone/togglecam.Thetrianglebuttoncanbeusedtoswitchotheon-boardpositionsta-bilization: Per default, thedronehovers (i.e. stabilizes its horizontalpositionbykeepingvxandvyatzero)whenyoudonottouchtheleftcontrolstick. Itevenactivelydecelerateswhenlettinggooftheleftcon-trolstick.Pressing and holding the triangle button will switch this o, i.e. give youdirectcontrol overroll andpitchatall timesnotehowthisleadstorapiddriftinhorizontalposition.(c) First Flight: Connect the drone as in the very rst exercise sheet. Display thecameraimageusingrosrun imageview imageview image:=ardrone/imagerawFlyashortroundintherobotlabandpracticeyouryingskills. Takeagrouppictureofyourteamandaddittoyourreport.(d) GraphVisualization: Userxgraphtovisualizetherunningnodesandusedtopics. Takeascreenshot,andattachittoyourreport.(e) Recorda.bagFile: [putmarkeronground]. Switchtothebottomcamera(theAR.Droneonlystreamsoneof thetwocameraimagestothePC), anduserosbagtorecordapproximatelya30s-ight. Themarkershouldregularlybe visible in the camera image, but should leave it temporarily. You will needtorecordat least the cameraimage topic, the camerainfotopic, andtheardronenavdatatopic,i.e.:rosbag record /ardrone/navdata /ardrone/imageraw/ardrone/camerainfo -O flight.bagExercise2: ExtendedKalmanFilterIn this exercise, you will learn how to use a Kalman lter to estimate the pose of therobot from a bag le or live data. You can either use the bag le from the website, or2yourownbaglerecordedinExercise1. Forthescreenshotsinyourreport,pleaseusethebaglefromourwebsite. Exitall runningnodesanddisconnectthedroneandjoystick,youwillnotneedthemforthisexercise.We provide you with a C++ framework of an extended Kalman lter for which youwill havetoimplementthecorrectionstep. Forsimplicity, wemodel thequadro-copter only in the 2D plane, i.e., its state at time t is described by xt=

xtytt

.Thepredictionfunctionisgivenbytheodometryfromthepreviousexercisesheetusingthemeasuredyawangleandhorizontalvelocities.(a) DownloadtheC++frameworkfortheKalmanFilterfromgit://github.com/tum-vision/visnav2013exercise2.gitRenamethefoldertoex2andbuilditusingrosmake.(b) Downloadandcompilethevisualmarkerdetectionnodefromhttp://robotics.ccny.cuny.edu/git/ccny-ros-pkg/ccnyvision.gitLaunchthemarkerdetectionnodeusingthelaunchleprovidedinex2.(c) Startrvizandaddagrid, tf display, amarker-displayandamarkerArray-displayasinthepreviousexercisesheet. Remembertochangethebasecoor-dinatesystemto/world,andtoselectthecorrecttopics.(d) Start the Kalman lter by running rosrun ex2 visnav2. Replay the bag leusingrosbag playandwatchtheresultinRVIZ;Takeascreenshotafteracoupleofsecondswiththecovarianceellipse. TheKalmanlteralsopublishestheestimatedposeon/ardrone/filteredpose. Visualisethefullestimatedtwo-dimensionaltrajectoryfromthebagleusingrxplot -p 48 /ardrone/filteredpose/linear/x:yi.e., restart the playbackafter startingrxplot. Take a screenshot andattachittoyourreport.(e) Assume that thequadrocopter drifts more(saytwotimes more) thanthisintheglobal x-direction, for example, becausethereis strongwindinthisdirection. ModifytheprocessnoisematrixQaccordinglyinthesourcecode,re-runtheexperiment,andtakeanothertwoscreenshots.(f) Whatwouldbeagoodwaytodeterminethenoisevaluesempirically? De-scribe briey an experimental setup that could be used to determinethesevalues.3(g) Theframeworkdetectsmarkersintheenvironmentandprovides(inthecaseof adetection)anobservationz=

x y

relative totheframeof thequadrocopter. Specifytheobservationfunctionz = h(x)andcomputeitsderivative(Jacobian)H=h(x)x.(h) Implement the correction step in the Kalman lter using the observation func-tionandtheJacobian. Takecarewhenyoucomputethedierencebetweenangles(asrequiredforthecomputationofzt h(t)inthecorrectionstep).Onesimplesolutionistonormalizetheangleafterwards, forexample, usingnormalized= atan2(sin(), cos()).(i) userxplot again to visualize the - now corrected - estimated two-dimensionaltrajectory from the bag le, take a screenshot of the trajectory,and addittoyourreport.(j) From the screenshots, estimate roughly how far the pose-estimate driftedwithoutvisualmarkersoverthe48sight(inmeters).SubmissioninstructionsA complete submission consists both of a PDF le with the solutions/answers to theboldquestionsontheexercisesheetandaTGZ(orZIP)lecontainingthesourcecodethat youusedtosolvethegivenproblems. Makesurethat your TGZlecontainsall lesnecessarytocompileandrunyourcode,butitshouldnotcontainanybuildlesorbinaries(make clean,rm -rf bin). Pleasesubmityoursolutionviaemailtovisnav2013@vision.in.tum.de.456