VISUAL SERVOING SYSTEM FOR A MOBILE PLATFORM

1
[1] S. Hutchinson, G. Hager, and P. Corke. A tutorial on visual servo control. IEEE Trans. On Robotics and Automation, 1996. [2] L Meier, P Tanskanen, F Fraundorfer and M Pollefeys, PIXHAWK:A system for autonomous flight using onboard computer vision, Proc. of The IEEE International Conference on Robotics and Automation(ICRA), 2992-2997,May 2011 VISUAL SERVOING SYSTEM FOR A MOBILE PLATFORM Zhang Xi; Zhang Chengcheng; Camilo Perez; Romeo Tatsambon; Martin Jagersand Robotics Research, Dept. of Computing Science, Univ. of Alberta, Edmonton, AB Introduction Plenty of robots have been applied to the exploration of the unknown space, both on the earth and outside the planet. The Mars Exploration Rovers ( MER) has shown the advantage of a mobile platform. We are working in the area of semi-autonomous supervisory control for space telerobotic applications, and focus on mobile manipulation. System Overview Aim & Objectives Objectives: (1)be familiar with the robotic plat- form (2)select an efficient tracker which is robust to occlusion and illumination (3)implement the controller law in image-based visual servoing (4)study QGroundControl and use it as a main reference for our system interface Aim: Provide a mobile visual servoing system with user friendly interface. Image-based visual servoing used visual data to directly control the motion of robot manipulators. Visual-motor function relates position to visual measurements : = () The time-derivation of visual-motor function : = = = The aim of visual-based control is to minimize an error that can be defined as : e t =s− s is the computed features, is the desired values of the features. Image-Based Control Law: = −λ + () User Interface References QGroundControl is designed to be a ground control station for small air- land-water autonomous unmanned systems. We choose QGroundControl as a main base for our system interface design. why do we choose QGroundControl?: 1) Its Open-source and modular design allow us to extend on each layer 2) Support for UDP, serial which allow us to communicate with Segway and camera 3) Use image/video streaming component for digital video transmission & display 4) Integrated MAVLink protocol supports up to 255 vehicles or robots in parallel and project-specific custom messages can be added 5) Many other useful functions integrated like 2/3D aerial maps support, routine set, real-time plotting of telemetry data, etc. Image-based Visual Servoing User Interface (QGroundControl) Visual Servoing System Camera Feature Tracker Control Law v c =−λ J + e Desired Current S* + Segway - S Pose Controller MAVLink Interface of QGroundControl (5)establish communication between QGroundControl and camera (6)establish communication between QGroundControl and visual servoing system (7)design new tools and widgets on QGroundControl for data receiving and command sending System Requirements Initial Position Desired Position ( ,v ) Rotation (μ, v) = −1 USB 2.0 Camera QGround Control Image Bus (shared mem.) MAVLink Broadcast (topic-filtered) MAVCONN MIDDLEWARE Segway Visual Servoing System = Segway with two degree of freedom: Task Space Selected feature Image Space fulfilled User Interface Visual Servoing System Communication System S* S e=s− =0 ME Prototype of MER

Transcript of VISUAL SERVOING SYSTEM FOR A MOBILE PLATFORM

Page 1: VISUAL SERVOING SYSTEM FOR A MOBILE PLATFORM

[1] S. Hutchinson, G. Hager, and P. Corke. A tutorial on visual servo control. IEEE Trans. On Robotics and Automation, 1996. [2] L Meier, P Tanskanen, F Fraundorfer and M Pollefeys, PIXHAWK:A system for autonomous flight using onboard computer vision, Proc. of The IEEE International Conference on Robotics and Automation(ICRA), 2992-2997,May 2011

VISUAL SERVOING SYSTEM FOR A MOBILE PLATFORM Zhang Xi; Zhang Chengcheng; Camilo Perez; Romeo Tatsambon; Martin Jagersand

Robotics Research, Dept. of Computing Science, Univ. of Alberta, Edmonton, AB

Introduction

Plenty of robots have been applied to the exploration of the unknown space, both on the earth and outside the planet. The Mars Exploration Rovers ( MER) has shown the advantage of a mobile platform.

We are working in the area of semi-autonomous supervisory control for space telerobotic applications, and focus on mobile manipulation.

System Overview

Aim & Objectives

Objectives: (1)be familiar with the robotic plat-form

(2)select an efficient tracker which is robust to occlusion and illumination

(3)implement the controller law in image-based visual servoing

(4)study QGroundControl and use it as a main reference for our system interface

Aim: Provide a mobile visual servoing system with user friendly interface.

Image-based visual servoing used visual data to directly control the motion of robot manipulators. Visual-motor function relates position 𝒒 to visual measurements 𝐬 :

𝐬 = 𝐅(𝒒) The time-derivation of visual-motor function :

𝑑𝒔

𝑑𝑡=𝜕𝑭

𝜕𝑞

𝑑𝒒

𝑑𝑡 𝒔 = 𝑱𝒒 = 𝑱𝒗𝒄

The aim of visual-based control is to minimize an error that can be defined as : e t = s − 𝑠∗ s is the computed features, 𝑠∗ is the desired values of the features. Image-Based Control Law:

𝒗𝒄 = −λ𝑱+𝑒(𝑡)

User Interface

References

QGroundControl is designed to be a ground control station for small air-land-water autonomous unmanned systems. We choose QGroundControl as a main base for our system interface design. why do we choose QGroundControl?: 1) Its Open-source and modular design allow us to extend on each layer 2) Support for UDP, serial which allow us to communicate with Segway and camera 3) Use image/video streaming component for digital video transmission & display 4) Integrated MAVLink protocol supports up to 255 vehicles or robots in parallel and project-specific custom messages can be added 5) Many other useful functions integrated like 2/3D aerial maps support, routine set, real-time plotting of telemetry data, etc.

Image-based Visual Servoing

User Interface (QGroundControl)

Visual Servoing System

Camera Feature Tracker

Control Law v c =−λ J +e

Desired

Current

S* +

Segway - S

Pose Controller

MAVLink

Interface of QGroundControl

(5)establish communication between QGroundControl and camera

(6)establish communication between QGroundControl and visual servoing system

(7)design new tools and widgets on QGroundControl for data receiving and command sending

System Requirements

Initial Position

Desired Position

𝑥 (𝜇𝑑 , v𝑑)

Rotation 𝜃

(μ, v)

𝑥 𝜃

= 𝐽−1𝜇 𝑣

USB 2.0 Camera

QGround Control

Image Bus (shared mem.)

MAVLink Broadcast (topic-filtered)

MAVCONN MIDDLEWARE

Segway Visual

Servoing System

𝒗𝒄 = 𝑱−𝟏𝒔

Segway with two degree of freedom:

Task Space

Selected feature

Image Space

fulfilled

User Interface

Visual Servoing System

Communication System

S*

S e = s − 𝑠∗=0

ME Prototype of MER