An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam...

34
An appearance-based An appearance-based visual compass visual compass for mobile robots for mobile robots Jürgen Sturm Jürgen Sturm University of Amsterdam Informatics Institute

Transcript of An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam...

An appearance-based An appearance-based visual compass visual compass

for mobile robotsfor mobile robots

Jürgen SturmJürgen Sturm

University of AmsterdamInformatics Institute

OverviewOverview Introduction to Mobile RoboticsIntroduction to Mobile Robotics

Background (RoboCup, Dutch Aibo Team)Background (RoboCup, Dutch Aibo Team)

ApproachApproach

ResultsResults

ConclusionsConclusions

2

Mobile robotsMobile robots

SICO at Kosair Children's Hospital Dometic, Louisville, Kentucky

Sony Aibos playing soccerCinekids, De Balie, Amsterdam

Robot cranes and trucks unloading shipsPort of Rotterdam

RC3000, the robocleanerKärcher

Challenge Application: Challenge Application: Robot SoccerRobot Soccer

4

Robot localizationRobot localization

Robot Robot localizationlocalization.. is the problem of estimating the robot’s .. is the problem of estimating the robot’s

pose relative to a map of the pose relative to a map of the environment.environment.

Probabilistic approachesProbabilistic approaches NoiseNoise AmbiguityAmbiguity UncertaintyUncertainty

5

DesignDesign

SensorsSensors Wheelsensors, GPS, Laserscanner, Wheelsensors, GPS, Laserscanner,

Camera..Camera.. Feature spaceFeature space

Map and Belief RepresentationMap and Belief Representation Grid-based Maps, Topological graphsGrid-based Maps, Topological graphs Single/Multi Hypothesis TrackersSingle/Multi Hypothesis Trackers

FiltersFilters Kalman Filter, Monte-Carlo MethodsKalman Filter, Monte-Carlo Methods

6

Design of Classical Design of Classical ApproachesApproaches

Artificial environmentsArtificial environments (Electro-magnetic) guiding lines(Electro-magnetic) guiding lines (Visual) landmarks(Visual) landmarks

Special sensorsSpecial sensors GPSGPS Laser-range-scannersLaser-range-scanners Omni-directional camerasOmni-directional cameras

Computationally heavyComputationally heavy offline computationoffline computation

7

Design of New ApproachDesign of New Approach

Natural environmentsNatural environments Human environmentsHuman environments Unstructured and unknown for the robotUnstructured and unknown for the robot

Normal sensorsNormal sensors CameraCamera

Reasonable requirementsReasonable requirements Real-timeReal-time On-boardOn-board

8

Platform: Sony AiboPlatform: Sony Aibo

Internal camera•30fps•208x160 pixels

Computer•64bit RISC processor•567 MHz•64 MB RAM•16 MB memorystick•WLAN

Actuators•Legs: 4 x 3 joints•Head: 3 joints

9

ApproachApproach

10

Demo VideoDemo VideoVisual CompassVisual Compass

11

Approach - SynopsisApproach - Synopsis

12

Localization Filter

tu tz

1( )tp 1( | , , )t t t tp u z

Raw image Color class imageSector-based

feature extraction

Motion Model

1, ( | , )t t t tu p u

EstimatedMotionM

otio

n da

taIm

age

data Previously learned

map

1( | , )t t tp u

prior odometry-corrected posterior

Sensor Model

CorrelationLikelihoods

, ( | , )t t tz p z m

13

Sector-based feature Sector-based feature extraction (1)extraction (1)

Camera field of view: 50°Head field of view: 230°

14

Sector-based feature Sector-based feature extraction (2)extraction (2)

For each sector: For each sector: Count color class transitions in vertical Count color class transitions in vertical

directiondirection Compute relative transition frequenciesCompute relative transition frequencies

15

Sensor model (1)Sensor model (1) Relative frequency of Relative frequency of

transitions from color class i transitions from color class i to color class j in direction φto color class j in direction φ

Frequency measurements Frequency measurements originate from a originate from a probabilistic source probabilistic source (distribution)(distribution)

How to approximate these How to approximate these distributions?distributions?

,

0 ( ) 1

( ) 1

ijt

ijt

i j

z

z

( ) ~ ( )ij ijtz Z

16

Sensor model (2)Sensor model (2)

Approximate source by a histogram Approximate source by a histogram distribution distribution

(parameters constitute the map)(parameters constitute the map)

-11

-2 12

( 1)1

if 2

if 2 21( )

if 2 2

ijij

ijijij

ij binsijk ij bins bins

k bins ij

m z

m zp z m

mm z

(1) (2) ( )( ) [ ; ; ]ij ij ij ijkZ m m m

( )tp z

tz

17

(1)ijm

(2)ijm

Sensor model (2)Sensor model (2) Likelihood that a single Likelihood that a single

frequency measurement frequency measurement originated from direction originated from direction φφ

Likelihood that a full Likelihood that a full feature vector (one sector) feature vector (one sector) originated from direction originated from direction φφ

Likelihood that a camera Likelihood that a camera image (set of features) image (set of features) originated from direction originated from direction φφ

( ) | ( )ij ijtp z m

,

( ) | ( )

( ) | ( )

t

ij ijt

i j

p z m

p z m

18

Sensor model (2)Sensor model (2) Likelihood that a single Likelihood that a single

frequency measurement frequency measurement originated from direction originated from direction φφ

Likelihood that all Likelihood that all frequency measurements frequency measurements originated from direction originated from direction φφ

Likelihood that whole Likelihood that whole camera image originated camera image originated from direction φfrom direction φ

( ) | ( )ij ijtp z m

,

( ) | ( )

( ) | ( )

t

ij ijt

i j

p z m

p z m

/ 2

/ 2

[ / 2, / 2] |

( ) | ( )

t

FOV

t

FOV

p z FOV FOV m

p z m d

Localization filterLocalization filterOrientational componentOrientational component

Use a Bayesian Filter to update Use a Bayesian Filter to update robot‘s beliefs (circular grid buffer)robot‘s beliefs (circular grid buffer)

From this buffer, extract per time From this buffer, extract per time stepstep Heading estimateHeading estimate Variance estimateVariance estimate

1( )tp 1( | , , )t t t tp u z 1( | , )t t tp u

prior odometry-corrected posterior20

1, ( | , )t t t tu p u , ( | , )t t tz p z m

( ) ( , )tp N

ResultsResults

ResultsResults Brightly illuminated living room

0

10

20

30

40

50

60

70

80

90

100

0 10 20

30 40 50

60 70 80

90

true orientation [deg]

esti

mat

ed o

rien

tati

on

[d

eg]

0

10

20

30

40

50

60

70

80

90

100

0 10 20

30 40 50

60 70 80

90

true orientation [deg]

esti

mat

ed o

rien

tati

on

[d

eg]

Applicable in natural indoor environmentApplicable in natural indoor environment Good accuracy (error <5°)Good accuracy (error <5°)

22

Results Results Daylight office environment

Applicable in natural office environmentApplicable in natural office environment Very robust against displacement Very robust against displacement

(error <20° over 15m)(error <20° over 15m)

-40

-30

-20

-10

0

10

20

30

40

-720

-560

-400

-240

-80

80 240

400

560

distance from training spot [cm]er

ror

in o

rien

tati

on

[d

eg]

0-40

-30

-20

-10

0

10

20

30

40

-720

-560

-400

-240

-80

80 240

400

560

distance from training spot [cm]er

ror

in o

rien

tati

on

[d

eg]

0

23

Results Results Outdoor soccer field

Applicable in natural outdoor environmentApplicable in natural outdoor environment

24

Results Results RoboLab RoboLab 4-Legged soccer field

Applicable in RoboCup soccer environmentApplicable in RoboCup soccer environment

25

Results Results RoboLab RoboLab 4-Legged soccer field

0

10

20

30

40

50

60

70

80

90

0 50 100 150 200 250

distance from training spot [cm]

erro

r in

ori

enta

tio

n [

deg

]

con

fid

ence

in

terv

al [

deg

]

variance

error in orientation

0

10

20

30

40

50

60

70

80

90

0 50 100 150 200 250

distance from training spot [cm]

erro

r in

ori

enta

tio

n [

deg

]

con

fid

ence

in

terv

al [

deg

]

variance

error in orientation

True average error <10° on a grid of 3x3mTrue average error <10° on a grid of 3x3m

26

Results Results Variable and Parameter StudiesVariable and Parameter Studies

Distance to training spotDistance to training spot Changes in illuminationChanges in illumination

Angular resolutionAngular resolution Scanning grid coverageScanning grid coverage Number of color classesNumber of color classes

27

Localization filterLocalization filterTranslational componentTranslational component

Use multiple training spotsUse multiple training spots Each (projectively distorted) patch Each (projectively distorted) patch

yields slightly different likelihoodsyields slightly different likelihoods

Interpolate translation from these Interpolate translation from these likelihoodslikelihoods

Visual HomingVisual Homing

28

Demo VideoDemo VideoVisual HomingVisual Homing

29

Results Results Visual HomingVisual Homing

-100

-75

-50

-25

0

25

50

75

100

-100 -75 -50 -25 0 25 50 75 100

x [cm]

y [c

m]

Positioning accuracy

cm

cm

cm

cm

37.15

73.16

09.12

30.22

Robot walks back to center after kidnap

Proof of conceptProof of concept

30

ConclusionsConclusions Novel approach to localization:Novel approach to localization:

Works in unstructured Works in unstructured environmentsenvironments

Accurate, robust, efficient, Accurate, robust, efficient, scaleablescaleable

Interesting approach for mobile Interesting approach for mobile robotsrobots

Future ResearchFuture Research

Use Monte-Carlo LocalizationUse Monte-Carlo Localization Extend to dynamic environmentsExtend to dynamic environments Triangulation from two training spotsTriangulation from two training spots

Announced succeeding projects:Announced succeeding projects: Port to RoboCup Rescue Simulation Port to RoboCup Rescue Simulation

(MSc. Project)(MSc. Project) RoboCup 2007 Open Challenge (DOAS RoboCup 2007 Open Challenge (DOAS

Project)Project)32

3rd Prize Technical 3rd Prize Technical ChallengesChallenges

of the of the 4-Legged League, RoboCup 2006 4-Legged League, RoboCup 2006 in Bremenin Bremen

33

Thank You!Thank You!