Semantic Curiosity 5min - cs.cmu.edudchaplot/talks/eccv20-semantic-curiosity.pdfCuriosity [1] Object...

Post on 25-Sep-2020

4 views 0 download

Transcript of Semantic Curiosity 5min - cs.cmu.edudchaplot/talks/eccv20-semantic-curiosity.pdfCuriosity [1] Object...

Semantic Curiosity for Active Visual Learning

Devendra Singh Chaplot*

AbhinavGuptaHelen Jiang* Saurabh

Gupta

Webpage: https://devendrachaplot.github.io/projects/SemanticCuriosityECCV-2020

Active Visual Learning

2

Static Datasets Great for benchmarking Long-tail problem

Active Visual Learning Personalization Learn to gather data

Active Visual Learning

3

Given a pre-trained object detection/segmentation model, learn aself-supervised exploration policy to gather observations to improve the model

Active Visual Learning

4

Generates observations of objects not walls and ceilings. Observe many unique objects. Observe images with incorrect object detections.

Given a pre-trained object detection/segmentation model, learn aself-supervised exploration policy to gather observations to improve the model

Semantic Curiosity

5

chaircouchbed

couchcouchcouch

Inconsistent Detections = High Reward

Consistent Detections = Low Reward

Mask RCNN

CouchChairChair

X Y Z C1 C2 C2

sum across height

RGB (It)

Depth (Dt)

Prediction

Point Cloud

Semantic Labels

Voxel

Semantic Map (C × M × M)

Semantic Mapping

Category-wise

Semantic Mapping

7

Semantic Map

Semantic Curiosity

RSC ∝ Σ(c,i,j)MSem[c, i, j]MSem ∈ {0,1}C×M×M

sum across height

sum across length

sum across breadth

Cumulative Semantic Curiosity reward

Encourages temporal inconsistencies

<

Encourages more unique objects

<

Step 2: Train Semantic Module

Exploration Policy Training Environments ( )ℰU

Semantic Map

Step 1: Learn Exploration Policy Step 3: Test on held-out data

Semantic Curiosity Reward

Exploration Policy

Fine-tune

Semantic Module(Mask RCNN)

Semantic Mapping

Semantic Module

Sample and Label

Trajectories

Trained Exploration Policy

Semantic Module Training Environments ( )ℰtr

Semantic Module Test Environments ( )ℰt

Semantic ModuleTest on held-out data

?

Randomlysampled held-outdataset

Method Overview

Results

AP50

Pre-Trained

Curiosity [1]

Object Exploration

Coverage Exploration [2]

Active Neural SLAM [3]

Semantic Curiosity

30 35.5 41 46.5 52

39.2

45.3

51.02

42.2

37.42

44.24

Overall

*Adapted from [1] Pathak et al. ICML-17, [2] Chen et al. ICLR-19, [3] Chaplot el al. ICLR-2010

Chair Bed Toilet Couch Plant

46.7 28.2 46.9 60.3 39.1

49.4 18.3 1.8 67.7 49.0

54.3 24.8 5.7 76.6 49.6

48.5 23.1 69.2 66.3 48.0

51.3 20.5 49.4 69.7 45.6

51.6 14.6 14.2 65.2 50.4

Quality of object detection on training trajectories

Results

AP50

Pre-Trained

Curiosity [1]

Object Exploration

Coverage Exploration [2]

Active Neural SLAM [3]

Semantic Curiosity

30 32.75 35.5 38.25 41

39.96

38.5

36.56

35.26

37.26

31.72

Overall

*Adapted from [1] Pathak et al. ICML-17, [2] Chen et al. ICLR-19, [3] Chaplot el al. ICLR-2011

Chair Bed Toilet Couch Plant

41.8 17.3 34.9 41.6 23.0

48.4 18.5 42.3 44.3 32.8

50.3 16.4 40.0 39.7 29.9

50.0 19.1 38.1 42.1 33.5

53.1 19.5 42.0 44.5 33.4

52.3 22.6 42.9 45.7 36.3

Demo Video

12

Observation Semantic Map

Temporal Inconsistency

13

Time

Temporal Inconsistency

14

Time

Temporal Inconsistency

15

Time

16

Semantic Curiosity for Active Visual LearningDevendra Singh Chaplot, Helen Jiang, Saurabh Gupta, Abhinav GuptaECCV 2020

Webpage: https://devendrachaplot.github.io/projects/SemanticCuriosity

Devendra Singh ChaplotWebpage: http://devendrachaplot.github.io/Email: chaplot@cs.cmu.eduTwitter: @dchaplot

Thank you