Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid
-
Upload
pra-group-university-of-cagliari -
Category
Engineering
-
view
144 -
download
2
Transcript of Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid
![Page 1: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/1.jpg)
PatternRecognitionandApplications Lab
UniversityofCagliari,Italy
DepartmentofElectricalandElectronic
Engineering
Is Deep Learning Safe for Robot Vision?Adversarial Examples against the iCub Humanoid
12017ICCVWorkshopViPAR,Venice,Oct.23,2017
MarcoMelis,AmbraDemontis,BattistaBiggio,GavinBrown,GiorgioFumera,FabioRoli
Dept.OfElectrical andElectronicEngineeringUniversity ofCagliari,Italy
@biggiobattista
![Page 2: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/2.jpg)
http://pralab.diee.unica.it @biggiobattista 2
The iCub is the humanoid robot developed at the Italian Institute of Technology as part of the EU project RobotCub and adopted by more than 20 laboratories worldwide.
It has 53 motors that move the head, arms and hands, waist, and legs. It can see and hear, it has the sense of proprioception (body configuration)and movement (using accelerometers and gyroscopes).
[http://www.icub.org]
The object recognition system of iCub uses visual features extracted with CNN models trained on the ImageNet dataset[G. Pasquale et al. MLIS 2015]
The iCub Humanoid
![Page 3: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/3.jpg)
http://pralab.diee.unica.it @biggiobattista 3
The iCub Robot-Vision System
![Page 4: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/4.jpg)
http://pralab.diee.unica.it @biggiobattista 4
[http://old.iit.it/projects/data-sets]The iCubWorld28 Dataset
![Page 5: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/5.jpg)
http://pralab.diee.unica.it @biggiobattista
Crafting the Adversarial Examples
• Key idea: shift the attack sample towards the decision boundary– under a maximum input perturbation (Euclidean distance)
• Multiclass boundaries are obtained as the difference between the competing classes (e.g., one-vs-all multiclass classification)
5
f1
f2
f3 f1-f3
![Page 6: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/6.jpg)
http://pralab.diee.unica.it @biggiobattista
Error-generic Evasion
• Error-generic evasion– k is the true class (blue)– l is the competing (closest) class in feature space (red)
• The attack minimizes the objective to have the sample misclassified as the closest class (could be any!)
6
1 0 1
1
0
1
Indiscriminate evasion
![Page 7: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/7.jpg)
http://pralab.diee.unica.it @biggiobattista
Error-specific Evasion
• Error-specific evasion– k is the target class (green)– l is the competing class (initially, the blue class)
• The attack maximizes the objective to have the sample misclassified as the target class
7
max
1 0 1
1
0
1
Targeted evasion
![Page 8: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/8.jpg)
http://pralab.diee.unica.it @biggiobattista 8
∇fi (x) =∂fi(z)∂z
∂z∂x
f1
f2
fi
fc
...
...
Gradient-based Evasion Attacks• Solved with projected gradient-based optimization algorithm
![Page 9: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/9.jpg)
http://pralab.diee.unica.it @biggiobattista 9
An adversarial example from class laundry-detergent, modified with our algorithm to be misclassified as cup
Adversarial Examples against the iCub
![Page 10: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/10.jpg)
http://pralab.diee.unica.it @biggiobattista 10
Adversarial example generatedby manipulating only a specific region, to simulate a sticker that could be applied to the real-world object
This image is classified as cup
The ‘Sticker’ Attack against iCub
![Page 11: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/11.jpg)
http://pralab.diee.unica.it @biggiobattista
Why ML is Vulnerable to Evasion?
• Attack samples far from training data are anyway assigned to ‘legitimate’ classes
• Rejecting such blind-spot evasion points should improve security!
11
1 0 1
1
0
1
SVM-RBF (higher rejection rate)
1 0 1
1
0
1
SVM-RBF (no reject)
![Page 12: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/12.jpg)
http://pralab.diee.unica.it @biggiobattista 12
Countering Adversarial Examples
maximum input perturbation (Euclidean distance)
visually-indistinguishable perturbations
Error-specificevasion(similarresultsforerror-genericattacks)
![Page 13: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/13.jpg)
http://pralab.diee.unica.it @biggiobattista
Conclusions and Future Work
• Adversarial Examples against iCub• Countermeasure based on rejecting blind-spot evasion attacks
• Main open issue: instability of deep features
13
smallchangesininputspace(pixels)alignedwiththegradientdirection...
...correspondtolargechangesindeepfeaturespace!
![Page 14: Is Deep Learning Safe for Robot Vision? Adversarial Examples against the iCub Humanoid](https://reader031.fdocuments.us/reader031/viewer/2022022415/5a65c7c27f8b9a33648b4dc1/html5/thumbnails/14.jpg)
http://pralab.diee.unica.it @biggiobattista
https://sec-ml.pluribus-one.it/
14