Intro to the Mobile Vision API

18
Intro to the Mobile Vision API +FrederikSchweiger Easily detect faces and barcodes.

Transcript of Intro to the Mobile Vision API

Intro to the Mobile Vision API

+FrederikSchweiger

Easily detect faces and barcodes.

Recognizing Barcodes

Frame frame = Frame.Builder()

.setBitmap(myBitmap)

.build();

MainActivity.java

BarcodeDetector barcodeDetector = new BarcodeDetector.Builder(context)

.build();

SparseArray<Barcode> barcodes = barcodeDetector.detect(frame);

Recognizing Faces

Frame frame = Frame.Builder()

.setBitmap(myBitmap)

.build();

FaceDetector faceDetector = new FaceDetector.Builder(context)

.setTrackingEnabled(false)

.build();

SparseArray<Face> faces = faceDetector.detect(frame);

MainActivity.java

What you see is what you get

What you see is what you get

● face’s orientation (Euler X, Y and Z)● up to 8 landmarks● eyes open / closed● smiling

The Mobile Vision API also provides a

high performance,yet easy to use,

video pipeline structure

1. Building the detector

FaceDetector detector = new FaceDetector.Builder(context)

.setTrackingEnabled(true)

.setProminentFaceOnly(false)

.setMode(FaceDetector.FAST_MODE)

.setLandmarkType(FaceDetector.NO_LANDMARKS)

.build();

2. Create a trackerprivate class FaceTracker extends Tracker<Face> {

public void onNewItem(int faceId, Face face) { … }

public void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) { … }

public void onMissing(FaceDetector.Detections<Face> detectionResults) { … }

public void onDone() { … }

}

private class FaceTrackerFactory implements MultiProcessor.Factory<Face> {

@Override

public Tracker<Face> create(Face face) {

return new FaceTracker(); // created in step two

}

}

3. Create a factory

4. Create an associated processor

detector.setProcessor(new MultiProcessor.Builder<Face>()

.build(new FaceTrackerFactory())); // created in step three

5. Create a camera source to capture images

mCameraSource = new CameraSource.Builder()

.setRequestedPreviewSize(640, 480)

.setFacing(CameraSource.CAMERA_FACING_BACK)

.setRequestedFps(30.0f)

.build(getApplicationContext(), detector); // created in step four

SOthat was easy

wasn’t it?

...finally we can start detecting faces?

Unfortunately not.

<meta-data

android:name="com.google.android.gms.vision.DEPENDENCIES"

android:value="face" />

tells Google Mobile Services to download the native

libraries in the background when the app is installed

Unfortunately not (yet).

first you have to check if the detector .isOperational()

Now we are ready.

DEMOtime

Thanks!

+FrederikSchweiger

Feel free to circle me on Google+ and send me your questions! You will find the slides in my Google+ stream.