EyeDroid: An Open Source Mobile Gaze Tracker on Android for … › publications › [1]Conference...

7
EyeDroid: An Open Source Mobile Gaze Tracker on Android for Eyewear Computers Shahram Jalaliniya IT University of Copenhagen Rued Langgaards Vej 7 2300 Copenhagen S, Denmark [email protected] Daniel Garcia Garcia IT University of Copenhagen Rued Langgaards Vej 7 2300 Copenhagen S, Denmark [email protected] Diako Mardanbegi IT University of Copenhagen Rued Langgaards Vej 7 2300 Copenhagen S, Denmark [email protected] Ioannis Sintos IT University of Copenhagen Rued Langgaards Vej 7 2300 Copenhagen S, Denmark [email protected] Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]. Ubicomp/ISWC ’15 Adjunct , September 07-11, 2015, Osaka, Japan c 2015 ACM. ISBN 978-1-4503-3575-1/15/09. . . $15.00 DOI: http://dx.doi.org/10.1145/2800835.2804336 Abstract In this paper we report on development and evaluation of a video-based mobile gaze tracker for eyewear computers. Unlike most of the previous work, our system performs all its processing workload on an Android device and sends the coordinates of the gaze point to an eyewear device through wireless connection. We propose a lightweight software architecture for Android to increase the efficiency of image processing needed for eye tracking. The evaluation of the system indicated an accuracy of 1.06 degrees and a battery lifetime of approximate 4.5 hours. Author Keywords Gaze tracking; Eyewear computer; Android; Google Glass ACM Classification Keywords H.5.2. [Information interfaces and presentation: User Interfaces]: Input devices and strategies Introduction By emerging new generation of unobtrusive eyewear computers, such as Google Glass 1 and Vuzix smart glass 2 , it seems feasible that eventually these eyewear devices play role in everyday tasks. Due to the special form factor of eyewear devices, the delay between intention and action 1 https : //developers.google.com/glass/ 2 http : //www.vuzix.com/consumer/productsm100/ 873 UBICOMP/ISWC '15 ADJUNCT, SEPTEMBER 7–11, 2015, OSAKA, JAPAN

Transcript of EyeDroid: An Open Source Mobile Gaze Tracker on Android for … › publications › [1]Conference...

Page 1: EyeDroid: An Open Source Mobile Gaze Tracker on Android for … › publications › [1]Conference Papers... · 2017-09-22 · accomplished on existing eyewear devices such as Google

EyeDroid: An Open Source MobileGaze Tracker on Android forEyewear Computers

Shahram JalaliniyaIT University of CopenhagenRued Langgaards Vej 72300 Copenhagen S, [email protected]

Daniel Garcia GarciaIT University of CopenhagenRued Langgaards Vej 72300 Copenhagen S, [email protected]

Diako MardanbegiIT University of CopenhagenRued Langgaards Vej 72300 Copenhagen S, [email protected]

Ioannis SintosIT University of CopenhagenRued Langgaards Vej 72300 Copenhagen S, [email protected]

Permission to make digital or hard copies of all or part of this work forpersonal or classroom use is granted without fee provided that copies are notmade or distributed for profit or commercial advantage and that copies bearthis notice and the full citation on the first page. Copyrights for componentsof this work owned by others than ACM must be honored. Abstracting withcredit is permitted. To copy otherwise, or republish, to post on servers or toredistribute to lists, requires prior specific permission and/or a fee. Requestpermissions from [email protected]/ISWC ’15 Adjunct, September 07-11, 2015, Osaka, Japanc©2015 ACM. ISBN 978-1-4503-3575-1/15/09. . . $15.00

DOI: http://dx.doi.org/10.1145/2800835.2804336

AbstractIn this paper we report on development and evaluation ofa video-based mobile gaze tracker for eyewear computers.Unlike most of the previous work, our system performs allits processing workload on an Android device and sendsthe coordinates of the gaze point to an eyewear devicethrough wireless connection. We propose a lightweightsoftware architecture for Android to increase the efficiencyof image processing needed for eye tracking. Theevaluation of the system indicated an accuracy of 1.06degrees and a battery lifetime of approximate 4.5 hours.

Author KeywordsGaze tracking; Eyewear computer; Android; Google Glass

ACM Classification KeywordsH.5.2. [Information interfaces and presentation: UserInterfaces]: Input devices and strategies

IntroductionBy emerging new generation of unobtrusive eyewearcomputers, such as Google Glass1 and Vuzix smart glass2,it seems feasible that eventually these eyewear devicesplay role in everyday tasks. Due to the special form factorof eyewear devices, the delay between intention and action

1https : //developers.google.com/glass/2http : //www.vuzix.com/consumer/productsm100/

873

UBICOMP/ISWC '15 ADJUNCT, SEPTEMBER 7–11, 2015, OSAKA, JAPAN

Page 2: EyeDroid: An Open Source Mobile Gaze Tracker on Android for … › publications › [1]Conference Papers... · 2017-09-22 · accomplished on existing eyewear devices such as Google

is very short compared to other mobile devices[10]. Thisopens new opportunities for eyewear computers to be usedmore on the move and in parallel with real world tasks.However, mobile interaction with eyewear devices is stillchallenging. For example, in a mobile scenario, sometimeshands of the user are busy with a manual activity, or theuser might be doing a visually demanding task. Thismeans the eyewear device needs to provide severalchannels for interaction to support users in differentsituations. State of the art eyewear devices alreadysupport for head gesture input, voice commands, andtouch-based gestures. Eye gaze has also been studied asan input modality for head-mounted displays [3].However, due to the technical limitations, gaze-basedinteraction is not still supported by state of the arteyewear devices. One of the main challenges is the factthat the image processing required for gaze tracking isextremely complex and power demanding. Unfortunately,this computational demand is very far from what can beaccomplished on existing eyewear devices such as GoogleGlass. In this paper, we investigate the possibility of usingan Android smartphone to process eye images captured bya head-mounted camera to calculate the gaze coordinatesfor an eyewear computer.

Related WorkMost of the recent mobile gaze trackers use a laptop inuser’s backpack [5, 6] or a remote computer [9, 1] toanalyze the eye image and calculate gaze coordinate. Thedependency of gaze tracking systems to a local or remotecomputer decreases the mobility of users. There are alsosome commercial products from companies such asEyeTribe3, Tobii4, and Umoove5 which support eye

3https : //theeyetribe.com/4http : //www.tobii.com/5http : //www.umoove.me/

tracking on mobile and wearable devices. But thecommercial mobile gaze trackers are usually so expensiveand hard to afford. That is the reason why some of therecent studies have tried to use cheap and small processorssuch as Raspberry Pi [2] and micro-controllers [7] for eyetracking. Ferhat et.at [2] have presented a cheap eyetracking solution running on a Raspberry Pi device. Theybased their work on the open source Opengazer [8]. Theaverage gaze estimation error of their system is about1.4◦ for an image size of 640 × 480 pixels with the framerate of 3Hz. Although their system was running on asmall device, it was only tested on a stationary setup forgaze tracking on a computer screen. A more relevantwork to our paper is the iShadow eye tracker by Mayberryet.al [7] that focuses on head-mounted gaze tracking.They have presented a fully mobile eye tracking solutionusing a low-power ARM Cortex M3 micro-controller.

Google  Glass  running  GlassGaze  app    

An  extra  USB  camera  sends  the  eye  image  to  

EyeDroid  

Android  smartphone  running  EyeDroid  app  

Streaming  gaze  data  to  the  Glass  

Figure 1: A schematic view of the system Architecture.

874

UBICOMP/ISWC '15 ADJUNCT, SEPTEMBER 7–11, 2015, OSAKA, JAPAN

Page 3: EyeDroid: An Open Source Mobile Gaze Tracker on Android for … › publications › [1]Conference Papers... · 2017-09-22 · accomplished on existing eyewear devices such as Google

Figure 2: EyeDroid software arcgitecture. Eye trackingalgorithm inside the core is decomposed into steps (filters) andconnected by pipes (arrows). Each composite is executed on aseparate thread.

The focus of the iShadow system was mainlyimplementing a very efficient video-based eye trackingapproach that can run on a small micro-processor. Theyachieved real-time gaze tracking in an image captured bya front-view camera and they have reported an error ofabout 3 degrees for their system. Since in eyewearcomputers the display size is very small (less than 15degrees), the accuracy of the eye tracker should be higherto provide a graceful interaction. In this paper, to achievea higher accuracy in eye tracking (about 1 degree), werely on the processing capacity of commonly used mobiledevices. The proposed eye tracker on mobile device is anopen-source affordable solution for gaze tracking oneyewear computers.

System ArchitectureThe proposed system comprises two main components:(1) our gaze tracking application (EyeDroid) running as a

server on an android smartphone, and (2) the clientapplication on Google Glass (GlassGaze)6. A schematicview of the system architecture is represented in Figure 1.

EyeDroid: Gaze Tracker Server on AndroidHardwareThe hardware requirements in the current implementationof the EyeDroid eye tracker are an Android mobile device(minimum API level 15) and a head mounted USB 2.0infrared camera connected directly to the Android phonethrough a USB cable. The first hardware prototype of theEyeDroid is shown in Figure 5. The recommended cameraresolution is 640×480 pixels. Because the Androidplatform does not provide support to connect an externalUSB camera, the operating system needs to own rootaccess to the phone and use customized camera videodrivers. To develop the EyeDroid gaze tracker, opensource third party drivers are used [4].

Software ArchitectureThe software architecture of the EyeDroid application isdesigned based on pipes and filters design pattern. Thisarchitecture helped us test different algorithmconfigurations easily during system development. Also inthe EyeDroid software platform we built and used JavaLightweight Processing Framework (JLPF)7 (Figure 3) asan external library. This design allows for a fullyconfigurable algorithm in terms of decomposability andscheduling of the steps for execution on the availableprocessing resources, instead of a monolithic algorithmthat would perform poorly. Finally, in order to divide thealgorithm in steps of equal execution time, the compositepattern was implemented to allow composition ofindividual steps (see Figure 2). Since performance is a

6https : //github.com/centosGit/GlassGaze7https : //github.com/centosGit/JLPF

875

WORKSHOP

Page 4: EyeDroid: An Open Source Mobile Gaze Tracker on Android for … › publications › [1]Conference Papers... · 2017-09-22 · accomplished on existing eyewear devices such as Google

critical issue, we used the Android NDK support for C++instead of the regular Android SDK for java. This allowedthe algorithm code to run directly on the processingresources and access system libraries directly, unlike Javawhich would run on a virtual machine.

Figure 3: Java Lightweight Processing Framework (JLPF)software architecture

Gaze tracking methodIn order to achieve real-time image processing, we haveskipped detecting cornea reflection in the image whichcould compensate for the small movements of the camerarelative to the eye. Only the pupil center obtained from

the eye image is used for gaze estimation. Pupil detectionis done by applying a simple blob detection algorithm onthe eye image as follow: (step 1) The image is firstconverted to grey-scale (step 2) and then a morphologicaloperation is done on the resulting image. (step 3) Then athreshold was applied at a constant level of around 70.(step 4) After thresholding, a morphological operation isdone on the image before applying blob detection (4). Toreduce the computation time, in each frame, we havedefined a region of interest (ROI) for which the imageprocessing is applied for. In the first frame, the ROI willbe defined as the entire image. Once the pupil is found onprevious processed frames, the ROI is reduced to 30% ofthe image size and is moved to the most recentlycomputed pupil coordinates (the last frame whoseprocessing is completed).

CalibrationIn the current implementation of the system, ahomography transformation is used as our gaze mappingfunction. The mapping function is obtained from acalibration process consisting of a minimum of 4calibration markers on the display.

GlassGaze: Gaze Tracker Client on GoogleGlassGlassGaze [1] is an android app developed for GoogleGlass that was originally developed to work as a client forthe open-source Haytham gaze tracker [1]. GlassGazeprovides a convenient user interface that can be controlledby voice and finger gesture. This client has an Androidbackground service that receives the gaze data from theserver and allows different applications, to subscribe to itsmessages. This background service also allowsapplications on Glass to communicate with the gazetracking server. By applying the same messaging protocol

876

UBICOMP/ISWC '15 ADJUNCT, SEPTEMBER 7–11, 2015, OSAKA, JAPAN

Page 5: EyeDroid: An Open Source Mobile Gaze Tracker on Android for … › publications › [1]Conference Papers... · 2017-09-22 · accomplished on existing eyewear devices such as Google

defined in the GlassGaze we could easily use this app asour client.

Figure 4: Pupil detection steps

System PerformanceAccuracy of the Gaze TrackerTo measure the accuracy of our gaze tracker weconducted an experiment with 10 participants recruitedamong students from our university. Participants wereasked to wear the Google Glass and run the GlassGazeapplication. First, we had a training session in which theytried the system for a while until they felt comfortablewith the system. We started the experiment with afour-point calibration. After calibration, 15 markers weredisplayed randomly on the Google Glass display for onesecond. The participants were asked to look at themarkers immediately after marker appearance (see Figure5). We had 15 markers distributed evenly in 3 rows and 5columns (Figure 6).

Figure 5: A participant perfomring the task.

To measure the gaze coordinate, the average coordinateof the gaze for the last 700 milliseconds of looking at eachmarker was calculated. The average of deviation fromactual marker position was equal to 52.61 pixels withstandard deviation of 35.26 pixels. This means that theerror of our gaze tracking system is equal to 1.06 degrees.The distribution of the error for each marker is illustratedin Figure 6. The X and Y dimensions of the graph inFigure 6 represent two dimensions of the display onGoogle Glass (maximum 640 × 320), and the gray circlesaround each marker show the average error of thedetected gaze point for each marker.

877

WORKSHOP

Page 6: EyeDroid: An Open Source Mobile Gaze Tracker on Android for … › publications › [1]Conference Papers... · 2017-09-22 · accomplished on existing eyewear devices such as Google

0&

50&

100&

150&

200&

250&

0& 100& 200& 300& 400& 500& 600&

Y&(pixel)&

X&(pixel)&

Error&area&

Figure 6: The average error of the gaze estimation for eachmarker (dots) is illustrated by circles around the markers

Battery LifeTo calculate the battery life of the mobile device (a brandnew LG-G2 smartphone with 2 GB RAM, a Quad-core2.26 GHz Krait 400 processor, an Adreno 330 GPU andrunning Android 4.4 version) while running the EyeDroidapplication, we measured the charge of the battery (givenby the Android built-in battery level indicator) every hourfor 3 hours. To compensate the inaccuracy of the built-inindicator, the device was fully charged before conductingeach experiment, any other apps were closed but defaultAndroid services, and the brightness of the screen wasminimized. Each measurement was repeated three timesand results were averaged.

Figure 7: Comparison between EyeDroid and two otherpopular applications showing cumulative energy consumption(%) per hour

Since EyeDroid can optionally show the resultingcoordinates drawn in top of the input video streamed onthe device display, two different experiments wereconducted. First in the video preview enabled mode andsecond when the preview mode is disabled. To have abaseline for our comparison, the battery life of the devicerunning two other popular applications was measured inthe same way: YouTube video streaming and HillClimbing racing game. The results suggest that EyeDroidbehaves similar to Hill Climbing game but deviatingapproximately 10% per hour. The maximum battery lifeestimation running EyeDroid with inpreview-disabled-mode is approximately 4.5 hours.

Discussion & ConclusionIn this paper we presented a monocular mobile gazetracker on Android smartphone to support gaze-basedinteraction with eyewear devices. We used an efficient andlightweight software architecture to divide imageprocessing task into parallel threads. Using our approach,we reached to 6.41 fps performance in the imageprocessing task. The experimental study showed the

878

UBICOMP/ISWC '15 ADJUNCT, SEPTEMBER 7–11, 2015, OSAKA, JAPAN

Page 7: EyeDroid: An Open Source Mobile Gaze Tracker on Android for … › publications › [1]Conference Papers... · 2017-09-22 · accomplished on existing eyewear devices such as Google

accuracy of 1.06 degree for our gaze tracker. The errorareas (gray zones) around each marker in the Figure 6show that our gaze tracker can be used for interactionwith Google Glass since users are able to accurately pointto (at least) 15 different objects on the display. Although,the head gear was fixed on the head, small movements ofthe camera relative to the eye could create a relativelylarge error in the gaze tracking result. This was due to thefact that gaze mapping was using only pupil center. Asfuture work, we will add glint detection to increaserobustness of the system.

AcknowledgmentsThis work was supported by the EU Marie Curie NetworkiCareNet under grant number 264738.

References[1] Haytham gaze tracker.

http://itu.dk/research/eye/, June 2014.[2] Ferhat, O., Vilarino, F., and Sanchez, F. A cheap

portable eye-tracker solution for common setups.Journal of Eye Movement Research 7, 3 (2014), 2.

[3] Jalaliniya, S., Mardanbeigi, D., Pederson, T., andHansen, D. Head and eye movement as pointingmodalities for eyewear computers. In BSNWorkshops, 2014 11th International Conference on(June 2014), 50–53.

[4] Lab, K. Usage of usb webcam with customizedgalaxy nexus (android 4.0.3). http:

//brain.cc.kogakuin.ac.jp/research/usb-e.html.[5] Li, D., Babcock, J., and Parkhurst, D. J. openeyes:

A low-cost head-mounted eye-tracking solution. InProceedings of the 2006 Symposium on Eye TrackingResearch &Amp; Applications, ETRA ’06, ACM(New York, NY, USA, 2006), 95–100.

[6] Lukander, K., Jagadeesan, S., Chi, H., and Muller,K. Omg!: A new robust, wearable and affordableopen source mobile gaze tracker. In Proceedings ofthe 15th International Conference onHuman-computer Interaction with Mobile Devicesand Services, MobileHCI ’13, ACM (New York, NY,USA, 2013), 408–411.

[7] Mayberry, A., Hu, P., Marlin, B., Salthouse, C., andGanesan, D. ishadow: Design of a wearable,real-time mobile gaze tracker. In Proceedings of the12th Annual International Conference on MobileSystems, Applications, and Services, MobiSys ’14,ACM (New York, NY, USA, 2014), 82–94.

[8] Nel, E., MacKay, D., Zielinski, P., Williams, O., andCipolla, R. Opengazer: open-source gaze tracker forordinary webcams.

[9] Rantanen, V., Vanhala, T., Tuisku, O., Niemenlehto,P., Verho, J., Surakka, V., Juhola, M., and Lekkala,J. A wearable, wireless gaze tracker with integratedselection command source for humanx2010;computer interaction. Information Technologyin Biomedicine, IEEE Transactions on 15, 5 (Sept2011), 795–801.

[10] Starner, T. Project glass: An extension of the self.Pervasive Computing, IEEE 12, 2 (April 2013),14–16.

879

WORKSHOP