BackTap: Robust Four-Point Tapping on the Back of an Off ... · the input modality of a smartphone...

2
BackTap: Robust Four-Point Tapping on the Back of an Off-the-shelf Smartphone Cheng Zhang, Aman Parnami, Caleb Southern, Edison Thomaz, Gabriel Reyes, Rosa I. Arriaga, Gregory D. Abowd School of Interactive Computing Georgia Institute of Technology Atlanta, Georgia, USA ABSTRACT We present BackTap, an interaction technique that extends the input modality of a smartphone to add four distinct tap locations on the back case of a smartphone. The BackTap interaction can be used eyes-free with the phone in a user’s pocket, purse, or armband while walking, or while holding the phone with two hands so as not to occlude the screen with the ngers. We employ three common built-in sensors on the smartphone (microphone, gyroscope, and accelerometer) and feature a lightweight heuristic implementation. In an evaluation with eleven participants and three usage conditions, users were able to tap four distinct points with 92% to 96% accuracy. Author Keywords Acoustic classication; mobile; heuristics; inertial sensors; touch; always-available input; lightweight interaction ACM Classification Keywords H.5.m. Information interfaces and presentation: User interfaces; Input devices and strategies General Terms Human Factors; Design; Measurement. INTRODUCTION Currently, the primary mode of interaction with a smartphone is limited to the front-facing touchscreen and a small number of physical buttons along the sides. However, contemporary smartphones are equipped with an increasing number of sensors, specically accelerometers, gyroscopes and microphones, which can be exploited to support a wider variety of software-detected input events without any further change to the devices themselves. Current interactions based on the accelerometer, gyroscope, and microphone have only scratched the surface of the potential these built-in sensors offer. Traditionally, these sensors have been used separately, based on the purpose for which each was designed. In this paper, we demonstrate detection of four distinct tap locations of the back corners of a phone, through the combination of data from the built-in accelerometer, gyroscope, and microphone on a commodity smartphone. Our tapping gestures on surfaces other than the touchscreen offer three main advantages: 1.An alternative set of interactions that augment the touchscreen input. 2. Eyes- free interaction, including when the phone is in a user’s pocket. 3. Lack of occlusion of the screen by the fingers. RELATED WORK We review related work for interactions on the back of a mobile device, tapping interactions, and the use of built-in sensors on commodity phones. Several researchers have experimented with shifting mobile interaction away from the touchscreen to the side and back of the device, as a means to eliminate occlusion, such as Behind Touch [9], BlindSight[6], LucidTouch [10]. Tapping is one subset of the finger gestures available for interaction with a mobile phone. TapInput[7], ForceTap[2], and Whack Gestures[5] detected tap gestures with additional information from the accelerometer. Hinckley et al. designed an application to recognize the user’s tap gesture on the two top corners of a phone through inertial sensors [3]. One trade-off with many of the approaches presented so far is that they require additional sensing to be attached to the mobile phone. Recently, Hinckley et al. explored techniques using only inertial sensors and touch input for hand-held devices, providing touch-enhanced motion and motion-enhanced touch gestures [4]. GripSense is a system that leverages mobile device touchscreens and only their built-in inertial sensors and vibration motor to infer hand postures [1]. PocketTouch[8] enables capacitive touch-based gesture control and text input on a mobile device through fabric enclosing the device. In our project, BackTap, we extend the interactions using only commodity sensors (i.e. microphones, accelerometer, gyroscope etc.) that embedded in the phone. BACKTAP IMPLEMENTATION The BackTap interaction allows the user to tap on each of the four corners on the back case of the phone: top-left, top- right, bottom-left, and bottom-right. The interaction may be used while the phone is in a pocket or while holding it in two hands (see Figure 1). BackTap employs only the built- in gyroscope, accelerometer, and microphone to detect each Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). Copyright is held by the author/owner(s). UIST’13 Adjunct, October 8–11, 2013, St. Andrews, United Kingdom. ACM 978-1-4503-2406-9/13/10. http://dx.doi.org/10.1145/2508468.2514735 Posters UIST’13, October 8–11, 2013, St. Andrews, UK 111

Transcript of BackTap: Robust Four-Point Tapping on the Back of an Off ... · the input modality of a smartphone...

Page 1: BackTap: Robust Four-Point Tapping on the Back of an Off ... · the input modality of a smartphone to add four distinct tap locations on the back case of a smartphone. The BackTap

BackTap: Robust Four-Point Tapping on the Back of an Off-the-shelf Smartphone

Cheng Zhang, Aman Parnami, Caleb Southern, Edison Thomaz, Gabriel Reyes, Rosa I. Arriaga, Gregory D. Abowd

School of Interactive Computing Georgia Institute of Technology

Atlanta, Georgia, USA

ABSTRACT We present BackTap, an interaction technique that extends the input modality of a smartphone to add four distinct tap locations on the back case of a smartphone. The BackTap interaction can be used eyes-free with the phone in a user’s pocket, purse, or armband while walking, or while holding the phone with two hands so as not to occlude the screen with the fingers. We employ three common built-in sensors on the smartphone (microphone, gyroscope, and accelerometer) and feature a lightweight heuristic implementation. In an evaluation with eleven participants and three usage conditions, users were able to tap four distinct points with 92% to 96% accuracy.

Author Keywords Acoustic classification; mobile; heuristics; inertial sensors; touch; always-available input; lightweight interaction

ACM Classification Keywords H.5.m. Information interfaces and presentation: User interfaces; Input devices and strategies

General Terms Human Factors; Design; Measurement.

INTRODUCTION Currently, the primary mode of interaction with a smartphone is limited to the front-facing touchscreen and a small number of physical buttons along the sides. However, contemporary smartphones are equipped with an increasing number of sensors, specifically accelerometers, gyroscopes and microphones, which can be exploited to support a wider variety of software-detected input events without any further change to the devices themselves. Current interactions based on the accelerometer, gyroscope, and microphone have only scratched the surface of the potential these built-in sensors offer.

Traditionally, these sensors have been used separately, based on the purpose for which each was designed. In this paper, we demonstrate detection of four distinct tap

locations of the back corners of a phone, through the combination of data from the built-in accelerometer, gyroscope, and microphone on a commodity smartphone.

Our tapping gestures on surfaces other than the touchscreen offer three main advantages: 1.An alternative set of interactions that augment the touchscreen input. 2. Eyes-free interaction, including when the phone is in a user’s pocket. 3. Lack of occlusion of the screen by the fingers.

RELATED WORK We review related work for interactions on the back of a mobile device, tapping interactions, and the use of built-in sensors on commodity phones.

Several researchers have experimented with shifting mobile interaction away from the touchscreen to the side and back of the device, as a means to eliminate occlusion, such as Behind Touch [9], BlindSight[6], LucidTouch [10]. Tapping is one subset of the finger gestures available for interaction with a mobile phone. TapInput[7], ForceTap[2], and Whack Gestures[5] detected tap gestures with additional information from the accelerometer. Hinckley et al. designed an application to recognize the user’s tap gesture on the two top corners of a phone through inertial sensors [3]. One trade-off with many of the approaches presented so far is that they require additional sensing to be attached to the mobile phone. Recently, Hinckley et al. explored techniques using only inertial sensors and touch input for hand-held devices, providing touch-enhanced motion and motion-enhanced touch gestures [4]. GripSense is a system that leverages mobile device touchscreens and only their built-in inertial sensors and vibration motor to infer hand postures [1]. PocketTouch[8] enables capacitive touch-based gesture control and text input on a mobile device through fabric enclosing the device.

In our project, BackTap, we extend the interactions using only commodity sensors (i.e. microphones, accelerometer, gyroscope etc.) that embedded in the phone.

BACKTAP IMPLEMENTATION The BackTap interaction allows the user to tap on each of the four corners on the back case of the phone: top-left, top-right, bottom-left, and bottom-right. The interaction may be used while the phone is in a pocket or while holding it in two hands (see Figure 1). BackTap employs only the built-in gyroscope, accelerometer, and microphone to detect each

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made ordistributed for profit or commercial advantage and that copies bear this notice andthe full citation on the first page. Copyrights for third-party components of thiswork must be honored. For all other uses, contact the owner/author(s). Copyrightis held by the author/owner(s). UIST’13 Adjunct, October 8–11, 2013, St. Andrews, United Kingdom. ACM 978-1-4503-2406-9/13/10. http://dx.doi.org/10.1145/2508468.2514735

Posters UIST’13, October 8–11, 2013, St. Andrews, UK

111

Page 2: BackTap: Robust Four-Point Tapping on the Back of an Off ... · the input modality of a smartphone to add four distinct tap locations on the back case of a smartphone. The BackTap

tap and its location on the back of the phone. We implemented the BackTap prototype on a Samsung Galaxy S3 running Android 4.0.4.

The basic detection principle behind the interaction is the shake of the body of the phone when it is tapped or touched by other objects. We applied heuristic techniques to the data from the three sensors to identify the characteristic signature of taps on each of the four corners of the back case. Our approach consists of two phases of tap event detection: 1) segmentation, where we determine that a tap occurred; and 2) classification, where we determine the location of the tap on the back case of the phone.

We detect taps from a combination of data that comes from the built-in accelerometer, gyroscope, and microphone on a commodity smartphone. In our initial prototype, we only used inertial sensors to detect the tap event. However, in pilot testing we observed that when the user held the phone very tightly, these two motion sensors often failed to detect the tap event, even with a strong tap on the back case. To address this problem, we added the microphone into the set of BackTap heuristics as a third input sensor. We only consider loudness from the microphone buffer, measured in decibels (dB). Meanwhile, the data from inertial sensors can help reduce the environment noise from the microphone.

Figure 1. Two-hand and in-pocket interactions

EVALUATION We evaluated the accuracy of the interaction under three conditions: 1) in-pocket and stationary; 2) in-pocket and walking; and 3) two-hand. We conducted the evaluation over two days with 11 participants (mean age 24.7; one female). Eleven participants tapped a total of 3300 times in response to randomized stimuli in four locations. Participants were able to perform the BackTap tapping interaction under three usage conditions, with accuracies ranging from 92.27% to 96.67%.

DISCUSSION AND FUTURE WORK The BackTap interaction suggests a number of compelling applications. The in-pocket interaction is suitable for a variety of contexts that require eyes-free interactions, because the phone is in a pocket, purse, or arm band. One primary application of the in-pocket usage case is to use the four buttons to control a music player. Many people place their phone in a pocket, purse, or arm band while listening to music, where it is inconvenient to take the phone out in order to access the touchscreen interface controls. The back corners of the case could be mapped to volume up, volume down, track forward and previous track. Another

application for the in-pocket usage case is quick predefined text message replies under situations when it is socially inappropriate to actively use a touchscreen to reply. The two-hand usage scenario is most compelling for gaming applications, as it exposes the entire screen area and obviates finger occlusion by exposing up to four button controls on the back.

In our BackTap prototype, we calibrated our heuristics for tap detection specifically for the Samsung Galaxy S3 phone. In future work, we will port the BackTap interaction to other phones that include a microphone, gyroscope, and accelerometer. However, we believe it is likely that some of these threshold settings are interdependent, and that per phone calibration may only involve changing a few parameters. We also consider the use of machine learning methods to further develop the interactions.

CONCLUSION We presented BackTap, an interaction technique for tapping on the back four corners of a smartphone. It can be used when the touchscreen is not readily available, such as when the phone is in a pocket, purse, or arm band. It can also be used with the phone held in landscape, in order to not occlude the small screen with the fingers. This promises to improve the game playing experience on the mobile phone.

REFERENCES 1.Goel, M., Wobbrock, J., and Patel, S. GripSense: using

built-in sensors to detect hand posture and pressure on commodity mobile phones. Proc. UIST ’12, ACM Press(2012)

2.Heo, S., and Lee, G. Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures. Proc. MobileHCI ’11, ACM Press(2011)

3.Hinckley, K., and Horvitz, E. Toward more sensitive mobile phones. Proc. UIST ’01, ACM Press(2001)

4.Hinckley, K., and Song, H. Sensor synesthesia: touch in motion, and motion in touch. Proc. CHI ’11, ACM Press (2011)

5.Hudson,S. E., Harrison, C., Harrison, B. L., and LaMarca, A. Whack gestures: inexact and inattentive interaction with mobile devices. Proc. TEI ’10, ACM (2010)

6.Li, K. A. Blindsight: eyes-free access to mobile phones. Proc. CHI ’08 (2008), 1389—1398.

7.Ronkainen, S., Hkkil, J., Kaleva, S., Colley, A., and Linjama, J. Tap input as an embedded interaction method for mobile devices. Proc. TEI ’07, ACM (2007)

8.Saponas, T. S., Harrison, C., and Benko, H. PocketTouch: through-fabric capacitive touch input. Proc, UIST ’11, ACM (2011)

9.Shigeo, H., Isshin, M., and Kiyoshi, T. Behind touch: A text input method for mobile phone by the back and tactile sense interface. Information Processing Society of Japan, Interaction 2003.

10.Wigdor, D., Forlines, C., Baudisch, P., Barnwell, J., and Shen, C. Lucid touch: a see-through mobile device. Proc. UIST ’07, ACM Press (2007)

Posters UIST’13, October 8–11, 2013, St. Andrews, UK

112