1 TAPSENSE ENHANCING FINGER INTERACTION ON TOUCH SURFACES In proceedings of 24 th ACM UIST...

10
1 TAPSENSE ENHANCING FINGER INTERACTION ON TOUCH SURFACES In proceedings of 24 th ACM UIST symposium, 2011, Santa Barbara, CA

Transcript of 1 TAPSENSE ENHANCING FINGER INTERACTION ON TOUCH SURFACES In proceedings of 24 th ACM UIST...

1

TAPSENSEENHANCING FINGER INTERACTION ON TOUCH SURFACES

In proceedings of 24th ACM UIST symposium, 2011, Santa Barbara, CA

2

• Recognized by MIT’s Technology review as one of the worlds top 35 innovators under 35

• Recipient of Google’s Ph.D Fellowship in 2012

• Current Carnegie Mellon Ph.D student

• Recipient of Microsoft’s Ph.D fellowship in 2012

• Current Carnegie Mellon Ph.D student

• Expert Juggler

• Professor in the HCI institute of Carnegie Mellon Univ.

• Inducted into the CHI academy in 2006

• Founding director of the Ph.D program in HCI at Carnegie Mellon Univ.

CHRIS HARRISON JULIA SCHWARZSCOTT E. HUDSON

3

• The use of touch interfaces has significantly increased, sophistication of input has not

• Tablet PC touchscreens

• Mobile touchscreens

• Fingers are very sophisticated, can be used in a more efficient way than the x/y

coordination method currently used to measure finger input

4

• Gives the touchscreen the ability to identify the type of object used as input

• Can differentiate between a nail, tip, knuckle, and pad

• Tapsense relies on the fact that different parts of the finger produce different acoustic signatures

• Many features based on what has been done with skinput technology

Oh no! The dreaded Fat Fingers

• May solve two current issues with touch interactions:• Finger Overloading• Breaking out functionality

5

• Ipod Touch with acoustic sensor attached• Acoustic sensor connected to computer to record• Discussed the possibility of mobile phones to use

built in microphones to implement tapsense

MOBILE PROOF OF CONCEPT

6TABLE PROOF OF CONCEPT

7SAMPLE APPLICATION

8

• 18 Participants involved

• 9 Male, 9 Female

• 9 participants for the prototype mobile device

• 4 female, 5 male

• Tested input set of Pad, Tip, Knuckle, Nail, and stylus

• 9 participants for the multitouch table

• 5 female, 4 males

• Tested all finger inputs as well 7 input tools

• 10 taps to different screen locations, done 4 times

9

• Mobile Accuracy:

• 88.3% with all input types

• Tip was worst performing

• Finger and pad have accuracy rating of 99.4%

• Tablet Accuracy

• 86.3% with all input types

• Tip also worst performing

• Finger and pen combo – accuracy of 99.7%

10

• Any other questions?

• Is Tapsense limited to mobile devices, or could it be integrated with any other technologies mentioned in the paper?

• How do you imagine TapSense helping you? Is this the end of buttons/keypads?

• Will we soon see technology like TapSense in our mobile devices?

QUESTIONS

THANKS!