The Glass Class at AWE 2015

155
The Glass Class Rapid Prototyping for Wearables June 8th, AWE 2015 Mark Billinghurst HIT Lab NZ University of Canterbury [email protected]

Transcript of The Glass Class at AWE 2015

The Glass Class Rapid Prototyping for Wearables

June 8th, AWE 2015

Mark Billinghurst HIT Lab NZ

University of Canterbury [email protected]

Introduction

Mark Billinghurst ▪  Ex-Director of HIT Lab NZ,

University of Canterbury

▪  PhD Univ. Washington

▪  Research on AR, mobile HCI, Collaborative Interfaces

▪  More than 300 papers in AR, VR, interface design

▪  Sabbatical in Glass team at Google [x] in 2013

Goals and Schedule Goals

Learn simple interface design guidelines

Learn useful prototyping tools

Learn where further resources are

Schedule

10:00 Introduction

10:05 Wearable Interface Design (15 minutes)

10:20 Prototyping Tools (25 minutes)

10:45 Finish

Room Desk Lap Hand Head

Wearable Computing ▪  Computer on the body that is: ▪  Always on ▪  Always accessible ▪  Always connected

▪  Other attributes ▪  Augmenting user actions ▪  Aware of user and surroundings

History of Wearables ▪  1960-90: Early Exploration ▪  Custom build devices

▪  1990 - 2000: Academic, Military Research ▪  MIT, CMU, Georgia Tech, EPFL, etc ▪  1997: ISWC conference starts

▪  1995 – 2005+: First Commercial Uses ▪  Niche industry applications, Military

▪  2010 - : Second Wave of Wearables ▪  Consumer applications, Head Worn

● Second Gen. Systems ▪ Recon (2010 - ) ▪ Head worn displays for sports ▪ Ski goggle display ▪ Investment from Intel (2013)

▪ Google (2011 - ) ▪ Google Glass ▪ Consumer focus

● Recon Use Case ▪ While skiing show: ▪ maps, ▪ speed, ▪ altitude ▪ phone calls ▪ text messages

Google Glass (2011 - )

● View Through Google Glass

Always available peripheral information display Combining computing, communications and content capture

● Smart Glass in 2015

Design Guidelines

How do you Design for this?

● Designing for Intimacy ▪  Interface Design for wearables means designing for the body: ▪  Designing for Attention

▪  Designing for Social Interaction

▪  User Experience Design

● Attention

●  Micro-Interactions

Using mobile phone people split their attention between the display and the real world

● Using Micro Interactions

Quick micro-interactions reduce divided attention and allow people to spend more time in real world

● Typical Interaction Times

▪ sadfa

● Time Looking at Screen

Oulasvirta, A. (2005). The fragmentation of attention in mobile interaction, and what to do with it. interactions, 12(6), 16-18.

● Dividing Attention to World

▪ Number of times looking away from mobile screen

●  Design for MicroInteractions ▪  Design interactions less than a few seconds ▪  Tiny bursts of interaction ▪  One task per interaction ▪  One input per interaction

▪  Benefits ▪  Use limited input ▪  Minimize interruptions ▪  Reduce attention fragmentation

NHTSA Guidelines - www.nhtsa.gov

For technology in cars: •  Any task by a driver should be interruptible at any time. •  The driver should control the pace of task interactions. •  Tasks should be completed with glances away from the

roadway of 2 seconds or less •  Cumulative time glancing away from the road <=12 secs.

Rule of Thumb: The interface should not take more than 4 seconds to complete a given

step in the interaction

● Designing for Interruptions

▪ Assume user is engaged in critical real world task ▪ Use context to filter interruptions (is it necessary?) ▪ Interrupt in way that consumes least attention ▪ Allow user to dismiss interruption with minimal effort ▪ Progressively disclose information and increase interaction

● Example: Interruptions on Glass

▪  Gradually increase engagement and attention load ▪  Respond to user engagement

Receiving SMS on Glass

“Bing”

Tap Swipe

Glass

Show Message Start Reply

User Look Up

Say Reply

● Consider Cognitive Load (Input)

▪ Consider the Cognitive Load required for input ▪  Little user input = low cognitive load (e.g. Sensor) ▪  Constant user input = high cognitive load (e.g. touch)

Continuum of Cognitive Load for User Input

● Cognitive Load (Output)

▪  The Cognitive Resource consumed by system output ▪  Agents = low cognitive load (e.g. Web shopping agent) ▪  Multimedia = high cognitive load (e.g. VR system)

Continuum of Cognitive Load for Output

● Social Interactions

● Social Acceptance

▪ People don’t want to look silly ▪ Only 12% of 4,600 adults would be willing to wear AR glasses ▪ 20% of mobile AR browser users experience social issues

▪ Acceptance more due to Social than Technical issues ▪ Needs further study (ethnographic, field tests, longitudinal)

Rule of Thumb: Fashion First - It DOES NOT MATTER what

the device does unless the user is willing to put it on the first time

TAT Augmented ID

Rule of Thumb: Make the interface not just for the user

but also for the people around the user, both physically and socially.

● User Experience

Last year Last week Now Forever

The Now machine Focus on location, contextual and timely information, and communication.

● Consider Your User

▪ Wearable User ▪ Probably Mobile ▪ One/no hand interaction ▪ Short application use ▪ Need to be able to multitask ▪ Use in outdoor or indoor environment ▪ Want to enhance interaction with real world

How would you take a note?

How would you take a note?

How would you take a note?

How does she take a note?

Rule of Thumb: Provide multiple ways of accessing

functionality.

● Example: Glass Pictures

▪ On Glass there are three ways to take a picture 1/ Voice commands – “Ok Glass, Take a Picture” 2/ Touch navigation through menu

3/ Winking with right eye

▪ Which you use depends on context ▪ Riding a bike outdoors – voice commands ▪ During a meeting – winking

● Design For Device

▪ Simple, relevant information ▪ Complement existing devices

It's  like  a  rear  view  mirror  

 Don't  overload  the  user.  S7ck  to  the  absolutely  essen7al,  avoid  long  interac7ons.  Be  explicit.  

   

Make it glanceable

Seek to rigorously reduce information density. Successful designs afford for recognition, not reading.

Bad Good

✓Reduce the number of info chunks

You are designing for recognition, not reading. Reducing the total # of information chunks will greatly increase the glanceability of your design.

1

2

3

1

2

3

4

5 (6)

Test done by Morten Just using a watch

Design single interactions to be faster than 4 s

Eye movements For 1: 1 230ms For 2: 1 230ms For 3: 1 230ms For 4: 3 690ms For 5: 2 460ms

~1,840ms

Eye movements For 1: 1-2 460ms For 2: 1 230ms For 3: 1 230ms

~920ms

1

2

3

1

2

3

4

5 (6)

Test done by Morten Just using a watch

Test the glanceability of your design ✓

✓ Do one thing at a time

✓ Test your design indoors + outdoors

● Don’t Get in the Way

▪ Enhance, not replace, real world interaction

● Keep it Relevant

▪ Information at the right time and place

Remember, people have an ever-growing ecosystem of wearables

Each device should be used when it’s most relevant and when it’s the easiest interaction available.

Interface Guidelines ▪  Design for device ▪  Use Micro Interaction ▪  Make it glanceable ▪  Do one thing at a time ▪  Reduce number of information chunks ▪  Design for indoor and outdoor use

Universal Design Principles

Flexibility

Equitable use Easy to perceive Simple and intuitive

Low physical effort

High tolerance for error

Attention: least visual-manual attention necessary, 4 second checkpoints, < 2 second access time Social: graceful interfaces, multiple ways of accessing functionality

User Experience: Glanceable interface, design for device, multiple ways of accessing information, keep it relevant

Summary

Prototyping

Why Prototype? ▪  Quick visual design ▪  Capture key interactions ▪  Focus on user experience ▪  Communicate design ideas ▪  “Learn by doing/experiencing”

● Google Glass Prototyping

Prototyping Tools ▪  Static/Low fidelity ▪  Sketching ▪  User interface templates ▪  Storyboards/Application flows

▪  Interactive/High fidelity ▪  Wireframing tools ▪  Mobile prototyping ▪  Native Coding

Important Note ▪  Most current wearables run Android OS ▪  eg Glass, Vuzix, Atheer, Epson, etc

▪  So many tools for prototyping on Android mobile devices will work for wearables

▪  If you want to learn to code, learn ▪  Java, Android, Javascript/PHP

Typical Development Steps ▪  Sketching ▪  Storyboards ▪  UI Mockups ▪  Interaction Flows ▪  Video Prototypes ▪  Interactive Prototypes ▪  Final Native Application

Increased Fidelity & Interactivity

Sketched Interfaces

▪  Sketch + Powerpoint/Photoshop/Illustrator

● Paper Prototype

▪ Use sketched interface in template

GlassSim – http://glasssim.com/

▪  Simulate the view through Google Glass ▪  Multiple card templates

GlassSim Card Builder ▪  Use HTML for card details ▪  Multiple templates ▪  Change background ▪  Own image ▪  Camera view

GlassSim Samples

Glass UI Templates

▪  Google Glass Photoshop Templates ▪  http://glass-ui.com/ ▪  http://dsky9.com/glassfaq/the-google-glass-psd-template/

Sample Slides From Templates

● Smart Watch Templates

▪ Eg https://dribbble.com/jaysuthar/buckets/260235-watch

Application Storyboard

▪  http://dsky9.com/glassfaq/google-glass-storyboard-template-download/

Application Flow

● Glassware Flow Designer

▪ Quick flow layout tool ▪ https://glassware-flow-designer.appspot.com/

● Viewing Design on Device ▪ Android Design Preview ▪ https://github.com/romannurik/AndroidDesignPreview

▪ View a portion of your desktop on Android device ▪ Select region of screen ▪ Mirror it on Android Device

▪ Use to view mock-ups on target device ▪ Eg Powerpoint for Glass mockups

Limitations ▪  Positives ▪  Good for documenting screens ▪  Can show application flow

▪  Negatives ▪  No interactivity/transitions ▪  Can’t be used for testing ▪  Can’t deploy on wearable ▪  Can be time consuming to create

Transitions

▪ Series of still photos in a movie format. ▪ Demonstrates the experience of the product ▪ Discover where concept needs fleshing out. ▪ Communicate experience and interface ▪ You can use whatever tools, from Flash to iMovie.

Video Sketching

See https://vine.co/v/bgIaLHIpFTB

Example: Video Sketch of Vine UI

UI Concept Movies

● Pop - https://popapp.in/

▪ Combining sketching and interactivity on mobiles ▪ Take pictures of sketches ▪ Link pictures together

● Using Pop

Interactive Wireframes

Interactive Wireframing ▪  Developing interactive interfaces/wireframes ▪  Transitions, user feedback, interface design

▪  Web based tools ▪  UXpin - http://www.uxpin.com/ ▪  proto.io - http://www.proto.io/

▪  Native tools ▪  Justinmind - http://www.justinmind.com/ ▪  Axure - http://www.axure.com/

UXpin - www.uxpin.com

▪  Web based wireframing tool ▪  Mobile/Desktop applications ▪  Glass templates, run in browser

https://www.youtube.com/watch?v=0XtS5YP8HcM

Proto.io - http://www.proto.io/ ▪  Web based mobile prototyping tool ▪  Features ▪  Prototype for multiple devices ▪  Gesture input, touch events, animations ▪  Share with collaborators ▪  Test on device

● Proto.io Android Wear Demo

▪ https://proto.io/showcase/android-wear/

Proto.io - Interface

Demo: Building a Simple Flow

Gesture Flow

Scr1

Scr2 Scr3

Scr4 Scr5 Scr6

Tap

Swipe

Start Transitions

Demo

Wireframe Limitations ▪  Can’t deploy on Device ▪  No access to sensor data ▪  Camera, orientation sensor

▪  No multimedia playback ▪  Audio, video

▪  Simple transitions ▪  No conditional logic

▪  No networking

Processing for Wearables

Processing ▪  Programming tool for Artists/Designers ▪  http://processing.org ▪  Easy to code, Free, Open source, Java based ▪  2D, 3D, audio/video support

▪  Processing For Android ▪  http://wiki.processing.org/w/Android ▪  Strong Android support ▪  Generates Android .apk file

Processing - Motivation ▪  Language of Interaction ▪  Sketching with code ▪  Support for rich interaction

▪  Large developer community ▪  Active help forums ▪  Dozens of plug-in libraries

▪  Strong Android support ▪  Easy to run on wearables

http://processing.org/

http://openprocessing.org/

Development Enviroment

Basic Parts of a Processing Sketch

/* Notes comment */ //set up global variables float moveX = 50; //Initialize the Sketch void setup (){ } //draw every frame void draw(){ }

Importing Libraries ▪  Can add functionality by Importing

Libraries ▪  java archives - .jar files

▪  Include import code import processing.opengl.*;

▪  Popular Libraries ▪  Minim - audio library ▪  OCD - 3D camera views ▪  Physics - physics engine ▪  bluetoothDesktop - bluetooth networking

http://toxiclibs.org/

Processing and Glass ▪  One of the easiest ways to build rich

interactive wearable applications ▪  focus on interactivity, not coding

▪  Collects all sensor input ▪  camera, accelerometer, touch

▪  Can build native Android .apk files ▪  Side load onto Glass

Example: Hello World //called initially at the start of the Processing sketch void setup() { size(640, 360); background(0); } //called every frame to draw output void draw() { background(0); //draw a white text string showing Hello World fill(255); text("Hello World", 50, 50); }

Demo

Hello World Image PImage img; // Create an image variable void setup() { size(640, 360); //load the ok glass home screen image img = loadImage("okGlass.jpg"); // Load the image into the program } void draw() { // Displays the image at its actual size at point (0,0) image(img, 0, 0); }

Demo

Touch Pad Input ▪  Tap recognized as DPAD input

void keyPressed() { if (key == CODED){ if (keyCode == DPAD) {

// Do something ..

▪  Java code to capture rich motion events ▪  import android.view.MotionEvent;

Motion Event //Glass Touch Events - reads from touch pad public boolean dispatchGenericMotionEvent(MotionEvent event) { float x = event.getX(); // get x/y coords float y = event.getY(); int action = event.getActionMasked(); // get code for action switch (action) { // let us know which action code shows up case MotionEvent.ACTION_DOWN: touchEvent = "DOWN"; fingerTouch = 1; break; case MotionEvent.ACTION_MOVE: touchEvent = "MOVE"; xpos = myScreenWidth-x*touchPadScaleX; ypos = y*touchPadScaleY; break;

Demo

Sensors ▪  Ketai Library for Processing ▪  https://code.google.com/p/ketai/

▪  Support all phone sensors ▪  GPS, Compass, Light, Camera, etc

▪  Include Ketai Library ▪  import ketai.sensors.*; ▪  KetaiSensor sensor;

Using Sensors ▪  Setup in Setup( ) function

▪  sensor = new KetaiSensor(this); ▪  sensor.start(); ▪ sensor.list();

▪  Event based sensor reading void onAccelerometerEvent(…) { accelerometer.set(x, y, z); }

Sensor Demo

Using the Camera ▪  Import camera library

▪  import ketai.camera.*; ▪  KetaiCamera cam;

▪  Setup in Setup( ) function ▪ cam = new KetaiCamera(this, 640, 480, 15);

▪  Draw camera image void draw() { //draw the camera image image(cam, width/2, height/2); }

Camera Demo

Hardware Prototyping

Build Your Own Wearable

▪  MyVu display + phone + sensors

Beady-i

▪  http://www.instructables.com/id/DIY-Google-Glasses-AKA-the-Beady-i/

Rasberry Pi Glasses

▪  Modify video glasses, connect to Rasberry Pi

▪  $200 - $300 in parts, simple assembly ▪  https://learn.adafruit.com/diy-wearable-pi-near-eye-kopin-video-glasses

Physical Input Devices ▪  Can we develop unobtrusive input devices ? ▪  Reduce need for speech, touch pad input ▪  Socially more acceptable

▪  Examples ▪  Ring, ▪  pendant, ▪  bracelet, ▪  gloves, etc

Prototyping Platform

Arduino Kit Bluetooth Shield Google Glass

Example: Glove Input

▪ Buttons on fingertips ▪ Map touches to commands

Example: Ring Input

▪ Touch strip, button, accelerometer ▪ Tap, swipe, flick actions

How it works

Bracelet

Armband

Gloves

1,2,3,4

Values/output

● Light Blue Bean - punchthrough.com/bean/

▪  Low energy Bluetooth Arduino microcontroller ▪  Programmed wirelessly (Blue Tooth 4.0) ▪  Runs off coin battery ▪  On-board sensors (accelerometer, temperature) ▪  Ideal for wearable sensor/input projects

● LittleBits - http://littlebits.cc/

▪  Quick and dirty prototyping ▪  Snap together electronics ▪  Dozens of input and output modules ▪  Arduino module to connect to wearable

Other Tools

Other Tools ▪  Wireframing ▪  pidoco ▪  FluidUI

▪  Rapid Development ▪  Phone Gap ▪  AppMachine

▪  Interactive ▪  App Inventor ▪  WearScript

● WearScript ▪ Combines power of Android development on Glass with the learning curve of a website ▪ Use a cloud IDE to write JavaScript that runs instantly on Glass inside a WebView ▪ Get started at http://wearscript.com

● WearScript Features ▪ Community of Developers ▪ Easy development of Glass Applications ▪ GDK card format ▪ Support for all sensor input

▪ Support for advanced features ▪ Augmented Reality ▪ Eye tracking ▪ Arduino input

● How it Works

● WearScript and Glass ▪ Create user interface “cards” ▪ can include HTML formatting

▪ Detect Gestures

● WearScript Playground

▪  Test code and run on Glass ▪ https://api.wearscript.com/

● Nested Cards WS.displayCardTree();!var select = function () {WS.say('select')};!var tap = function () {WS.say('tap')};!var log = function () {WS.log('log')};!var tree = new WS.Cards();!!var subtree = new WS.Cards();!!subtree.add('No Listeners', 's0');!!subtree.add('Select', 's1', select);!!subtree.add('Select + Tap', 's2', select, tap);!!subtree.add('Menu', 's3', 'Tap', tap, 'Log', log);!

tree.add('Subtree', '0', subtree);!tree.add('Subtree', '', subtree);!WS.cardTree(tree);!

● Sensor Input ▪ Listen and respond to sensor data ▪ Ibeacon ▪ Gyroscope ▪ Gps ▪ Accelerometer ▪ magnetic field ▪ Orientation ▪ Light ▪ Gravity ▪ linear acceleration ▪ rotation vector

WS.sensorOn(WS.sensor(‘gps’),5,function(data){!!var latitude = data[‘values’][0];!!var longitude = data[‘values’][1];!!//do something with coordinates!

});!

● Connecting Sensors <html style="width:100%; height:100%; overflow:hidden">!<body style="width:100%; height:100%; overflow:hidden; margin:0">!<script>!function main() {! if (WS.scriptVersion(1)) return;! WS.serverConnect('{{WSUrl}}', function () {! WS.sensorOn('accelerometer', .25);! WS.cameraOn(1);! WS.dataLog(false, true, .15); ! });!}!window.onload = main;!</script></body></html>!

● Compatible Devices ▪ Google Glass ▪ Arduino (microcontroller) ▪ Myo (armband) ▪ Pebble (watch) ▪ more...

● WearScript Links ▪ http://www.wearscript.com/en/latest/ ▪ Documentation ▪ https://api.wearscript.com ▪ cloud IDE ▪ https://github.com/kurtisnelson/wearscriptandroid ▪ all the code is open-source on Github ▪ https://plus.google.com/+BrandynWhite ▪ creator of WearScript, lots of neat videos detailing his research

Summary ▪  Prototyping for wearables is similar to mobiles

▪  Tools for UI design, storyboarding, wireframing

▪  Android tools to create interactive prototypes ▪  Processing, WearScript, etc

▪  Arduino can be used for hardware prototypes ▪  Once prototyped Native Apps can be built

▪  Android + SDK for each platform

Conclusion

More Information

•  Mark Billinghurst –  Email: [email protected]

– Twitter: @marknb00

•  Website –  http://www.hitlabnz.org/

Resources

● Designing For Glass Video

▪ https://www.youtube.com/watch?v=6ERgbIJ6pCM

CHI Wearables Exhibit

Online at http://wcc.gatech.edu/exhibition

Glass Resources ▪  Main Developer Website ▪  https://developers.google.com/glass/

▪  Glass Apps Developer Site ▪  http://glass-apps.org/glass-developer

▪  Google Design Guidelines Site ▪  https://developers.google.com/glass/design/

index?utm_source=tuicool ▪  Google Glass Emulator ▪  http://glass-apps.org/google-glass-emulator

● Recon Jet ▪ Developer Website ▪ http://www.reconinstruments.com/developers/getting-started/reconsdk/

▪ Application Prototyping ▪ http://www.reconinstruments.com/developers/resources/application-prototyping/

Other Resources ▪  AR for Glass Website ▪  http://www.arforglass.org/

▪  Vandrico Database of wearable devices ▪  http://vandrico.com/database

Books ▪  Programming Google Glass ▪  Eric Redmond

▪  Rapid Android Development: Build Rich, Sensor-Based Applications with Processing ▪  Daniel Sauter