Multimodal, Crossmedia, Multi Platform

Post on 23-Dec-2014

569 views 1 download

Tags:

description

 

Transcript of Multimodal, Crossmedia, Multi Platform

INPUT MODALITIESUXD minor theme ‘Multimodal, Crossmedia and Multi-Platform

Theme program

March 23: ‘Input modalities’ (Hans), workshop and assignment kick-off

March 30: ‘Output modalities’ (Rolf) and assignment progress

April 6: Workshop with Pieter Jongerius (Fabrique)

April 13: No class, EasterApril 20: Final presentations assignment

Theme in the scheme of things

Media, modalities and platforms provide us the nuts and bolts of the user experience.

The quality of the user experience is determined by our ability to utilize the media, modalities and platforms at our disposal.

Crossmedia

‘Crossmedia (also known as Cross-Media, Cross-Media Entertainment, Cross-Media Communication) is a media property owned, service, story or experience distributed across media platforms using a variety of media forms.’

http://en.wikipedia.org/wiki/Crossmedia

Multi-platform

‘In computing, cross-platform (also known as multi-platform) is a term used to refer to computer software or computing methods and concepts that are implemented and inter-operate on multiple computer platforms.’

http://en.wikipedia.org/wiki/Multiplatform

Multimodal

‘Multimodal interaction provides the user with multiple modes of interfacing with a system beyond the traditional keyboard and mouse input/output.’

http://en.wikipedia.org/wiki/Multimodal_interaction

Modality

‘A modality is a path of communication between the human and the computer.’

http://en.wikipedia.org/wiki/Modality_(human-computer_interaction)

Input modalities and output modalities

‘In human-computer interaction, a modality is the general class of: a sense through which the human can receive the

output of the computer (for example, vision modality)

a sensor or device through which the computer can receive the input from the human’

http://en.wikipedia.org/wiki/Modality_(human-computer_interaction)

Output modalities (computer-to-human)

‘Any human sense can be translated to a modality:

Major modalities Seeing or vision modality Hearing or audition modality

Haptic modalities Touch, tactile or tactition modality — the sense of pressure Proprioception modality — the perception of body awareness

Other modalities Taste or gustation modality Smell or olfaction modality Thermoception modality — the sense of heat and the cold Nociception modality — the perception of pain Equilibrioception modality — the perception of balance’

http://en.wikipedia.org/wiki/Modality_(human-computer_interaction)

Input modalities (human-to-computer)

An input device is any peripheral (piece of computer hardware equipment) used to provide data and control signals to an information processing system (such as a computer).

http://en.wikipedia.org/wiki/Input_devices

Pointing devices

Ivan Sutherland (MIT) demoing Sketchpad (1962)(introduced by Alan Kay in 1987)

Pointing devices

‘Pointing devices are input devices used to specify a position in space. Direct/indirect Absolute/relative’

http://en.wikipedia.org/wiki/Input_devices

Fitts’ law

‘The time it takes to move from a starting position to a final target is determined by the distance to the target and the size of the object.’ (Saffer, 2007)

Pointing devices

And you can point at more than merely pixels on a screen…

Alphanumeric input: keyboards

Alphanumeric input: keyboards

Alphanumeric input: keyboards

Alphanumeric input: speech recognition

Speaker dependent/independentDiscrete-word/connected-word inputLimited/large vocabulary

Alphanumeric input: handwriting recognition

‘Recognition’ patents as early as 1914

‘Electronic ink’ and recognition in Vista

http://www.freepatentsonline.com/1117184.pdf

Pen Computing

‘The return of the pen’Switching modes:

‘pointing’ vs. ‘ink’

Tap is the New Click

"One of the things our grandchildren will find quaintest about us is that we distinguish the digital from the real.“

William Gibson - from: Saffer (2009)

Ubiquitous computing

‘Ubiquitous computing (ubicomp) is a post-desktop model of human-computer interaction in which information processing has been thoroughly integrated into everyday objects and activities.’

http://en.wikipedia.org/wiki/Ubiquitous_computing

Wearable computing

‘Wearable computers are computers that are worn on the body.’

http://en.wikipedia.org/wiki/Wearable_computer

Tangible user interfaces

Hiroshi Ishii (MIT)

Sketching Mobile Experiences

Workshop in ‘Design This!’

Gestural Interfaces

Touchscreen vs. Free-form

Ergonomics of Interactive Gestures

"Hands are underrated. Eyes are in charge, mind gets all the study, and heads do all the talking. Hands type letters, push mice around, and grip steering wheels, so they are not idle, just underemployed."

—Malcolm McCullough, Abstracting Craft(from: Saffer, 2009)

Ergonomics of Interactive Gestures

Limitations due to anatomy, physiology and mechanics of the human body (kinesiology)

Left-handedness (7-10%) Fingernails Screen Coverage

Designing Touch Targets

No smaller than 1x1cm in an ideal world

In a not so ideal world:Iceberg TipsAdaptive Targets

Designing Touch Targets

But even spaciously sized targets can be tricky…

Patterns for Touchscreens and Interactive Surfaces

Tap to open/activate

Patterns for Touchscreens and Interactive Surfaces

Tap to select

Patterns for Touchscreens and Interactive Surfaces

Drag to move object

Patterns for Touchscreens and Interactive Surfaces

Slide to scroll

Patterns for Touchscreens and Interactive Surfaces

Spin to scroll

Patterns for Touchscreens and Interactive Surfaces

Pinch to shrink and spread to enlarge

Patterns for Free-Form Interactive Gestures

Proximity activates/deactivates

Patterns for Free-Form Interactive Gestures

Point to select/activate

Patterns for Free-Form Interactive Gestures

Rotate to change state

Patterns for Free-Form Interactive Gestures

Shake to change

Patterns for Free-Form Interactive Gestures

Tilt to move

Reader

Wearable computers:Steve Mann. Eyetap.org. http://about.eyetap.org/

Ubiquitous computing: Mark Weiser (1991). The Computer for the 21st Century.

http://www.ubiq.com/hypertext/weiser/SciAmDraft3.htmlAdam Greenfield (2006). Everyware: The Dawning Age of

Ubiquitous Computing. New Riders, Berkeley, CA.Donald Norman (1998). The Invisible Computer: Why Good

Products Can Fail, The Personal Computer Is so Complex, and Information Appliances Are the Solution. The MIT Press, Cambridge, MA

Reader

Input devicesDoug Engelbart (1968). The mother of all demos.

Google video streamWikipedia.

http://en.wikipedia.org/wiki/The_Mother_of_All_Demos

Reader

Fitts’ LawDan Saffer (2007). Designing for Interaction: Creating

Smart Applications and Clever Devices. New Riders, Berkeley, CA. (page 53)

Speech recognitionMicrosoft. Microsoft Speech Technologies.

http://www.microsoft.com/speech/speech2007/default.mspx

Reader

Handwriting recognitionWacom. Unleash Windows Vista With A Pen.

http://www.wacom.com/vista/index.php

Gestural InterfacesDan Saffer (2009). Designing Gestural Interfaces. O’Reilly Media,

Sebastopol, CA

ErgonomicsHenry Dreyfuss (1955). Designing for People. Allworth Press,

New York, NY.

Theme assignment

Today’s workshop assignment

Work together in teams of 2-3 students on one input device Each team will be investigating the following:

What’s the typical application of this device? What are typical patterns applied with this device? How can this device connect to a computer? What driver or other software is available for this device? How can I adjust the parameters of this device? How can I create application prototypes with this device?

Build a simple demonstrator for the device, using your laptop computer

Analyze the user experience with your demonstrator Present your demonstrator at the end of the afternoon Document your findings in a pdf document Link the document to a post on your blog

Today’s workshop assignment

Available devices Touch screen (2) Wii mote (4) Xbox USB controller (2) Wacom (3) Web cam (5) SpaceNavigator (1) Presenter (3) Smartboard (1) iPhone (?)