Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak ›...

8
1 Computer support for human activities “in the wild” Dipak Surie Cognitive computing group Umeå University, Sweden http://www.cs.umu.se/~dipak Tenth Conference on Human-Computer Interaction, December 13-15 2005, Oviedo (Spain) Speech focuses on ... Ubiquitous and wearable computing a. What? b. Why? c. How? Interactions through context a. Context aware (CA) computing? b. Issues? c. Implicit Interaction? d. Support for human activties in CA computing? Ego-centric interaction a. Perceptual & cognitive space? b. Support personal human activities Approach of prototyping a. PHYVIR project b. Need for alternative development methodology EasyADL project a. Virtual reality design studio b. Ethnographical approach c. Personal everyday activity recognition (PEAR) toolkit Prosthesis - augment the capabilities of the human and overcome his limitations Support human activities Locomotion Calculation Communication Computer support for human activities Era of desktop (Personal) computing Mainly restricted to office environments Limited support for everyday activities Interactions are explicit, limited, uncomfortable,... GUI, mouse & keyboard Computer support for human activities Motivational quotes ! “The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it” (Mark Weiser) “Design the tool to fit the task so well that the tool becomes part of the task, feeling like a natural extension of the work, a natural extension of the person” (Don Norman) “Technology is for assisting everyday life and not just overwhelming it” (Abowd & Mynatt)

Transcript of Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak ›...

Page 1: Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak › SpainKeynote.pdf · • Actuation – many computer controlled applications Teco´s Particle

1

Computer support for human activities “in the wild”

Dipak SurieCognitive computing groupUmeå University, Sweden

http://www.cs.umu.se/~dipak

Tenth Conference on Human-Computer Interaction, December 13-15 2005, Oviedo (Spain) Speech focuses on ...

• Ubiquitous and wearable computinga. What?b. Why?c. How?

• Interactions through contexta. Context aware (CA) computing?b. Issues? c. Implicit Interaction?d. Support for human activties in CA computing?

• Ego-centric interactiona. Perceptual & cognitive space?b. Support personal human activities

• Approach of prototypinga. PHYVIR projectb. Need for alternative development methodology

• EasyADL projecta. Virtual reality design studiob. Ethnographical approachc. Personal everyday activity recognition (PEAR) toolkit

“ Prosthesis ”- augment the capabilities of the human and overcome his limitations

Support human activities

Locomotion

Calculation

Communication

Computer support for human activities

• Era of desktop (Personal) computing

• Mainly restricted to office environments

• Limited support for everyday activities

• Interactions are explicit, limited, uncomfortable,...

• GUI, mouse & keyboard

Computer support for human activities

Motivational quotes !“The most profound technologies are those thatdisappear. They weave themselves into the fabricof everyday life until they are indistinguishablefrom it” (Mark Weiser)

“Design the tool to fit the task so well that the tool becomes part of the task, feeling like a natural extension of the work, a natural extension of the person” (Don Norman)

“Technology is for assisting everyday life andnot just overwhelming it” (Abowd & Mynatt)

Page 2: Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak › SpainKeynote.pdf · • Actuation – many computer controlled applications Teco´s Particle

2

Beginning of a new era ?• Scenario: House without electric lamps?

• What about computers?a. We are surrounded by computers

b. Computers in everyday objects

c. We carry them and even wear them

d. Many computers per person

e. Universal information access

f. Networking of devices• Number of transistors on a chip doubles about every two years

• More performance for less cost, smaller size and more energy efficient

Other technological advancements

• Storage - size, faster, ...

• Networking – global, local, ad-hoc, high bandwidth, low latencies,...

• Sensing – types, accuracy, cost, robustness, ...

• Actuation – many computer controlled applications

Teco´s Particle computer

Tiny 1 cubic cm particle includes sensors, battery, CPU

and communication

Sensing technologies• Light & vision

• Audio

• Movement & acceleration

• Location and position

• Magnetic field & orientation

• Proximity, touch and user interaction

• Temperature, humidity & air pressure

• Weight

• Motion detection

• Gas-sensors & electronic noses

• Bio-sensors

• Zero-power sensors

GPS technology for outdoor localization

RFID technology for object identification

Beyond Quantitative increase in number!

Page 3: Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak › SpainKeynote.pdf · • Actuation – many computer controlled applications Teco´s Particle

3

Definitions “Pervasive computers are numerous, casually accessible, often invisible computing devices, frequently mobile orembedded in the environment and are connected to anincreasingly ubiquitous network infrastructure composed ofa wired core and wireless edges”

(Conference on Pervasive Computing, 2001)

“Wearable Computing pursues an interface ideal of acontinuously worn, intelligent assistant that augmentsmemory, intellect, creativity, communication and physicalsenses and abilities.” (Thad Starner)

Wearable interfaces ...

Wearable head-mounted display and

video camera lens

Steve Mann, University of Toronto

Sousveillance – recording of an activity from user´s perspective

Wearable interfaces ...Thad Starner, Georgia Institute of Technology

Gesture Pendant

Some issues in ubicomp...

• Uneven conditioningComputing resources available at various geographical locations are uneven

• Localized scalabilityDensity of interactions has to fall off as one moves away Overwhelmed by distant interactions that are of little relevance

• Privacy & security

Ubicomp Vs Wearcomp

? (Battery, ...)?Technological issues

Resource management

Localized control

Localized information

Personalization

Privacy

First perspectiveThird perspectiveView

WearcompUbicompFeature

You can interact with anything! How? Hiroshi Ishii, MIT

• Interaction through everyday architectural space - walls, doors, ceilings,...

• Interaction through everyday manipulable objects - coffee cup, scribling pad,....

• Interaction through ambient background media

Tangible user interface

Page 4: Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak › SpainKeynote.pdf · • Actuation – many computer controlled applications Teco´s Particle

4

How humans interact with the environment ?

• Interaction through multiple modalities- Vision- Audition- Tactility- Olfaction- Taste- Locomotion- Sensorimotor (movement of body and body parts)

How humans interact with the environment ?

• Interaction through multiple modalities- Vision- Audition- Tactility- Olfaction- Taste- Locomotion- Sensorimotor (movement of body and body parts)

• What about human cognition?- Memory- Knowledge- Belief

What is context ?“Context is any information that can be used to characterize thesituation of an entity. An entity is a person, place, or object that isconsidered relevant to the interaction between a user and anapplication, including the user and application themselves”

(Dey)• Three categories of context (Schilit)

1. Device context2. User context3. Physical context

“A system is context-aware if it uses context to provide relevant information and/or services to the user, where relevancy dependson the user’s task” (Dey)

Context is beyond location awareness

(Schmidt)

Types of context include...• Spacial – location, orientation,...

• Temporal – date, time, week-end,...

• Identity• Social – people nearby, activity,...

• Environmental – temperature, pressure,...

• Resources – nearby, availability,...

• Cognitive

Frame of reference?

Cyberguide

• GPS or infrared technology for location tracking

• Display location on screen• Predefined points of interest ?• Travel journal

– Keep log of places seen and photographs taken

• Context: location, time

Abowd, Georgia Institute of Technology

Project OxygenHuman centred computing, MIT

• Human perception

• Restricted to vision & speech

Page 5: Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak › SpainKeynote.pdf · • Actuation – many computer controlled applications Teco´s Particle

5

Mediacup

• Coffee cup with sensors, processing, and communication embedded in the base

• Cup on the table

• Temperature context (hot, warm, cold)

• Everyday objects as artifacts

Interactions through context “Implicit Human-Computer Interaction is the interaction of ahuman with the environment and with artefacts which isaimed to accomplish a goal. Within this process thesystem acquires implicit inputs from the user and maypresent implicit output to the user”

“Implicit Input are actions and behaviors of humans, whichare done to achieve a goal and are not primarily regardedas interaction with a computer, but captured, recognizedand interpret by a computer system as input”

(Albrecht Schmidt)

“Implicit output are output of a computer that is not directly related to an explicit input and which is seamlessly integrated with the environment and the task of the user”

(Albrecht Schmidt)

Locomotion 3, 4

1— Manipulable space Space around the body experienced through touch & sensorimotor (movement of body and body parts) senses. 2— Observable space Conceptual space experienced through vision and audio. No locomotion required 3— Environmental space Bigger than observable space and experienced through locomotion 4— Geographical space Large space and cannot be completely experienced through locomotion

Ego-centric Interaction User as the frame of reference

1 2

Ego-centric Interaction• User as the frame of reference• Perceptual space – manipulable, observable, environment &

geographical space

• Cognitive space – memory, knowledge, belief

• Implicit interaction through user activities in the physical world

• Components of Ego-centric Interaction- Capturing personal user activities- Modeling personal user activities- Providing user interfaces

• 24 hours, 7 days in a week• Realtime support while performing activities

Prototyping approach ?• Interaction is an experience

• Learn & understand the interaction

• Evaluation

• Design & re-design

Page 6: Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak › SpainKeynote.pdf · • Actuation – many computer controlled applications Teco´s Particle

6

• Synchronizing physical & virtual world• Humans switch between physical and virtual world frequently in

performing modern everyday activities• Develop artefacts that exist in both the physical world & virtual world

at the same time.• Physical world as an interface !

“Re-typing information into a

virtual environment based on notes

taken in the physical world”

Wearable object manipulation tracker

• Synchronizing object manipulations in the physical world and virtual world

• RFID technology & ultra-sound technology

• Localization

• Intra-family communication

• Integrating physical notes with virtual notes

Physical-virtual bulletin board

• Time consuming to build

• Technological limitations(available sensors, battery capacities, processing power, storage issues,...)

• Difficult to follow design & re-design methodology

• Many research...(context-aware computing) are working on context sensing...

Hardware prototyping...

Simulating ubiquitous computing environment in Virtual Reality

• Accelerate (design & re-design) prototype development process

• Build prototypes ahead of technology

• “Virtual Reality Vs Actual Reality”• VR design studio for simulating

sensors and exploring UBICOMP user interfaces

• Home environment • “Fika” (Swedish coffee break) -

Scenarios

A coin has 2 sides

EasyADL – Independent life despite dementia

“Dementia – mental disorder caused by brain malfunction and resulting in a decline in the intellectual faculties and anabnormal perception of reality”

ADL – Activities of daily living

Problems• Memory loss• Breakdown of thought process• Orientation disorders• Lost sense of routine sequences and time• Frustration in performing ADL• Caregivers are under enormous stress

Page 7: Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak › SpainKeynote.pdf · • Actuation – many computer controlled applications Teco´s Particle

7

EasyADL – Independent life despite dementia • Ubiquitous and wearable computing technology in real-world environments

• Intelligent assistant that is proactively operating in the background

• Wait until the user needs assistance (allow user to train their own cognitive abilities)

• Step-by-step help to do everyday activities

• Ensure safety- fall down- forget to lock the door- forget to turn off the stove

• Humane & economic perspective

EasyADL – Independent life despite dementia

Cognitive prosthesis

“Cognitive prosthesis involves the study of human cognition, studying the human being as a system. Based on this knowledge, the focus of this activity is to augment the capabilities of the human and overcome his limitations”

• Not to replicate a human being through robotics, but to augment his capabilities

• Example: Eyeglasses, which augment the eye, but does not replace them

Capture information - describing current state of the human actor - describing current state of the surrounding environment

Identify and store successfully performed action sequences as “activities” for later use

Compare previous activities with current ones for determining potential “slips”

If necessary, guide the human actor into a working sequence of actions

Cognitive prosthesis interaction cycle

Terminology for Activities

• Activity theory & cognitive science literature• Objects • Agents• Activity – “is a goal-directed sequence of state-changes within a set of objects, initiated by one or more agent(s) and directly or indirectly controlled and monitored by this/these agent(s) during the full lifetime ofthe activity”• Action - “is a clearly goal-directed activity carried out by one or more agent(s) in order to improve the status of an overall, higher-level,activity”• Operation - “is a functional sub-unit of an action”• Personal everyday activities

Sensing Layer (Physical & Logical

sensors)

Simple Feature Extraction Layer

(in the server side)

s1 s2 s3 s4 s5 s6 s7 s8 s9

s1, s2...are sensors that are used to capture user activities in a VR environment.

Server

Client

Client-Server communication

f1 f2 f3 f4 f5

ff1 ff2 ff3 ff4 ff5 ff6

Feature Extraction Layer

(in the client side)

f1, f2... are features extracted from sensor readings. For eg, standing, sitt ing etc. are features extracted from bend angle between knee joints and hip joints.

ff1, ff2... are features that are needed for the clustering layer to distinguish clusters.

Clustering Layer

c11

c35

c23 Several competitive clustering algorithms are used in parallel to address the issue of curse of dimensionality.

Classification Layer

cc4

Explicit user system interaction. The user gives user defined labels to the action clusters classified by PEAR toolkit.

Action Labelling

Activity Modeling Layer

State 1

State 2 State 3

State 4

Each classified & labelled clusters are considered as a state (Action). PEAR toolkit recognize activities by considering them as a sequence of actions. Activity modeling layer will keep track of state changes to recognize activities.

Page 8: Tenth Conference on Human-Computer Interaction Speech ...people.cs.umu.se › dipak › SpainKeynote.pdf · • Actuation – many computer controlled applications Teco´s Particle

8

• HCI is becoming interdiciplinary • Ethnographical studies

- How the subjects are performing personal everyday activities?- How objects are manipulated while performing activities?- First person view & third person view observations- Interview with the subject and care-taker

• Wearable audio & video recorder (First person view)

• Camera in the environment (Third person view)

• Simulating data for activity recognition algorithms

Ethnographical approach... Final words...

• Computer support for human activities is beyond office environments

• HCI is beyond GUI• It is an experience, experience it!

Questions?

[email protected]

Thank U!