NETMAN: The design of a collaborative wearable computer system

10
Mobile Networks and Applications 4 (1999) 49–58 49 NETMAN: The design of a collaborative wearable computer system Gerd Kortuem, Martin Bauer and Zary Segall Department of Computer and Information Science, University of Oregon, Eugene, OR 97403, USA This paper presents a wearable groupware system designed to enhance the communication and cooperation of highly mobile network technicians. It provides technicians in the field with the capabilities for real-time audio-conferencing, transmission of video images back to the office, and context-sensitive access to a shared notebook. An infrared location-tracking device allows for the automatic retrieval of notebook entries depending on the user’s current location. 1. Introduction There is an obvious need for effective communica- tion and collaboration in typical ‘wearable’ domains such as maintenance, repair, construction and manufacturing. Yet, most wearable computers are designed as stand- alone systems that provide users with automatic, context- sensitive access to information, but do not support inter- personal communication and collaboration (for example, [4,15,17,19,21]). Only few collaborative wearable systems have been described in the literature so far [1,5,10,20]. In this paper we report on the design and implemen- tation of NETMAN, a collaborative wearable system that supports computer technicians in their task of maintaining a campus-wide computer network. The main goal of the NETMAN system is to enhance the collaboration between field and office-based technicians. While the technicians in the field identify problems and perform actual on-site re- pairs, office-based technicians are presumed to be experts who support their colleagues in the field with advice. The entire NETMAN system consists of several components as shown in figure 1. The main component is a wearable computer that is worn by a technician in the pouch of a specially designed vest. It is a Personal Computer-like de- vice with a head-mounted display (HMD) that integrates a video camera, microphone and speaker (figure 2). The NETMAN wearable system enables field techni- cians to communicate and cooperate with office-based ex- perts in the following ways: First, it allows technicians and experts to engage in real- time audio conversations. Second, by transmitting the image of the wearable video camera over a wireless network, the expert is able to see what the technician sees and direct his or her attention using a remote or telepointer (figure 3). Third, a shared notebook allows technicians to document their work in a way that makes this information available to fellow workers. An infrared-based location sensor Figure 1. NETMAN scenario. makes it possible to retrieve notebook entries depending on the user’s current location. In the remainder of this paper, we will describe the de- sign and architecture of the NETMAN system. First, we introduce the application scenario that underlies the design of NETMAN. In section 3, we describe the overall system design, while in section 4, we detail the NETMAN software infrastructure. Section 5 discusses synchronous and asyn- chronous forms of collaboration in NETMAN. Section 6 reports our experiences with the NETMAN prototype and discusses issues of building collaborative wearable systems. Finally, in section 7 we summarize and discuss future work. Further information on the NETMAN system can be found in [5,15]. 2. Application scenario In this section we will introduce the application scenario that underlies the development of NETMAN and we will describe how the introduction of wearable computer tech- Baltzer Science Publishers BV

Transcript of NETMAN: The design of a collaborative wearable computer system

Mobile Networks and Applications 4 (1999) 49–58 49

NETMAN: The design of a collaborative wearable computersystem

Gerd Kortuem, Martin Bauer and Zary Segall

Department of Computer and Information Science, University of Oregon, Eugene, OR 97403, USA

This paper presents a wearable groupware system designed to enhance the communication and cooperation of highly mobile networktechnicians. It provides technicians in the field with the capabilities for real-time audio-conferencing, transmission of video images backto the office, and context-sensitive access to a shared notebook. An infrared location-tracking device allows for the automatic retrievalof notebook entries depending on the user’s current location.

1. Introduction

There is an obvious need for effective communica-tion and collaboration in typical ‘wearable’ domains suchas maintenance, repair, construction and manufacturing.Yet, most wearable computers are designed as stand-alone systems that provide users with automatic, context-sensitive access to information, but do not support inter-personal communication and collaboration (for example,[4,15,17,19,21]). Only few collaborative wearable systemshave been described in the literature so far [1,5,10,20].

In this paper we report on the design and implemen-tation of NETMAN, a collaborative wearable system thatsupports computer technicians in their task of maintaininga campus-wide computer network. The main goal of theNETMAN system is to enhance the collaboration betweenfield and office-based technicians. While the technicians inthe field identify problems and perform actual on-site re-pairs, office-based technicians are presumed to be expertswho support their colleagues in the field with advice. Theentire NETMAN system consists of several components asshown in figure 1. The main component is a wearablecomputer that is worn by a technician in the pouch of aspecially designed vest. It is a Personal Computer-like de-vice with a head-mounted display (HMD) that integrates avideo camera, microphone and speaker (figure 2).

The NETMAN wearable system enables field techni-cians to communicate and cooperate with office-based ex-perts in the following ways:

• First, it allows technicians and experts to engage in real-time audio conversations.

• Second, by transmitting the image of the wearable videocamera over a wireless network, the expert is able to seewhat the technician sees and direct his or her attentionusing a remote or telepointer (figure 3).

• Third, a shared notebook allows technicians to documenttheir work in a way that makes this information availableto fellow workers. An infrared-based location sensor

Figure 1. NETMAN scenario.

makes it possible to retrieve notebook entries dependingon the user’s current location.

In the remainder of this paper, we will describe the de-sign and architecture of the NETMAN system. First, weintroduce the application scenario that underlies the designof NETMAN. In section 3, we describe the overall systemdesign, while in section 4, we detail the NETMAN softwareinfrastructure. Section 5 discusses synchronous and asyn-chronous forms of collaboration in NETMAN. Section 6reports our experiences with the NETMAN prototype anddiscusses issues of building collaborative wearable systems.Finally, in section 7 we summarize and discuss future work.Further information on the NETMAN system can be foundin [5,15].

2. Application scenario

In this section we will introduce the application scenariothat underlies the development of NETMAN and we willdescribe how the introduction of wearable computer tech-

Baltzer Science Publishers BV

50 G. Kortuem et al. / NETMAN: The design of a collaborative wearable computer system

Figure 2. The Oregon Wearable Computer.

Figure 3. Screen sharing remote pointing in NETMAN. The field techni-cian wears a wearable computer with head-mounted display and camera.The expert on a destop computer uses a mouse to move a remote pointer

on the wearable display.

nology can enhance the collaboration of mobile networktechnicians.

2.1. The task: Network maintenance

Our goal for the NETMAN project is to design and de-velop a wearable system that helps technicians in their dailytask of troubleshooting and repairing faults in computernetwork equipment. For collecting requirements we areworking closely with the University of Oregon ComputingCenter, which is responsible for maintaining the computerand network installations throughout the campus. Typi-cal tasks of technicians include installation of new networkequipment such as routers, performing regularly scheduledmaintenance work, troubleshooting of network faults, andrepair and replacement of faulty equipment. Table 1 sum-marizes our application domain.

The technicians who are sent out to locate and resolvenetwork problems are equipped with an array of commu-nication devices like cellular phone, walkie-talkie, pager,and – in some cases – a notebook computer. Skills andexperiences of field technicians vary and can range frominexperienced student volunteers to highly trained experts.

Table 1Domain summary.

Domain:Maintenance of computer and network equipment on alarge scale

Tasks:Fault detection, equipment troubleshootingPerforming regular scheduled maintenance tasksInspection and corrective maintenance

User population:Advanced computer skillsMedium-level domain knowledge

Functionality:Collaboration with remote experts using audioGuidance for diagnosis and repair through shared videoand remote pointingCreation of group memory through shared notebookSeamless access to repair histories throughlocation-dependent access to shared notebookFast and easy access to email and online handbooks

In most cases, however, technical knowledge of techniciansis limited, but sufficient to perform routine repairs.

In addition to field technicians, the Computing Centeremploys a limited number of full-time employees who areknowledgeable experts in their domain. For the most part,they work in their office and do not perform routine re-pairs. The Computing Center also has an extensive libraryof online manuals and documentation, which is availableelectronically. This collection represents an invaluable re-source for field technicians and is used extensively.

Field technicians often have to visit a particular siteseveral times before they are able to resolve a problem,because they need to look up information, get additionalequipment, or ask a more experienced technician for ad-vice. Depending on the expertise of the technician, deter-mining the cause of a fault and fixing it can require exten-sive communication between field technicians and expertsback at the office. For example, field technicians mightcall into the office to ask for general advice (“How do

G. Kortuem et al. / NETMAN: The design of a collaborative wearable computer system 51

I. . . ?”) or for particular pieces of information (“What isthe IP address of the domain server?”). Similarly, tech-nicians are sometimes called by the Computing Center toask for status reports or to reassign technicians to differenttasks.

2.2. Observations

Looking at the current practice of network maintenancewe made the following observations:

Inefficiencies: Since the experts at the office are not al-ways available to answer questions immediately, fieldtechnicians often have to interrupt the current missionuntil they have had an opportunity to talk with an expertin person or through email. Thus, a large part of theworkday of a technician is spent going back and forthbetween work site and office.

Voice-only conversations are not always sufficient:While field technicians can call the office at any timeand ask for advice, a phone call often is not sufficient tosolve a particular problem. Phone conversations can behelpful to ask for specific bits of information, but are noteffective in situations where an expert has to provide atechnician with step-by-step directions. Verbal descrip-tions often do not convey enough information, and someconversations could be more effective, if the expert atthe office were able to see what the technician sees.

Lost knowledge: When troubleshooting a particular pieceof equipment, it can be extremely helpful for techniciansto know its repair history, that is, who was working on itin the past, what the problem was then, and when exactlythe last repair or maintenance was performed. Withoutthis knowledge field technicians often don’t know whyequipment was set up in a certain way, or whom theycould ask to find out. In the current practice there is nosystematic provision for capturing this kind of knowl-edge. Individual technicians tend to remember what wasdone in the past, but over time this form of knowledgegets lost unless it is preserved in some way. The currentsituation makes it especially difficult for less experiencedor newly hired technicians.

Too much equipment: Technicians carry many differentcommunication and computing devices. This is not onlya problem in terms of weight, but can also make it dif-ficult to determine the most efficient way to communi-cate. Furthermore, as part of their work, technicians of-ten have to perform manual activities like opening com-puters, moving furniture and equipment, dragging wires,crawling under desks. These are activities which requirethe full use of both hands and which suggest a wearablecomputer design as opposed to a computer that has tobe carried by hand.

The NETMAN system is designed to remedy the de-scribed shortcomings of the current practice. In the fol-lowing section we will give a general outline of the NET-

MAN system. Afterwards, we will describe the collabora-tion functions in detail.

3. The NETMAN prototype

The NETMAN system is a distributed groupware systemthat consists of several hardware and software componentsas shown in figure 1:

• A wearable computer worn by field technicians duringrepair and maintenance tasks.

• Several desk-bound workstations used by expert techni-cians at the Computing Center.

• Application software running on both the wearable com-puter and the workstations.

• A central database server.

All computers, wearable as well as workstations, havedirect access to the Internet. In the case of the wearablecomputer this is realized through a campus-wide wirelessnetwork. This network, which is provided by Metricom,covers the entire University of Oregon campus and providesa steady 28800 baud Internet connection.

3.1. The Oregon Wearable Computer

Our research group has designed and constructed a wear-able computer, which we call the Oregon Wearable Com-puter (figure 2). The computer is housed in a speciallydesigned vest that accommodates the various batteries andinput devices. The central processing unit is fitted into apouch on the back of the vest, and cables are run from theCPU out to the front pockets. These cables feed the batter-ies and input devices positioned in the front of the vest. Theweight of the batteries and accessories counters the weightof the CPU pouch on the back, providing a comfortablefit.

The display unit is a monocular heads-up display withsee-through capability. It is constructed from a pair ofVirtual-IO glasses with a resolution of 180000 pixels. Wechose to modify the Virtual-IO glasses by removing one ofthe eyepieces. This was done to provide an unobstructedand a more natural view for enhanced reality. As keyboard

Figure 4. The TwiddlerTM , a commercial chorded keyboard for one-handedoperation.

52 G. Kortuem et al. / NETMAN: The design of a collaborative wearable computer system

Figure 5. Wearable computer peripherals.

Figure 6. Wearable computer CPU.

we use the TwiddlerTM, a commercial chorded keyboardwith 18 keys [13] (figure 4). Other device options of theOregon Wearable Computer include (figure 5):

• A speaker-phone attached to the head-mounted display.

• A glide point that is worn on a belt for controlling thecursor.

• A video camera that supports continuous and stillvideo.

• Sensors for determining the current location and iden-tities of nearby objects:

a) An infrared receiver mounted on top of the displayallows the computer to determine its own locationby picking up signals sent out by specially designedinfrared transceivers, which are installed along theceiling of buildings. The location tracking subsys-tem is explained in more detail later on.

b) An iButton reader from Dallas Semiconductors [11]is used for scanning electronic equipment tags. Thisallows technicians to access information from anonline database without having to key in the iden-tification number.

• A wireless modem to connect to the campus-wide wire-less network.

The design philosophy followed in building the Ore-gon Wearable Computer was centered around COTS, orcommercial-off-the-shelf hardware. The COTS approachallowed us to purchase the parts of our wearable computerthrough traditional consumer outlets. The computer itselfis based on a Texas Instruments Pentium 75 motherboardand runs Windows95 (figure 6).

Our design is based on the following two principles:

Simplicity – our aim was to develop a simple system thatwould provide efficient support for basic functionalitylike data access and communication.

Robustness – we realized that the system could not besuccessful unless it is robust enough to sustain a 10 hourworkday without a major breakdown. This again calledfor simplicity of the employed solutions.

4. Software environment

The NETMAN system is based on a distributed softwareinfrastructure as shown in figure 7. The wearable computerand the stationary workstation of the remote expert functionas peers in a peer-to-peer system, communicating over awireless link. Both computers run identical software thatconsists of three components:

• The Application Manager.

• One or more Application Modules.

• The Session Manager.

In the following we will discuss the function of eachcomponent.

4.1. Application Manager

Because of the characteristics and limitations of the inputdevices of the wearable computer, we have abandoned somefeatures typical of current GUI interfaces, most notably thedesktop-metaphor and the concept of movable and resizablewindows. Both concepts have proven very successful fordesktop computers, but seem inappropriate for wearablecomputers with limited screen space and restricted inputdevice options.

The Application Manager is a simple GUI applicationthat replaces the default desktop. It provides a streamlineduser-interface for switching between application modules.Using the Previous and Next buttons visible in figures 8, 9and 10, the user can switch from application to applicationby simply pressing one button. While there may be severalapplications running at the same time, always only one ap-plication is visible in the main window of the applicationmanager. By avoiding overlapping windows, the Applica-tion Manager simplifies the user-interface and reduces thenumber of interaction steps.

4.2. Application Modules

Each application module is an independent software en-tity that plugs into the Application Manager, displayingits user-interface in the Application Manager’s window.Among others, the NETMAN system comprises the fol-lowing application modules:

Camera Viewer: The Camera Viewer module functions asthe viewfinder for the video camera of the wearable com-puter (figure 8). It simply displays the current video

G. Kortuem et al. / NETMAN: The design of a collaborative wearable computer system 53

Figure 7. Software architecture for application sharing.

(a) (b)

Figure 8. Camera viewer, (a) local, (b) remote.

(a) (b)

Figure 9. Wearable notebook.

image and allows the user to control which scene iscaptured by the camera by moving the head.

Web Browser: This module allows technicians to accessonline manuals, help files, configuration files, etc., whichare stored on a central LAN server. The ComputingCenter maintains an extensive collection of online doc-

uments, which have been created over time by techni-cians, capturing hands-on experience and describing rou-tine procedures. This collection is an invaluable sourceof information for technicians when trying to fix a prob-lem. The web browser provides a uniform interface tothis collection of documents. Furthermore, it can be used

54 G. Kortuem et al. / NETMAN: The design of a collaborative wearable computer system

Figure 10. User Manager.

to look up product specifications and other informationpublished by manufacturers of network equipment.

Emailer: This module allows technicians to send and re-ceive email messages while out in the field. A built-inalert function notifies the user of arrival of new messageseither by means of an acoustical or optical signal. Theemail module can replace the pager normally carried bytechnicians.

4.3. Session Manager

The Session Manager (figure 7) is responsible for settingup network connections for audio and video between twomachines. It also implements the screen sharing facilityas described in the next section. The Session Manager isimplemented on top of Microsoft NetMeeting, a toolkit forbuilding collaborative applications.

5. Collaboration functions

NETMAN provides support for synchronous and asyn-chronous collaboration of field technicians and experts:

• As a communication device it enables real-time audioconversations between technician in the field and office-based experts including the sharing of video images withremote users.

• A shared notebook application allows technicians todocument their work and look up notes of other techni-cians which are relevant to the task at hand.

The collaboration aspects of NETMAN are summarizedin table 2.

5.1. Synchronous collaboration

NETMAN includes two application modules designedto enable users to work together synchronously. The firstmodule is the User Manager.

Table 2Groupware dimensions of NETMAN.

Synchronous Asynchronous

Communication Real-time audio Seamless and ubiquitousconversations access to email

Cooperation Screen sharing and Shared notebook functionsremote pointing including as public group memorysharing of video images and personal reminder

User Manager: The User Manager (figure 10) allowsusers to start and end collaboration sessions. To starta session a user selects another user’s name from a listof possible communication partners. A dialog box willpop up on the remote machine giving the called user achance to accept or decline the call. As soon as a ses-sion has been created a voice channel is opened betweenboth machines, which enables users to talk to each other.

Once a session has been established either user can de-cide to share the screen contents with the other user. Press-ing the Share button visible in figures 8, 9, and 10 initiatesa process where the contents of the application window istransmitted to the remote machine, where it is displayed inthe Remote Application Viewer.

Remote Application Viewer: The Remote ApplicationViewer (figure 8(b)) is an application module that dis-plays the contents of the shared application window ofa remote machine.

The effect of sharing the application window is that theremote participant can watch what the local user is doingon his or her machine. While the application window canbe shared regardless of which application module is active,there are only two modules where it really makes sense todo so: the Camera Viewer and the Web Browser.

For instance, if a technician wants to show an expertwhat equipment he is working on, he would select theCamera Viewer and press the share button. The expertcan then watch the remote video image using the remoteviewer module. This allows the expert to answer questionslike “Where do I plug in this connector?” As can be seenin figure 8, both users have exactly the same screen in frontof them.

Conversely, by sharing the web browser of his or herworkstation the expert at the office can pull up online man-uals and refer to them while explaining to the field techni-cian how to perform a certain procedure.

5.1.1. Collaboration modesUsers who are engaged in a collaboration session can

choose their respective application modules independently:a) they may decide to rely on voice only and use differentmodules on each machine; b) they may decide to use thesame module without sharing (for example, both users maylook at documentation using the web browser); or c) theymay decide to collaborate more closely by having one ofthem share the application window and the other one use

G. Kortuem et al. / NETMAN: The design of a collaborative wearable computer system 55

the remote application viewer. Users can switch modes atany time; there is no explicit protocol for switching betweenmodes.

5.1.2. Remote pointingIn order to direct the attention of a remote participant,

the viewer of a shared image can use a remote pointer,a technique well known from collaborative systems withshared workspaces. The remote pointer is displayed on thescreen of the receiving (non-sharing) computer in additionto the regular cursor. The remote pointer mirrors the cursorof the sharing computer and is tied to the mouse of the re-ceiving machine. Thus, the user who shares the applicationwindow sees two cursors, the regular one and the remotepointer controlled by the remote user. Cursor and remotepointer are distinguished by form and color.

5.2. Asynchronous collaboration: The Wearable Notebook

The Wearable Notebook is a multi-user application thatgives technicians access to a shared text database (figure 9).It is realized as a distributed client–server application withclient software running on the wearable computers con-nected to a central LAN database server. To the user itappears as a simple text editor that allows technicians towrite down notes using the one-handed keyboard. Notesare automatically stored on the central server as they arecreated. Technically, the Wearable Notebook is just anotherapplication module that runs within the Application Man-ager.

The Wearable Notebook is intended to give techniciansa way to document their work on site. When troubleshoot-ing a particular piece of equipment, technicians can de-scribe the symptoms, the cause of the problem, and whatthey finally did to resolve it. These notes are accessible toother users so that technicians can share their experiencesand knowledge. In this respect, the Wearable Notebookserves a purpose similar to other shared information repos-itories like AnswerGarden or the Virtual Notebook Sys-tem [9].

The collection of notes and memos created by techni-cians over time represents the collective group memory ofall technicians and allows them to effectively share hands-on experience. Having a detailed description of work thatwas previously performed enables a technician to continuewhere the last technician left off. If necessary, it enablesa technician to contact the persons who performed repairtasks in the past. In sum, the Wearable Notebook capturesand makes available the knowledge and expertise of thepeople performing the work on site, knowledge that is lostin the current practice.

5.2.1. Location-aware retrieval of notebook entriesThe unique feature of the Wearable Notebook is the abil-

ity to retrieve notebook entries depending on the current lo-cation of the wearable computer. Technicians do not haveto search through the entire notebook for entries relevant to

Figure 11. Location tracking subsystem.

their current task. Instead, the Wearable Notebook automat-ically presents the user with a list of all notebook entriescreated at or nearby the user’s current location. The list isautomatically updated whenever the user moves from onelocation to the next (figure 9(a)).

The combination of sensors and context-awareness hasbeen employed before in various mobile and wearable com-puting systems [2,3,6–8,16]. However, all of these systemsare single-user systems without collaboration aspects.

In NETMAN the wearable computer’s ability to deter-mine its own location and to use this information to tag orretrieve notebook entries has been realized using infraredtechnology. We have designed a low cost infrared trans-ceiver that can be distributed in buildings throughout cam-pus. These transceivers send out unique signals that arereceived and recognized by the IR receiver attached to thewearable computer (Starner et al. [23] describe a similar IRsystem).

Information about the exact location of each transceiveris stored in a central database. Upon receiving signals froma transceiver, the wearable computer queries this databasefor the transceiver’s location. The location of transceivers isdescribed in terms of buildings and rooms, not in absolutephysical locations. For example, using the most recentlyreceived signal, the wearable computer can determine itscurrent location as “Deschutes Hall, Room 100”. All entriesin the notebook are automatically tagged with the currentlocation of the wearable device without requiring the user toprovide this information. When the user moves and a newIR signal is received by the wearable computer, the locationis used as a key to search the database for matching entrieswhich are then displayed on the screen. A schematic ofthis architecture is shown in figure 11.

Infrared-based systems have been used before to provideinformation about location or object identity, most notablyin Active Badge systems [24]. However, the philosophy ofsuch systems is contrary to what we aim at. Active Badgesystems are based on ‘smart environments’ that observe andmonitor the user’s movements. Because of privacy issueswe use a contrary approach in which the wearable computeractively collects information about its environment. In the

56 G. Kortuem et al. / NETMAN: The design of a collaborative wearable computer system

NETMAN system, there is no central instance that monitorsthe users’ movements.

The main advantage of the Wearable Notebook is theseamlessness with which information can be entered andaccessed. The head-mounted display enables the user to ig-nore the displayed information completely. If need shouldarise, however, the user can shift the attention instanta-neously. The automatic location-dependent retrieval func-tion of the Wearable Notebook reduces the need to ac-tively look for relevant entries as it is done automati-cally as a side-effect of the user’s movements in physicalspace.

5.2.2. Places vs. thingsThe idea to use location as an index for storing and re-

trieving notebook entries is based on the assumption thatlocation is a good relevance indicator. Notebook entriesthat were created in the very room a technician is currentlyworking in are most likely relevant to his or her task sincethey relate to the same equipment a technician is workingon. However, there is a problem whenever equipment ismoved from one room to the next. Since notes are associ-ated with places and not with equipment itself, connectionsbetween a particular piece of equipment and its correspond-ing notebook entries are lost whenever it is moved.

This problem can be resolved in two ways. First, bymaintaining a central database that indicates the current lo-cation of each piece of equipment. Second, by virtuallyattaching notebook entries to equipment and not to roomsin which the equipment is located. The drawback of thefirst approach is that is requires a database that at all timesaccurately reflects the current location of each device. Sucha database would have to be updated every time equipmentis moved, a task that probably incurs too much overheadto be practical.

We have realized the second approach as an alternativeto the IR-based location tracking system. We use iButtonsfrom Dallas Semiconductor [11] to attach an electronicallyreadable identification tag to each piece of equipment. AniButton is a 16 mm computer chip housed in a stainlesssteel can that, when attached to an object, enables usersto store up-to-date information directly at the object. TheiButton reader of the Oregon Wearable Computer is usedto read and write to the iButton memory. After extract-ing the ID from an iButton, it is used in the same wayas data received by the infrared receiver. Unfortunately,the user is required to physically connect the reader to theiButton; a wireless transmission of information is not (yet)possible.

6. Discussion

As of the writing of this paper the NETMAN system isin an experimental state and has not yet undergone field-testing. Nevertheless, during the design and through ex-perimentation we were able to gain important insights into

technical and usability issues of collaborative wearable sys-tems.

6.1. Head-mounted display

On the positive side we found that wearing a HMD posesless usability problems than anticipated. Users in generalget used to wearing a HMD and are able to switch atten-tion between the physical surrounding and screen contentsinstantaneously. On the other hand, wearing the Virtual-IOdisplay over long periods of time is not very comfortabledue to the heavy weight of the IO glasses. Similarly im-portant is the fact that there are serious social barriers thatmight prevent technician from using a head-mounted dis-play in the current form in public. Yet most users expressedtheir willingness to use HMDs, if the system provides sig-nificant benefits to them. In that respect we put much hopeinto a new generation of small and lightweight, eyeglass-mounted displays as exemplified by [22]. Such eyeglasseswill remove most of the social and some of the usabilityconcerns.

Nevertheless, we think that HMDs are not necessarilythe best display option for this kind of wearable computersystem. Instead we intend to employ a wrist-mounted dis-play as used, for example, in [17].

6.2. Text entry

Using the shared notebook feature of NETMAN requirestechnicians to input text. A chorded keyboard such as theTwiddlerTM can be an efficient text input device once theuser has undergone an extensive learning period. Someusers claim a typing speed of up to 30 wpm (wpm = ‘wordsper minute’, where a word is 5 characters plus one space)after several months or years of practice. That would becomparable to the speed an average user can achieve us-ing a miniature keyboard with regular layout. Yet basedon our own experiences it is questionable to assume thatprospective users are willing to invest the time it takes tobecome proficient with the TwiddlerTM or a similar type ofkeyboard. The TwiddlerTM is impractical due to the steeplearning curve.

A speech interface could one day be the solution forthe text entry problem of wearable computers. However,it interferes with the voice conferencing function of ourprototype and so we decided not to include it.

Instead we see two possible alternatives. First, we coulduse a wrist-mounted display/keyboard combination. Sec-ond, we could abandon text input altogether and resort toan approach where technicians create notebook entries outof predefined text blocks. With this approach text inputis no longer necessary; instead users must only be able toselect text blocks from a list.

6.3. Pointing devices and direct manipulation

Pointing devices like mice and track-balls are not wellsuited for wearable computers. Mice require a flat surface

G. Kortuem et al. / NETMAN: The design of a collaborative wearable computer system 57

while track-balls are sensitive to orientation – that is, theorientation of the track-ball changes with the user’s bodyposition requiring the user to adjust his or her finger move-ments accordingly. We are using a body-worn track-ballattached to the belt, but most users find it disorienting andnot very exact. We have come to the conclusion that for ourinterface design it is best to avoid the necessity for directcursor control at all. Instead, we favor an approach whereusers can cycle through interface elements using a simpletabbing approach. Our current interface design reflects thisapproach.

6.4. Video communication in the presence of resourcelimitations

The resource limitations of wearable devices are an im-portant but often ignored aspect of wearable computing.For example, wireless networks are limited in terms of com-munication delays, bandwidth and reliability. In contrast tonon-wearable collaborative systems that often rely on pow-erful workstations and high-speed network connections, therichness of the user experience in wearable systems is toa certain extent limited by such technical constraints. Al-though other researchers found no direct correlation be-tween video quality and effectiveness of collaboration [1],our preliminary results suggest that the video quality as cur-rently realized in the NETMAN system adversely effectsthe collaboration between technicians. The video quality isin general perceived as poor and not adequate.

There are two aspects to this problem, one related toframe rate, and the other to lighting conditions: Our cur-rent system implementation achieves a frame rate of up to2 frames/sec for a 200×200 pixel video image. This framerate results in visible delays and jerky motion of the remotevideo image whenever the wearable user moves the head.This is a combined effect of delays in the camera and inthe network, and probable inefficiencies in our current im-plementation. It is important to note that the problem ofthe moving camera is unique to wearable systems, sinceonly wearable systems use cameras that point away fromthe user at the workspace. In contrast, traditional videocon-ference solutions (for example, [25]) rely on a talking-headapproach where a still camera points at a more or less stableuser. The motion effect of a rotating camera, however, ismuch more severe than the effect of a user who is movingin front of a still camera.

Besides the motion blur of the camera we found thatinsufficient lighting conditions inside of buildings signifi-cantly contribute to the poor impression of the video quality.Pictures often look not only blurry, but also too dark.

Both these effects together (motion blur and insufficientlighting) have the result that the remote expert often isnot able to tell what is shown on the video image. Thevideo quality is good enough to allow the remote expertto determine where a wearable user is, in which directionhe or she is looking in, and what objects there are in theenvironment. Yet, it is impossible to clearly identify details

like shape and type of connectors or read printed labels (“Isthis a Sparc I or Sparc IV?”).

We are currently working on overcoming this problem.Our approach is based on the notion of ‘remote sensing’and is described in [5].

7. Future work

Providing collaboration tools for highly mobile fieldworkers, such as the network technicians in our example, isa challenging design task. While wearable technology al-lows new and interesting forms of collaboration, it also in-troduces serious technical and usability issues. We believethat NETMAN takes an innovative approach in addressingsome of these questions:

• NETMAN for the first time combines synchronous andasynchronous collaboration functions in a wearable sys-tem;

• As a collaborative system it is unique in its use of sensortechnology for providing users with automatic, context-sensitive access to relevant information.

Future work on NETMAN will continue in two direc-tions. First, we are currently performing a formal evalua-tion of the effectiveness of NETMAN’s video collaborationfunction with particular regard to remote pointing. Second,we are working on integrating ‘remote sensing’ capabilitiesinto NETMAN. The concept of ‘remote sensing’ means thata remote user has direct, unmediated access to the outputof sensors attached to another user’s wearable computer.Thus, remote sensing lets users perceive a remote environ-ment through the sensors of a remote wearable computer.We believe that remote sensing is a way to overcome thelimitations of poor video quality by providing a remote ex-pert with additional, more accurate information about thestate of the remote environment. By creating a height-ened sense of copresence and a rich shared conversationalcontext, we see remote sensing as a way to significantlyenhance the effectiveness of the collaboration of wearableusers.

References

[1] M.S. Ackerman, R.E. Kraut, M.D. Miller and J. Siegel, Collabo-ration in performance of physical tasks: Effects on outcome andcommunication, in: Proc. CSCW ’96, Boston, MA (1996).

[2] G.D. Abowd, C.G. Atkeson, J. Hong, S. Long, R. Kooper and M.Pinkerton, Cyberguide: A mobile context-aware tour guide, WirelessNetworks 3 (1997) 421–433.

[3] G.D. Abowd, A.K. Dey, R. Orr and J. Brotherton, Context-awarenessin wearable and ubiquitous computing, in: Proc. 1st ISWC ’97,Boston, MA (1997).

[4] L. Bass et al., The design of a wearable computer, in: Proc. CHI’97, Atlanta, GA (1997).

[5] M. Bauer, T. Heiber, G. Kortuem and Z. Segall, A collaborativewearable system with remote sensing, in: Proc. 2nd ISWC ’98, Pitts-burgh, PA (1998).

58 G. Kortuem et al. / NETMAN: The design of a collaborative wearable computer system

[6] H.W.P. Beadle, R. Gonzalez and J. Judge, Experiments with domestichypermedia information systems, in: 8th IEEE Workshop on Localand Metropolitan Area Networks, Potsdam, Germany (1996).

[7] H.W.P. Beadle, B. Harper, J.G.Q. Maguire and J. Judge, Locationaware mobile computing, in: Int. Conf. on Telecommunications (ICT’97), Melbourne (1997).

[8] H.W.P. Beadle, J.G.Q. Maguire and M.T. Smith, Using location andenvironment awareness in mobile communications, in: Proc. ICICS’97 (1997).

[9] L.M. Berlin, R. Jeffries, V.L. O’Day, A. Paepcke and C. Wharton,Where did you put it? Issues in the design and use of a groupmemory, in: Proc. INTERCHI ’93, Amsterdam (1993).

[10] M. Billinghurst, S. Weghorst and T.A. Furness, Wearable computersfor three-dimensional CSCW, in Proc. 1st ISWC ’97, Boston, MA(1997).

[11] Dallas Semiconductor: http://www.ibutton.com.[12] A.K. Dey, G.D. Abowd and A. Wood, CyberDesk: A framework

for providing self-integrating context-aware services, in: IntelligentUser Interfaces ’98.

[13] Handykey Corporation: http://www.handykey.com.[14] A.H. Ishii, TeamWorkStation: Towards a seamless shared work-

space, in: Proc. CSCW ’90.[15] G. Kortuem, Z. Segall and M. Bauer, Context-aware, adaptive wear-

able computers as remote interfaces to ‘Intelligent’ environments, in:Proc. 2nd ISWC ’98, Pittsburgh, PA (1998).

[16] S. Long, R. Kooper, G.D. Abowd and C.G. Atkeson, Rapid proto-typing of mobile context-aware applications: The cyberguide casestudy, in: Proc. MobiCom ’96.

[17] E. Matias, I.S. Mackenzie and W.A. Buxton, Wearable computer foruse in microgravity space and other non-desktop environments, in:Proc. CHI ’97, Atlanta, GA (1997).

[18] L.J. Najjar, J.C. Thompson and J.J. Ockerman, A wearable computerfor quality assurance inspectors in a food processing plant, in: Proc.1st ISWC ’97, Boston, MA (1997).

[19] J.J. Ockerman, L.J. Najjar and J.C. Thompson, Wearable computersfor performance support: Initial feasibility study, in: Proc. 1st ISWC’97, Boston, MA (1997).

[20] J. Siegel, R.E. Kraut, B.E. John and K.M. Carley, An empiricalstudy of collaborative wearable computer systems, in: Proc. CHI’95 (1995).

[21] B. Smith, L. Bass and J. Siegel, On site maintenance using a wear-able computer system, in: Proc. CHI ’95.

[22] M.B. Spitzer, N.M. Rensing, R. McClelland and P. Aquilino,Eyeglass-mounted displays for wearable computing, in: Proc. 1stISWC ’97, Boston, MA (1997).

[23] T. Starner, D. Kirsch and S. Assefa, The locust swarm: Anenvironmentally-powered, networkless location and messaging sys-tem, in: Proc. 1st ISWC ’97, Boston, MA (1997).

[24] R. Want, A. Hopper, V. Falcao and J. Gibbons, The active badgelocation system, ACM Trans. on Information Systems 10(1) (January1992).

[25] K. Watabe, S. Sakata, K. Maeno, H. Fukuoka and T. Ohmori, Dis-tributed multiparty desktop conferencing system: MERMAID, in:Proc. CSCW ’90 (1990).

Gerd Kortuem is a Ph.D. candidate at the Com-puter and Information Science Department, Uni-versity of Oregon, where he is a member of theWearable Computing Research Group. His cur-rent research interests include mobile and wearablecomputing, collaborative systems, mobile agentsystems, and selfadapting distributed systems. Hereceived Master’s degrees in computer sciencefrom both the University of Stuttgart, Germanyand the University of Oregon. Before entering the

Ph.D. program, Gerd Kortuem worked on mobile and nomadic systemsat the Advanced Technology Group at Apple Computer, Cupertino. Priorto that, he was working on terminological reasoning systems at the IBMResearch Center in Stuttgart, Germany, and the Technical University ofBerlin, Germany. In 1993, Gerd Kortuem received a Fulbright Fellowshipto pursue his studies in the US. He is a member of the ACM and the IEEEComputer Society.E-mail: [email protected]

Martin Bauer is a Master’s student at the Com-puter and Information Science Department, Uni-versity of Oregon, where he is a member of theWearable Computing Research Group. His re-search includes user-interface and systems aspectsof collaborative wearable computer systems. In1996, he received a Bachelor’s degree in computerscience from the University of Stuttgart, Germany.He is currently a Fulbright grantee for the acad-emic year 1997/98.

E-mail: [email protected]

Zary Segall is Professor and Department Headof the Computer and Information Science Depart-ment, University of Oregon. Prior to joining theUniversity of Oregon in 1993, he was 15 yearswith the Computer Science and Electrical andComputer Engineering Departments at CarnegieMellon University. Dr. Segall has developed anumber of technologies including: technology forfast prototyping of performance efficient parallelprocessors, methodology and a practical system for

automated performance characterization of complex programs, method-ology and a practical system for validation of fault tolerant programs,methodology for open system, application independent, fault tolerant op-erating systems such as FTM (Fault Tolerant Unix) and Total Recall forWindows and the concept of self-organizing software architectures forwearable computers. Dr. Segall’s current interests include wearable in-formation systems, dynamic validation of QoS in adaptable networks andknowledge trading systems. Zary Segall is a fellow of the IEEE ComputerSociety and a member of ACM.E-mail: [email protected]