HCI and new technologies in the interface design for...

6
313 HCI and new technologies in the interface design for medical ultrasound devices Marcin Wichrowski, Polsko-Japońska Wyższa Szkoła Technik Komputerowych w Warszawie Abstract Development of new technologies contributes significantly to changes in interaction and interface design of everyday devices and more specialized medical devices. The following article is an attempt to analyze which of these solutions may have a direct effect to change the level of safety in use, forms of interaction, appearance and usefulness of medical ultrasound equipment. Particular attention will be devoted to new types of display screens, LCD, LED, Plasma, FOLED, E-Ink, wearable display, methods of interaction through gestures(multi-touch, body, hand, facial gesture control), speech synthesis and recognition, intelligent workspaces and software, networking technologies and robot assisted equipment. 1. Introduction Designing medical products is a challenging and complex task. Getting the design right means investigating the product’s functional requirements and the users’ needs and preferences. The most reliable method to accomplish this goal is to implement a structured Human-Computer Interaction (HCI) engineering program from the very beginning. HCI is the study concerned with the design, evaluation and implementation of interactive computing systems for human use and it’s a intersection of many disciplines (computer science, cognitive science, human factors, software engineering, management science, psychology, sociology, and anthropology). In order to ensure high usability of new product, designers should follow principles of User-Centered Design (UCD), which result from human factors psychology. During UCD process designers analyze needs, wants, and limitations of end users of a product at each stage of the development. The rapid growth of new medical devices based on software solutions makes the HCI field increasingly important factor in medical area. The combination of rapid prototyping and usability testing provides support in fixing problem before the product is introduced and can point to deeper implications for re-design.[10] Standards for the application of design principles for medical devices indicate that a high level of usability is important for all components. Even those parts that do not perform safety-critical functions could lead to a scenario of improper use and evoke danger situations for patients. Products which are designed without taking into account usability and human factors can impact safety because users may become fatigue, confused and frustrated. In this context, the use of advanced technology in the design of medical ultrasound devices could improve their usability and safety, but designers should take into account that it must be tested to avoid emergence of new risk for doctors and patients. According to the guidelines of the agency Food and Drug Administration (FDA), International Electrotechnical Commission (EIC) and the International Organization for Standardization (ISO) risk analysis is needed to check how new technologies and solutions for user interface affects the usability and safety of medical devices. Inspection is proposed in terms of interface design heuristics for medical devices, e.g. [9] and the analysis and assessment of risk for each new UI technology [4]. Conducting tests of the suitability of the latest technology in designing user interfaces for ultrasound equipment is a difficult task. First of all it often involves the need to build working prototypes, which are not only expensive but also very laborious. Moreover, manufacturers of medical equipment upgrade rarely proven technology and it is hard to find a similar medical devices using the new solution to perform comparative analysis. This paper is therefore illustrative and tries to determine the suitability of the new technologies referring to its characteristic features and examples of its use in similar medical areas. XIII International PhD Workshop OWD 2011, 22–25 October 2011

Transcript of HCI and new technologies in the interface design for...

Page 1: HCI and new technologies in the interface design for ...mechatronika.polsl.pl/owd/pdf2011/313.pdf · 4. Touchscreens vs. touchless displays Both large and small screens can be augmented

313

HCI and new technologies in the interface

design for medical ultrasound devices

Marcin Wichrowski, Polsko-Japońska Wyższa Szkoła Technik Komputerowych w Warszawie

Abstract

Development of new technologies contributes significantly to changes in interaction and interface design of everyday devices and more specialized medical devices. The following article is an attempt to analyze which of these solutions may have a direct effect to change the level of safety in use, forms of interaction, appearance and usefulness of medical ultrasound equipment. Particular attention will be devoted to new types of display screens, LCD, LED, Plasma, FOLED, E-Ink, wearable display, methods of interaction through gestures(multi-touch, body, hand, facial gesture control), speech synthesis and recognition, intelligent workspaces and software, networking technologies and robot assisted equipment.

1. Introduction

Designing medical products is a challenging and complex task. Getting the design right means investigating the product’s functional requirements and the users’ needs and preferences. The most reliable method to accomplish this goal is to implement a structured Human-Computer Interaction (HCI) engineering program from the very beginning. HCI is the study concerned with the design, evaluation and implementation of interactive computing systems for human use and it’s a intersection of many disciplines (computer science, cognitive science, human factors, software engineering, management science, psychology, sociology, and anthropology). In order to ensure high usability of new product, designers should follow principles of User-Centered Design (UCD), which result from human factors psychology. During UCD process designers analyze needs, wants, and limitations of end users of a product at each stage of the development. The rapid growth of new medical devices based on software solutions makes the HCI field increasingly important factor in medical area. The combination of rapid prototyping and usability testing provides support in fixing problem before the product is introduced and can point to deeper

implications for re-design.[10] Standards for the application of design principles for medical devices indicate that a high level of usability is important for all components. Even those parts that do not perform safety-critical functions could lead to a scenario of improper use and evoke danger situations for patients. Products which are designed without taking into account usability and human factors can impact safety because users may become fatigue, confused and frustrated. In this context, the use of advanced technology in the design of medical ultrasound devices could improve their usability and safety, but designers should take into account that it must be tested to avoid emergence of new risk for doctors and patients.

According to the guidelines of the agency Food and Drug Administration (FDA), International Electrotechnical Commission (EIC) and the International Organization for Standardization (ISO) risk analysis is needed to check how new technologies and solutions for user interface affects the usability and safety of medical devices. Inspection is proposed in terms of interface design heuristics for medical devices, e.g. [9] and the analysis and assessment of risk for each new UI technology [4].

Conducting tests of the suitability of the latest technology in designing user interfaces for ultrasound equipment is a difficult task. First of all it often involves the need to build working prototypes, which are not only expensive but also very laborious. Moreover, manufacturers of medical equipment upgrade rarely proven technology and it is hard to find a similar medical devices using the new solution to perform comparative analysis. This paper is therefore illustrative and tries to determine the suitability of the new technologies referring to its characteristic features and examples of its use in similar medical areas.

XIII International PhD Workshop OWD 2011, 22–25 October 2011

Page 2: HCI and new technologies in the interface design for ...mechatronika.polsl.pl/owd/pdf2011/313.pdf · 4. Touchscreens vs. touchless displays Both large and small screens can be augmented

314

2. Application of large screen

displays

The development of display technology resulted in the addition to the ultrasound equipment better color LCD / LED / Plasma displays. Their advantages compared to older CRT display technology are large. These are primarily thinner and lighter-weight and allow for very precise image display in high-resolution without flicker / vibration, while retaining the wide viewing angle. This makes it possible to install displays in different locations of ultrasound machine – on the control panel, on the wing or even in the form of detachable from the main unit. With the introduction of the image diagonals above 24 inches it’s possible to clearly present the large amount of information at the same time, such as comparing multiple images from different studies, simultaneous viewing of multiple readings, documents, etc. No need to switch between screens and navigate between them may contribute to accelerating the work. Very significance is the quality of the presented image. High resolution, very good contrast and precise grayscale reproduction are playing a key role in reading the image from the probe.

Taking into account the use of screens in the construction of popular stationary ultrasound, we can usually meet the following configurations:

• large touchless screen for visualization study + large control panel

• large touchscreen + minimized version of the control panel with key functions

• above systems are sometimes accompanied by additional displays (both touchscreen and touchless), which are built mostly in the control panel and which are acting as information / configuration device with the ability to change the function of controls surrounding the screen

Fig. 1. Vscan - portable ultrasound device (General

Electric Company) and S-FAST - universal ultrasound

tool (SonoSite)

3. Application of small screen

displays

Progressing miniaturization and availability of very good quality small displays, combined with decreasing energy consumption, gives new opportunities to build portable ultrasound devices. The problem in comparison to the big screen is still less comfortable interaction with the device. For mobile devices there are two most common design patterns:

• device that is entirely touchscreen, in which data entry is done via the onscreen keypad and controls,

• device consisting of a touchless screen and additional control panel with buttons and keypads.

Increasing the possibilities of small screen displays is made possible by mobile devices equipped with mini projectors and docking stations with large displays. Inconvenience of a small space can be solved by adding voice control or multi-touch gestures control.

4. Touchscreens vs. touchless

displays

Both large and small screens can be augmented with touch technology. In this case, beyond the role of an information, screen interfaces act as a real opening for new possibilities for interaction. The tendency to popularize the device, in which all interactions take place only through the touchscreen is getting bigger. It includes tablets, mobile phones, satellite navigation, music players, etc. The idea of such devices, all of which serve as a the touchable screen and form mutual interactions was presented by D. Merrill in Siftables [3]. This way of designing allows for quick change of the user interface without having to replace the entire device, and thus change of experience with its use. Users can also easily adjust the appearance of the interface to suit their needs (e.g. change the position of buttons, increase their size, modify proportions of interface components etc.). Pros for touchscreen:

• quickly make changes in appearance of device interface

• create interactive workspaces by user,

• control by means of multi-touch gestures,

• more screen space for the interaction for small devices.

Cons for touchscreen:

• some types of screen need to use the stylus,

• ease of error when entering data (the size of the buttons are too small, etc.),

• uncertainty whether the interaction was made,

• screen easy to dirty.

Page 3: HCI and new technologies in the interface design for ...mechatronika.polsl.pl/owd/pdf2011/313.pdf · 4. Touchscreens vs. touchless displays Both large and small screens can be augmented

315

Pros for touchless displays:

• ensure sense of implementation of the interaction (the real button instead of virtual one on the screen),

• easier to keep the screen clean. Cons for touchless displays:

• lack of a uniform look for interface design from different manufacturers (different location of buttons, symbols, nomenclature),

• inability to match the appearance of an interactive workspace to suit user requirements,

• space needed for additional buttons enlarges device and reduces the screen especially for portable devices.

5. FOLED (Flexible Organic Light

Emitting Diodes) and E - Ink

(Flexible Electrophoretic Ink)

displays

Using FOLED (Flexible Organic Light Emitting Diodes) and E - Ink (Flexible Electrophoretic Ink) screens abolish the restriction of having to design flat display surface where interaction is limited to movement in axis x and y. Displays of this kind can "wrap" around any three-dimensional object for example presenting research data directly on the probe or in the form of ultra-thin films placed on different parts of the ultrasound device. These ideas gave the origin of the name "Organic User Interfaces" (OUIs) [6]. Their spaciousness and ease of integration with any surface create the possibility to use interaction methods similar to those known from everyday life. For example – changing a page will be as simple as bending the corner of the display (as formerly pages of real book). Using these thin-film displays for presenting results create many new possibilities of interactions - user will be able to move data between them and arrange them visually in any way.

6. Wearable display (video

eyewear)

In order to release a physician from the need to simultaneously observe the patient and ultrasound device screen, you can consider the introduction of video eyewear equipped with cameras and transmitting the real image enhanced by information from ultrasound device. There are several popular solutions for this type of wearable display using a small built-in LCD or CRT screens. Unfortunately, their low resolution and image quality makes it difficult to provide necessary good perception of the image. Optical magnification needed for filling the entire space before the eyes and proximity of the displays causes eyestrain and difficulties in getting used to this type of image perception. Relatively

large weight of such devices and their wiring also create negative impact on usability and hinder the ease of movement and freedom to work. The appearance of light weight, translucent video eyewear offering high precision video and use of wireless communications is a necessary condition to begin deliberations on the use of such alternatives for the presentation of medical examination.

7. Multi-touch gesture control

With the growing popularity of devices with screens or touch areas (smartphones, tablet PCs, touchpads, etc.) the introduction of touch gestures seems to be a natural enrichment of forms of interaction with the interface in medical devices. Entering gestures that could be done either directly on the touchscreen (the most natural solution) or by the active area of the desktop (touchpad).

In the case of ultrasound devices use this type of interaction would assist the process of reviewing and analyzing of the images (zoom, pan, rotate, photo selection, segregation) as well as setting the parameters of the study or matching workspace to users’ needs (such as dragging and scaling of application windows). In addition to these examples, system could remember specific gestures for common operating commands like "next / previous screen," "start/default screen", "move up / down", "open in new window", "maximize / minimize window", etc. which are already used, for example, in some browsers (executed by movements of the mouse). Intuitive gestures allow for faster and more ergonomic routine operations.

A necessary condition for efficient interaction with the device is a high precision touchscreen / touch area and the fast system feedback. It should be noted that during the study doctor will be able to make gestures with one hand because the other is occupied by the probe. Therefore, designers should aim to simplify these forms of interaction. Only after the completion of medical examination doctor can make full use of both hands to control the interface and for example, quickly view and describe the results of the study.

8. Body / hand / facial gesture

control

Technological progress has led to increasingly precise methods of analysis and recognition of motion resulting in the emergence of a growing number of devices based on control by using body movements, hand - gestures, facial expressions etc. This allows users to more intuitive and natural interaction with system. The main condition for rapid acceptance of this type of communication is make it simple, natural and comfort. Moreover gesture patterns must be easy to remember and

Page 4: HCI and new technologies in the interface design for ...mechatronika.polsl.pl/owd/pdf2011/313.pdf · 4. Touchscreens vs. touchless displays Both large and small screens can be augmented

316

perform to ensure that users will learn it fast without adding mental and physical load.

Interactions controlled by gestures provide an advantage over conventional forms of computer interaction by:

• access to information without contact while maintaining sterility,

• greater accessibility for the disabled who have problems with using keyboard / mouse,

• more efficient exploration of large data such as high-resolution photos with the benefit of interaction in 3D space. [7]

The first argument is very valuable for medical devices where sterility is critical in preventing infections. Gesture controlled interfaces in medical environment may be useful for controlling the distribution of resources in the hospital, interacting with medical devices, controlling visualization displays and in the process of rehabilitation therapy of patients. For example, the Face Mouse system [5] allows doctors to control the movement of a laparoscope through facial expressions. Another example is the system Gestix [8] which allows to view MRI images using gestures.

Fig. 2. Gestix system

In the case of ultrasound devices adding hand gestures recognition could be used during process of medical examination. For example doctor may increase size of the image, change the visualization mode and test parameters without touching the controller panel. Standard interactions made earlier by using buttons or other controllers will be replaced by intuitive and easy to remember gestures. Another example is supporting laborious process of viewing and analyzing images from the probe and later cataloging and describing results from examination. Implementation of gesture control for medical ultrasound devices must meet several technical and usability requirements. These are the major ones:

• Fast response time - only these systems which provide sense of working in real time are seen as effective by users. Value of 45ms response time is read as a lack of delay. Starting from the above 300ms interaction is seen as too slow and tiring.

• Easy to learn and intuitive - the proposed gestures should be as close as possible to the natural patterns of human behavior and have a clear cognitive association with the functions users perform in the system (e.g. finger-pointing in order to select the menu). This will help users to learn it quickly and effectively. Designers should avoid gestures that require hard to remember complicated movement trajectory and configuration of hand or fingers. Additionally a sense of naturalness may be problematic because of the cultural diversity of users and their previous experience in reading the meanings of gestures.

• Comfort - selected gestures should be easy to make and do not require intensive, awkward movements, or holding hands in difficult, aggravating the muscles positions for too long.

• The precision of detection, tracking and recognition of gestures - plays a key role in the interaction. Varied shape, size and color of the hand, different lighting conditions, image blur due to fast movements are common serious problems faced in the way of a good detection of gestures.

• Distinguishing proper gestures - while working with the device user performs many other gestures that may be mistakenly diagnosed and introduce errors in the process of interaction. The solution to this problem may be to determine the characteristic gesture / motion initiating and terminating recognition. Another method is to precede and end the process with voice commands.

• Work immediately ("come as you are") - no need to use additional elements supporting gestures such as gloves, caps on the toes, the preparation of special backgrounds, lighting, etc. allows for the implementation of interaction immediately, without unnecessary preparatory acts.

• Identify two hand gestures - after completing the examination, two hands gesture recognition can speed up interaction with the system, e.g. during navigation through resources and describing images.

9. Proxemic interactions

In greatly simplified explanation the model of proxemic interaction implies that devices have features such as user identification, calculation of its distance, position and motion relative to the device and the identification and location of other devices [2]. Intended uses this type of interaction may be

Page 5: HCI and new technologies in the interface design for ...mechatronika.polsl.pl/owd/pdf2011/313.pdf · 4. Touchscreens vs. touchless displays Both large and small screens can be augmented

317

based on several sample scenarios. The first example is the use of displays which monitor the patient in the hospital as a temporary interface for portable ultrasound equipment. The display after detection of ultrasound device nearby connects to it and shows in real time examination results and could serve also as input device. Another example is to identify the user and adjust the screen interface to its predefined preferences. Next idea is detection of the position relative to the device such as checking if the doctor is directed toward the screen. If not ultrasound device can switch to voice control and give additional messages, which at the moment cannot be observed. The level of detail of the interface may also vary depending on the user's distance from the screen. For example when the doctor is close to device it could display the entire interface with all functions, while further away it could show only the image of the probe.

10. Communication through

speech recognition and

synthesis

Automatic speech recognition (ASR) in medical devices is used among others for camera positioning and operating panels, take notes, etc. However, due to low reliability, it shall be limited to tasks not directly related to the interaction with the patient. Main reasons by which it is hard to conduct the entire medical procedure only by voice control are: need for preparation (training) the system to the new speaker, the use of a limited vocabulary of commands, imperfect error correction and the inability to manipulate the natural language. [4]

However, in the case of ultrasound examination, there are generally favorable conditions for the use of ASR - the device itself does not generate a high volume of noise, the test takes place mostly in controlled (quiet) conditions, the doctor is usually directed towards the device and does not wear a mask distorting the voice. Moreover specificity of the medical procedure - the doctor usually has a busy hands (one holding the probe, the second operates manipulators to get the best picture) - indicates the need for additional support by adding voice commands. Changing parameters of the projected image of the probe, inclusion of additional modes of visualization research, etc. saving photos could be done by voice control. Speech synthesis would serve to confirm the commands and information about the current operation. Speech can be also used for medical documentation preparation.

11. Intelligent workspaces /

software

Appearance of control panels and screen interfaces for ultrasound devices despite many similarities, usually show some differences in the construction

and arrangement of working space. Location and layout of groups of buttons, shape and color of knobs / controls, command names, menu content, etc. are specific to each manufacturer. These differences may have a significant negative impact on the ergonomic quality and usability of the device and affect the comfort of a doctor who frequently moves his habits of the working methods from previous device. Another problem associated with the operation of medical device interfaces is that users do not always go through proper training and generally have a different experiences in operating the equipment.

The solution to these issues may be the introduction of customizable workspaces that allow users to adjust the appearance of the screen to the needs of both novice users and experts. Physician might decide to choose working space according to his experience or change it to fit his needs. A more sophisticated way would be analyzing the interaction with the device (e.g. measuring the number of errors made by the user and delays in decision making, checking the navigation paths in search of a function etc.) and automatically adjusts the interface to the ability of the recipient. Techniques supporting teaching how the equipment works could be realized in form of interactive simulations using the device and by intelligent help / assistance suggesting the next steps of action. Additionally, based on the analysis of user's methods of work there is possibility of introducing advanced help system which could suggest the best solution for current task.

12. Network technologies and

Internet connection

In hospitals and medical centers frequently operate IT systems that help manage and utilize the database for patients and medical infrastructure. This helps to store and exchange the necessary information related to the activities of medical units. Ultrasound device equipped with access to such systems may collect data about the patient and inform doctors e.g. about previous studies and the course of treatment history. New examination results and diagnosis are automatically saved in the database. This saves the time needed to get important information and prevents their loss. It may also help to reduce errors in data transmission process. Common methods for patient identification are held by a bar code or RFID (Radio Frequency Identification). Adding GPS module in portable ultrasound devices would allow to locate them quickly and transfer to patients who require immediate diagnosis [4].

Page 6: HCI and new technologies in the interface design for ...mechatronika.polsl.pl/owd/pdf2011/313.pdf · 4. Touchscreens vs. touchless displays Both large and small screens can be augmented

318

13. Remote control and robot

assisted devices

The development of the aforementioned technologies and the introduction of broadband Internet and satellite communications allowed the emergence of telemedicine. Initially, its use was limited to the transmission of real-time telemetry readings, ECG data, x-ray images, MRI, etc. to allow medical centers to diagnose and examine patients at a distance. The role of telemedicine is particularly important in emergency care, where it allows the transfer of examination results through e.g. mobile phone directly from the scene to the hospital where the doctor on duty immediately provide further guidance for the conduct of the patient. This allows to prepare well in advance equipment in the treatment room on the way to hospital and give needed medications in an ambulance. Over time, telemedicine has been used to perform remotely controlled surgical procedures and operations. Experiments in the field of telemedicine has been taken also in the field of ultrasound [1]. Thanks to the system consisted of specially-designed mechanical arm located near patient and the controller, it is possible to precisely manipulate the probe to perform remote examination. Location of the probe is controlled not only by a doctor (operator), but also by a computer system which analyzes: the position of the probe and the forces acting on it, acquired ultrasound images (for example, recognizes examination path) and the learned motion trajectories consistent with the chosen type of study. Presence of physician (or his assistant) near the patient is not required during examination and the advanced model of supporting doctor by the computer allows to perform a fully valuable study completely remotely in real time. In addition it saves the physician fatigue, which may be caused by long keeping the probe in uncomfortable position with a specified force necessary for good visualization of examination results.

14. Summary

Introducing new technologies in designing user interfaces for medical ultrasound devices provides promising opportunities to improve their usefulness and safety. Many of the proposed solutions is already operating successfully in other areas, but in order to reduce the potential risk it's recommended to apply HCI techniques including the impact of human factors in whole process of designing and introducing proposed new forms of interfaces and interactions (in accordance with the guidelines of the FDA, EIC, ISO).

15. Bibliography

[1] Abolmaesumi P., Salcudean SE, Zhu WH, DiMaio SP, Sirouspour MR, A User Interface for

Robot-Assisted Diagnostic Ultrasound, Robotics and Automation, 2001. Proceedings 2001 ICRA.

[2] Saul Greenberg, Nicolai Marquardt, Ballendat Till, Rob Diaz-Marino, Wang Miaosen, Proxemic interactions: the new ubicomp?, Volume 18 Issue 1, January & February 2011, ACM

[3] D. Merrill, Kalanithi J., Maes P., Siftables: Towards Sensor Network User Interfaces. In the Proceedings of the First International Conference on Tangible and Embedded Interaction (TEI'07). February 15-17 in Baton Rouge, Louisiana, USA.

[4] William H. Muto, Israelski Edmond W., How new technologies can help create better UI's for medical devices, HCI'07 Proceedings of the 12th International Conference on Human-computer interaction: applications and services

[5] Nishikawa, A. Hosoi, T. Koara, K. Negoro, D. Hikita, A. Asano, S. Kakutani, H. Miyazaki, F. Sekimoto, M. Yasui, M. Miyake, Y. Takiguchi, S . Monden, M., Face MOUSE: A novel human-machine interface for controlling the position of a laparoscope, Robotics and Automation, IEEE Transactions on, 2003

[6] Roel Vertegaal, The (re) usability of everyday computational things: why industrial design will be the new interaction design, Interactions Magazine, Volume 18 Issue 1, January & February 2011, ACM

[7] Juan Pablo Wachs, Mathias Kölsch, Helman Stern, Yael Edan, Vision-based hand-gesture applications, Communications of the ACM Volume 54 Issue 2, February 2011

[8] Wachs, J., Stern H., Edan, Y., Gillam, M., Feied, C., Smith, M., and Handler, J. A hand-gesture sterile tool for browsing the MRI images in the OR. Journal of the American Medical Informatics Association 15, 3 (May-June 2008)

[9] Zhang Jiajia, Todd R. Johnson, Vimla L. Patel, Danielle L. Paige, Tate Kubose, Using Heuristics to evaluate usability patient safety of medical devices, Journal of Biomedical Informatics - Patient Safety, Volume 36 Issue 1 / 2, February 2003

[10] Wiklund Michael E., Wilcox Stephen B. Designing Usability into Medical Products, CRC Press Inc., 2005

Author:

MSc. Marcin Wichrowski Polsko-Japońska Wyższa Szkoła Technik Komputerowych ul. Koszykowa 86 02-008 Warszawa tel. (022) 58 44 500 fax (022) 58 44 501

email: [email protected]