Guidance System for the Visually Impaired Dissertation - FINAL V2
-
Upload
kishan-bhugul -
Category
Documents
-
view
648 -
download
0
Transcript of Guidance System for the Visually Impaired Dissertation - FINAL V2
University of Mauritius
Guidance System for the Visually Impaired
Kishan Yashveer Bhugul David Kwet Chin Young Ten
Project submitted in partial fulfillment of the requirements for the degree of
BSc (Hons.) Computer Science
Supervisor: Associate Professor Dr. Kavi Kumar Khedo
Department of Computer Science and Engineering Faculty of Engineering
April 2014
Guidance System for the Visually Impaired Persons Table of Contents __________________________________________________________________________________
Table of Contents
Chapter 1 Introduction ........................................................................................................... 1
1.1 Problem Statement ........................................................................................................... 1
1.2 Scope ................................................................................................................................ 2
1.3 Aims and Objectives ........................................................................................................ 2
1.4 Contributions .................................................................................................................... 3
1.5 Scheduled Plan ................................................................................................................. 4
1.6 Individual Contribution .................................................................................................... 4
Chapter 2 Background Study ................................................................................................. 6
2.1 Context Awareness ........................................................................................................... 6
2.1.1 Context Awareness and Visually Impaired Persons .................................................. 7
2.1.2 Existing Context Aware Systems .............................................................................. 8
2.2 Navigational Systems ..................................................................................................... 10
2.2.1 Navigation Systems Technologies........................................................................... 10
2.2.2 Path Determination Approaches .............................................................................. 12
2.2.3 Existing Navigational Systems for Visually Impaired Persons ............................... 16
Chapter 3 Analysis ................................................................................................................. 18
3.1 Domain Analysis ............................................................................................................ 18
3.2 Problem Analysis ........................................................................................................... 20
3.2.1 Scenario ................................................................................................................... 20
3.3 Technological Analysis .................................................................................................. 22
3.3.1 Mobile Device ......................................................................................................... 22
3.3.2 Shakes Input ............................................................................................................ 22
3.3.3 Voice Recognition ................................................................................................... 23
3.3.5 Text-to-Speech Output ............................................................................................ 24
3.3.6 Haptic Feedback and Sound Alert ........................................................................... 24
i
Guidance System for the Visually Impaired Persons Table of Contents __________________________________________________________________________________
3.3.7 Global Positioning System (GPS) ........................................................................... 25
3.3.8 Digital Compass ...................................................................................................... 25
3.4 Navigation Systems Techniques Analysis ..................................................................... 26
3.4.1 Choice of Navigation Systems Technique ............................................................... 26
3.5 Requirements Analysis ................................................................................................... 27
3.5.1 Functional Requirements ......................................................................................... 27
3.5.2 Non-Functional Requirements ................................................................................. 30
Chapter 4 Design .................................................................................................................... 32
4.1 List of Assumptions ....................................................................................................... 32
4.2 System Design Issues ..................................................................................................... 33
4.2.1 Usability................................................................................................................... 33
4.2.2 Performance ............................................................................................................. 33
4.2.3 Responsiveness ........................................................................................................ 33
4.2.4 Map Modelling ........................................................................................................ 34
4.2.5 Voice Instruction ..................................................................................................... 37
4.3 System Architecture Design ........................................................................................... 38
4.3.1 Structure Diagram.................................................................................................... 35
4.3.2 Overall Architecture ................................................................................................ 40
4.3.2.1 Administrator System Architecture .................................................................. 40
4.3.2.2 User System Architecture ................................................................................. 41
4.3.3 Component Diagram................................................................................................ 42
4.3.4 Overall Class Diagram............................................................................................. 43
4.3.4.1 Class MapItem .................................................................................................. 44
4.3.4.2 Classes from Navigation Package ..................................................................... 45
4.3.4.3 Class Graph and NodeEntry.............................................................................. 46
4.3.4.4 Classes from Voice Package ............................................................................. 47
4.3.4.5 Classes from Schedule Package ........................................................................ 48
ii
Guidance System for the Visually Impaired Persons Table of Contents __________________________________________________________________________________
4.3.5 Sequence Diagrams ................................................................................................. 49
4.3.6 Activity Diagrams.................................................................................................... 51
4.4 Software Design ............................................................................................................. 54
4.4.1 Path Determination Algorithm ................................................................................ 54
4.4.2 Determination of user orientation ............................................................................ 55
4.4.3 Traversing KML for Obstacle and Places Information ........................................... 57
4.4.4 Obstacle Detection Mechanism ............................................................................... 59
4.4.5 Traversing of XML File for Weather Information .................................................. 62
4.4.6 Power Consumption Management........................................................................... 64
4.4.7 Timer Alarm Schedule............................................................................................. 65
4.4.8 Re-routing Mechanism ............................................................................................ 65
4.5. Interaction Design ......................................................................................................... 66
4.5.1 Human Computer Interaction Strategies ................................................................. 66
4.5.2 Interaction Modes Design ........................................................................................ 67
4.5.2.1 Voice Feedback Mechanism ............................................................................. 67
4.5.2.2 Voice Recording Mechanism ............................................................................... 67
4.5.2.3 Shake-to-Respond Mechanism ......................................................................... 68
4.5.2.4 Haptic Feedback Mechanism ............................................................................ 69
4.6 Graphical User Interface Design .................................................................................... 70
Chapter 5 Implementation .................................................................................................... 71
5.1 Implementation Issues .................................................................................................... 71
5.1.1 Platform ................................................................................................................... 71
5.1.2 Compatibility ........................................................................................................... 71
5.1.3 Robustness ............................................................................................................... 72
5.1.4 Real-time Concurrency ............................................................................................ 72
5.1.5 File Access ............................................................................................................... 72
5.1.6 Interface and Class................................................................................................... 73
iii
Guidance System for the Visually Impaired Persons Table of Contents __________________________________________________________________________________
5.2 Standard and Convention ............................................................................................... 74
5.2.1 Conventions ............................................................................................................. 74
5.3 Development Environment ............................................................................................ 76
5.3.1 Mobile Device Configuration .................................................................................. 76
5.3.1.1 Sensors .............................................................................................................. 76
5.3.1.2 Other Hardware ................................................................................................. 77
5.3.2 Software Tools ......................................................................................................... 78
5.3.2.1 Google Drive Cloud Storage ............................................................................. 78
5.3.2.2 Google Earth ..................................................................................................... 78
5.3.2.3 GPS Status & Toolbox ...................................................................................... 79
5.4 Implementation of modules ............................................................................................ 80
5.4.1 Safest Path Module .................................................................................................. 80
5.4.2 User Heading Module .............................................................................................. 84
5.4.3 Timer Alarm Module ............................................................................................... 85
5.4.4 Navigation Module .................................................................................................. 86
5.4.5 Shake-to-Respond Module ...................................................................................... 86
5.4.6 Voice Annotation Module ....................................................................................... 87
5.4.7 Re-routing Module .................................................................................................. 88
5.4.8 Graphical User Interface Module ............................................................................ 89
5.5 Module Context-triggered Action .................................................................................. 91
5.6 Difficulties Faced ........................................................................................................... 92
Chapter 6 Integration and Testing ....................................................................................... 93
6.1 Unit Testing .................................................................................................................... 93
6.2 Integration Testing ......................................................................................................... 94
6.3 System Testing ............................................................................................................... 95
6.3.1 Accuracy Testing ..................................................................................................... 95
6.3.2 Performance Testing ................................................................................................ 96
iv
Guidance System for the Visually Impaired Persons Table of Contents __________________________________________________________________________________
6.3.2.1 Downloading Time v/s File Size....................................................................... 96
6.3.2.2 Items voiced out ................................................................................................ 97
6.3.2.3 Re-routing ......................................................................................................... 99
6.3.2.4 Loading Time v/s File Size ............................................................................. 100
6.3.3 Stress Testing ......................................................................................................... 100
6.3.4 Shake Sensitivity ................................................................................................... 101
6.4 User Acceptance Test ................................................................................................... 104
6.4.1 Shake-to-Respond .................................................................................................. 105
6.4.2 Voice Feedback Mechanism .................................................................................. 105
6.4.3 Obstacle Detection Mechanism ............................................................................. 106
6.4.4 Voice Recording Mechanism ................................................................................ 106
6.5 Debugging .................................................................................................................... 107
Chapter 7 Critical Appraisal and Future Works ............................................................. 108
7.1 Achievements ............................................................................................................... 108
7.2 Limitations ................................................................................................................... 113
7.3 Future Works ................................................................................................................ 114
7.3.1 Modern Technologies ............................................................................................ 114
7.3.2 Intelligent Algorithms............................................................................................ 114
7.3.3 Routing Algorithms ............................................................................................... 115
7.3.4 More Sensors ......................................................................................................... 115
Chapter 8 Conclusion .......................................................................................................... 116
References ............................................................................................................................. 117
Appendix 1 ............................................................................................................................ 120
Interview with a Blind Person ............................................................................................ 120
Lessons Learnt From a Blind Person ................................................................................. 125
v
Guidance System for the Visually Impaired Persons List of Tables __________________________________________________________________________________
List of Tables Table 1.1 Individual Contribution.............................................................................................. 5
Table 5.1 Inbuilt interfaces used in the project ........................................................................ 73
Table 5.2 Some inbuilt classes used in the project .................................................................. 73
Table 5.3 Guidance System Application Minimum Requirements ......................................... 77
Table 5.4 Module Context-triggered Action ............................................................................ 91
Table 6.1 Important modules of the project ............................................................................. 93
Table 6.2 Stages of integration testing ..................................................................................... 94
Table 6.3 Diverse Capabilities of each person involved in the test ....................................... 102
Table 6.4 Actual wordings and desired wordings by Ramlo ................................................. 106
Table 6.5 List of main bugs while testing the application ..................................................... 107
vi
Guidance System for the Visually Impaired Persons List of Figures __________________________________________________________________________________
List of Figures
Figure 1.1 Gantt chart ................................................................................................................ 4
Figure 2.1 Guide Dog (left) and white cane (right) (R & R Associates 2012) .......................... 7
Figure 2.2 Screenshot of SmartRotuaari services: (Ojala 2010) ................................................ 9
Figure 2.3 Dijkstra’s Algorithm Sample (Cornell University 2009) ....................................... 12
Figure 2.4 Graph Path Planning Approach (Kim 2012) .......................................................... 14
Figure 2.5 Grid Path Planning Approach (Kim 2012) ............................................................. 15
Figure 2.6 System Architecture of Obstacle Detection System (Cardin and Vexo 2005) ....... 17
Figure 3.1 Interview with Mr. R. Appadoo (February 2014) ................................................. 19
Figure 3.2 Screenshot of Google Maps, Quatre Bornes, Mauritius ......................................... 21
Figure 4.1 Google Earth Sample .............................................................................................. 34
Figure 4.2 Exporting of places information to KML file ......................................................... 36
Figure 4.3 Home studio setup for voice recording .................................................................. 37
Figure 4.4 Model of Top down design for the project ............................................................. 35
Figure 4.5 Administrator System Architecture ........................................................................ 40
Figure 4.6 User System Architecture ....................................................................................... 41
Figure 4.7 Split Components ................................................................................................... 42
Figure 4.8 System Classes Overview ....................................................................................... 43
Figure 4.9 Class MapItem ........................................................................................................ 44
Figure 4.10 Navigation Package .............................................................................................. 45
Figure 4.11 Class Graph and class NodeEntry ........................................................................ 46
Figure 4.12 Voice Package ...................................................................................................... 47
Figure 4.13 Schedule Package ................................................................................................. 48
Figure 4.14 User Usage ............................................................................................................ 49
Figure 4.15 Time and Weather condition Voice Feedback ..................................................... 50
Figure 4.16 Obstacles closed to device .................................................................................... 51
Figure 4.17 Annotating objects ................................................................................................ 52
Figure 4.18 Low Battery .......................................................................................................... 53
Figure 4.19 : Illustration of a map in a form of a graph ........................................................... 54
Figure 4.20 Earth showing a path from Baghdad to Osaka ..................................................... 55
Figure 4.21 User heading in the flatten map ............................................................................ 56
Figure 4.22 Extract of KML file containing information about obstacles in Quatre-Bornes .. 57
vii
Guidance System for the Visually Impaired Persons List of Figures __________________________________________________________________________________
Figure 4.23 Obstacles Information Extraction Algorithm from a KML File .......................... 58
Figure 4.24 Illustration of obstacle between two places .......................................................... 59
Figure 4.25 Using the heading angle algorithm ....................................................................... 59
Figure 4.26 Simple illustration when approaching an obstacle ............................................... 60
Figure 4.27 Obstacle/Place Detection Mechanism algorithm .................................................. 61
Figure 4.28 Extract of weather XML file for the region of Quatre Bornes ............................. 62
Figure 4.29 Reading weather information algorithm ............................................................... 62
Figure 4.30 Remaining battery life algorithm.......................................................................... 64
Figure 4.31 Thirty minutes schedule algorithm ....................................................................... 65
Figure 4.32 Re-routing algorithm ............................................................................................ 65
Figure 4.33 Voice feedback algorithm..................................................................................... 67
Figure 4.34 Voice Recording Algorithm ................................................................................. 68
Figure 4.35 Shake-to-respond algorithm ................................................................................. 69
Figure 4.36 Haptic feedback mechanism algorithm ................................................................ 69
Figure 4.37 Screen design 1 ..................................................................................................... 70
Figure 4.38 Screen design 2 ..................................................................................................... 70
Figure 5.1 Screenshot of ‘Location and security’ settings on Android 2.3 ............................. 77
Figure 5.2 Screenshot of the public Google Drive folder VIS App ......................................... 78
Figure 5.3 Screen Capture of GPS Status & Toolbox Android Application ........................... 79
Figure 5.4 Beginning of getReverserSafestPath function ........................................................ 80
Figure 5.5 Part updating weightage in getReverseSafestPath function ................................... 81
Figure 5.6 Part fetching next node to update to least weightage ............................................. 82
Figure 5.7 Part setting least weightage .................................................................................... 83
Figure 5.8 Part reverse safest path at end of function getReverseSafestPath .......................... 84
Figure 5.9 getHeading Function Code Snippet getHeading Function Code Snippet .............. 84
Figure 5.10 ScheduleAlarm Constructor Code Snippet ........................................................... 85
Figure 5.11 Part of onReceive function in class AlarmReceiver Code Snippet ...................... 85
Figure 5.12 Map Items Detection Code Snippet ...................................................................... 86
Figure 5.13 Function onSensorChanged Code Snippet ........................................................... 87
Figure 5.14 Function record from class VoiceRecorder Code Snippet ................................... 88
Figure 5.15 Re-routing Code Snippet ...................................................................................... 89
Figure 5.16 Graphical Interface after pressing the Menu button on mobile device ................ 89
Figure 5.17 Graphical Interface after tapping on ‘Choose Destination’ option ...................... 90
Figure 5.18 Graphical Interface after tapping on category ‘Manze ek Boir’ ........................... 90
viii
Guidance System for the Visually Impaired Persons List of Figures __________________________________________________________________________________
Figure 6.1 Number of satellites fixed v/s Error graph (the trend is shown in dotted) ............. 95
Figure 6.2 Real Path v/s Recorded path by device .................................................................. 96
Figure 6.3 Downloading time v/s File Size (the trend is shown in dotted) ............................. 97
Figure 6.4 Map with path travelled and map items ................................................................. 98
Figure 6.5 Map showing re-routing ......................................................................................... 99
Figure 6.6 Loading time into memory v/s File Size .............................................................. 100
Figure 6.7 Loading maximum map items in RAM Code Snippet ......................................... 100
Figure 6.8 OutOfMemory Exception ..................................................................................... 101
Figure 6.9 Successful Attempts v/s Attempt Number ........................................................... 102
Figure 6.10 Evidence of user acceptance test with Mr. R. Appadoo ..................................... 104
ix
Guidance System for the Visually Impaired Persons Preface _____________________________________________________________________________________
Preface
Chapter 1: Introduction
The first chapter introduces the thesis and give a brief description of the project. The aim
and objectives of the project are stated. It also describes some of the different problems being
faced daily by the visually impaired. At the end of the chapter, a Gantt chart is given to show the
evolution of the project.
Chapter 2: Background Study
This chapter is a literature review of the field of navigation and context awareness. It
includes features of existing systems, algorithms, techniques and technologies they used.
Chapter 3: Analysis
In this chapter, an in-depth study of everything related to this project is made. This chapter
includes interviews, technological analysis and navigation techniques. It finally includes a list of
requirements for the new system.
Chapter 4: Design
Design is the chapter which deals with the logic of the system and includes algorithms,
architectures and explanations. A detailed design of the different aspect of the project is shown.
Different design issues are discussed.
x
Guidance System for the Visually Impaired Persons Preface _____________________________________________________________________________________ Chapter 5: Implementation
This chapter shows the different processes used in building the Guidance system for the
visually impaired. It describes the different units and modules with code snippets and clear
explanations.
Chapter 6: Integration and Testing
Testing is the chapter in which different techniques are used for testing the system. This
is to ensure that the system is running without identified bugs. It also contains graphs that illustrate
test results.
Chapter 7: Critical Appraisal and Future Works
This chapter compares and contrasts the final solution with the initial requirements. The
project achievements are included. The limitations of the projects along with future works are also
discussed.
Chapter 8: Conclusion
The final chapter provides a conclusion of the project. It briefly describes the project and
compares and contrasts the test results.
xi
Guidance System for the Visually Impaired Persons Acknowledgements _____________________________________________________________________________________
Acknowledgements
It is with great pleasure that we find ourselves writing down these lines to express our
sincere thanks to all those people who helped us in completing this project.
Firstly, we would like to express our sincere gratitude to our project supervisor. It was a
privilege to have been guided by the Associate Professor Dr. Kavi Kumar Khedo. His proper
guidance, support, kindness, commitment and invaluable assistance have made this project
possible.
We are also grateful to all the visually impaired persons who collaborated with us. A
special thanks to Mr. Ramlo Appadoo who without any hesitation accepted to meet us on several
occasions to discuss about navigation for the visually impaired.
We would, moreover, like to express our sincere gratitude and appreciation to
Mrs. Soulakshmee Devi Nagowah for her feedback, constructive criticism and advice on our
project during the poster presentation.
All this would not have been possible without the support of our family members,
therefore our last words go to them. They have accompanied us throughout this long journey we
call education and not once left our side or let us down. It is thanks to their precious and wise
advice that we have reached here today and we will continue to thrive with their blessings.
xii
xiii
Name:
Student ID:
Programme of Studies:
Module Code/Name:
Title of Project/Dissertation:
Name of Supervisor(s):
Declaration:
In accordance with the appropriate regulations, I hereby submit the above dissertation for examination and I declare that:
(i) I have read and understood the sections on Plagiarism and Fabrication and Falsification of Results found in the University’s “General Information to Students” Handbook (20…./20….) and certify that the dissertation embodies the results of my own work.
(ii) I have adhered to the ‘Harvard system of referencing’ or a system acceptable as per “The University of Mauritius Referencing Guide” for referencing, quotations and citations in my dissertation. Each contribution to, and quotation in my dissertation from the work of other people has been attributed, and has been cited and referenced.
(iii) I have not allowed and will not allow, anyone to copy my work with the intention of passing it off as his or her own work.
(iv) I am aware that I may have to forfeit the certificate/diploma/degree in the event that plagiarism has been detected after the award.
(v) Notwithstanding the supervision provided to me by the University of Mauritius, I warrant that any alleged act(s) of plagiarism during my stay as registered student of the University of Mauritius is entirely my own responsibility and the University of Mauritius and/or its employees shall under no circumstances whatsoever be under any liability of any kind in respect of the aforesaid act(s) of plagiarism.
Signature: Date:
UNIVERSITY OF MAURITIUS
PROJECT/DISSERTATION DECLARATION FORM
Kishan Yashveer Bhugul
111 80 80
BSc (Hons.) Compuer Science
Guidance System for the Visually Impaired
Associate Professor Dr. Kavi Kumar Khedo
CSE3000Y(5)Project
01-APR-2014
xiv
Name:
Student ID:
Programme of Studies:
Module Code/Name:
Title of Project/Dissertation:
Name of Supervisor(s):
Declaration:
In accordance with the appropriate regulations, I hereby submit the above dissertation for examination and I declare that:
(i) I have read and understood the sections on Plagiarism and Fabrication and Falsification of Results found in the University’s “General Information to Students” Handbook (20…./20….) and certify that the dissertation embodies the results of my own work.
(ii) I have adhered to the ‘Harvard system of referencing’ or a system acceptable as per “The University of Mauritius Referencing Guide” for referencing, quotations and citations in my dissertation. Each contribution to, and quotation in my dissertation from the work of other people has been attributed, and has been cited and referenced.
(iii) I have not allowed and will not allow, anyone to copy my work with the intention of passing it off as his or her own work.
(iv) I am aware that I may have to forfeit the certificate/diploma/degree in the event that plagiarism has been detected after the award.
(v) Notwithstanding the supervision provided to me by the University of Mauritius, I warrant that any alleged act(s) of plagiarism during my stay as registered student of the University of Mauritius is entirely my own responsibility and the University of Mauritius and/or its employees shall under no circumstances whatsoever be under any liability of any kind in respect of the aforesaid act(s) of plagiarism.
Signature: Date:
UNIVERSITY OF MAURITIUS
PROJECT/DISSERTATION DECLARATION FORM
David Kwet Chin Young Ten111 37 65
Bsc (Hons.) Computer Science
CSE3000Y(5)Project
Guidance System for the Visually Impaired
Associate Professor Dr. Kavi Kumar Khedo
01-APR-2014
Guidance System for the Visually Impaired Persons Abstract _____________________________________________________________________________________
Abstract
In the recent years there has been many advancements in technologies that are helping
millions of visually impaired persons across the world in their day to day life. Simple solutions
such as text-to-voice output to more complex ones such as the braille impression printer have been
developed. Research about how to improve the everyday life of a visually impaired person is
nevertheless not very common and scientists have to work harder in order to find more solutions
to ease the life of those persons. As far as navigation solutions are concerned in this field, it has
been observed that not so many contributions have been made.
In this project, a guidance system for the visually impaired persons has been developed for
Android mobile devices. Since the population of blind persons is quite small in Mauritius, there
is lack of supports and not so many infrastructures have been adapted for proper navigation with
the white cane. Throughout intensive research in this field, it has been concluded that blind people
in Mauritius do not have mobility. The solution implemented will encourage mobility by giving
turn to turn voice instruction in Creole language and by using several context awareness
information. A custom map has been built, taking into consideration the whole nearby
environment including obstacles, places of interest, actual weather information and among others,
the time and day of the week. The system also route the user from his/her actual position to his/her
destination following the safest path based on complex parameters like dynamic context
information. To allow interaction with the system anywhere and without any specific knowledge,
new human computer interactions for the visually impaired person have been developed such as
the shake-to-respond input.
After testing the system with collaborated visually impaired persons, it can be concluded
that, depending on the GPS device accuracy, good results are obtained. Context awareness
information is presented to the user in such a way that allow the visually impaired person to “see”
his/her surrounding. Context triggered actions, like preforming rerouting in case of the user going
in another direction or dealing with constraints on the path, have also been implemented and tested.
xv
Guidance System for the Visually Impaired Persons List of Abbreviations __________________________________________________________________________________
List of Abbreviations
Abbreviation Meaning
CPI Content Provider Interface
CSV Comma-separated Values
EDGE Enhanced Data Rates for GSM Evolution
EOA Electronic Orientation Aids
ESRI Environmental Systems Research Institute
GPRS General Packet Radio Service
GPS Global Positioning System
IDE Integrated Development Environment
KB Kilobytes
KML Keyhole Markup Language
kNN k Nearest Neighbor
LAN Local Area Network
mAh Milli Ampere hour
MB Megabytes
MCB Mauritius Commercial Bank
MP3 MPEG-2 Audio Layer III
OS Operating System
POI Place of Interest
RAM Random Access Memory
RFID Radio Frequency Identification
SD Card Secure Digital Card
SDK Software Development Kit
SIM Subscriber Identity Module
URL Uniform Resource Locator
USB Universal Serial Bus
VIS Visually Impaired System
WLAN Wireless Local Area Network
XML Extensible Markup Language
xvi
Guidance System for the Visually Impaired Persons Chapter 1: Introduction _____________________________________________________________________________________
Chapter 1 Introduction
It is difficult for a sighted person to imagine how different and difficult daily experiences
are without the ability to see. Fortunately, all the challenges associated with the visual impairments
are being addressed at an amazingly rapid pace with stunning modern technology.
It is now possible to use technology and make a person with visual impairments more aware
of his/her surrounding. Many researchers like Anind K. Dey and Gregory D. Abowd have
described general solutions on how to make people feel better in their current environment by
using technology. This include providing some additional information to the person’s
surroundings (Dey and Abowd 1999).
This project is based on developing a guidance system for the visually impaired by using
navigation techniques and context awareness information.
1.1 Problem Statement
The most important travelling aid for the visually impaired person is still the white cane.
It is after all an excellent example of a good travelling aid as it is multifunctional, cheap and
reliable. It also tells to others that the person is visually impaired. In studies about visually
impaired person navigation (Abdelasalam et al. 2001, p.2-5), it has been noted that even a small
amount of extra information about the environment makes a remarkable increase in performance.
Therefore, the provision of extra information that can help the visually impaired people needs to
be investigated.
Nowadays, modern technologies are within the reach of all. It is therefore possible to make
use of them and develop a reliable tool to efficiently augment the user’s actual navigating
techniques.
1
Guidance System for the Visually Impaired Persons Chapter 1: Introduction _____________________________________________________________________________________
1.2 Scope
The project will focus on implementing not a replacement to the white cane but a system
that will be an augmentation to the white cane. The implemented system will be used by the
visually impaired persons of Mauritius. A generic system will be developed. However to
demonstrate the capabilities of the system, the region of Quatre Bornes will be modeled. Different
aspect of navigation will be taken into consideration including computing safest path. Also,
context awareness will be an important part of this project. Context awareness information will
be used both to devise an appropriate path and to provide the user with useful information so as to
ease navigation. Therefore, both navigation techniques and context awareness will be studied.
1.3 Aims and Objectives
When a visually impaired person walks from one location to another, he/she would lack of
many useful inputs such as weather condition, time and among others obstacles. The goal of the
project is to develop a system to augment a visually impaired person’s pedestrian experience with
enough information to make him/her feel more comfortable on a walk from one location to another.
The system should constantly guides the blind user to navigate based on static and dynamic data.
Environment conditions and landmark information queried from a spatial database along their
route are provided on the fly though detailed explanatory voice cues.
Moreover, it is expected that the system enables users to obtain spoken GPS navigational
information using intuitive speech voice out. The prototype system can advise the user where
he/she is currently located and provide spoken directions to travel to a destination.
In this project, it is important to meet and discuss about possible solutions with blind
people, research about navigation and context awareness and finally develop such a system.
2
Guidance System for the Visually Impaired Persons Chapter 1: Introduction _____________________________________________________________________________________
1.4 Contributions
After having focused on researching about blind people, context awareness and navigation
techniques, time has been spent with the blind person until it was understood how they travel from
one place to another. It has then become easier to start thinking about what features the system
should present to the user.
First of all, a custom map has been built taking into consideration all objects, like obstacles,
useful for a blind. The backend part of the system has then been implemented using the cloud
storage to ease maps update process. The map provided is generic and can be built independently
for different regions by adding objects such as obstacles and places of interests along with their
descriptions.
Furthermore, the map consists nodes, objects like obstacles and buildings, and the
Dijkstra’s algorithm has been modified in such a way that allow the computation of the safest path.
Weather information and other context information are used for the computation of the navigation
path. When navigating, it is important to know the moving direction of the user. For this purpose,
an orientation algorithm has been developed to flatten the map in order to get the proper user
heading.
Also, several interaction mechanisms for visually impaired persons have been researched
and developed. This includes the shake-to-respond interaction, the haptic feedback, the context
aware information by voice out and among others the sound alert.
The solution has been implemented using Java and can be used on any mobile devices
running the Android operating system.
3
Guidance System for the Visually Impaired Persons Chapter 1: Introduction _____________________________________________________________________________________
1.5 Scheduled Plan
The Gantt chart shown below shows an estimate of the time taken to complete the whole
project.
Figure 1.1 Gantt chart
1.6 Individual Contribution
This is a group project and it involves different tasks. Task distribution is shown in the
table below.
Task
Contributions
BHUGUL Kishan
(111 80 80)
YOUNG TEN David
(111 37 65)
Research and Analysis
Navigation
Shake-to-Respond
Map Modelling
Haptic feedback
4
Guidance System for the Visually Impaired Persons Chapter 1: Introduction _____________________________________________________________________________________
Voice out
Voice Annotation
Weather Reading
30-Minutes Information
Emergency Mode
Path determination
Button Input
Graphical User Interface
GPS Heading
Current Time
Report Writing
Table 1.1 Individual Contribution
5
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________
Chapter 2 Background Study
In this chapter, a review of research papers and other people’s work related to this project
is done. A deep analysis of the tools and technologies described in research papers is included.
Main terms such a context awareness and navigation systems are explicitly defined.
2.1 Context Awareness
In this section, the context awareness field is being introduced. Context awareness refers
to identification of nearby people and objects, and changes to these objects (Schilt and Theimer
1994). Generally, context awareness involves every person and the surroundings since persons
and the surroundings are in perpetual change as time passes by.
With people spending long time in an environment such as home and workplace, it has
become important to analyze how context awareness can support them in doing their tasks and
jobs.
For instance, when an individual enters in his/her house, the temperature of the air
conditioner/heater is set to his/her preferred temperature whereby he/she needs not to use a remote
control or move to the air conditioner/heater. Another example, “Lights, chairs and tables
automatically adjust as soon as the family gathers in the living room to watch TV” (Meyer 2003,
p.2). This has been achieved by means of heat sensors which detect the body temperature of the
persons who are in a particular room (Meyer 2003, p.2).
Since the context where a person is situated changes rapidly, these changes have to be taken
into consideration while designing a system, especially those which need many interactions. In
context awareness, mobility is an important aspect since the context varies in function of where
one is located. With a wide range of possible user scenarios, a way for the services of the system
to adapt appropriately and implicitly is needed. For instance, context awareness has to be treated
differently at home and at workplace. At work for example, the productivity has to be optimized
and one way to increase throughput is to achieve better communication between departments of
6
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________ the company and the employees. To achieve such objective, the computers can be interconnected
in a local area network fashion.
2.1.1 Context Awareness and Visually Impaired Persons
Context awareness can help visually impaired persons by making them aware of obstacles
and where they are situated. This is a crucial aspect while travelling from one place to another as
blind persons are not fully aware of the possible dangers around them. Thus, their other senses
such as touch, hearing and smell are heavily used to get an ‘image’ of their location. It is sometime
difficult for a non-visually impaired person to be completely aware of his/her surroundings in an
unknown area, hence it can be concluded that it will be much more difficult for visually impaired
persons.
Visually impaired persons generally make use of a white cane or a guide dog, shown in
figure 2.1, to get more information about their surroundings. However those two aids do not
provide much information that significantly improve context awareness. Visually impaired
persons who do not have access to the context aware technologies, like the GPS, often prefer to
rely on repetitive and regular situations that is their past experiences. Unfortunately, visually
impaired persons may not be aware of other unexpected hazards. This is where voice recognition
and synthesis can help a lot and make them more conscious of the hazards.
Figure 2.1 Guide Dog (left) and white cane (right) (R & R Associates 2012)
7
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________
Very common around the world, when the green pedestrians light appears, the warning
bells begin ringing as well. A proximity sensor to detect approaching vehicles along with a camera
reading the color of the signal lamp are better for safer crossing as a vehicle may still be moving
towards the crossing area even though the red signal lamp for vehicles is lit.
Technologies such as Global Positioning System and Geographic Information System are
used for navigation purposes and they can offer contextual information to visually impaired
persons. Optimized routes can be computed based on the user preference and constraints such as
traffic congestion and dynamic obstacles. To get more information about the environment and
landmarks of where the person wants to navigate, a spatial database can be queried and output
through voice cues (Abdelasalam et al. 2001, p.1).
2.1.2 Existing Context Aware Systems
Various existing context-aware systems focusing on the technologies that are being used
are introduced. Various approaches are also discussed and important aspects in context-aware
computing are analyzed.
1. Context-Aware Homes
Context-aware homes (Meyer 2003, p.2-3) are homes which use context aware
technologies such as actuators, sensors and wireless networks. For example, “Phones only ring in
rooms where the addressee is actually present, preventing other people being disturbed by useless
ringing” (Meyer 2003, p.2). Another example is “The music being played in a room adapts
automatically to the people within and the pictures in the frames on the desk change depending on
which person is working there” (Meyer 2003, p.2). These can be done using smart sensors
consisting of microprocessors. Infrared sensors, for example, are used in context-aware homes
mainly to detect the presence of human bodies.
8
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________
2. ContextAlert
ContextAlert (Phithakkitnukoon and Dantu 2010, p.1) is a mobile context aware system
which has been designed for Android phones and uses components like accelerometer, GPS and
microphone. This system can help in remembering to switch to vibrate mode while in a meeting
or at a movie theater. This is attained by using the accelerometer sensor and the GPS to determine
the user’s movement and location respectively. From these parameters, the speed of the movement
can be calculated and the system can predict in what situation the user is and switches to the
corresponding mode. Thus, if the speed is above a given value, the user may be driving and the
phone is automatically switched to hands free mode.
3. SmartRotuaari
SmartRotuaari (Ojala 2010, p.2-4) is a wireless Internet service that provides context-aware
mobile multimedia information across a web-based Content Provider Interface as shown in figure
2.2. SmartRotuaari development started in 2002 and was developed because of the needs of
companies and consumers for mobile services. However, small retailers were not aware of these
services as per surveys, workshops and discussions.
Figure 2.2 Screenshot of SmartRotuaari services: (Ojala 2010)
9
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________
2.2 Navigational Systems
A navigation system is a system, usually electronic, that aids in navigation. Navigation
systems generally consist of one or several signal emitting devices and a signal receiving device
for each user – generally one sender and several receivers. Technologies ranging from the radio
frequency identifier description to the global positioning system can be used to provide
information like position and environment details.
2.2.1 Navigation Systems Technologies
The four most used navigational systems technologies are described. Each of them is
briefly analyzed.
1. Radio Frequency Identifier Description (RFID)
RFID are tags used in most current navigation systems where RFID's can either be passive
or active. Some systems use both active and passive tags. Active RFID tags contain a battery and
transmit signals automatically. The tags have a large range, which reduces the number of required
tags that need to be installed. The drawback of active tags is that it require maintenance, as
batteries need to be replaced. Passive tags do not require a battery and are powered to transmit
signals by the RFID reader. While passive tags are much less expensive, they have much shorter
range and can store much less data. RFID tags can be used to store an identifier or location
information may be embedded in the tag itself. Active tags can store up to 128 KB and passive
tags typically store less than 128 B. RFID tags themselves are relatively inexpensive, but installing
them in large environments may be costly, since such tags need to be embedded in walls or objects,
for example, light poles where users, using appropriate sensors, can sense them (Fallah 2012, p.5).
10
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________
2. Ultrasound Identification (USID)
USID uses emitters that broadcast ultrasound waves with a short wavelength. Emitters are
installed in the infrastructure and the user carries a receiver on each shoulder. The time difference
of the ultrasound signals received from the two closest emitters to each receiver is used to locate
the user. By placing a receiver on each shoulder, the user's orientation can also be calculated.
There are other systems that have the user carry the ultrasound emitter but the receivers are
installed in the environment where the users' location is determined centrally. A disadvantage of
ultrasound is that walls may react or block ultrasound signals, which result in less accurate
localization. The other drawback of using ultrasound for localization is that a line of sight is
required between the receivers and emitters (Fallah 2012, p.6).
3. Bluetooth bacons
This technology can be used for localization. For localization to be possible, the user has
to walk slower than with other techniques because of the device delay. Bluetooth bacons require
a power source and henceforth they need to be maintained. Similar to RFID localization, the
change in infrastructure is one of the disadvantages of these technique since the receivers or
emitters need to be installed throughout the path (Fallah 2012, p.6).
4. Global Positioning System (GPS)
The Global Positioning System is a space-based satellite navigation system that provides
location and time information in all weather conditions, anywhere on or near the Earth where there
is an unobstructed line of sight to GPS satellites. It is maintained by the United States government
and is freely accessible to anyone with a GPS receiver. A GPS receiver's job is to locate four or
more of these satellites, figure out the distance to each, and use this information to deduce its own
location. This operation is based on a simple mathematical principle called trilateration. However
GPS signals are not completely accurate. Obstacles like buildings and trees can deflect the signal,
causing your position on the GPS screen to be off by as much as three meters. Atmospheric
11
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________ conditions (such as geomagnetic storms caused by solar activity) may also affect GPS accuracy.
For highway driving, this can cause you to miss a turn or exit (Guier and Weiffenbach 1997).
2.2.2 Path Determination Approaches
The main objective of a navigation system is to generate the best path in terms of distance,
less obstacle and other useful parameters. Some of the algorithms are described in this section.
1. Dijkstra's Algorithm
Dijkstra’s Algorithm can be used to calculate shortest path from a particular point to
another. In navigational systems, generally, the shortest path is not the safest one and still the
shortest path may not necessary mean that it will take less time to travel. Furthermore, shortest
paths can be complex with many turns and people can get lost if, for instance, they do not turn on
at the right place (Fallah et al. 2012, p.10). Figure 2.3 shows a sample diagram of Dijkstra’s
Algorithm.
Figure 2.3 Dijkstra’s Algorithm Sample (Cornell University 2009)
12
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________ The pseudocodes below illustrate the Dijkstra’s Algorithm.
S = ∅
P[ ] = none for all nodes
C[start] = 0, C[ ] = ∞ for all other nodes
while (Not all nodes in S) {
Find node K not in S with smallest C[K]
Add K to S
For each node J not in S adjacent to K {
If (C[K] + cost of (K,J) < C[J]) {
C[J] = C[K] + cost of (K,J)
P[J] = K
}
}
} (Padua-Perez and Pugh 2006)
The best route is the one with the least obstacles or any other hazards that can be a possible
danger to the traveler. For example, the path to bypass busy lanes can be longer but safer. The
path generated from the graph after applying Dijkstra’s algorithm must be translated in directions
in order for the person to understand and use it so as to reach the destination. The graph consists
of a set of vertices one after the other. A straight line is obtained and when the direction of the
line changes, it means that there is a turning point. At a turning point where an intersection arises,
the user gets the required commands. These steps are repeated until the user arrives to the
destination.
2. Graph Path Planning Approach
To plan paths, a graph path planning approach (Fallah et al. 2012, p.10-11) can be used.
The environment is divided into nodes and edges. Depending on the limitations, each node can
represent an object such as an intersection. Afterwards, each edge may be given a weight
depending on different criteria such as how dangerous the object can be while navigating. For
example, edges with stairs will be of higher weight. Figure 2.4 illustrates the graph approach.
13
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________
Figure 2.4 Graph Path Planning Approach (Kim 2012)
Discussion forum is an example of social networking service where users interact with
other users by posting information about a related topic or problem. It is to be noted that the users
who are discussing are related by the topic of the discussion (Tiwari et al. 2013, p.1). Graph
approach can be used for this purpose.
Benefit:
• Nodes and edges are created only where there are objects
Constraints:
• Graphs can become large if there are many objects linked together.
• Creating graphs for dynamic environment can be complex and need much processing since
the graphs have to be updated when there is a single change in the environment
14
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________
3. Grid Path Planning Approach
Another way of planning graphs, is the grid path planning approach (Fallah et al. 2012,
p.10-11). In this approach, the environment is divided into cells and each cell contains information
about objects at this position. There can be information such as the terrain type for a cell and from
the type of terrain, the cell can be classified as traversable or not. For instance, if there is vegetation
or snow, the cell is classified as non-traversable. The degree of traversability can be determined
from the height of the terrain compared to a reference level which can be determined by where the
person is actually located. Figure 2.5 illustrates this approach.
Figure 2.5 Grid Path Planning Approach (Kim 2012)
Constraints:
• If a cell is much larger than the objects in it then much more information must be stored
for that particular cell.
• If the cell does not contain many objects, there may not be valuable information but must
be part of the process for planning the paths.
15
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________
2.2.3 Existing Navigational Systems for Visually Impaired Persons
There exists some specialized existing navigational systems for visually impaired persons.
Since the project is part of a navigational system, four popular systems are analyzed.
1. Drishti
Drishti (Abdelasalam et al. 2001, p.2-5) is a pedestrian navigation system for visually
impaired and disabled persons. It can be used to guide the users to his/her chosen destinations
through voice directions feedback. Drishti can also be utilized for tourist guidance. Voice is
output from the device on which the system is installed. Along the route towards the users’
destinations, warnings of nearby obstacles such as ramps or stairs are given. If other paths are
possible, the user can choose other routes where he/she will be more comfortable. Another
functionality is that the user can add notes on where he/she is positioned. For instance, he/she can
add a note about an obstacle that he/she has discovered on his/her walk for future notices.
2. Obstacle Detection System of Visually Impaired People
Obstacle Detection System (Cardin and Vexo 2005, p.1-3) is used to caution about
neighboring obstacles while wandering. This can be done using a stereoscopic sonar system and
then a haptic response is specified to the user about the location of the obstacle. Animate obstacles
may produce noise and thus non-sighted people can have an idea about the positions of the objects.
Yet, static obstacles do not produce sounds and touch has to be used and the person must be nearby
the items. Figure 2.6 illustrates the architecture of the system.
16
Guidance System for the Visually Impaired Persons Chapter 2: Background Study _____________________________________________________________________________________
Figure 2.6 System Architecture of Obstacle Detection System (Cardin and Vexo 2005)
Since haptic feedback is used, the hearing sense of the user is not required when interacting
with the system. Thus, the user can use his/her hearing sense for other purposes such as for
listening to the surrounding. The sonar sensor has two ultrasonic transducers which are attached
together. One emits an ultrasonic wave while the other captures the echo of the ultrasonic wave
which had been sent before. By the moment at which the signal is emitted and the time at which
the echoed signal is received, calculations are done so as to get the distance of the nearest obstacle.
3. Voice Operated Outdoor Navigation System for Visually Impaired Persons
Voice operated outdoor navigation system (Koley and Mishra 2012, p. 1-3) is transportable
and will be useful in congested places where there is a huge population. Using a low power and a
good GPS, along with voice and the ultrasonic sensor, the user is given necessary information to
avoid the obstacles along his/her way. As input, a joystick is used to choose direction and as
output, a speaker or headphones are used. For instance, if the East direction is chosen, information
about that direction is played via the speaker/headphones. The GPS receiver gets the latitude and
longitude of the user’s actual position which is then compared in a CSV file on the SD card. Places
are stored in the SD and then is played back to the user from an audio file. If there is an obstacle
near the retrieved location, a warning is played back.
17
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________
Chapter 3 Analysis
The goal of this chapter is to derive a list of functional and non-functional requirements for
the system. This is done by analyzing the domain and the system to be developed. Furthermore,
some of the tools and technologies, mentioned in the previous chapter, are analyzed and the most
appropriate ones are chosen for the project.
3.1 Domain Analysis
In this section, the everyday life of visually impaired persons is investigated. The goal of
the domain analysis for this project is to better understand how visually impaired persons navigate
in different situations and using different kinds of instruments.
There are two categories of visually impaired persons that should be considered, namely
partly blind persons and fully blind persons. Those two categories of persons need equipment to
navigate from one place to another whether it is indoor or outdoor. Visually impaired persons,
sometimes need special education to be able to move from one place to another. This kind of
education and training is sometime very costly and its purpose is to help visually impaired persons
to adapt to any kind of environment. The main equipment that is used is the white cane and its
main use is to detect nearby objects like corners, walls and other such obstacles.
Apart from touch, another sense that is often used is the hearing for knowing about the
possible dangers in the surroundings such as the sound of vehicles and people walking. Sometimes
the sense of smell can be used as a way to know about the location of a fire and as a result, the
person knows that he/she should not go in this particular direction.
In an interview with Mr. Ramlo Appadoo, a blind, on Monday 3rd February 2014,
additional information on the perspective of a blind person was obtained.
18
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________
Figure 3.1 Interview with Mr. R. Appadoo (February 2014)
From left to right, Mr. Ramlo Appadoo, Kishan Bhugul and, David Young Ten at Association Foyer Vivre Debout at Curepipe, Mauritius.
The interview took place at ‘Association Foyer Vivre Debout’. Currently, the interviewee
is the President of the association which shows that he is an active person although he is blind. He
was not using his white cane to walk indoor. This is due to the fact that he is used to walking from
room to room in the building. He could even skip a step in a room without probing for an obstacle.
He said that in order to travel outdoor, a visually impaired person should be able to easily navigate
indoor first. While travelling outdoor, sometime blind persons need help from people of the
surrounding. For instance, at a zebra crossing, the blind person must ask whether it is safe or not
to cross the road. Help is often sought in case the traffic light signal is not equipped with a buzzer.
Details of the interview in a question-answer fashion is found in the appendix 1.
19
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________
3.2 Problem Analysis
In this section the difficulties that visually impaired persons come across in their day to
day life are analyzed.
Very often, blind persons have difficulties to navigate outdoor. Sometimes a visually
impaired person could be in situations where the surrounding is unknown to him/her. In this case,
the person moves slower and step by step using a white cane or a guide dog. Dogs are color blind,
at least partly and are not able to really distinguish between red and green for traffic lights. Guide
dogs must undergo trainings and dog owners must take care of the dogs and feed them
appropriately. Compared to persons, who can see where they are walking, blind persons have
difficulties to know about the ground level and its inclination. If the slope is too steep, a blind
person may fall down and hurts himself/herself. Visually impaired persons can easily slip on wet
grass or puddle of water on the road.
3.2.1 Scenario
The goal of this project is to design a system to augment a visually impaired person’s
pedestrian experience with enough information to make him/her feels comfortable on a walk from
one location to another. It is possible to make the user feel more at ease by providing him/her with
some static and dynamic context information. These information include precise locations,
environmental conditions and landmark that can be queried from a spatial database (Cloud).
For demo purposes of this project, an area in Quatre Bornes, Mauritius, a very popular and
busy area has been chosen. Figure 3.2 on the next page shows an approximation of the selected
area (red rectangle) along with some places of interest already provided by Google Maps.
20
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________
Figure 3.2 Screenshot of Google Maps, Quatre Bornes, Mauritius
A maximum amount of information for this selected area will be collected to create a spatial
database which will then be stored on the Cloud. This operation will involve going to the selected
area, at different times of the day, with a GPS device and collect precise coordinates all places of
interest.
21
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________
3.3 Technological Analysis
In this section, an analysis is done on the technologies available for navigation and
interaction systems for the visually impaired persons. The objective is to figure out what
technologies will be used for the system.
3.3.1 Mobile Device
Smartphones are equipped with components such as Wi-Fi and GPS that are useful for the
proposed system. Since its processor allows the execution of some complex instructions, the in-
built processor is of great use for this project. Other devices such as tablets also offer the same
possibilities and are generally more powerful and have more functionalities. However, with some
tablets, phone calls cannot be made and usually most of them are bigger than mobile phones.
Both modern mobile phones and tablets have audio capabilities which are very helpful for
visually impaired person. Being visually impaired, the major communication with the mobile
phone is done via the in-built microphone and/or speaker. Another in-built component is the
camera. With the advancements in technologies, image processing can be done with images taken
from the camera. On the other hand, if the mobile device does not have enough resources, the
processing may have to be done remotely. An internet connection is thus needed or the image has
to be transferred digitally to another machine for processing.
3.3.2 Shakes Input
Shake input is a new technology that became popular with the arrival of smartphones, in
2007. It generally makes use of an accelerometer sensor; a dynamic sensor capable of a vast range
of sensing. Accelerometers that are available can measure acceleration in one, two, or three
orthogonal axes. They are typically used in one of three modes:
• As an inertial measurement of velocity and position,
22
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________
• As a sensor of inclination, tilt, or orientation in 2 or 3 dimensions, as referenced from the
acceleration of gravity (1 g = 9.8 m/s2),
• Or as a vibration or impact (shock) sensor.
The shake input method, using the accelerometer sensor, has been mostly used for
functionalities like shuffling music in smartphones and MP3 players or gaming like in the
Nintendo WII device. The shake method can be used to trigger functionalities for a blind or
visually impaired person. For example one shake voices out the time and two shakes voice out the
date.
3.3.3 Voice Recognition
One way to communicate with the computer is to speak to it. With voice recognition
software, the right hardware, some time and patience, it is possible to train the computer to
recognize text dictated and commands issued. The operating system, Windows XP from Microsoft,
introduced in 2001 is a very easy way of training the computer to recognize voice. But today this
technology has become more powerful and the most common one is Google Now by Google which
makes use of natural language.
Voice Recognition technology can be of great help to blind or visually impaired people as
they can say: “Call mum” and the voice is analyzed and the action triggered within seconds.
However, the use of this technology requires a fast internet connection and works best in a noise
free-place.
3.3.4 Refreshable Braille Display
Braille displays provide access to information on a plate of cells by electronically raising
and lowering different combinations of pins in the braille cells. It can show up to eighty characters
from the screen and it changes continuously as the user navigates by using the command keys
found on the plate. The advantages of braille displays over synthetic speech are such that it
23
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________ provides direct access to information, allowing the user to check format, spacing, and spelling and,
is quiet.
The software that controls the display is called a screen reader. It gathers the content of
the screen from the operating system, converts it into braille characters and sends it to the display.
Screen readers for graphical operating systems are especially complex, because graphical elements
like windows or slide bars have to be interpreted and described in text form. Modern operating
systems usually have an Application Programming Interface to help screen readers obtain this
information. Nowadays, braille display can also be connected to portable devices like smartphones
and tablets.
3.3.5 Text-to-Speech Output
Text-to-speech are software programs that allow blind or visually impaired users to listen
the text displayed on the computer screen using a speech synthesizer. A screen reader is the
interface between the computer’s operating system, its applications, and the user. The user sends
commands by pressing different combinations of keys on the computer keyboard to instruct the
speech synthesizer what to say and to speak automatically when changes occur on the computer
screen.
Text-to-Speech function is present in most portable devices nowadays. Blind or visually
impaired users can configure their portable devices in such a way that text messages are
automatically read, caller’s name voiced out whenever a call is received or even voicing out the
time at each hour.
3.3.6 Haptic Feedback and Sound Alert
This new technology is used in several ways for the blind and visually impaired. Some
applications are using it to relay information about nearby objects back to the user using some
vibration motors. The user can feel the distance by the frequency at which the motor pulses; the
faster the motors pulse, the closer the object. This kind of feedback can be used whenever it is
24
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________ hard for the user to hear sound alerts. Sound alerts use the same principle, with sound pulses, to
relay information to the user but this technique cannot be used in noisy places.
3.3.7 Global Positioning System (GPS)
The Global Positioning System (GPS) is a space-based satellite navigation system that
provides location and time information in all weather conditions, anywhere on or near the Earth
where there is an unobstructed line of sight to four or more GPS satellites. The system is freely
accessible to anyone with a GPS receiver. Most, if not all, mobile devices like smartphones and
tablets are equipped with a GPS receiver. The device can be used to get precise outdoor location
of the user.
3.3.8 Digital Compass
The digital compass is usually based on a sensor called magnetometer. It provides portable
devices with a simple orientation in relation to the Earth's magnetic field. As a result, the phone
always knows which way is North so it can auto rotate the digital maps depending on its physical
orientation. The digital compass can be used to know the orientation of the blind or visually
impaired person so that accurate directions of navigation can be given to him/her.
25
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________
3.4 Navigation Systems Techniques Analysis
In this section, various techniques that are used in navigation are discussed in order to find
an appropriate technique to use in the system and possibly a combination of the techniques.
The most common algorithm that is used in navigation systems is the Dijkstra’s Algorithm
(Fallah et al. 2012, p.10). The purpose of this algorithm is to output the shortest path from one
point to another given a graph with vertices. However, the shortest path is not always the best for
visually impaired persons as the resulted path may have more obstacles and is not safe for those
persons given their disabilities. Dijkstra’s Algorithm may be adapted in such a way that it
calculates the shortest and safest path.
Another approach is the graph approach (Fallah et al. 2012, p.10-11) which is applied in
the Dijkstra’s Algorithm for path planning. The edges in the graph can then be given weight which
can represent how dangerous a particular path is. The advantage of this approach is that the nodes
are created only where there are objects such as obstacles and buildings. However, if the path is
long and contains several nodes, much resources are required to compute the needed path.
The last approach is the grid based method (Fallah et al. 2012, p.10-11). Some of the
constraints of the graph based method can be overcome using this approach. Since the
environment is divided into cells, these cells are fixed. Using the grid based approach, it can be
more complex when turns and crossroads need to be considered. This is because one turn or one
crossroad may be in two or more cells whereas, in the graph approach, the turn or crossroad is
denoted by a single node.
3.4.1 Choice of Navigation Systems Technique
Instead of using only one of the techniques discussed, a combination of Dijkstra’s
Algorithm and graph based path planning approach will be used. The grid based approach does
not provide more advantages than the combination of the Dijkstra’s Algorithm and the graph based
approach. The only advantage of the grid based approach is that information is stored in each cell
similar to storing information in a node in the graph based approach. Therefore two navigation
techniques will be combined to obtained better paths for the visually impaired.
26
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________
3.5 Requirements Analysis
In this section, the requirements of the system are analyzed and derived. It is divided into
two types of requirements, namely, functional and non-functional requirements. The functional
and non-functional requirements describe the features and attributes of the system respectively.
3.5.1 Functional Requirements
A list of functional requirements is described. Functional requirements defines the
functions of a system or its component. Functions are described as a set of inputs, the behaviors,
and outputs. Figure 3.3, shown below is an overall representation of the functional requirements.
Figure 3.3 Use case diagram of functional requirements
27
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________ FR1: The system shall provide information about obstacles in the surrounding.
If there are obstacles near the user, he/she shall be informed about what kind of obstacles
and in what direction they are.
FR2: The system shall provide information about buildings in the surrounding.
If there are buildings near the user, he/she shall be informed about the name of the
buildings, in some cases either they are operating or not and in what direction they are.
FR3: The system shall allow the user to add a voice note at his/her current location.
The user shall be able to add voice notes about objects such as obstacles or buildings when
he/she encounters them.
FR4: The system shall be able to guide the user towards his/her destination.
If the user inputs a destination, the system shall be able to guide the user from his/her
current location to the destination if there is a path for it.
FR5: The system shall be able to download maps.
To get some information about the places that the user wants to go or his/her surroundings,
the system shall be able to retrieve maps from an online repository.
FR6: The system shall allow the maps to be customized.
If there are objects on the maps that are not present, the system shall allow the administrator
to add them. Also, if there are objects that have been removed, the system shall allow the
administrator to edit/remove them
28
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________ FR7: The system shall use voice to inform the user about nearby obstacles and buildings.
As the user is walking around, the system shall, by virtue of the GPS location,
automatically retrieve obstacles, places and buildings from the map and give general information
like names and if they are operating or not.
FR8: The system shall use haptic feedback to get the user’s attention for obstacles before any
voice out.
Before any voice out signals, it is important to get the user’s attention, this is why there
will be a haptic feedback before any voice out.
FR9: The system shall voice out context awareness information every thirty minutes.
Every thirty minutes, the user will get information like the current time, the weather
condition and the street name.
FR10: In case of low battery life, the system shall automatically switch to an emergency
mode.
If the battery life of the system is less than twenty five percent, the system will switch to
an emergency mode which consists of urging the user to get back home immediately.
FR11: The user shall be able to trigger tasks.
To choose between different options, the system shall provide a mechanism to input the
user’s choice depending on the options he/she wants to trigger.
29
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________
3.5.2 Non-Functional Requirements
A list of non-functional requirements is described. It specifies criteria that can be used to
judge the operation of a system, rather than specific behaviors.
NFR1: The system shall not affect other devices or equipment the visually impaired or blind
person is using for navigation.
The user should be able to interact with the system using only one hand. The other hand
can still be used for conventional tools such as the white cane or the guide dog.
NFR2: The device where the system resides must be lightweight to the user.
While carrying the device, the user should not feel that it is heavy to carry.
NFR3: The system should provide simple interaction mechanisms for the user to use.
While navigating, there should be a minimum number of interactions between the system
and the user. For example, the user should not have to input the same instruction several times
before it is accepted.
NFR4: The system should be able to guide the user towards his/her destination along the
safest path.
If there are several paths towards the user’s destination, the system should be able to give
directions for the safest path based on complex algorithms.
30
Guidance System for the Visually Impaired Persons Chapter 3: Analysis _____________________________________________________________________________________ NFR5: The system should use a combination of Dijkstra’s Algorithm and graph based for
generating the safest path.
Paths that the user will have to travel to reach his/her destination has to be generated using
Dijkstra’s Algorithm and graph based path planning approach.
NFR6: The system should be able to notify the user when an obstacle is at a distance,
depending on the speed at which the user travels.
The faster the user travels towards an obstacle, the earlier the system should notify the user
about that obstacle.
31
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Chapter 4 Design
In this chapter, the structure of the system to be implemented and its components are
described. Design issues like modelling and interactions of the system are also discussed.
4.1 List of Assumptions
In every project, there is always some unknown attributes or characteristics. For the sake
of this project, a list of assumptions is being made.
• The project will be limited to a specific region of Quatre-Bornes around the street of St
Jean (As shown and discussed in section 3.2.1 – Scenario). However it can be easily
extended to other regions.
• The point of interests will be limited to restaurants, banks, shopping centers, supermarkets,
bus stops and private and local companies like the post office.
• It is assumed that there is no road closed or road deviation due to road works.
• It is assumed that the user’s current location and starting point for routing will be within
the specified region of Quatre-Bornes.
• It is assumed that the user will have internet connection on the first use of the application
to load the map, weather service and other functionalities.
• It is assumed that the user will have a mobile phone running Android, as specified in the
project description, and the phone will have most sensors and components, present in
modern mobile phones, like GPS, accelerometer, vibrator, sound output and all of them are
enabled.
• It is also assumed, as specified in the project description, that the system is not a
replacement of the white cane, it is instead an augmented system to the white cane.
32
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.2 System Design Issues
In this section, relevant design issues in the system are discussed such as usability and map
modelling.
4.2.1 Usability
The usability of a system is how ease the system is of use and how the learnability of a
human-made system is. For this project different aspects of the usability of the system are
considered. The biggest challenge is that the users of the system are partly or completely blind
people. Special care, should be taken while developing the user interface.
4.2.2 Performance
The performance of the system is one of the priorities of this project. Throughout intensive
research work, how to build a system that performs accurately and with great performance are
investigated. Other performance issues like battery life, accuracy of the GPS device and accuracy
of the accelerometer sensor are discussed.
4.2.3 Responsiveness
Due to the interactive nature of the system, information provided should be quick and
accurate. The user should be prompted to turn in a particular direction before passing it, not after.
In other words, a real time service should be ensured. Movement of the user should be considered
and GPS coordinates updated accordingly and properly modelled on the map. New path should be
determined in case the user takes another direction still ensuring shortest route. Different
algorithms should be considered to ensure redetermination of routes in case the user takes the
wrong way.
33
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.2.4 Map Modelling
The guidance system needs a map to guide the user and notify him/her about the objects
surrounding his/her location. To model the map, the main objects that should be included are
obstacles and places.
The information that the map should provide is:
• Name of the obstacles/places in Creole language where possible (voiced out)
o E. g. ‘lescalier’ for obstacle and ‘labanque MCB’ for building
• GPS location of obstacles/buildings
o E. g. Longitude -20.247167, Latitude 57.480734
• Category of place
o E. g. ‘Kentucky’ is in category ‘Manze’
• Opening hours of place
o E. g. M-07:30-20:00 means that the place is opened between 07:30 and 20:00 on
Mondays
Figure 4.1, below, shows the location of MCB, a bank, of Quatre-Bornes.
Figure 4.1 Google Earth Sample
34
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
So as to facilitate the collection of locations, obstacles and places, a third party application
called Google Earth (http://www.google.com/earth/download/ge/) is used. Using a laptop with
Google Earth installed, one goes on the site and add placemarks on the map along with necessary
information that will be used by the system. Some places of interest (POI) are already available
on Google Earth as shown above, thus only the category and opening hours of the place will need
to be surveyed as the GPS location can already be retrieved. However, to add GPS locations to
the map for objects that are not present yet, especially obstacles, a GPS device must be used to
record the coordinates at the place where the object is found.
Furthermore, the information about the obstacles and places will be uploaded on a server
(see section 4.3.2.2). Internet access is needed in order to download from the server. If Internet
is not available on the mobile phone at some instant, files can be downloaded from another
machine such as a computer and transmitted to the phone via other means such as USB or
Bluetooth. The diagram on the next page shows the steps for adding information to the map which
is saved as a KML file.
35
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Adding folder for a region. (Each folder represents a region)
Add placemark to folder on seeing
the obstacle/place
Enter information about obstacle/place
Export as KML file
Figure 4.2 Exporting of places information to KML file
36
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.2.5 Voice Instruction
The voice of one of the project members, Kishan Bhugul, is used to record instructions that
will be voiced out to the user. A professional audio recording home studio has been set up for this
purpose. Figure 4.3 below shows the main components that are being used to record studio quality
sounds.
Figure 4.3 Home studio setup for voice recording
From left to right: Sound Equalizer optimized for vocal, Audio Mixer, Monitoring Speakers to
ensure good quality sound has been recorded, Adobe Audition for editing recorded sounds and
high quality microphone with pop filter.
It is to be noted that all instructions that will be given to the user will be in Creole
language, Mauritius most widely used language.
37
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3 System Architecture Design
In this section, the overall architecture of the system is defined and broken down so as to
explain each part in details.
38
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________________________________________
4.3.1 Structure Diagram
The figure below shows the structure diagram of the system.
Figure 4.4 Model of Top down design for the project
Guidance System for the Visually
Impaired
Navigation
GPS
Track User
Location of obstacles
Location of places
Orientation
Algorithms
Dijsktra's Algorithm
Safest Path
Emergency Mode
Get directions to destinations
Graph Based Approach Weightages
(Difficutly rate) calculations within
places
Context Awareness
Time Weather
www.openweathermap.org
XML file
Obstacles
KML File
Bins
Crossings
Traffic Lights
Ramps
Holes
Lamp Posts
Interesting Places
KML File
Transport
Bank
Entertainment
Food/Drink
Rest Place
Feedback
Haptic
Pattern
Once
Audio
Voice Out
High Quality
Creole Language
Notification
Tone Alert
Input
Button
Volume Buttons
Shake
Accelerometer
Annotation
Voice Reocrding
GPS Coordinates
39
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.2 Overall Architecture
In this section, block diagrams are used to represent an overview of the system. It is divided
into two parts: Administrator System Architecture and User System Architecture.
4.3.2.1 Administrator System Architecture
The following procedures are followed to update the map.
1. As mentioned in section 4.2.4, using Google Earth, an administrator plots the locations
of the obstacles and places. He/She adds other information about the places such as
opening hours on Google Earth itself. The information are saved as KML files on the
administrator’s computer.
2. The saved KML files are then uploaded on Google Drive.
Figure 4.5 Administrator System Architecture
40
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.2.2 User System Architecture
The guidance system for visually impaired persons operates on the mobile device. Sensors
are used when required and the system reads and write files such as maps. Some files have to be
downloaded from an online repository. The diagram below gives an overview of the system.
Figure 4.6 User System Architecture
41
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.3 Component Diagram
The GUI of the system is dependent on the Android OS of the mobile device. If the OS
crashes, the system will not be operable. Similarly, if the speaker or microphone of the device is
defective, the Voice component will not be operable as well. The Navigation component is meant
to guide the user while the Schedule component gives context information at regular intervals. Figure 4.7 below gives an overall architecture split into components.
Figure 4.7 Split Components
42
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.4 Overall Class Diagram
On launching the application, class MainActivity will be the first class to be utilized. Class
Shaker will listen to the user’s shakes and class VoicePlayer will play voice instructions.
Moreover, class Downloader will download files while the class KMLFile will read KML files. Figure 4.8 below illustrates the system classes.
Figure 4.8 System Classes Overview
43
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.4.1 Class MapItem
Both classes Place and Obstacle extend the class MapItem. The two classes have attributes:
name (‘filename’), location and a weightage (‘normalWeightage’). Attribute ‘category’ from class
Place is used to distinguish between different places.
Figure 4.9 Class MapItem
44
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.4.2 Classes from Navigation Package
Class LocationList contains LocationPoint objects which are GPS locations. Class
Direction is used to determine the direction in which the user is heading to, while the class Edge
is to create objects that connect two nodes, ‘from’ and ‘to’.
Figure 4.10 Navigation Package
45
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.4.3 Class Graph and NodeEntry
Class Graph consists of functions to calculate the safest path. Class NodeEntry is to keep
track of nodes with the least weightages and nodes that must be traversed for obtaining the safest
path.
Figure 4.11 Class Graph and class NodeEntry
46
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.4.4 Classes from Voice Package
Classes VoicePlayer and VoiceRecorder are used to play and record voices respectively.
Figure 4.12 Voice Package
47
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.4.5 Classes from Schedule Package
Class AlarmScheduler is intended to schedule an alarm using the class AlarmReceiver.
The latter uses the class WeatherReader to obtain weather information to be voiced out.
Figure 4.13 Schedule Package
48
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.5 Sequence Diagrams
Sequence diagrams are used to show the processing of use cases mentioned in the previous
chapter.
askShake starts a sensor listener to determine the number of shakes done by the user. After
listening the options available and choosing his/her destination by shaking, function getSafestPath
is called. Figure 4.14 shows the sequence diagram during the user usage.
Figure 4.14 User Usage
49
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Figure 4.15 below shows the time and weather condition voice feedback sequence
diagram. Each thirty minutes, the time and weather condition are voiced out.
Figure 4.15 Time and Weather condition Voice Feedback
50
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.3.6 Activity Diagrams
In this section, activity diagrams are used to model how the system completes tasks in
particular scenarios. The following activity diagram model what has to be accomplished when an
object is nearby the operating device.
Figure 4.16 Obstacles closed to device
51
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Figure 4.17 below shows the activity diagram which models what has to be accomplished
when launching the function to annotate an object not found in the list of objects.
Figure 4.17 Annotating objects
52
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
The activity diagram in figure 4.18 below models what has to be accomplished when the
battery life of the device is low.
Figure 4.18 Low Battery
53
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.4 Software Design
In this section, the design of the different software components of system are explained in
details.
4.4.1 Path Determination Algorithm
Given a destination, the system gives the safest path to the destination. A combination of
a modified version of the Dijkstra’s algorithm and a graph based algorithm is used for this purpose.
In order to use the algorithms, a graph must be constructed which represents the map. An
illustration of a customized map is given below.
Figure 4.19 : Illustration of a map in a form of a graph
54
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
For the design of the graph, the following analogy is considered.
• Each point, whereby there is a change of direction, is taken as a node (yellow placemark).
• The path whereby the user can travel between two nodes is taken as edges (red lines).
• Each obstacle (red placemark) is given a weightage. The higher the weightage, the more
dangerous the obstacle is for the user.
• The weightage of the edges is determined by the sum of the weightages of the obstacles
between the two nodes connected by the edge.
As a result, the one which has the lowest weightage is considered the safest.
4.4.2 Determination of user orientation
Like for all GPS navigation tools, it is important to get the user heading that is in which
direction the user is walking. To get the user heading, the magnetometer sensor is usually used,
which acts as a compass. The main disadvantage with this technique is that it requires the mobile
device to be in a specific position, either horizontal or vertical. In this project, it cannot be assumed
that the mobile phone is always in a specific position. This is why another technique which
consists of flattening the map and getting the user heading using two GPS coordinates has been
derived. This is not as easy as it appears to be because of the curvature of the earth.
Figure 4.20 Earth showing a path from Baghdad to Osaka
Figure 4.20 shows a line from Baghdad to Osaka and as it can be seen, it is not a constant
bearing. To flatten the map, the Heading Angle formula is used.
55
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Heading Angle = ATAN2(COS(lat1)*SIN(lat2)-SIN(lat1)*COS(lat2)*COS(lon2-lon1),
SIN(lon2-lon1)*COS(lat2))
The above formula returns the heading angle in radian from –π to π in a four quadrants
fashion. Figure 4.21 below shows how to interpret the angle obtained.
Figure 4.21 User heading in the flatten map
User walking: Forward Left
User walking: Forward Right
User walking: Backward Left
User walking: Backward Right
0
π/2
π | - π
-π/2
56
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.4.3 Traversing KML for Obstacle and Places Information
To read a KML file, an XML parser can be used since the file is in the format of an XML.
Figure 4.22 below shows an extract of the KML file containing the information about the obstacles.
Figure 4.22 Extract of KML file containing information about obstacles in Quatre-Bornes
The algorithm shown in figure 4.23 has been designed to extract information on obstacles
from a KML file.
57
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Figure 4.23 Obstacles Information Extraction Algorithm from a KML File
A loop is used to traverse through the map items in the KML file and information for
each map item is retrieved.
Since the KML file has replicating information on places, the same algorithm is used but
with some modifications to retrieve the categories and opening hours of the places.
58
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.4.4 Obstacle Detection Mechanism
The weightage of an edge is determined by the sum of the weightage of obstacles between
two nodes forming the edge. Consider the illustrated situation squared in blue below. Obstacles
that are present between two nodes must be determined. In figure 4.24, the obstacle ‘la_montant’
is between nodes ‘gold_crest_hotel’ and ‘mikado’. Therefore, the edge between the latter nodes
must have the same weightage as the obstacle ‘la_montant’. To determine the presence of the
obstacles, the heading angle formula from the section algorithm is used. The algorithm is shown
in figure 4.25.
Figure 4.24 Illustration of obstacle between two places
Figure 4.25 Using the heading angle algorithm
59
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Figure 4.26 shows an example where the mentioned algorithm returns the Boolean value
true. In this example, the obstacle is found between the two nodes, hence the algorithm returns
the Boolean value true.
Key
Obstacle/Place
User with mobile device
Figure 4.26 Simple illustration when approaching an obstacle
The system alerts the user about the obstacle when the device is at a distance of three meters
from the obstacle and/or an interesting place. This mechanism uses the GPS sensor to get the
current location of the device. The current location is then compared to a list of GPS locations of
obstacles and places. The algorithm on the next page has been designed to determine when the
user should be informed of the obstacle.
60
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Figure 4.27 Obstacle/Place Detection Mechanism algorithm
61
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.4.5 Traversing of XML File for Weather Information
To read any XML file, a parser can be used. In the Analysis phase, a sample of an XML
file has been presented and in this section how to traverse the XML file is described. Figure 4.28
below shows an extract of the important parts of the XML file and later the retrieving of the
information is described.
Figure 4.28 Extract of weather XML file for the region of Quatre Bornes
The piece of code in the red box describes the current weather condition. The following
algorithm is used to extract this information.
Figure 4.29 Reading weather information algorithm
62
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
A list of weather conditions and its corresponding codes is given in the table below.
Storm and Rain 200, 201, 202, 230, 231, 232
Storm 210, 211, 212, 221
Rain 300, 301, 302, 310, 311, 312, 313, 314, 321, 500, 501, 502, 503, 504,
511, 520, 521, 522, 531
Snow 600, 601, 602, 611, 612, 615, 616, 620, 621, 622 Fog 701, 711, 721, 731, 741 Dust 751, 761, 762 Tornado 771, 781 Fine Weather 800, 950, 951, 952, 953, 954, 955 Few Clouds 801, 802, 803 Cloudy 804 Extreme Conditions and dangerous
900, 901, 902, 903, 904, 905, 906, 956, 957, 958, 959, 960, 961, 962
Table 4.1: List of Weather Codes
63
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.4.6 Power Consumption Management
The remaining battery life is a very important aspect of this project. This is because the
visually impaired person will be very dependent on the system. In order to cater for low battery
life, two options are available. If the battery life reaches thirty percent, the user will be alerted that
the battery is getting low and if the battery life reaches twenty five percent then the system will
automatically be switched to an emergency mode. The emergency mode consists of disabling
features like haptic feedback and urging the user to set the next destination to the starting position.
To get information about the battery life in an Android operating system device, a receiver
should be registered with an intent and the life of the battery is check periodically. The algorithm
below shows the task to be triggered if the battery reaches thirty or twenty five percent.
Figure 4.30 Remaining battery life algorithm
64
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.4.7 Timer Alarm Schedule
Every thirty minutes, an action is triggered by the system and the current time along with
the actual weather condition are voiced out. In order to achieve this, an alarm needs to be
scheduled and re-scheduled every thirty minutes. For this purpose, an intent and an alarm receiver
should be used. The intent sets the priority of the alarm and trigger the alarm receiver. Alarm
manager is used to get the alarm service and then the alarm is set based on a calculation. The
calculation to check the next schedule is shown in the algorithm below.
Figure 4.31 Thirty minutes schedule algorithm
4.4.8 Re-routing Mechanism
When the user is going away from the established path, the safest path is
recalculated and another safest path towards his/her destination is set. The algorithm below
is used.
Figure 4.32 Re-routing algorithm
65
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.5. Interaction Design
In this section, the interaction mechanisms destined for both visually impaired and non-
visually impaired are described.
4.5.1 Human Computer Interaction Strategies
A list of HCI strategies that have been considered when designing the various interfaces of
the system is given below.
• Getting the user’s attention in critical situations such as when there is a nearby an obstacle
• Conveying system information such as battery level of phone.
• Confirmation of action such as when the user has chosen an option after shaking.
• Ergonomics discipline such as the user may hold the device in any position to get the user
heading.
• Reaction time when outputting information such as voicing out obstacle on time before
reaching the latter.
• Haptic stimuli such as vibration
• Short option name such as when choosing category of places
• Groupings of items such as places grouped in different categories
66
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.5.2 Interaction Modes Design
The different interaction modes that are provided in the system are described in this section.
These modes aim to ease the interaction between the visually impaired person and the system.
4.5.2.1 Voice Feedback Mechanism
To voice out an instruction, the MediaPlayer inbuilt function in Android system is used. It
is important that the navigation application gets the audio focus in order to prevent any other
applications on the mobile phone to use the sound output function. In this case other applications
do not prevent useful instruction from being spoiled by their notifications. To get the audio focus,
the audio manager is needed. Once the focus is requested, a function can be implemented as shown
below.
Figure 4.33 Voice feedback algorithm
4.5.2.2 Voice Recording Mechanism
To record a voice, the MediaRecorder inbuilt function in Android system is used. For
example, at his/her current location, a visually impaired person may want to add an obstacle/place
that is not found in the KML file. For this purpose, the algorithm in figure 4.34 is proposed.
67
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Figure 4.34 Voice Recording Algorithm
4.5.2.3 Shake-to-Respond Mechanism
The Shake-to-Respond mechanism is a new way of user interaction with the system. The
accelerometer sensor is used to get shakes by the user. After lots of research on how the Android operating system works, it is now possible to
write about the functions that is used for the implementation of the shake-to-respond feature.
Algorithms as well as explanations are provided.
The sensor listener is first activated and the sensor manager uses the accelerometer sensor.
The sensor listener provides a function that executes when there are changes in the accelerometer
sensor. The algorithm in figure 4.35 shows the shake-to-respond mechanism.
68
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
Figure 4.35 Shake-to-respond algorithm
4.5.2.4 Haptic Feedback Mechanism
The vibrate function is used to capture the user’s attention before voicing out any
instructions. As for any other inbuilt device/sensor in an Android phone, the system service for
the vibrator is required. Once obtained, the vibrator device can be called by specifying the number
of times it should vibrate in milliseconds. The algorithm below shows the haptic feedback
mechanism algorithm.
Figure 4.36 Haptic feedback mechanism algorithm
69
Guidance System for the Visually Impaired Persons Chapter 4: Design _____________________________________________________________________________________
4.6 Graphical User Interface Design
A graphical interface is designed for a non-visually impaired person so that he/she can set
the destination for the blind person. Screen designs are shown below.
On tapping the Menu button:
MAP
Free Walk
Choose Destination
Figure 4.37 Screen design 1
If the option ‘Free Walk’ is selected, then the latter is shown as disabled.
On tapping ‘Choose Destination’.
Manze ek Boir
Place Repos
Transport
Loisir
Figure 4.38 Screen design 2
Categories of places are shown and a scrolling is available if the list is long. Places of
selected category have the same layout as above.
Scroll bar
To be tapped if blind person
does not have a specific
destination to go to
Tap to get list of
categories of places
Tap to get places of
category
70
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
Chapter 5 Implementation
This chapter describes the steps involved in the development process of the software.
Details on different implementation issues, standards and requirements that must be dealt with are
given along with an in-depth description of the classes and methods that are implemented and the
difficulties encountered.
5.1 Implementation Issues
In this section, an initial description of the technical aspect of the implementation is
discussed. It includes what should be taken into consideration before starting the implementation.
5.1.1 Platform
As per the project title, the solution shall be implemented for an Android based device.
Although many other languages like C++ and C# can be used to write Android applications, Java
is used for this project. This is because Java is the official programming language for writing
Android application and Android is supported by Google, holding the main patents, with lots of
documentation and tutorials.
5.1.2 Compatibility
In the recent years, Google has released several versions of the Android operating system.
Two main devices will be used for the development of the solution. The first device runs the
Android OS 2.3 also known as Android Gingerbread and the other device runs the Android OS 4.2
also known as Android Jelly Bean. Although there has been a recent released last year, the Android
4.4 Kitkat, care has been taken to make sure that the solution will run on most if not all Android
OS devices.
71
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.1.3 Robustness
Android OS, being a Linux based operating system, already cater for all robustness issues
as far as resource allocation is concerned. The proposed application however will make sure that
resources are used in an efficient way and memory will be freed as soon as not needed. This is
important to prevent any kind of memory overflow which can result to a temporary dead phone.
Another robustness issue for the application is how it reacts to the accuracy of a GPS sensor
on a particular device. Unfortunately it is difficult to cater for readjustments of the GPS
coordinates for all kind of devices as the deviation will be different. This kind of algorithm
requires years of testing and implementation before it can be deployed.
5.1.4 Real-time Concurrency
After intensive research, it can be said that threads, although possible, should be avoided
when developing mobile applications. This is because due to the lack of resources, Android OS
may kill tasks. For example, if a thread is put to sleep and another thread is being executed but for
some reason both thread at a particular time is sleeping, then the Android OS may consider the
application as not responding and kills it.
Instead in-built listeners can be used which is managed internally by the Android OS. The
Android OS will adjust and share resources in an efficient way without killing any applications.
5.1.5 File Access
The Android OS restricts access to the system files unless rooted. The main files of the
application are stored at the path: /mnt/sdcard/VIS/. It is important to note that the sdcard is
actually the internal memory and at this particular path, the VIS folder have rwx (read, write and
execute) access.
72
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.1.6 Interface and Class
Inbuilt interfaces of the Android OS will be used. The table below describes them.
Interface Description
android.hardware.SensorListener When sensor value has changed, this
listener is notified
android.widget.AdapterView.OnItemClickListener When an item is tapped, this listener is
notified
android.location.LocationListener When the GPS location is changed,
this listener is notified
Table 5.1 Inbuilt interfaces used in the project
Inbuilt classes of the Android OS will be used. The table below describes the important
ones.
Class Description
android.app.Activity Acts as a window to add user interface components
android.content.Intent Describes an action between different applications or
activities
android.hardware.SensorManager Represents a specified sensor
android.content.BroadcastReceiver Receives intents
android.app.AlarmManager Provides alarm services and for this project it is used
to give time and weather at regular intervals
android.content.IntentFilter Compares intent values and filters them
android.os.Vibrator Operates the vibrator on the device
android.os.AsyncTask Helps to run a task like a thread mostly when user
interface components are involved
android.widget.ListView Widget to display items as a list
android.widget.SimpleAdapter Uses to specify list of data and later be used to display
in other widgets such as ListView
Table 5.2 Some inbuilt classes used in the project
73
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.2 Standard and Convention
For the coding standard, the reference book: Clean Code, A handbook for Agile Software
Craftsmanship by Robert C. Martin has been used. This book describes the modern way of writing
codes and also the proper standards and conventions to be used. For this project proper Java
standards and conventions are used. The book also describes modern naming convention. For
example instead of using a meaningless variable name and then commenting about the variable
name, meaningful variable names are used. According to the book, the whole idea of modern
programming is to avoid using comments. Comments take more time to be understood as it is
written in natural language compared to reading meaningful variable names where the programmer
can follow the flow of the program.
5.2.1 Conventions
The code convention describes the rules that are followed in writing good and clean codes.
A list of conventions are given below.
1. The naming conventions that are used are given below.
• Classes names should be in CamelCase. The class name should be a noun: class
Vibrator
• Methods names should be in mixed case and can be verbs describing the method:
void getDirection()
• Variables names should be in mixed case and represent the value of the variable:
String xPoint
• Constants names should be in uppercase. static final int TIMEOUT_SECONDS
2. All the declarations mush be made right from the beginning wherever possible.
3. Indentation is an important aspect that gives a better structure to the program and improves
clarity of codes.
74
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
4. Comments should be used only where appropriate and must be kept updated in parallel
with the source code since they are invaluable to a developer who must maintain a
particularly intricate or cumbersome piece of software.
75
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.3 Development Environment
In this section, the hardware and software tools for the development of the system
environment are discussed.
5.3.1 Mobile Device Configuration
Firstly, so as to acquire updated information from the centralized server, the mobile
phone should be able to connect to the Internet. Most of the Android devices can access the
Internet using the Wi-Fi and data packet technologies.
Normally, setting up internet access using Wi-Fi is done only once per access point and as
for data packet, only once per SIM card. The system, for security reasons, does not allow any
automatic connection to open Wi-Fi access points. Moreover, depending on the service provider
that the mobile phone is subscribed to, the subscriber may be charged addition fees for data packet.
5.3.1.1 Sensors
GPS sensor and accelerometer sensor are the two sensors that the system will make use of.
The GPS sensor has to be able to connect to the maximum number of GPS satellites. The GPS
sensor should be able to remain connected to those GPS satellites in order to receive real-time
information about the current location while moving. It is to be noted that the more satellites the
sensor connects to, the more accurate is the current location. In order to receive more accurate GPS
data, two options can be enabled in Android Settings: ‘Use Wireless Network’ and ‘Use GPS
Satellites’. A sample screen capture of Android Settings is given in figure 5.1.
76
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
Figure 5.1 Screenshot of ‘Location and security’ settings on Android 2.3
The accelerometer sensor will be used to get the acceleration by which the mobile device
is moving. From the latter information, the number of shakes can be determined.
5.3.1.2 Other Hardware
The solution has been designed so as to consume resources in an intelligent and efficient
way. The table below shows the minimum requirements of a device running this application.
Hardware Description
Processor Single core processor.
Memory 256 MB for smartphone and 25 MB for the application.
Display 256K Colors with size 240x320 pixels.
Vibrator Good enough to sense it in a pocket.
Connectivity • Bluetooth (if opting for Bluetooth GPS).
• WI-FI 802.11 b/g should be fast enough, no need for n access points.
• EDGE should be fast enough.
Battery 1200 mAh for up to 6 hours of navigation (depends on the device).
Table 5.3 Guidance System Application Minimum Requirements
77
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.3.2 Software Tools
In this section, the software tools that have been used in this project are described.
5.3.2.1 Google Drive Cloud Storage Google Drive is used to store the files such as logs and KML files in a shared folder. The
files are meant for the mobile application and can be accessed via the URL of the shared folder
whether it is on the phone or a computer. The application downloads the files on the internal
storage of the mobile device. The Google Drive folder has been set to ‘Public on the web’ so that
anyone on the Internet can find and view the contents of the folder. Figure 5.2 shows a screenshot
of the public Google Drive folder for VIS App.
Figure 5.2 Screenshot of the public Google Drive folder VIS App
5.3.2.2 Google Earth Mainly used by the administrator, Google Earth helps in creating a customized map and
add information such as opening and closing hours of interesting places to the map. If several
administrators maintain the map, they can each create a folder which represents a region. Each
administrator works on a specific region and upload it on Google Drive Cloud Storage.
78
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________ 5.3.2.3 GPS Status & Toolbox
GPS Status is an Android application available for free on Play Store
(https://play.google.com/store/apps/details?id=com.eclipsim.gpsstatus2). The application enables
the user to download GPS assistance data that helps in connecting to GPS satellites as fast as
possible. An Internet connection is only needed for the duration of the download which is most
of the time less than one second. This application can be installed along with the proposed system.
On the other hand, a non-visually impaired person must download the assistance data since there
is no feature that enables the blind person to download the assistance data. Figure 5.3 below shows
a screenshot of the GPS Status application.
Figure 5.3 Screen Capture of GPS Status & Toolbox Android Application
79
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.4 Implementation of modules
This section provides code snippets of the important functions in the system.
5.4.1 Safest Path Module
Given the locations of the starting position and the destination, the getReverseSafestPath
function returns an array of locations consisting of nodes. To get the safest path, the array is read
from end to start. To travel along the safest path, one has to travel through the nodes in the order
the array is read.
At lines 40-43 in figure 5.4, the sum of all the weightages is calculated. The result
represents the unlimited weightage that Dijkstra’s algorithm needs at the beginning. At lines 45-
49 in the same figure, the information about all nodes is stored in a HashMap ‘safestEdges’. The
location of the node is the key and NodeEntry is the value. Each NodeEntry contains the weightage
from the starting position to another node. A NodeEntry also contains the location of the node
which must be traversed to reach the destination following the safest path (called ‘safestNode’)
Figure 5.4 Beginning of getReverserSafestPath function
At line 54 in figure 5.5, the location of the ‘from’ of an edge is checked against the starting
position. If it returns true, there is a path between the starting position to the ‘to’ of the edge. If
80
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________ the weightage of the latter edge is less than in the NodeEntry, that means there is a safer path. The
weightage is updated to the least weightage. The mentioned steps are done with all the edges.
Figure 5.5 Part updating weightage in getReverseSafestPath function
At line 84 in figure 5.6, the NodeEntry with the least weightage and which has not yet been
visited is fetched from the HashMap. It is the next entry that is checked in order to update the
weightage and consider the safest path to the node (variable ‘nextLabel’) contained in NodeEntry.
81
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
Figure 5.6 Part fetching next node to update to least weightage
At line 105 in figure 5.7, while looping through the edges, if either the location of the two
points of the edge is the ‘nextLabel’, then weightage of the edge is updated to the least weightage.
The ‘safestNode’ is updated with ‘nextLabel’ as it is the node with the least weightage.
82
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
Figure 5.7 Part setting least weightage
At line 128 in figure 5.8, the previous node (‘safestNode’) that must be traversed to reach
destination is fetched. The previous node of the safestNode is fetched afterwards. This is done
repeatedly until it reaches the start node. All the nodes that must be traversed in order to establish
the safest path are stored in an array but in reverse order. The Array is then returned by the
function.
83
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
Figure 5.8 Part reverse safest path at end of function getReverseSafestPath
5.4.2 User Heading Module
At line 22-25, the latitudes and longitudes of two nodes are converted from degrees to
radians. This is because to calculate the heading angle, the values must be in radians. At line 28,
the heading angle formula (see section 4.4.2) is used to calculate the heading angle in radians.
Finally, at line 31, the getHeading function returns the angle in degrees.
Figure 5.9 getHeading Function Code Snippet getHeading Function Code Snippet
84
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.4.3 Timer Alarm Module
On creating a new instance of ScheduleAlarm, the function ‘checkNextSchedule’ (see
section 4.4.6) is called. This assigns the value of variable ‘scheduledTime’ at which the time and
the weather will be voiced out. For the latter purpose, at line 34, a receiver must be prepared that
calls necessary voice out functions.
Afterwards, at line 37 an alarm manager is created and is scheduled at the time set in
variable ‘scheduledTime’ at line 40. When the ‘scheduleTime’ is reached, the onReceive function
is executed. An extract of the onReceive function is shown in the figure below.
Figure 5.10 ScheduleAlarm Constructor Code Snippet
At line 43, in figure 5.11, a tone alert is played to notify the user. Furthermore at line 45,
the current time is voice out. Also, at line 48-49, a WeatherReader is created that reads the weather
codes (see section 4.4.4) in the weather XML. Then, at line 49, the function
‘getWeatherCondition’ calls the voice out function to voice out the appropriate weather condition.
Figure 5.11 Part of onReceive function in class AlarmReceiver Code Snippet
85
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.4.4 Navigation Module
At line 245 in figure 5.12, the array ‘arrayObstacles’ is looped and at line 246, the ‘if
statement’ compares the distance between the current obstacle and the device with the specified
distance that is three meters. If true is returned, then the device vibrates and the name of the
obstacle is voiced out. At lines 256-263, same is done but the device does not vibrate.
Figure 5.12 Map Items Detection Code Snippet
5.4.5 Shake-to-Respond Module
At line 58 in figure 5.13, if the acceleration of any of the three axes is greater than the
specified value (in this case, the value is 15, which is greater than gravity on Earth), then a shake
is recorded on the device (line 77). At line 66, an alert sound is played so that the user knows
when a shake is recorded.
86
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
Figure 5.13 Function onSensorChanged Code Snippet
5.4.6 Voice Annotation Module
At lines 23-35 in figure 5.14, parameters for the voice recorder are set. At line 45, the
recorder starts recording then at line 48, the recorder is automatically stopped using a timer of
eight seconds. This is because the maximum recording time is 8 seconds.
87
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
Figure 5.14 Function record from class VoiceRecorder Code Snippet
5.4.7 Re-routing Module
At lines 281 in figure 5.15, the distance from the current location to the next node the user’s
destination is calculated (‘currentDistance’). The distance between his previous location to the
next node has been calculated before (‘previousDistance’). At line 282, an if statement checks if
the ‘currentDistance’ is greater than ‘previousDistance’. If the latter is true, this means that the
user is going far from the next node. If this is so, at line 283, the safest path is recalculated from
the nearest node of his/her current location to the user’s destination. At line 285, the
‘currentDistance’ is assigned to ‘previousDistance’ that is used to recheck if re-routing is needed.
All the above steps are done on each location changed.
88
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
Figure 5.15 Re-routing Code Snippet
5.4.8 Graphical User Interface Module
A non-visually impaired person can use the implemented graphical interface below to set
guidance system to ‘Free walk’ mode or choose a destination for the blind person. It is to be noted
that afterwards, the visually impaired person may still choose his/her own destination by shaking.
Figure 5.16 shows the graphical interface after pressing the menu button on the mobile phone.
Figure 5.16 Graphical Interface after pressing the Menu button on mobile device
Note: The ‘Free Walk’ option is disabled as on launch it is already set to that option. The option
is enabled if the system is on ‘Choose Destination’ mode.
89
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
Figure 5.17 Graphical Interface after tapping on ‘Choose Destination’ option
A list of categories of places is displayed and after the user has chosen a category, the list of places
of the selected category appears as shown below.
Figure 5.18 Graphical Interface after tapping on category ‘Manze ek Boir’
90
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.5 Module Context-triggered Action
Most of the modules implemented in the previous section are dynamic. Context aware
information are used by the modules and an action is triggered. The table below shows the module,
the context awareness information used and the action being triggered.
Module Context Awareness Used Triggered Action
Safest Path
• Obstacles
• Current GPS position
• Destination GPS position
• Weather information
• Current Time
The safest path is built.
User Heading Module • Movement of the user by
GPS coordinates
Set the moving direction and
angle of the user.
Timer Alarm Module • Thirty minutes timer
• Current time
Voices out the current time,
weather information and street
name.
Navigation • Current GPS position
Voice out surrounding objects
like obstacles, interesting
places and voice annotation.
Shake-to-Respond • A shake action of the user Select an option based on the
number of shakes.
Volume Button Respond • A press action of the user Select an option based on the
number of presses.
Voice Annotation • Current GPS position
Save the current GPS position
to an XML file and start voice
recording.
Emergency Mode • Battery life Information
• Current GPS location
Urge user to go to initial
position, (if user chooses) new
path built
Table 5.4 Module Context-triggered Action
91
Guidance System for the Visually Impaired Persons Chapter 5: Implementation _____________________________________________________________________________________
5.6 Difficulties Faced
Most of the time, after updating Android Studio, the IDE that has been used to implement
the system, some compatibility errors were obtained. Investigations were done in order to find
solutions to resolve the issues. One of the solutions was to use older versions of libraries.
Sometimes, exceptions of type NullPointerException were obtained. This is because some
variables which have null values were accessed. To resolve this exception, either a try-catch
statement was used to catch the NullPointerException or the values of the variables are checked if
they are null before being used.
Once, when voicing out nearby map items, the voices were overlapping. This is because
several map items were detected and are being voiced out at the same time. To resolve this
problem, just after calling the function for voicing out, a sleep function is called for the duration
of the voice out.
Finally, another issue encountered was about the Android OS version of the two devices
being used. After having successfully tested a module on a particular device, the same module
was not running properly on the other device. After investigations and research, it was found that
version exceptions can be used to restrict access to some functions to some Android OS versions.
92
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
Chapter 6 Integration and Testing
This chapter consist of integration of the modules in the previous chapter and the testing
of the overall system.
6.1 Unit Testing
After implementing a module, it is first tested individually and then tested with the other
modules together. The latter is called integration testing (see section 6.2). Important modules are
shown in the table below.
Module Description
Activity Contains codes to be run on launch and other components for user interaction
Voice To play and record audio
Vibrator To activate vibrator of the mobile device
Shaker To trigger operations through shakes
Downloader To download and update KML and log files
KMLFile To read information on obstacles and places from KML files
Graph To construct the map in a form of a graph with obstacles, places, nodes and edges
MapItem Contains classes to create places and obstacles objects
Navigation To determine the following:
-Direction in which the user is going
-Safest path from current location to a particular destination
Schedule To voice out time and weather at regular intervals
Battery To determine battery level and alert when battery is low
Table 6.1 Important modules of the project Note: Testing the Shaker module while the user is walking has also been successful
93
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
6.2 Integration Testing
After doing unit testing for each module separately, each module is integrated together and
tested. The table below shows the integration of each module to the other successfully tested
integrated modules.
Stage Issues Amendments
Activity +
Downloader
Application crashes. This is because both the user
interface (for visually showing download progress on
screen) and downloading processing are trying to
process at the same time.
Acting like a thread, an
AsyncTask downloads
the files while the main
thread gradually
increments the progress
bar
KMLFile NIL NIL
Voice NIL NIL
Vibrator Vibrates after completely voicing out the obstacle
name which is near the device. Thus, the device
vibrates too late.
Function to vibrate is
first called before
function for voice out
Shaker NIL NIL
Graph +
MapItem
NIL NIL
Navigation NIL NIL
Battery NIL NIL
Schedule Alarm continues to run even if application has been
closed
The intent is killed after
the application is closed.
Table 6.2 Stages of integration testing
94
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
6.3 System Testing
In this section, various aspects of the system are tested. Some examples are accuracy and
performance. For each aspect being tested, the value of a variable is changed and the result
obtained is used for conclusion.
6.3.1 Accuracy Testing The GPS Status application (see section 5.3.2.3) can be utilized to obtain the number of
satellites connected/fixed and error margin at the same time. The graph below shows the number
of satellites connected/fixed plotted against error (meters). The value of error is defined as: “the
radius of 68% confidence. In other words, if you draw a circle centered at this location's latitude
and longitude, and with a radius equal to the accuracy, then there is a 68% probability that the
true location is inside the circle” (Android Developers 2013).
Figure 6.1 Number of satellites fixed v/s Error graph (the trend is shown in dotted)
The graph is not completely linear and theory behind this phenomenon may be due that
some satellites send more accurate GPS data than other.
0
1
2
3
4
5
6
7
8
9
10
0 50 100 150 200 250 300
Num
ber o
f sat
ellit
es fi
xed
Error (m)
95
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________ When the GPS device was connected to eight satellites, the GPS coordinates obtained from
the device are shown on a map below. The green line is the real path of the user and the red line
is the GPS coordinates obtained by the GPS sensor.
Figure 6.2 Real Path v/s Recorded path by device
Some accurate GPS coordinates were obtained by the device. However, large gaps
between the red and green line represent inaccurate readings obtained. This is due to the fact that
the device got disconnected to some satellites. This phenomenon can be explained by interferences
due to weather condition or objects obstructing the signal received from the satellites.
6.3.2 Performance Testing In performance testing, the amount of work done during a period of time is assessed.
6.3.2.1 Downloading Time v/s File Size
The downloading time is checked given different sizes of files. A graph in which the
downloading time of KML file (milliseconds), plotted against the size of the KML file (kilobytes)
is shown in figure 6.3.
96
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
Figure 6.3 Downloading time v/s File Size (the trend is shown in dotted)
For some files, they have taken more time compared to the trend. This may be due to
server receiving more requests or the bandwidth available is smaller than usual.
6.3.2.2 Items voiced out
At each GPS location, nearby items/objects (if any) are voiced out. The path recorded by
the device, is plotted on a map (Google Maps) and each point has been numbered.
0
1000
2000
3000
4000
5000
6000
0 1 2 3 4 5 6 7 8 9
Dow
nlao
ding
tim
e (m
s)
File Size (KB)
97
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
Figure 6.4 Map with path travelled and map items
The device was taken from location 1 to 8 and the voiced out obtained is shown in the table
below. Some items are voiced out with repetition. This is because the item voiced out timeout
occurs. That is, after three minutes if the user is still in the range of the GPS coordinates of the
building/obstacle, the item is voiced out again.
Point Order in which map items voiced out
1 Pharmacy Ideale
2 Pharmacy Ideale
3 Terrain Glissant
4 Terrain Glissant, Chez Marguerita
5 Chez Marguerita
Pharmacy Ideale
Cafetino
Chez Marguerita
Bus Stop
Terrain Glissant
2
1
8
7
6
5
4
3
98
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
6 Cafetino
7 NIL
8 Bus Stop
Table 6.3 Map items voiced out at different locations
Obstacles are voiced out first even though if a place is nearer to the device and map items
that are on the other side of the road are not voiced out.
6.3.2.3 Re-routing
If the user is going in the wrong direction, the safest path is recalculated from his/her
current location to his/her destination. The path is plotted on the map below. The initial safest
path is in blue, the path taken by the user is in red and the re-rerouted path is in black.
Figure 6.5 Map showing re-routing
After recalculating the safest path, from the user’s current location, the device voices out
instructions to go to the next node (labeled ‘Next Node to go’). Afterwards, the voice instructions
guide the user from the next node to his/her destination. The module works as expected.
Current Location
Next Node to go
Destination
99
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________ 6.3.2.4 Loading Time v/s File Size
Time needed to load data from a file into the RAM is assessed. The graph below shows
the time taken to load data plotted against the size of the file (kilobytes).
Figure 6.6 Loading time into memory v/s File Size
The least size of a file is around 1.75 KB. This is because a file always consists of metadata,
that is, basic information used by Google Earth.
6.3.3 Stress Testing Stress testing involves checking the stability of the system when maximum resources such
as the memory are used. The following codes were used to load around four thousands of places
and obstacles.
Figure 6.7 Loading maximum map items in RAM Code Snippet
0
100
200
300
400
500
600
700
0 2 4 6 8 10
Load
ing
time
(ms)
File Size (KB)
100
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
Around two hundred thousands of places and obstacles were able to be loaded in the
system’s memory before the application crashes. It is to be noted that the system has been designed
in such a way that the minimum number of items are loaded in memory at any particular instant.
The exception (in red) is shown below.
Figure 6.8 OutOfMemory Exception
6.3.4 Shake Sensitivity To assess the shake sensitivity, the tests was based on hundred attempts and different types
of persons were involved. The maximum number of shakes has been set to four. The test scenario
was as follows:
1. The person has chosen of the operation he/she wanted.
2. The person has to shake the number of times needed to execute the operation.
3. For each attempt, the number of successful attempt is recorded.
It is to be noted that since at some point in time, the user may feel tired of continuously
shaking during the test, he/she has been given the liberty to decide when he/she wanted to take a
break. On average, during the tests, a person took a rest after approximately 10 attempts. The
table below shows the diverse capabilities of each person.
Person’s name Knowledge on
shaker algorithm
Experience in using
shaker
Training
Kishan Bhugul Deep Much While debugging
David Young Ten Fair Little Trained by Kishan
Bhugul
101
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________ Ramlo Appadoo
(Visually impaired)
Vague Never used Trained by David
Young Ten
Anonymous (Non-
visually impaired
tierce person not
involved in
development of the
project)
Vague Never used Trained by David
Young Ten
Table 6.3 Diverse Capabilities of each person involved in the test
For each person, successful attempts has been plotted against the attempt number. The
trend represents the ideal scenario in which the attempts are always successful.
Figure 6.9 Successful Attempts v/s Attempt Number
0
10
20
30
40
50
60
70
80
90
100
1 5 9 13 17 21 25 29 33 37 41 45 49 53 57 61 65 69 73 77 81 85 89 93 97
Suce
ssfu
l Att
empt
s
Attempt NumberTrend Anonymous David
102
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
From the graph, it can be inferred that this module is sensitive enough to detect users’
shakes. It can be noticed that the results for the non-visually person closely track he results for the
visually impaired person. The shaker has been tested through several situations:
• While walking
• While sitting
• While standing
• While being knocked by a nearby person
• Using different hands
• By handling the device in different ways
103
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
6.4 User Acceptance Test
In this section, the system is tested by the user. Mr. Ramlo Appadoo, a visually impaired
person has tested the system and the outcome of the test is discussed. The photo below shows an
evidence of the user acceptance test.
Figure 6.10 Evidence of user acceptance test with Mr. R. Appadoo
104
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
6.4.1 Shake-to-Respond
Since Mr. Ramlo Appadoo has never used the shake-to-respond feature, he has first
undergo a training by one of the member of the project, David Young Ten. The training session
consisted of grabbing his hand and simulating the shake movements. This is because the shake
movements could not be explained another way. After the completion of the test, Mr. Ramlo
Appadoo gave his opinions. The opinions are listed below.
• When choosing option n, it would better if the device vibrates n time.
• Instead of using the same alert sound after choosing an option, different alert
sounds would be better for different options.
• It would be better, between each shake, the timeout is 4 seconds instead of 3
seconds.
• The overall shaking experience is very good.
6.4.2 Voice Feedback Mechanism
Mr. Ramlo Appadoo had listened to the voice instructions of the system. The following
feedback has been given by him.
• It would be better that different types of obstacles are voiced out by different
persons. This is to act as checkpoints.
• Distance can be given in meters instead of steps as blind person has a good notion
of distance.
• It would be better to use other wordings for some voice instructions. The table
below shows the actual wordings and the desired wordings by Mr. Appadoo.
Meaning Actual Wordings Desired Wordings
Low battery Life Batterie faible Mettre battery charger
Continue on your way Contign droite Ale droite
105
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
Table 6.4 Actual wordings and desired wordings by Ramlo
6.4.3 Obstacle Detection Mechanism Mr. Ramlo Appadoo has also tested the obstacle detection mechanism and has given the
following comment.
• The system gives useful information about obstacles which is benefit for a visually
impaired person.
6.4.4 Voice Recording Mechanism Mr. Ramlo Appadoo tried to annotate an obstacle that was not present in the system. He
said that eight seconds is enough but some person may need between ten to twelve seconds. This
is because a user may want to record additional information about the obstacles or places.
The destination will be
on your right
Destination pu lor droit Destination pu a coté
Cross the road Saute Crosir Traverse Crosir
The place of interest is
opened
Asterla li ouvert Li ouvert
The place of interest is
closed
Asterla li fermer Li fermer
106
Guidance System for the Visually Impaired Persons Chapter 6: Integration and Testing _____________________________________________________________________________________
6.5 Debugging
During the development of the system, several bugs were found. Some of them are shown
in the table below.
Issue ID Module Description Remarks
1 Downloader Files not being
downloaded
Resolved by specifying the protocol
https in the download URL
2 Voice Recording time
was halved
Resolved by setting the audio encoder
to narrow band instead of wide band
3 Vibrator Application
crashes
Resolved by adding permission in
AndroidManifest.xml for application to
gain access to vibrator
4 Shake-to-Respond Application
crashes
Resolved by putting the accelerometer
sensor to sleep while voicing out
instructions.
6 Read Weather
information
No data was
obtained
Resolved by using the proper path to
the XML file.
Table 6.5 List of main bugs while testing the application
107
Guidance System for the Visually Impaired Persons Chapter 7: Critical Appraisal and Future Works _____________________________________________________________________________________
Chapter 7 Critical Appraisal and Future Works
This chapter deals with the verification of requirements of the system to see if it coincides
with the actual output. The project achievements and limitations are discussed. It is important to
appraise the system and the discrepancies between the specification and the actual achievements.
Finally future works are discussed and explanations are given on how the system can be further
improved.
7.1 Achievements
It can be said that all requirements have been achieved within the allocated time, fulfilling
the aims and objectives of the project. The initial requirements are listed below and the final
features of each one of them are discussed.
1. The system shall provide information about obstacles in the surrounding.
Context awareness is one of the most important feature of this project. This feature has
been fully implemented. Time has been spent in building a special custom map for the visually
impaired persons. An experienced visually impaired person has been contacted on several
occasions so as to avoid omitting any important obstacles. While the user is navigating, either in
free walk mode or in chosen destination mode, all obstacles are voiced out along the way. The
requirement can therefore be concluded as completely met.
2. The system shall provide information about buildings in the surroundings.
As for the previous requirement, information about buildings is also included in the custom
map. As the user is travelling nearby buildings, their names and details, such as actually opened,
are voiced out. Buildings have been classified in such a way to ease choosing the appropriate
destination. The requirement can therefore be concluded as completely met.
108
Guidance System for the Visually Impaired Persons Chapter 7: Critical Appraisal and Future Works _____________________________________________________________________________________
3. The system shall allow the user to add a voice note at his/her current location.
Whenever the user wants to add a note about his/her navigation, he/she can do so. The
user needs to trigger the option either by shaking or tapping on the volume button. He/she will
then be asked to voice out his/her note. At the same time, the system gets the current GPS location
and as soon as the user has completed his/her annotation, the audio is saved to the phone memory
and the audio name and the location is saved to a XML file. When the user passes by this location,
this annotation is voiced out. The requirement can therefore be concluded as completely met.
4. The system shall be able to guide the user towards his/her destination.
This feature has been tested and concluded as fully implemented. After the safest path has
been established, precise voice instructions are given to the user by a one sided ear piece. As the
user is navigating, all the required instructions are given depending on the current GPS location.
Special care has been taken in the algorithm to adapt to the user’s way of navigating. If ever the
user decides to change the path for some reason, the system automatically adapts to the change
and regenerate another path from the current location. The requirement can therefore be concluded
as completely met.
5. The system shall be able to download maps.
On loading the application, Google Drive, a cloud technology, has been used to retrieve
the latest map. A log system has been implemented to avoid unnecessary download if there is no
updated version of the map available. The requirement can therefore be concluded as completely
met.
6. The system shall allow the maps to be customized.
Google Earth has been adapted in such a way to allow the map to be customized. The map
is categorized as several regions to allow easy customization. Special marking design can be used
109
Guidance System for the Visually Impaired Persons Chapter 7: Critical Appraisal and Future Works _____________________________________________________________________________________ to add different objects on the map. The requirement can therefore be concluded as completely
met.
7. The system shall use voice to inform the user about nearby obstacles and buildings.
The local Creole language has been used to record instructions and information. This is
because the Creole language is the preferred language among visually impaired persons. High
quality audio has been recorded to make instructions and information more understandable. The
requirement can therefore be concluded as completely met.
8. The system shall use haptic feedback to get the user’s attention for obstacles and
before any voice out.
Vibration is used only when needed to capture the user’s attention. Haptic feedback should
be avoided when not necessary to improve the battery life of the mobile phone. The requirement
can therefore be concluded as completely met.
9. The system shall voice out context awareness information every thirty minutes.
Every thirty minutes, the system voice out the current time, the actual weather condition
and the street name the user is currently in. The requirement can therefore be concluded as
completely met.
10. In case of low battery life, the system shall automatically switch to an emergency
mode.
In case the battery reaches twenty five percent or less the system automatically switches to
emergency mode, that is using the least amount of processes and sensors to lessen battery drain.
The user is urged to get back to his/her initial position but he/she can also choose to continue to
his/her destination. The requirement can therefore be concluded as completely met.
110
Guidance System for the Visually Impaired Persons Chapter 7: Critical Appraisal and Future Works _____________________________________________________________________________________
11. The user shall be able to trigger tasks.
Two different ways are available to trigger tasks. The first one is using shaking, the number
of shakes correspond to the task number and the other way is by using the volume button, again
the number of press on the volume button is the task number. The requirement can therefore be
concluded as completely met.
12. The system shall not affect other devices or equipment the visually impaired or blind
person is using for navigation.
The system is an augmentation to the actual tools the visually impaired persons use. The
system can be manipulated with only one hand, leaving the other for holding and using the white
cane. Also only one sided ear piece is used, this is to allow the user to use his/her hear sense to
navigate safely. The requirement can therefore be concluded as completely met.
13. The device where the system resides must be lightweight to the user.
The system has been designed to fit in any modern smartphone. Therefore this feature
depends mainly on the user’s smartphone. However modern smartphones are very handy, small
and lightweight. The requirement can therefore be concluded as completely met.
14. The system should provide simple interaction mechanisms for the user to use.
Special care has been taken to make sure that the visually impaired person is comfortable
with the system. This includes less confirmations and more shortcuts to tasks. The requirement
can therefore be concluded as completely met.
111
Guidance System for the Visually Impaired Persons Chapter 7: Critical Appraisal and Future Works _____________________________________________________________________________________
15. The system should be able to guide the user towards his/her destination along the
safest path
Algorithms have been implemented based on precise parameters like weather, day of the
week, time and objects to guide the user towards his/her destination using the safest and shortest
path. This is one of the main aspect of the project and this part has been implemented taking all
kind of possibilities into consideration. The requirement can therefore be concluded as completely
met.
16. The system should use a combination of Dijkstra’s Algorithm and graph based for
generating the safest path.
To generate the path, Dijkstra’s Algorithm is one of the best at taking nodes and their
weightage into consideration. The Dijkstra’s Algorithm makes use of a graph representation of
the nodes which has been modified in such a way to allow more efficient use of resources while
computing the required path. The requirement can therefore be concluded as completely met.
17. The system should be able to notify the user when obstacle is at a certain distance
depending on the speed at which the user travels.
Based on the speed the user is walking, the system automatically calculates the instant at
which to voice out the existence of the obstacle. If for example the user is walking fast, the system
voices out the existence of the obstacle earlier that if the user is walking less fast. The requirement
can therefore be concluded as completely met.
112
Guidance System for the Visually Impaired Persons Chapter 7: Critical Appraisal and Future Works _____________________________________________________________________________________
7.2 Limitations
The main limitation is the accuracy of the GPS device of smart phones. Most smartphones
have very inaccurate GPS devices and is not suitable for walk navigation. This is because GPS
devices on smartphones connect only to a limited number of GPS satellites which is generally not
precise enough to get the user’s exact current location. However better precision can be obtained
with an external GPS device.
Moreover, as the number of obstacles increases at a particular location, the number of
nodes will increase and thus the resources required will also increase. It should be noted that
smartphones have a limited amount of memory and cannot allocate more than the defined amount
by the Operating System. To overcome this, a device with a larger amount of memory should be
used or an algorithm that cater for large number of obstacles at a particular location.
113
Guidance System for the Visually Impaired Persons Chapter 7: Critical Appraisal and Future Works _____________________________________________________________________________________
7.3 Future Works
The navigation and context awareness fields are vast and are still under research.
Improvements for even accurate readings and predictions are mostly desirable. The work
accomplished is only an apercu of what can be achieved in terms of perfect navigation for the
visually impaired. However good and reliable a system can be, improvements are always
welcome. However, although the amount of time given was very small and a large number of
research in different fields were required, the essential features have all been implemented.
7.3.1 Modern Technologies
The system can be implemented along with modern technologies like the Google Glass.
The latter for example has a very accurate GPS device which though intensive research has been
optimized for walk navigation.
Devices like the Google Glass is also equipped with a high resolution camera at the eye
level which allows object to be identified according to their position and location. Image
processing can be used to voice out objects in real time instead of fully depending on a custom
map.
7.3.2 Intelligent Algorithms
Based on how the user navigate, on which days, at what time, the places of interest, how
obstacles are tackled and which path is used, a database can be build. From this database, the
technique of data mining can be used to build paths from A to destination B based on the user’s
previous personal experiences.
114
Guidance System for the Visually Impaired Persons Chapter 7: Critical Appraisal and Future Works _____________________________________________________________________________________
7.3.3 Routing Algorithms
More research can be carried out on the way the safest path is determined. The main
algorithm that has been adapted is the Dijkstra’s algorithm. However, by doing intensive research
and testing, more appropriate algorithms can be derived which will consume less memory and less
processing power to give better paths.
7.3.4 More Sensors
More sensors like the proximity sensor or other sensors to detect nearby objects can be
used. As mentioned above, computer vision can be used to detect objects but it will be at a
particular height and object at the foot level will not be able be determined. The white cane can
therefore be equipped with sensors to detect foot level objects.
115
Guidance System for the Visually Impaired Persons Chapter 8: Conclusion _____________________________________________________________________________________
Chapter 8 Conclusion
The aim of the project was to design and implement an Android based guidance system to
augment visually impaired persons. Such a system is generic and applied is in the region of Quatre
Bornes, Mauritius.
This project was a very research intensive project. Researches had to be carried out in
many fields including visually impaired person, context awareness, mobile application
development, human computer interaction for visually impaired, navigation techniques, navigation
techniques for the visually impaired and among others, audio recording and editing techniques.
Many visually impaired persons have collaborated in this project. Mr. Ramlo Appadoo,
one of the visually impaired persons, was a great collaborator in helping to assess obstacles while
navigating. This enables each obstacle to have a dynamic rating which is mandatory in
determining the safest path.
Since the system was tested using the mobile phone integrated GPS device, it has been
noticed that the reading is not so accurate and as the number of satellite decreases, the accuracy of
the system also decreases. Once the number of fixed satellite reaches four, the accuracy keeps on
decreasing.
The system has also undergo several performance tests. One of them was to see how the
system responds in real. The device was carried along a path and the objects voiced out were the
expected set of objects. At the same time, different parts of this module were tested like calculating
the distance between the device and obstacles. All the results obtained were the expected results.
Finally the stress test has shown that the system still runs smoothly even though the file
size of the application in memory grows to its maximum that is not more than two hundred nodes.
The result obtained was also as expected.
Following those test results and the user acceptance test, it can be concluded that the system
is successful. The system greatly helps the visually impaired by improving their navigating
experience and security. In the future, the system may be easily extended to other regions of
Mauritius.
116
Guidance System for the Visually Impaired Persons References __________________________________________________________________________________
References
ABDELSALAM, H., MOORE S.E. and RAMACHANDRAN B., 2001. Drishti: An Integrated
Navigation System for Visually Impaired and Disabled, Proceedings of the 5th IEEE
International Symposium on Wearable Computers, 8-9 October 2001 Zurich. IEEE, 149-156.
Android Developers, 2013. Location | Android Developers [online]. Available
from: http://developer.android.com/reference/android/location/Location.html#getAccuracy%
28%29 [31 January 2014].
ANGIN, P. AND BHARGAVA B.K., 2011. Real-time Mobile-Cloud Computing for Context-
Aware Blind Navigation. International Journal of Next-Generation Computing, 2 (2) 1-13.
BALDAUF, M., DUSTDAR, S. and ROSENBERG F., 2007. A survey on context-aware
systems. Int. J. Ad Hoc and Ubiquitous Computing, 2(4), 263-277.
CARDIN, S., THALMANN D. and VEXO F., 2005. Wearable Obstacle Detection System for
visually impaired People. HAPTEX ’05 Workshop on Haptic and Tactile Perception of
Deformable Objects, 1 December 2005 Hannover, Germany. 50-55.
DEY, A.K. and ABOWD G.D, 1999. Towards a Better Understanding of Context and Context-
Awareness. Proceedings of the 1st international symposium on Handheld and Ubiquitous
Computing. London: Springer-Verlag, 304-307.
DEY, A.K. 1998. Context-Aware Computing: The CyberDesk Project. Spring Symposium on
Intelligent Environments, Palo Alto: AAAI Press., 51-54.
DEY, A.K., ABOWD, G.D. and WOOD, A. 1999. CyberDesk: A Framework for Providing
Self-Integrating Context-Aware Services. Knowledge-Based Systems, 11, 3-13.
117
Guidance System for the Visually Impaired Persons References __________________________________________________________________________________
DEY, A.K., SALBER, D., FUTAKAWA, M. and ABOWD, G.D. 1999. An Architecture to
Support Context-Aware Computing.
FALLAH, N., APOSTOLOPOULOS I., BEKRIS K. and FOLMER E., 2012. Indoor Human
Navigation Systems - a Survey. Interacting with Computers, Oxford Journals, 25(1), 21-33.
GUIER W.H and WEIFFENBACH G.C, 1998. Genesis of Satellite Navigation. Johns Hopkins
APL Technical Digest, 19(1), 14-17.
KOLEY, S. AND MISHRA R., 2012. Voice operated outdoor navigation system for visually
impaired persons. International Journal of Engineering Trends and Technology, 3 (2) 153-
157.
MEYER, S. and RAKOTONIRAINY A., 2003. A Survey of Research on
Context-Aware Homes, Proceedings of the Australasian information security workshop
conference on ACSW frontiers, 2003 Adelaide. Australian Computer Society, Inc., 159-168.
Modern-Eyes Training Services Ltd, 2006. OUT AND ABOUT WITH VIPS [online].
Available from:
http://community.stroud.gov.uk/_documents/23_Out_and_about_with_VIPS__Visually_Imp
aired_Persons_.pdf [Accessed 11 September 2013].
NEUFELD J., ROBERTS J., WALSH S., SOKOLKSY M., MILSTEIN A. and BOWLING
M., 2008. Autonomous Geocaching: Navigation and Goal Finding in
Outdoor Domains. Proceedings of the 7th international joint conference on Autonomous
agents and multiagent systems. International Foundation for Autonomous Agents and
Multiagent Systems, 47-54.
NICHOLS, A., 1995. Why Use The Long White Cane? [online]. National Federation of the
Blind. Available from:
http://web.archive.org/web/20100330050804/http://www.blind.net/g42w0001.htm [Accessed
10 September 2013].
118
Guidance System for the Visually Impaired Persons References __________________________________________________________________________________
OJALA, T., 2010. Case studies on context-aware mobile multimedia services. JDIM, 8(1), 4-
15.
PADUA-PEREZ, N. and PUGH, B., 2006. Graphs & Graph Algorithms 2 [online]. Available
from: http://www.cs.umd.edu/class/spring2006/cmsc132/Slides/lec30.ppt [Accessed on 3rd
November 2013].
PHITHAKKITNUKOON S. and DANTU R., 2010. ContextAlert: context-aware alert mode
for a mobile phone. International Journal of Pervasive Computing and Communications, 6
(3), 311-332.
R & R Associates, 2012. Guide Dog Harness [online]. Available from: [Accessed on 17th
December 2013].
SALBER, D., DEY A.K. and ABOWD, G.D. 1998. Ubiquitous Computing: Defining an HCI
Research Agenda for an Emerging Interaction Paradigm.
SCHILIT, B. and THEIMER, M., 1994. Disseminating Active Map Information to Mobile
Hosts. IEEE Network, 8(5) 22-32.
THE ROBOTICS INSTITUTE, 2008. BlindAid: An Electronic Travel Aid for the Blind.
Pittsburgh, PA: The Robotics Institute, (CMU-RI-TR-07-39).
TIWARI R., SHUKLA A. and KALA R., 2013. Graph Based Path Planning. Intelligent
Planning for Mobile Robotics: Algorithmic Approaches. 26-53
119
Guidance System for the Visually Impaired Persons Appendix 1 _____________________________________________________________________________________
Appendix 1
Interview with a Blind Person
Below is an extract of the interview with Mr. Ramlo Appadoo. Kishan Bhugul was the
interviewer.
Can you tell me about yourself?
My name is Ramlo Appadoo and I have two children: one son and one daughter. I am 42
years old and I am currently the President of ‘Association Vivre Debout’. In addition to that, I am
a member of ‘Rainbow Foundation’. I travel by myself everyday.
Can you tell me the technology that you use the most in your everyday life?
One of the technologies that I usually use is JAWS. JAWS uses the concept of a screen
reader. For example, if I am typing a letter using the keyboard of my computer, the characters that
I am typing are read aloud. This allows me to know if I have made any typing errors. JAWS is
also included in my mobile phone so that I can easily make calls and know who is calling me.
What are the other technologies or systems that you know about?
In America, there is a sort of box that consists of a GPS which can detect around two
thousand types of obstacles. Thus, wherever I go, it detects any obstacle recognized by the box.
In France, there is an electronic white cane that consists of one thousand alarms and a vibrator.
The sensors are included in the interior of the white cane. Moreover, there is a little device that
can be attached to clothes. This device voices out the color of the clothes. In Japan and Europe,
there are special tiles. When stepping on them, they let me know where I am located.
120
Guidance System for the Visually Impaired Persons Appendix 1 _____________________________________________________________________________________
How is the white cane different from a baton? And why do you not use the latter?
A white cane consists of three components: the handle, the stem and the tip. The handle is
made of rubber and compared to a baton, it will not easily slip from my hand. The tip is made of
plastic and is three inches long. The stem is one meter long and if ever the tip hits a high voltage
object, I am protected by the materials of the white cane.
Can you describe the life of a blind?
In the past, unfortunately, a blind could not travel from one place to another due to the lack
of facilities. Since 1980, there has been a big change. The blinds are now able to travel easily and
more and more people are willing to help them. The government has also made some effort in
constructing adapted infrastructures. Education is free for blind persons and in exams we are
given extra time so that we can read the paper in Braille. In some cases, the characters of the exam
papers are enlarged and special lamp table are provided to visually impaired persons.
What do you think is the proper age for a person to start travelling alone?
Long ago, parents did not let their children travel by themselves outdoors. This is because
people sometime make fun of them. Nowadays, the mindsets of people have changed.
Handicapped persons are protected by laws set in 1981. There is also the ‘White Cane Safety Day’
and ‘International Day of Persons with disabilities’. To answer your question, it depends both on
the parents and the child whether the latter can travel by himself/herself. But mostly, it depends
on the parents who must not consider the child as a burden.
121
Guidance System for the Visually Impaired Persons Appendix 1 _____________________________________________________________________________________
What are the criteria to be considered to know if a child is ready or not, to travel by
himself/herself?
Firstly, the child must know the orientation in which he/she is travelling. Secondly, he/she
must be strong psychologically. Thirdly, his/her hearing and sense of touch must be good enough
to know about his/her surroundings.
How do the parents help to achieve these criteria?
First of all, the mother is the first person to teach her child how to travel by himself/herself.
For instance, the child will listen to all kinds of sounds and the mother will give him/her more
information about them. The most important thing is that the mother has to be ready before the
child. She must be able to accept the child as he/she is. If the mother refuses to accept that her
child is handicapped, then the child will feel her frustration and will not be able to travel by
himself/herself.
Can you tell me how the mother teaches the child to travel by himself/herself?
At the beginning, the child will try by himself/herself and then will fall down. The mother
will help him/her to get back on his/her feet and guide him/her. This process goes on and on.
For example, when the child is on the streets, what are the things that he/she must
touch?
No, he/she must not touch anything when he/she is on the streets but he/she must have a
good hearing that acts as checkpoints. He/she must know the locations of ramps, canals and other
obstacles. But before that, he/she must know how to move from one place to another in his/her
house. At first, it will be difficult but the parents must advise him/her. Yet, there must be some
restrictions to the kind of help given by the parents. Helps must be restricted so that the child does
not always depend on the parents.
122
Guidance System for the Visually Impaired Persons Appendix 1 _____________________________________________________________________________________
How will a blind person know when to cross the road?
He/she will not know when to cross and must know how to ask help politely to people. A person
from the public will help the blind person to cross the road or tell him/her when the traffic lights
turn green.
What are the difficulties that you faced while travelling?
In Mauritius, there is a problem of accessibility. The pavements, bus stops, bus stations and
hospitals are not well structured for blind persons. Sometimes there are holes and the locations of
bus stops often change. There are also moving obstacles such as vehicles.
What are the places that you often go by yourself?
I usually go to supermarkets and to other places that I am used to going. If I am not used to a
place, then I am accompanied by someone to guide me. In some countries, the articles in a
supermarket that one buys can be checked out using technologies like the barcode reader by the
purchasers themselves. The bill is given and one pays the required amount.
What kind of help you get from the public while travelling alone?
On bus stops, bus conductors tell me the destination of the bus. In the bus, they let me know when
the destination has been reached. Some bus conductors are really nice and without them, the blind
persons would not have been able to travel. On the other hand, there are some bus conductors that
will not tell anything.
123
Guidance System for the Visually Impaired Persons Appendix 1 _____________________________________________________________________________________
Can you tell me a scenario when you are travelling on your own?
If I am travelling to a particular area for the first time, I will be nervous. Therefore I will try to
look for checkpoints such as ramps and stairs. If someone calls me from behind, I will not turn
around immediately because if I do and there is an obstacle behind me, I might trip and fall. So,
first, I will raise my hand to let the person know that I am coming then I will turn around cautiously.
The use of an earphone while travelling will be a problem to you?
No, it will not be a problem as I usually use earphones to listen to news on the radio while
travelling.
Do you use guide dogs?
No, in Mauritius guide dogs cannot be used as an aid. This is because, there are too many stray
dogs in Mauritius and these dogs may distract the guide dog. However, in 1980, there was a guide
dog and the owner was the director of a school. If a person has a car at his/her disposition and that
he/she is going to a specific place, the guide dog can be used as an aid but I do not guarantee that
other dogs will not attack the guide dog.
124
Guidance System for the Visually Impaired Persons Appendix 1 _____________________________________________________________________________________
Lessons Learnt From a Blind Person
During meetings with Mr. Ramlo Appadoo, many lessons from a blind perspective were
learnt. Below is a list of the lessons.
1. Use of guide dogs is difficult in Mauritius.
2. A blind uses checkpoints such as broken pavements and ramps in order to know
their location.
3. When rain falls, the checkpoints on the ground are covered with water and the white
cane cannot be used properly.
4. Voice out systems are considered the best interaction systems for visually impaired
persons.
5. Visually impaired persons must not touch objects in the surrounding but rather use
their white canes to guess the object. This is because the object may be an electrical
component.
6. Markets are dangerous places as there are vegetables and other stuffs on the floor.
One, whether visually impaired or not can slip on them.
7. Some vehicles are parked on pavements and on bus stops which make it more
difficult to pass by while travelling.
125