Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better...

73
Candidate Recommendation 1.0 1 Open Standard for Audio-based Wayfinding Version: Candidate Recommendation 1.0 Published: 15 th December 2016

Transcript of Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better...

Page 1: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 1

Open Standard for Audio-based Wayfinding

Version: Candidate Recommendation 1.0 Published: 15th December 2016

Page 2: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 2

Release notes

Following is a list of the sections that changed in Candidate Recommendation 1.0:

A change that applies to various sections is that the verb “walk” in the examples of audio

instructions has been replaced with the word “move”, because some users may not be on foot

when using a Wayfindr enabled App.

1.0 Why an Open Standard

New text was added to better explain why the Open Standard is needed.

1.2. How are the guidelines informed

The list of contributors changed as the Working Group that reviewed and proposed the Candidate Recommendation consisted of different people and organisations.

1.4. What is a Candidate Recommendation

Description text changed to reflect the Candidate Recommendation document.

2.0 Purpose of this section

More explanation is added about how a vision impaired person interprets their environment.

3.1.1 Involve users in the process

New text was added to list the different levels of visual impairment participants should have

when engaging them for research trials.

3.1.5 Provide reassuring information

New text was added to highlight the need for distance information in large open areas.

3.1.7 Provide different techniques for angular directions

Word “diagonal” was replaced by “angular” in the heading to better communicate the term.

3.1.8 Provide auditory cues

Windows Narrator was added as an example of accessible use of a modern smartphone.

4.1.4.0 Escalators - Overview

Text added recommending that passengers be given a reminder of which side of the escalator they need to stand.

Page 3: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 3

4.1.7.0 Ticket Control - Overview

Text added to mention that accessible gates might work differently because they have often been designed for wheelchair users.

s4.1.8.1 Determining position in relation to the platform length

Added this section to explain about ‘acoustic properties’ of train stations - may need to be

removed.

4.3.1.1 Enabling replay of previous audio instruction

Added text with a recommendation about ‘reassuring messages’ not to be replayed.

4.3.1.2 Enabling dictation for searching

Added text about dictation potentially being hampered by acoustics in train stations.

s4.3.1.5 Enabling users to choose voice type

Removed because the Operating System of the smartphone device determines what voice type someone can use.

5.1. 3.1 - Proximity based approach

Added two new disadvantages of the proximity-based approach to the list

5.1.4.2 Eddystone

Text was modified to better explain what the Eddystone-EID is.

5.1.6 References

Three Wikipedia references were removed as non-authoritative sources.

Page 4: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 4

Table of Contents

1. Gettingstarted...........................................................................................................7

1.0. WhyanOpenStandard?...................................................................................................7

1.1. WhattypesofcontentmakeuptheOpenStandard?.......................................................8

1.2. Howaretheguidelinesinformed.....................................................................................8

1.3. HowtousetheOpenStandard.........................................................................................9

1.4. WhatisaCandidateRecommendation?.........................................................................10

2. Learningaboutmobilityofvisionimpairedpeople...................................................12

2.0. Purposeofthissection...................................................................................................12

2.1. Somefactsaboutvisionimpairment..............................................................................12

2.2. Primarymobilityaids.....................................................................................................13

2.3. OrientationandMobility(O&M)training.......................................................................14

2.4. Landmarksandclues......................................................................................................16

2.5. Thesafest,notthefastestorshortestroute...................................................................17

2.6. UserPreferencefordiscreettechnology.........................................................................17

2.7. References.....................................................................................................................18

3. Designingforvisionimpairedpeople........................................................................20

3.0. Purposeofthissection...................................................................................................20

3.1. Designprinciples............................................................................................................20

3.1.0. Overview.........................................................................................................................20

3.1.1. Involveusersintheprocess............................................................................................20

3.1.2. Focusontheenvironmentnotthetechnology..............................................................21

3.1.3. Usesimpleandconcisemessages..................................................................................22

3.1.4. Useactivewords.............................................................................................................22

3.1.5. Providereassuranceinformation....................................................................................22

3.1.6. Provideaninstructionateverydecisionmakingpoint...................................................23

3.1.7. Providedifferenttechniquesforangulardirections.......................................................23

3.1.8. Provideauditorycues.....................................................................................................25

3.1.9. Dividetherouteintoclearsegments..............................................................................25

3.1.10. References....................................................................................................................26

3.2. Effectivemessagingforaudioinstructions......................................................................27

Page 5: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 5

3.2.0. Overview.........................................................................................................................27

3.2.1. Thecomponentsofanaudioinstruction........................................................................27

3.2.2. References......................................................................................................................29

3.3. Differenttypesofaudioinstructions..............................................................................30

3.3.0. Overview.........................................................................................................................30

3.3.1. Theclassification.............................................................................................................30

3.3.2. References......................................................................................................................32

4. Guidelines................................................................................................................33

4.0. Purposeofthissection...................................................................................................33

4.1. Guidelinesforvariousenvironmentalelements.............................................................33

4.1.0. Purposeofthissection....................................................................................................33

4.1.1. Entrances&Exits............................................................................................................34

4.1.2. Pathways.........................................................................................................................36

4.1.3. Tactilepaving..................................................................................................................38

4.1.4. Escalators........................................................................................................................40

4.1.5. Stairs...............................................................................................................................42

4.1.6. Lifts..................................................................................................................................44

4.1.7. Ticketcontrol..................................................................................................................46

4.1.8. Platforms.........................................................................................................................48

4.2. Guidelinesforvarioustypesofbuilt-environment..........................................................51

4.2.0. Purposeofthissection....................................................................................................51

4.2.1. MainlineRailandMetrostations....................................................................................51

4.3. Guidelinesformobileappdevelopment.........................................................................53

4.3.0. Purposeofthissection....................................................................................................53

4.3.1. Guidelinesformobileappfunctionality..........................................................................53

4.3.2. Guidelinesforsounddesign............................................................................................57

5. Wayfindingtechnologies..........................................................................................60

5.0. Purposeofthesection....................................................................................................60

5.1. BluetoothLowEnergybeacons.......................................................................................60

5.1.0. Purposeofthissection....................................................................................................60

5.1.1. WhatisaBluetoothLowEnergybeacon?......................................................................61

5.1.2. SourcesofBluetoothsignaldistortion............................................................................61

5.1.3. BLEbeaconinstallation...................................................................................................62

5.1.4. TheparametersofaBLEbeacon....................................................................................65

Page 6: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 6

5.1.5. MaintenanceandOperationalconsiderations...............................................................70

5.1.6. References......................................................................................................................71

6. OpenSourceWayfindrDemomobileapp.................................................................72

6.0. TheaimoftheWayfindrDemomobileapp....................................................................72

6.1. HowwasWayfindrDemomobileappdeveloped...........................................................72

Page 7: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 7

1. Getting started

1.0. Why an Open Standard? There are an estimated 285 million people worldwide living with sight loss.

This can often lead to isolation, poverty and depression. Of the estimated 2 million vision

impaired people in the UK, almost half say they would like to leave their home more often.

At the moment many vision impaired people are unable to travel independently, instead relying on other people to help them get around or just not venturing out at all.

What if vision impaired people were empowered to navigate independently using the smartphone they already have in their pockets?

Emerging indoor navigation technologies such as Bluetooth Low Energy (BLE) Beacons hold

the key to opening up the world for vision impaired people. However, in order to achieve the

greatest impact globally, there is a pressing need to develop a consistent standard to be

implemented across wayfinding systems. This will truly open up a world where vision

impaired people are no longer held back by their sight loss, removing barriers to employment, to seeing friends and family and engaging in their community.

Accessibility needs to be ‘baked’ into the roll-out of all indoor navigation services. Venues

need reassurance that their investment in the installation of indoor navigation services will

improve the customer experience, for all their customers. The Wayfindr Open Standard aims to do just that.

When individuals and organisations get behind a purposeful vision, solutions to what previously seemed like big challenges become attainable.

The aim is that this Open Standard will help lower the barrier for built-environment owners

and digital navigation services to make their environments, products and services inclusive from the outset as we continue to weave technology into our cities.

Once the Open Standard is adopted across the built-environment and digital navigation

services alike, vision impaired people will benefit from a consistent, reliable and seamless navigation experience.

Page 8: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 8

1.1. What types of content make up the Open

Standard? The Wayfindr Open Standard contains different types of content:

• Factual information, e.g. as in Section 2 “Learning about mobility for vision

impaired people” and Sections 5.1.1-5.1.2 and 5.1.4

• Recommendations for best practices, as seen in Section 3 “Designing for vision

impaired people” and Section 5.1.3 “BLE beacon installation”

• Guidelines, as in Section 4, which have been validated through Wayfindr trials and

other resources as seen in Section 1.2 below

• Suggestions for further Investigation that can be found in the Guidelines

section and are appended “s”, e.g. “s4.1.1.1”. As their name suggests, these paragraphs

have the potential to become guidelines once they are validated through further

investigations

• Considerations, e.g. as in Section 5.1.5, which include things that need to be considered for specific tasks

1.2. How are the guidelines informed Six main resources have informed the content of the “Candidate Recommendation 1.0”:

• The work of the Working Group that consisted of the following members:

o Martine Abel-Williamson, Objective Leader – Access to the Environment and

Transport, World Blind Union

o Nicole Holmes, Technology Accessibility Officer, Guide Dogs NSW/ACT

o Manuel Ortega, Head of R&D, Ilunion Technology and Accessibility

o Kelly Prentice, Mobility Specialist, Guide Dogs NSW/ACT

o John Welsman, Policy Lead on Transport and Travel, Guide Dogs UK

• The qualitative user research conducted by Wayfindr in the trials in London and

Sydney. In total, 75 user sessions have taken place in these trials.

• Academic research that supports the findings in Wayfindr trials. Wherever used,

academic research resources can be found under “References”.

• Factual information about vision impairment and the navigation techniques of vision

impaired people. The resources for this information can also be found under

“References”.

Page 9: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 9

• Input and feedback from industry experts: Professor Peter Barker OBE, Alan Brooks

- Mobility Specialist, Henry Daw - Sound Designer and Audio Branding Consultant &

Owner of Oblique Sound, Ann Frye - Independent Accessibility Consultant, Google

Beacon Platform Team, Sue Sharp – RLSB, Alastair Somerville - Sensory Design

Consultant at Acuity Design

• Input and feedback from Wayfindr Community members such as BlindSquare, BlueCats, Estimote, Guide Dogs NSW/ACT, Kontakt.io, Nominet Innovation

1.3. How to use the Open Standard Each section of the Wayfindr Open Standard might be relevant to different audiences. Some of the audiences that will benefit from the Open Standard are:

• Venue owners and their access consultants that want to make their estate

accessible

• Developers and designers of navigation products and services that offer

wayfinding for vision impaired people

• Researchers who are conducting research or experiments in the area of wayfinding for vision impaired people

Anyone who is involved in projects about wayfinding for vision impaired people should

become familiar with Section 2 “Learning about mobility of vision impaired people”,

that provides an introduction to the world of vision impaired people and their navigation and mobility techniques.

If you are a venue owner or an accessibility stakeholder, the following sections will be the most relevant:

• Section 4.2 “Guidelines for various types of built-environment” with

guidelines about different types of built-environments, such as a rail station and the

environmental elements that are likely to be found in them.

• Section 5 “Wayfinding technologies” that includes information and

recommendations about the installation, configuration and maintenance of specific technologies that can be used for wayfinding, such as Bluetooth Low Energy beacons.

If you are involved in digital navigation services as a researcher or designer, the following sections include the most relevant content:

Page 10: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 10

• Section 3 “Designing for vision impaired people” provides an overview of the

design principles that underpin the Wayfindr Open Standard as well as a taxonomy of

the elements that formulate an audio instruction.

• Section 4.1 “Guidelines for various environmental elements” with guidelines

about the information that needs to be provided when vision impaired people

interact with or confront various elements in their environments, such as entrances,

escalators, lifts etc.

• Section 4.3 “Guidelines for Mobile App Development” with guidelines about

the functionality that should be provided through a digital navigation service in order

to provide a good user experience for vision impaired people. Section 4.3.2 is

dedicated to guidelines around sound design, as sound is an important aspect in navigation for vision impaired people.

If you are involved in digital navigation services as a developer, the following sections will be relevant:

• Section 3 “Designing for vision impaired people” that provides a taxonomy of

different types of instructions that are used in a wayfinding system.

• Section 4.3 “Guidelines for mobile app development” with guidelines about

the functionality that should be provided through a digital navigation service in order

to provide a good user experience for vision impaired people.

• Section 6 “Open Source Wayfindr Demo mobile app” with information about

the latest version of the Wayfindr Demo Mobile app and a link to a Github repository

with access to the source-code. The Demo app has some features integrated as

recommended by the “Guidelines for Mobile App Development”. This app is open-sourced and aimed for testing and demonstration purposes.

1.4. What is a Candidate Recommendation? The Candidate Recommendation is a document of greater maturity than the Working Draft.

It’s the result of the collaborative work of the Working Group that was formed in September 2016 (see above for the members of the Working Group).

Made up of members of the Wayfindr Community, the Working Group received, digested, and evaluated contributions, then proposed updates to the Working Draft.

The release of a Candidate Recommendation triggers an 8-week period for public feedback, where extensive feedback is being sought by the Community and the public.

Page 11: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 11

If you have any feedback on the candidate recommendation or the Open Standard you can send it to [email protected].

Page 12: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 12

2. Learning about mobility of vision impaired people

2.0. Purpose of this section This section provides an introduction to vision impairment and the mobility aids and

techniques used by vision impaired people to navigate their way around in the course of their

everyday lives. The aim is to inform anyone who is involved in installing and operating

wayfinding systems in built-environments as well as creating navigation services and products for vision impaired people.

A person with a vision impairment moves through an environment processing what is

happening in their immediate personal space to negotiate objects and changes in surface

level, whilst keeping orientation to their final destination. Each blind or vision impaired

person will use strategies that work best for them and use the information which is pertinent

for them to achieve this. This information may be gathered from residual vision (very few

vision impaired people in the world have no light perception), tactile information (e.g.

underfoot surface texture) following building line, auditory information (e.g. hearing

landmarks & cues) and other sensory information (e.g. smells from a coffee shop, bakery etc.… or vibrations from an aid such as a cane).

Wayfinding technologies such as beacons are a tool that can be used to add another layer of

information to use when moving through an environment. It may provide awareness of items

in the person’s immediate space that they will need to negotiate. It can also provide confirmation they are on the way to their final destination or arrived.

2.1. Some facts about vision impairment According to the World Health Organisation (WHO) [9, 10], there are an estimated 285

million vision impaired people worldwide. The vast majority live in less developed countries,

in low income settings and are aged over 50. The most common causes of vision impairment globally in 2010 were:

1. refractive errors - error in how the eye focuses light - 42%

Page 13: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 13

2. cataracts - a clouding of the lens in the eye - 33%

3. glaucoma - a group of eye diseases which result in damage to the optic nerve and vision loss - 2%

The World Health Organisation (WHO) [9, 10] classifies the levels of vision impairment as follows:

• Normal (full) vision - no visual impairment

• Moderate vision impairment

• Severe vision impairment

• Blindness

About 15% of people who are registered as having vision loss cannot see anything at all [10].

The remaining 85-90% may have residual vision or other types of low vision and may have difficulties with colour, light, form or movement perception.

2.2. Primary mobility aids The most commonly used primary mobility aids in many countries are a (long) white cane

and a guide dog (“seeing eye dog”). Vision impaired people might be using one or both of

them depending on the environments they are travelling through. Generally, guide dog users

have long cane skills, as good orientation and mobility skills are required to ensure that the

guide dog is guiding the person who is vision impaired in the correct manner and along the routes that the user specifies.

In those countries and cultures that use them, the primary role of the guide dog is to walk in

a straight line avoiding obstacles in the person’s path of travel. It is the job of the guide dog

user, however, to give the dog commands and direct the dog to find certain features of the

environment and get to a destination. Guide dog users often ask their guide dog to target a

specific landmark such as a door, steps, escalator etc., a task that is called “targeting”.

Guide dogs are able to memorise a route; however the user always needs to be aware of the

environment they are travelling through and cannot rely totally on the guide dog to get to a

destination. The guide dog needs support and encouragement from the user to ensure that the person is safe and gets to a destination.

There are three types of cane that a person who is vision impaired may use:

• Long cane: Used by people with little or no vision.

Page 14: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 14

o A long cane allows the user to detect obstacles and hazards, drop-offs/kerbs,

ground level changes and stairs in the path of travel.

o A long cane provides information from the environment that assists

orientation. For example, the cane user can detect changes in surface textures

between grass and concrete to follow a footpath.

o Many cane users experience an increase in confidence because they hesitate

less about the safety of the next step.

o A long cane improves the user's posture, because they don't need to feel the

ground with their feet while travelling or walk with the head down to check

the surface directly at their feet.

• Identification/symbol cane: This cane is a tool that allows the general public to

identify a person as having a vision impairment. This cane is not designed to make

contact with the ground surface, but may be used to check the height of a step or

drop-off/kerb.

• Support cane: The white support cane allows a person who requires balance and stability support to be identifiable as having a vision impairment.

Any navigation service should not replace the primary mobility aids, but rather it should be

treated as an orientation tool used along with other skills to augment the user experience and reassure its users.

2.3. Orientation and Mobility (O&M) training In many countries, some vision impaired people receive Orientation & Mobility (O&M)

training. This is usually one-to-one training with a Mobility Specialist and the vision

impaired person learns techniques and skills that will help them to travel safely and independently in their environment.

In the context of Orientation & Mobility training, orientation refers to the knowledge of the

individual of where they are located in an environment and how they will get to a destination confidently and safely [6].

Some of the skills that may be covered in this type of training include using residual vision,

sensory awareness, understanding how objects relate to each other in one’s environment, how to search for places and objects, personal safety and cane usage.

The techniques for long cane training are outlined below:

Page 15: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 15

• The cane is used with the index finger held alongside the cane so that the cane acts as

an extension of the finger to provide a tactile experience of the ground one pace

ahead of the user. Each user will be measured and fitted with a cane to match their

height, length of stride and walking speed.

• Touching or Constant Contact: The cane user moves the cane in an arc from left

to right, slightly wider than their shoulder width to ensure that they preview the

environment prior to moving forward in order to detect features such as barriers,

upward or downward slopes and obstacles. Touching or Constant Contact technique

allows the user to arc the cane from left to right while keeping constant contact with

the ground surface.

• Two-point touch: This is where the cane user taps the cane to the left and right

instead of keeping constant contact with the ground. This allows the person to get

audio feedback about the environment from the cane taps.

• Shorelining is following a wall, kerb, hedge or other contrasting surface to the one a

person is walking on in order to maintain a specific orientation while travelling

through environments to arrive at a decision-making point. Shorelining allows a

person to arc the cane either by constant contact or two-point touch technique to

cross an open space or easily travel towards a known point in a crowded

environment. Shorelines are an important element of how a vision impaired person

navigates. There are two shorelines:

o Inner shoreline, in the junction of a ground surface and a wall

o Outer shoreline, in the junction of pavement and kerbline

• Trailing is different to shorelining in that the technique allows a person to trail a

contrasting surface such as a wall with their hand. The degree to which people use

shorelining or trailing depends on the individual, their orientation and mobility skills

and the environment they are in at any one time. Generally shorelining is the preferred technique in public places.

Importantly, those who are experienced travellers may not use shorelining or trailing.

Each traveller differs in their approach to different techniques, so designers should

not refer to shorelines or trailing as the only techniques to use in their environments.

Rather they should design a system that refers to different features of the

environment.

Page 16: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 16

2.4. Landmarks and clues Primary landmarks are defined as objects always found in the environment that are difficult to miss at a particular location such as kerbs, a change in walking surface etc.

Clues are more transient and may include sounds, smells, change of temperature etc. Clues

may include the noise of machinery but this may only be present in the working day, smells

of the baker’s shop or dry cleaners, the change of temperature as you enter the open door to a building providing the air conditioning/heating is working.

Landmarks and clues play an important role in wayfinding and navigation, as they reassure

the individual that they are walking in the right direction as well as helping to place themselves in the space [1].

Sighted people can easily identify features in the environment that they use as for this

purpose. Vision impaired people also make use of landmarks and clues as they move,

although they may be different to those used by sighted people. Orientation & Mobility

experts define clues and landmark as, “any familiar object, sound, smell, temperature, tactile

or visual clue that is easily recognised, is constant and has a discrete permanent location in

the environment that is known to the traveller.” A clue can include the sounds, smells,

temperature, tactile clues etc., whereas a landmark is a specific and permanent feature,

which is familiar to the user.

Vision impaired people tend to use different clues and landmarks than sighted people,

sometimes more detailed ones [2, 7]. Most seem to be closest to the individual [6] in other

words in the areas that can be touched by the long cane or felt through the soles of the feet.

Non-visual cues such as wind direction, smell of the bakery, soap shop, heat from the sun are

more inconsistent and so considered less reliable. In order to increase the reliability of the

clues used, vision impaired people often combine the use of different senses. For example

they may use a tactile landmark followed by an auditory clue in order to confirm that they are approaching a landmark [2].

None of the above should be seen as supporting the myth that blindness sharpens other

senses but rather, that it makes vision impaired people pay more attention to their available senses in order to cognitively process the information from the environment.

Page 17: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 17

2.5. The safest, not the fastest or shortest route For most digital navigation services, the main criteria for calculating the shortest route are

distance and time taken. Some navigation services have additional criteria such as quieter routes, less stressful routes and those with fewer changes.

During Wayfindr trials and in other work by other researchers [3, 5, 8] vision impaired

people reported that they are willing to walk further provided that the longer route is

considered safer and easier to manage. For example, instead of crossing a hotel lobby where

there are a lot of people waiting with luggage they might prefer to walk all the way round the

lobby rather than face obstacles and the potential danger of collision and/or loss of

orientation. This makes their journey less stressful and enhances confidence. Thus the shortest or quickest route may not be appropriate for some vision impaired people.

Routes should be planned using a combination of what is considered a safer route and less

challenging for the individual and with the advice of Orientation and Mobility specialists. A

number of different routes should be considered as some travellers will prefer certain routes

depending on their travel skills, the situation and where they need to get to at a given time.

Guide dog users however will utilise the dog’s ability to avoid obstacles, pedestrians and

other hazards along with its ability to target the interim or final destination, which may result in them taking the shortest, most direct route.

Some partially sighted people, for example, confident independent travellers, may still prefer

the shortest or quickest route to their destination. Therefore, it should not be assumed that a difficult route for one vision impaired person might be the same for another.

NB. This functionality, offering alternative and personalised routes, is not currently demonstrated in the Wayfindr Demo iOS app v0.4.

2.6. User Preference for discreet technology Holding a device

During Wayfindr trials, vision impaired people reported that they would feel vulnerable

holding a smartphone in their hand when navigating public spaces, e.g. because of the risk of

having it stolen. They also pointed out that with one hand usually holding their primary

mobility aid they need the other hand to be free to hold handrails or their ticket. For these reasons most people would be prefer to keep the smartphone in their pocket.

Page 18: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 18

Even with the device in the pocket it should be able to communicate with Bluetooth beacons

and trigger instructions at the appropriate time. Based on the Bluetooth antenna orientation

and the signal reflection, the instructions might be triggered earlier but this should not cause

a problem for users if the beacons have been installed and calibrated following the best practices as seen in Section 5.1 “Bluetooth Low Energy beacons.”

Listening through headphones

In busy and noisy environments, hearing audio instructions from a smartphone speaker,

particularly when in a pocket, will be a challenge. During Wayfindr trials and in the work of

other researchers [4], vision impaired users reported that they do not like using headphones

whilst moving around as they block out auditory clues and warnings in the environment. Bone conducting headphones might offer a good solution.

Additionally, vision impaired people have reported that ideally they would like to use

wireless headphones to avoid wires becoming tangled. Bluetooth headphones might be a

solution. However due to the limitations of Bluetooth technology, there may be a time delay

in the delivery of the audio instruction to the Bluetooth headphones that might be a problem for the user.

2.7. References 1. Allen, G. L. (1999). Spatial abilities, cognitive maps, and wayfinding. Wayfinding

behavior: Cognitive mapping and other spatial processes, 46-80.

2. Fryer, L., Freeman, J., & Pring, L. (2013). What verbal orientation information do

blind and partially sighted people need to find their way around? A study of everyday

navigation strategies in people with impaired vision. British Journal of Visual

Impairment, 31(2), 123-138.

3. Gaunet, F., & Briffault, X. (2005). Exploring the functional specifications of a

localized wayfinding verbal aid for blind pedestrians: Simple and structured urban

areas. Human-Computer Interaction, 20(3), 267-314. http://lpc.univ-

amu.fr/dir_provence/dir/gaunet/articles/Exploring%20the%20functional%20specif

ications%20of%20a%20localized%20wayfinding%20verbal%20aid.pdf (last

accessed: 24 February 2016)

4. Golledge, R., Klatzky, R., Loomis, J., & Marston, J. (2004). Stated preferences for

components of a personal guidance system for nonvisual navigation. Journal of

Visual Impairment & Blindness (JVIB), 98(03).

Page 19: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 19

http://www.geog.ucsb.edu/~marstonj/PAPERS/2004_JVIB_GMLK.pdf (last

accessed: 24 February 2016)

5. Helal, A. S., Moore, S. E., & Ramachandran, B. (2001). Drishti: An integrated

navigation system for visually impaired and disabled. In Wearable Computers, 2001.

Proceedings. Fifth International Symposium on (pp. 149-156). IEEE.

http://www.cs.umd.edu/class/fall2006/cmsc828s/PAPERS.dir/wearableConf-1.pdf

(last accessed: 24 February 2016)

6. Long, R. G. & Giudice, N. A. (2010). Establishing and maintaining orientation for

orientation and mobility. In B. B. Blasch, W. R. Wiener & R. W. Welch (Eds.),

Foundations of orientation and mobility (3rd ed. Vol.1: History and Theory, pp. 45-

62). New York: American Foundation for the Blind.

http://www.vemilab.org/sites/default/files/Long%20%26%20Giudice(2010)-

%20Orientation%20and%20mobility%20(foundations%20of%20O%26M).pdf (last

accessed: 24 February 2016)

7. Passini, R., & Proulx, G. (1988). Wayfinding without vision an experiment with

congenitally totally blind people. Environment and Behavior, 20(2), 227-252.

http://www.stolaf.edu/people/huff/classes/Psych130F2010/Readings/Passini'88.pd

f (last accessed: 24 February 2016)

8. Swobodzinski, M & Raubal, M. (2009). An indoor routing algorithm for the blind:

development and comparison to a routing algorithm for the sighted. International

Journal of Geographical Information Science, 23(10), 1315-1343.

http://www.raubal.ethz.ch/Publications/RefJournals/Swobodzinski&Raubal_Indoor

BlindWayfinding_IJGIS09.pdf (last accessed: 24 February 2016)

9. World Health Organisation, Fact Sheet No 213, Blindness: Vision 2020 - The Global

Initiative for the Elimination of Avoidable Blindness (last accessed: 24 February

2016) http://www.who.int/mediacentre/factsheets/fs213/en/

10. World Health Organisation, Fact Sheet No 282, Visual impairment and blindness

(last accessed, 24 February 2016)

http://www.who.int/mediacentre/factsheets/fs282/en/

Page 20: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 20

3. Designing for vision impaired people

3.0. Purpose of this section This section outlines the design principles that underpin the Wayfindr Open Standard, the

different types of audio instructions and the various structural elements of an audio instruction.

3.1. Design principles

3.1.0. Overview This section outlines the design principles that inform the guidelines as seen in Section 4. These principles provide guidance about:

• The design and development process of a wayfinding and digital navigation system

for vision impaired people

• The high level thinking for creating audio instructions

• The usage of sound in a wayfinding system for vision impaired people

3.1.1. Involve users in the process

In order to ensure that a wayfinding system is useful and usable, validating the system with

users within the environment in which it will be deployed is key. You must involve users in the design and development process so as to validate the system in the real setting.

It is recommended that a wide range of people with vision impairments are invited to

participate in the validation of the system. The recruitment factors that need to be considered are the following:

Page 21: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 21

• Level of vision impairment: from partially sighted to blind

• Wide age group: 18-55 years old

• Gender: even split

• Mobility aids: no aid, symbol cane, long cane and guide dog.

• Confidence in independence travel: from people who travel independently worldwide

to people who only do familiar local routes without assistance

• Tech literacy: smartphone users from basic to expert (self-assessed)

• Venue familiarity: from no familiar, to everyday users of the venue

Feedback and validation from vision impaired people should be sought:

• To understand user needs for each different type of built-environment:

For example, understand what are the main points of interest, what are the

landmarks used along the route, what are the safest route options etc. Research

methods that could be used in this phase are interviews, observations, surveys and

walkthroughs.

• To validate the system while it is being developed: This might include

incremental testing of the following:

o The installation and the configuration of wayfinding technology, such as

Bluetooth Low Energy beacons (see Section 5.1 for best practices)

o The information and the terminology used in the audio instructions (for

information on how to design the appropriate audio instructions see Section

3.2 and the guidelines for each environmental element under Section 4.1)

o The usability and accessibility of the digital navigation service

• Research methods that could be used in this phase are observations and think-

aloud one-to-one sessions of participants trying out the system.

3.1.2. Focus on the environment not the technology Vision impaired people pay considerable attention to their remaining senses in order to

understand their environment better and to identify environmental landmarks and clues. It

is important therefore that they should have as few distractions as possible from the technology that is assisting them.

With fewer distractions, vision impaired people are better able to memorise new routes.

When they have memorised a route their reliance on digital navigation technology for that

particular route will be reduced and their feeling of confidence and independence will be

Page 22: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 22

increased. Confidence in using technology may also lead vision impaired people to use more routes in less familiar or more complex environments.

3.1.3. Use simple and concise messages The “less is more” principle applies when designing audio instructions. Wayfindr trials and

the work of researchers [3] have shown that vision impaired people do not like long or very detailed instructions due to the time required to think about and process the information.

Limiting messages only to those that are relevant and essential to navigate a route will

greatly assist vision impaired people. This means excluding information that is not relevant

to their chosen route, for example, those obstacles and objects that can be detected and

easily avoided with their mobility aid (long cane or guide dog).

3.1.4. Use active words Audio instructions that need to communicate movement must include active words, i.e.

verbs. For example a phrase like, “The stairs are in front of you” does not imply any action.

Some people might take the instruction literally and will not take any action when they reach

the stairs. Instead a phrase like, “Move forward and take the stairs up” makes the action clear.

Verbs used in audio instructions must be carefully considered. Some verbs may be vague and

open to different interpretations. An example is the verb “to bear”. When it is combined with

a direction such as, “bear left” this might cause the vision impaired person to move through a space in ways that were not anticipated when the messages were constructed.

3.1.5. Provide reassurance information Some vision impaired people may not easily relate to distance information given in feet or

metres. Distance information described in terms of the number of steps may be more

familiar but there are factors to be considered. The length of each step may vary based on the

person’s height and how tired they are feeling. Even if the calibration of someone’s step

length is accurate, it has been reported in research studies [2] that it is demanding for an individual to count or memorise the number of steps required.

In enclosed environments such a railway station, distance information may not be needed.

Instead, reassuring feedback can be sufficient in order to make a vision impaired person feel

Page 23: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 23

secure that they are moving in the right direction or that they are approaching the next

landmark on their route. However, in large open areas a rough estimation of the distance

that they will be travelling before the next instruction might make them feel more comfortable.

3.1.6. Provide an instruction at every decision making point Initially, for vision impaired people learning a route, it is important to have an audio

instruction at every decision making point on the route, even if the user is not changing

direction. The need to follow an instruction in every decision making point is likely to be reduced as people become familiar with the route.

3.1.7. Provide different techniques for angular directions There are various techniques for communicating directions for different angles. These techniques are divided in two broad categories depending on the user’s point of reference:

• Egocentric frame of reference is when the spatial layout and orientation is

communicated based on the individual’s current location and viewpoint. For

example, “Keep moving forwards, the escalators are in front of you” is information

described based on an egocentric frame of reference. The most common ways to

communicate directions like this are by using:

o Clock faces, when the space is divided into 12 parts as on a clock face with

the user situated in the middle of the dial, i.e. with 12 o’clock indicating

straight ahead and 6 o’clock behind. This technique seems to be very popular

with older people as it is taught during O&M training however the younger

generations of vision impaired people do not have a lot of experience with

clocks with fixed numbers and moving hands, which makes this technique

confusing for them. During Wayfindr trials many also reported that it is

difficult to distinguish between 1 and 2 o’clock. Thus, clock faces should be

used to communicate the general direction.

o Degrees, when directions and position are communicated using degrees. For

example, “turn 45 degrees to your right.” Similarly to those using clock faces,

various vision impaired people expressed concerns around distinguishing

between 45 and 60 degrees for example.

o Proportional is when instructions such as “straight ahead,” “to the right,”

“diagonally to the left” or “slightly to your left” are being used to communicate

Page 24: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 24

positioning and direction. Again, this technique is open to interpretation by

users. For example, during Wayfindr trials some vision impaired people

asked, “how many degrees diagonally do I have to turn?” It has been observed

that people interpret the word “slightly” in different ways.

o Orthogonal axis means that direction is communicated based on right

angles to a line going ahead, behind, left or right. In this way the risk of

confusion - as with diagonal directions above - is reduced. On the other hand

users may have to follow a less efficient or longer route based only on

orthogonal directions.

• Allocentric frame of reference is when the layout and the relationship between

objects are described independently from the individual’s current location and

viewpoint. For example, “the escalators are 10m straight ahead after the ticket

barriers” indicate a fixed relationship between these two landmarks. Allocentric

frames of reference are useful because they allow individuals to create a memorable

mental image of an environment that would help them recover should they get lost or

make a detour. The most common technique using an allocentric frame of reference

is:

o Cardinal coordinates when position and directions are communicated as

being “north,” “southeast,” “northwest” etc. Vision impaired people have

reported [1, 5] that they find it difficult to orientate themselves based on

cardinal coordinates as they have to first translate them into an egocentric

frame of reference before use. Use of cardinal coordinates also vary from one

country to another and it can be difficult for anyone to orientate themselves using them indoors, regardless of whether they are sighted or not.

Based on the above, vision impaired people have no single preferred way for communicating

directions. Their preferences are based on various factors including previous Orientation &

Mobility education, experience of independent travel, familiarity with analogue clock metaphors and familiarity with existing digital navigation services. One size does not fit all.

A suggestion for further Investigation on how to provide this choice to users through a digital navigation app can be found in the Section S4.3.1.1.

NB. This functionality, i.e. enabling users to choose their preferred technique for diagonal directions, is not currently demonstrated in the Wayfindr Demo iOS app v0.4.

Page 25: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 25

3.1.8. Provide auditory cues As seen in the Section 2.4 “Landmarks and clues”, auditory information plays an important

role in the everyday life of a vision impaired person, whether this auditory information is

environmental, e.g. a car passing by, or pre-determined, e.g. any sound design component such as a keypad sound or alert.

Any pre-determined auditory cues, i.e. sound design components, designed specifically for

wayfinding, need to reflect a very high level of care and attention in their design, with the aim to guide and assist the user in a safe and reassuring manner.

The predetermined auditory information for digital wayfinding, i.e. the sound design components, comes in two forms:

• Notification alerts, which are sound alerts that precede the audio instructions

• Synthesised voice commands, which are computer-generated voice commands

often associated with the accessible use of a modern smartphone or computer, e.g.

VoiceOver mode on iOS, Windows Narrator or TalkBack mode on Android. These are

vital tools for vision impaired people.

Specific guidelines around sound design along with the proposed Wayfindr sounds can be found in Section 4.3.2.

3.1.9. Divide the route into clear segments Research has shown that a very common wayfinding technique for vision impaired people is

to divide the route into a number of segments that they can memorise and follow, based on

identified landmarks [3]. This technique is called piloting or in the psychology of learning - chaining.

During Wayfindr trials vision impaired people reported that they would like to know where

they were heading as this helps them create an image of the space in which they are moving.

Their preference is to move around in relation to particular areas or landmarks instead of

just simply following instructions. Additionally, some people enjoy having an extra layer of

information to “colour” their journey to give themselves a deeper understanding of their current environment. Therefore, it is good to divide the route into a number of segments.

An example with the route segments in a Mainline Rail and Metro Station can be found in Section 4.2.1.

Page 26: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 26

3.1.10. References • 1 - Chen, H. E., Lin, Y. Y., Chen, C. H., & Wang, I. (2015). BlindNavi: a navigation app

for the visually impaired smartphone user. In Proceedings of the 33rd Annual ACM

Conference Extended Abstracts on Human Factors in Computing Systems (pp. 19-

24). ACM. http://nccur.lib.nccu.edu.tw/bitstream/140.119/74356/1/407291.pdf (last

accessed: 24 February 2016)

• 2 - Kalia, A. A., Legge, G. E., Roy, R., & Ogale, A. (2010). Assessment of indoor route-

finding technology for people with visual impairment. Journal of visual impairment

& blindness, 104(3), 135. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3160142

(last accessed: 24 February 2016)

• 3 - Nicolau, H., Guerreiro, T., & Jorge, J. (2009). Designing guides for blind people.

Departamento de Engenharia Informatica, Instituto Superior Tecnico, Lisboa.

http://www.di.fc.ul.pt/~tjvg/amc/blobby/files/paper-nicolau.pdf (last accessed: 24

February 2016)

• 4 - Swobodzinski, M., & Raubal, M. (2009). An indoor routing algorithm for the

blind: development and comparison to a routing algorithm for the sighted.

International Journal of Geographical Information Science, 23(10), 1315-1343.

http://www.raubal.ethz.ch/Publications/RefJournals/Swobodzinski&Raubal_Indoor

BlindWayfinding_IJGIS09.pdf (last accessed: 24 February 2016)

Page 27: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 27

3.2. Effective messaging for audio instructions

3.2.0. Overview This section describes how messages can be constructed to make audio instructions as

effective as possible. Other researchers’ work [1, 2, 3] has been used as a basis for the analysis of the different components of an audio instruction.

3.2.1. The components of an audio instruction The vital components of an audio instruction are the following:

• Verbs, which communicate the action required. They are mainly about movement

but can convey other types of actions, such as “press a button.” Useful verbs for

instruction include move, walk, turn, take, go, follow, keep moving, keep walking,

keep following, and press.

• Orientation information This information communicates the user’s current

location in relationship to their surroundings. Examples include: there are, it is, you

are now at, the trains leave, you are approaching, you are halfway, when inside.

• Environmental features that comprise:

o Segments are distinctive areas in an environment. As mentioned above, it is

helpful for vision impaired people to break the route down into sections and

announce the forthcoming segments. For the segments of a Mainline Rail or

Metro station see Section 4.2.1.

o Pathways are corridors, ramps, escalators, stairs or lifts (elevators) that

vision impaired people can use to get to their destination.

o Decision points such as crossings or intersections are where more than one

pathway meets and vision impaired people need to be instructed about the

direction they need to take when they reach these decision points.

o Landmarks as in Section 2.4 “Landmark and clues”; vision impaired people

can identify many different landmarks and their choice of landmarks may be

subjective.

o Objects that require interaction. The call button on a lift is a typical object

that requires users to interact with it in the environment.

• Directional delimiters are words and phrases that communicate direction. They

usually follow a verb. Some commonly used examples are: forward, up, down, to,

Page 28: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 28

right, left, through, in front (of you), the left (one), 45 degrees to your right, turn at 2

o’ clock, between, on, on your left, at the platform, for your destination, from (your

left)

• Countable delimiters are words or phrases that count numbers to navigate to an

object or an environmental feature. Some commonly used examples are: the first

(corridor), the next (crossing).

• Sequential delimiters are words or phrases that limit the relationship of one

object to another. Some commonly used examples are: after (the gates), at the bottom

of (the escalators), at the next (passage).

• Descriptive delimiters are words or phrases that describe an object or an

environmental feature. Some commonly used examples are: the up (escalator); the lower (concourse); the wide (gate).

To better understand these, below are some examples of how an audio instruction is structured based on combinations of the various elements above.

Audio Instruction

Example 1

“Turn left and take the escalator down to the platforms.

The down escalator is the one on the left.”

The instruction comprises the following elements:

Verb (i.e. turn), directional delimiter (i.e. left), verb (i.e. take), environmental feature is the

pathway (i.e. the escalator), directional delimiter (i.e. down), directional delimiter (i.e. to),

environmental feature is the area/segment (i.e. the platforms), directional delimiter (i.e. the

down), environmental feature is the pathway (i.e. escalator), state-of-being verb (i.e. is), directional delimiter (i.e. the one on the left).

Example 2

”At the bottom of the stairs, turn right and move forward to the platform.”

The instruction comprises the following elements:

Sequential delimiter (i.e. at the bottom), environmental feature is the pathway (i.e. the

stairs), verb (i.e. turn), directional delimiter (i.e. right), verb (i.e. walk), directional delimiter

(i.e. forward), directional delimiter (i.e. to), environmental feature is the segment (i.e. the platform).

Page 29: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 29

3.2.2. References • 1 - Allen, G. L. (2000). Principles and practices for communicating route knowledge.

Applied cognitive psychology, 14(4), 333-359.

http://www.paulmckevitt.com/cre333/papers/allen.pdf (last accessed: 29 February

2016)

• 2 - Kulyukin, V. A., Nicholson, J., Ross, D. A., Marston, J. R., & Gaunet, F. (2008).

The Blind Leading the Blind: Toward Collaborative Online Route Information

Management by Individuals with Visual Impairments. In AAAI Spring Symposium:

Social Information Processing (pp. 54-59). http://lpc.univ-

amu.fr/IMG/UserFiles/Images/The%20Blind%20Leading%20the%20Blind.pdf (last

accessed: 24 February 2016)

• 3 - Nicolau, H., Guerreiro, T., & Jorge, J. (2009). Designing guides for blind people.

Departamento de Engenharia Informatica, Instituto Superior Tecnico, Lisboa.

http://www.di.fc.ul.pt/~tjvg/amc/blobby/files/paper-nicolau.pdf (last accessed: 24

February 2016)

Page 30: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 30

3.3. Different types of audio instructions

3.3.0. Overview This section classifies the different types of audio instructions that have been used during

Wayfindr indoor trials in places such as Mainline Rail and Metro stations. The analysis of

different audio instructions carried out by other researchers [1, 2] has been used as a

reference. This classification will continue to evolve as research and further trials are carried out in new environments.

3.3.1. The classification The audio instructions can be classified as follows:

● Route starting are instructions that define the starting point for the route as well as

the kind of environment that the users are entering. It is important to identify the

user’s orientation first so that the audio instructions are aligned with it.

● Route ending are instructions that inform users that they have reached their

destination and that the navigation guidance is ending. This instruction can include

information about the physical layout of the destination in order to help vision

impaired people make better sense of the space and position themselves in relation to

other objects such as the exit, a ticket machine, platform edge or bus stop edge. A

specific guideline on Route Ending instructions can be found in Section 4.1.1.1.

● Orientation

○ Orientation in place for progression to the next environmental

feature: These are instructions for direction change that require immediate

action from users. They must link to the current environmental clue or

landmark: an area, a decision point, a pathway or an object. “At the bottom of the stairs, turn left and move forward”

“At the bottom of the stairs, turn left and move forward to the ticket gates”

The instruction above links to the next environmental feature namely the ticket gates.

● Orientation in place for reassurance: These instructions do not require

any change in direction but act to reassure users that they are on the right

route. Usually they repeat the action from the previous audio instruction to

Page 31: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 31

assist positioning in relation to the user’s progression. For example, when

there are no landmarks: “Keep move forward”

“You are halfway there”

If there are landmarks that can be referenced then use them, for example:

“Keep moving forward past the ticket gates on the left”

● Orientation prior to reaching and passing an environmental

feature: These instructions announce a change in direction prior to reaching

an environmental feature. Usually these types of instructions help vision

impaired people to prepare for the next move without them having to stop or

block the flow of people around them. For example, in environments where

tactile walking surface indicators are used to provide guidance:

“At the next tactile paving intersection, follow the tactile paving route to the left”

“Take the first corridor left and move forward.”

Once the environmental feature has been negotiated instructions are needed for a change in direction for example:

“At the bottom of the stairs, move forward”

“At the end of the escalator, turn left and move to the platform”

The instruction above links to the next environmental feature, namely the platform.

• Alerts

o Alerts about location of (next) environmental feature these

instructions inform users about the location of specific features in the

environment. They are descriptive in nature. Usually, they follow immediately

after instructions that orientate users in relation to the next environmental feature. For example:

“The down escalator is the left one.”

“The wide gate is the left one.” “The call buttons are between the two lifts.”

Page 32: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 32

○ Alert when approaching an environmental feature: these instructions

advise users that they are approaching the environmental feature that they

are aiming for. This type of instruction does not usually require any action but

serves more as a warning to users. Normally these alerts are linked to the

previous audio instructions that inform vision impaired people about the

action required once they reach the environmental feature. For example:

“You are approaching the escalators.”

“You are approaching the stairs”

○ Warning about user’s current location These are instructions that

inform users about the spatial layout and their positioning at their current

location. They can be given on demand namely when users ask for their

current location, their progress along their route and information about what is around them.

NB. This functionality is not currently demonstrated in the Wayfindr Demo iOS app v0.4.

3.3.2. References • 1 - Gaunet, F., & Briffault, X. (2005). Exploring the functional specifications of a

localized wayfinding verbal aid for blind pedestrians: Simple and structured urban

areas. Human-Computer Interaction, 20(3), 267-314. http://lpc.univ-

amu.fr/dir_provence/dir/gaunet/articles/Exploring%20the%20functional%20specif

ications%20of%20a%20localized%20wayfinding%20verbal%20aid.pdf (last

accessed: 24 February 2016)

• 2- Passini, R., & Proulx, G. (1988). Wayfinding without vision an experiment with

congenitally totally blind people. Environment and Behavior, 20(2), 227-252.

http://www.stolaf.edu/people/huff/classes/Psych130F2010/Readings/Passini'88.pd

f (last accessed: 24 February 2016)

Page 33: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 33

4. Guidelines

4.0. Purpose of this section This section includes guidelines with various techniques for providing effective navigation

instructions and creating inclusive digital navigation services for vision impaired people. Specifically, in this section developers, designers and researchers can find:

● Guidelines with the information that needs to be provided when vision impaired

people interact with or confront various elements in their environments, such as

entrances, escalators, lifts etc.

● Information about different types of venues or environments, such as a railway or

metro station, and the elements that are likely to be features in them.

● Guidelines about functionality that needs to be provided when developing a digital

navigation mobile app for vision impaired people.

● In addition to guidelines, this section also includes “Suggestions for further

investigation” in future trials. These suggestions have not been tested yet in Wayfindr

trials and so are not included as guidelines. They serve as a log of items for evaluation

in future trials.

We welcome submissions of research findings, suggestions for future investigation and feedback on existing elements in the Wayfindr Open Standard.

4.1. Guidelines for various environmental elements

4.1.0. Purpose of this section This section includes guidelines about the information that needs to be provided when vision

impaired people interact or confront with various elements in their environments, such as

entrances, escalators, lifts etc. The list of environmental elements is by no means exhaustive, but represents the elements that have been included in the scope of Wayfindr trials.

Page 34: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 34

4.1.1. Entrances & Exits

Overview

As seen in the Section 2.5 the shortest or fastest route is not always a priority for vision

impaired people. They often care more about getting to their destination by the safest and

most accessible route. There are however differences in priorities for cane users and guide

dog users. – There might be a difference in priorities for cane and guide dog users (see

Section s4.3.1.7 – Enabling users to choose their mobility aid.)

Taking this principle into account, vision impaired people should be guided into a venue via

the most accessible route in relation to their previous location. For example, when a vision

impaired person is getting off the bus in order to go into a railway or metro station, the

closest or the main entrance is not necessarily the most accessible option. Similarly, when

leaving a venue, they should be guided through the most accessible route based on where

they want to go next.

Announcing orientation after entering or exiting a venue

Inform vision impaired people about their position when they have entered or left a venue so that they make sense of the area around them. For example:

“Welcome to Central Station. You are now on the main concourse. For your train, move forward to the ticket barriers”

“You have exited from Central station. You are now on the east side of Alpha High Street facing north towards the City Hall”

Allowing users to choose entrance or exit

Venues have different types of entrances and exits, which may be close to escalators, lifts,

stairs, and ramps or similar. Which option a vision impaired person chooses will depend on

a variety of factors – their destination point, confidence, mobility aid (guide dog users may

prefer not to use some escalators), familiarity with the route etc. This guideline therefore is

targeted at digital navigation services to allow vision impaired people to choose how they

enter and exit a venue based on their personal preferences.

Page 35: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 35

Informing users about door type

There are different types of doors ranging from automatic sliding doors and revolving doors

through to push doors and the like. When a vision impaired person is approaching an

entrance or an exit with doors, inform them about the type of doors so that they are ready to take the necessary action in order to go through the door. For example:

“Proceed through the automatic door in front of you. It will open towards you.”

Or

“Proceed through the door in front of you. The door is a push door.”

Announcing location of door opening button

As with Lifts (see Section 4.1.6.1), some doors require users to press a button in order to go

through the door. In such cases it is imperative to inform vision impaired people about the

location of the button. Vision impaired people are likely to locate the door first and then wall-trail with their hands in order to locate the button. For example:

“Go through the door in front of you. To open the door, press the exit button on the wall to the left of the door. The button is at waist height.”

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

s4.1.1.1 - Revolving doors: tell users where to find an accessible door

If the main door is a revolving door inform vision impaired people where to locate the

accessible door. A revolving door can be difficult or impossible for vision impaired people to negotiate, particularly those with guide dogs.

s4.1.1.2 - Informing users about the accessible door

Page 36: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 36

In some cases, where the main door of the entrance, exit, toilet etc might be difficult, or

impossible, for vision impaired people to negotiate (particularly with guide dogs) inform users where to locate the accessible door or toilet etc.)

4.1.2. Pathways

Overview

As seen in Section 3.2.1, a pathway is a track that people can follow to get to their

destination. There are many different types of pathways that can be found in a built-environment. Some examples include:

• Corridors

• Ramps

• Tunnels

• Subways

• Pavements

• Crossings

• Intersections and junctions

• Stairs

• Lifts

• Escalators

Although stairs, lifts and escalators are considered pathways, they have dedicated sections in the Open Standard.

Pathways often provide vision impaired people with clues that facilitate their wayfinding

task. For example, in a Metro Station a tunnel may have different lighting and air stream than a corridor intersection.

Providing guidance at each junction on the route

As mentioned in the Section 2.4 “Landmarks and clues”, vision impaired people may,

through the use of remaining senses, identify when they reach a junction especially indoors,

but not everyone will do so or do so consistently. It is recommended therefore to provide an

audio instruction at all route junctions even if the individual doesn’t need to change direction.

Page 37: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 37

Cane users may use the shoreline to identify junctions. If the announcement is adequate it may

even allow the freedom for some travellers to be off the shoreline in order to identify a junction.

The announcement should be given 8 +/-1 metres in advance to give time to change shoreline to identify the junction on route.

Providing reassurance over long distances where no change

in direction is required

Where vision impaired people travel on a route of more than 50 metres without needing to

change direction, they may wish to check if they are heading in the right direction. Once they

hear an instruction they are reassured and may continue as before. An audio instruction to

confirm that they are on the right track is recommended to be given at 25 metre intervals. For example:

“Keep moving forward.”

Informing about a curve in the pathway

If a pathway has a significant curve, inform vision impaired people in advance so that they do not feel that they have taken a wrong turn. For example:

“Keep moving forward. The corridor curves to the left.”

Warning about obstacles on the pathway

If the pathway contains obstacles that might be hazardous or challenge vision impaired people, warn them upfront so that they are prepared to face them. For example,

“Beware, pillars ahead”

“Beware, benches on the left”

Providing information for shorelining

As seen in Section 2.3 “Orientation and Mobility (O&M) training”, one of the common

navigation strategies of long cane users is shorelining. Therefore, inform users if there are

opportunities for shore lining on their route so that they can keep a point of reference. For examples,

Page 38: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 38

“Follow the building line on the left”

“Follow the kerb line on the left”

“Keep left, past the shops”

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

s4.1.2.1 - Advising to keep on one side on a two-way route

On a route where people are moving in both directions, advise vision impaired users to keep

on the side of the route that is best for the direction in which they are travelling. For example:

“Turn left and move forward. Keep on the left side of the corridor after turning”

s4.1.2.2 - Announcing the pathway type

There are different types of pathways such as corridors, tunnels, ramps etc. It is suggested to announce which type of route they are following. For example:

“Turn left and move forward into the tunnel.”

4.1.3. Tactile paving

Overview

Tactile paving (used in many countries) is specialist paving which can be detected underfoot

to convey information to vision impaired people. The paving is also often in a contrasting

colour or tone to the rest of the nearby ground surface to provide an additional cue to those

with useful residual vision. Internationally it is referred to as Tactile Walking Surface

Indicators (TWSI) and is the subject of an international standard [ISO 23599:2012, Assistive products for blind and vision impaired persons].

Page 39: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 39

There are two different types of tactile paving:

• Warning paving, which is used variously to warn of pedestrian crossings, platform

edges and the tops and bottoms of staircases and ramps. In many cases, they can also

be used to signal change of direction.

• Directional paving used for guidance.

Tactile paving is useful for navigation as it provides a useful clue and reference point. Vision

impaired people may be trained in the location of tactile paving along their familiar routes.

References

1. ISO 23599:2012, Assistive products for blind and vision impaired persons

Mentioning tactile paving in all audio instructions

For the reasons mentioned above it is recommended to incorporate tactile paving in audio

instructions where it is used correctly for guidance or warning along a route. It will be important to check that the paving is laid correctly if the messaging is to be accurate.

When tactile paving is being used for guidance, the audio instructions can be expressed as in the following examples:

“Turn left and move forward until you locate the tactile guidance paving. Follow the guidance paving to your left.”

“At the next tactile paving intersection, continue to follow the tactile paving forward.”

“At the next warning tactile intersection, turn left and continue to follow the warning tactile indicators.”

“At the end of the tactile paving, keep moving forward.”

When tactile paving is being used for warning purposes, such as to indicate the edge of a platform, the audio instruction can be expressed as in the following example:

Page 40: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 40

“The platform ahead has tactile paving to warn of the platform edge.”

4.1.4. Escalators

Overview

Vision impaired people become aware when they are approaching an escalator as a result of

different environmental stimuli. It might be a different floor texture (e.g. that covers the

escalator mechanism), the vibrations close to the escalator or the sound the escalators make

when they are moving. In some countries there may be warning TWSIs which indicate the approach to an escalator.

When vision impaired people are about to step onto an escalator they tend to look for the

handrail as an indicator of the direction of the escalator. When they are on the escalator they

tend to hold the handrail for safety and also as an indicator of when the end of the escalator is approaching.

White cane users tend to put their cane against the next step in the escalator. This acts as an

additional indicator when the end of the escalator is near. A guide dog user may stand with one foot ahead of the other for the same reason.

The side of the escalator on which people are encouraged to stand can vary between

countries. In the UK, for example, in many transport environments, it is recommended that

people stand on the right hand side of the escalator. You may want to include an audio

reminder that lets the user know on which side of the escalator to stand when they step onto

it. An audio reminder for vision impaired people may well prevent them being pushed by other passengers.

Announcing direction of travel of escalator

Vision impaired people need information about the direction they need to follow in order to orientate themselves. For example:

“Move forward and take the escalator down to the platforms.”

“Move forward and take the escalator up to the main concourse.”

Page 41: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 41

If there is more than one escalator it is vital that vision impaired people are directed to the

one they require. For example after the instructions above it would be helpful to have these follow up instructions:

“... Both escalators are going down.”

“... The up escalator is the one on the left.”

“... The two escalators going up are the ones on the left.”

Given this information vision impaired people are to be expected to stand on the side of the

corridor close to the escalator. Alternatively, if they are not in a corridor, they may be able to

understand from the noise of the escalators and the flow of people around them which escalator they need to take. If in doubt they will ask staff or other passengers.

In some cases, the direction of the escalator is likely to change often during the peak hours.

This change should be taken into account and the instructions should be updated accordingly.

Announcing proximity of escalators

If the instruction about taking the escalator is given more than 25 +/- 1 metres in advance,

an additional instruction is needed 8 +/- 1 metres before the escalator to reassure vision impaired people that they are moving towards the escalators as intended. For example:

“You are approaching the escalators.”

Indicating next move while travelling on the escalator

When vision impaired people are travelling on an escalator they need to be prepared for their

next move when they step off the escalator. This will enable them to avoid being pushed from

behind by other passengers or prevent others from getting off the escalator behind them.

For example:

“At the bottom of the escalator, turn left and move forward to the ticket gates.”

Page 42: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 42

4.1.5. Stairs

Overview

Short flights of stairs that are straight rather than curving, and which have wide steps seem

to be the preferred option as they allow vision impaired people to assume control about how

they move.

Not all guide dogs are trained to use escalators. Users of dogs which are not trained, as well

as some other vision impaired people will prefer to use fixed stairways (but not spiral stairs).

For most vision impaired people having a handrail to hold on to is vital so that they are able

to maintain their balance. Some guide dogs have been trained to target the handrail on the right.

When going up the stairs, long cane users hold the cane vertically against the riser of the next

tread. In this way they detect the steps as they go and can identify when they have reached

the landing or top of the stairs. Similarly, when going down stairs long cane users hold the cane against the edge of the second next step so that they can sense where the steps end.

Announcing the direction of stairs

It is helpful to announce if the stairs are going up or down to assist vision impaired people in creating a mental map of their environment. For example:

“Turn left and take the stairs down to the ticket hall.”

“Move forward and take the stairs up to the main concourse.”

Describing the number of steps

Describing the number of steps enables vision impaired people to move forward with

confidence. However, only give the number if it is a short flight of 12 steps or fewer.

Counting steps on longer flights can be a distraction for vision impaired people, particularly in busy environments. For example:

“Move forward and take the stairs down to the ticket hall. There are nine steps.”

“Move forward and take the stairs down to the ticket hall. This is a long flight.”

Page 43: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 43

Announcing if there is more than one flight of stairs

Vision impaired people will use their primary mobility aid to indicate when they have

reached a landing and top/bottom of stairs. However, an audio instruction will also be

useful. There are various ways to communicate this, such as “landing between” or “break between” with a preference for the latter. For example:

“Move forward and take the stairs down to the ticket hall.

This is a long flight with a landing at the mid-point.”

In the case direction changes at the landing point, inform vision impaired people about this direction change. For example:

“At the landing turn right for the next part of the flight.”

Indicating next move while using stairs

As with Escalators (Section 4.1.4) and Ticket control (Section 4.1.7) vision impaired people

need to prepare for what they will do once they leave the stairs. It is recommended to leave

a pause between groups of instructions particularly if the instructions are lengthy. For example:

“Move forward and take the stairs down to the ticket hall. This is a long flight with a landing at the midpoint. (Pause) At the bottom of the stairs, move forward.”

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

s4.1.5.1 - Advising which side of stairs to use

As with Escalators (Section 4.1.4), on some staircases users will be advised to use one part of

the stairs, i.e. either the left or the right. Vision impaired people need to be guided towards that side of the stairs. Suggested example:

“Move forward and take the stairs up to the ticket hall. This is a long flight.

Keep to the left side of the staircase.”

Page 44: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 44

s4.1.5.2 - Announcing open riser stairs

Open riser staircases present particular challenges for vision impaired people. For example,

there is a danger for their white cane or their feet to get caught in the open riser staircase. In

such situations, it may be helpful to include notice of that detail in the audio instruction. Suggested example:

“Move forward and take the stairs up to the ticket hall. This is an open riser staircase.”

4.1.6. Lifts

Overview

When vision impaired people are about to use lifts (elevators), they may face challenges as the examples below:

• Which lift will be the next to arrive?

• Are the doors automatic or do they need to be opened physically and how is that

done?

• Is the lift going up or down?

• Where are they currently, i.e. which floor or level are they on?

• Which side of the lift the doors will open (if there are doors at both ends)?

It is increasingly the case that lifts have audible announcements to inform their users about the aspects above.

Lifts are a good option when they save the effort of navigating around the station, for

example when a lift takes passengers from the platform straight to the ticket control area.

Some people also report that they do not feel comfortable in the closed environment of a lift during an emergency.

There may be no reasonable alternative, for example, where step access is very long and is only to be used in emergencies.

Page 45: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 45

Announcing the location of the call button

When facing a lift vision impaired people will first locate the lift door first and then scan the

surrounding area to locate the lift call button. It is recommended to announce the location of

the call button in relation to the lift doors. For example:

“Turn right and take the lift down to the ticket hall. The call button is to the left of the lift doors at waist height.”

Announcing which button to press when travelling in a lift

If the lift serves more than two levels, the audio instruction should indicate where the buttons are inside the lift and which button to press to get to each level. For example:

“When inside, press the button marked minus one for the ticket hall.”

Modern lifts have tactile buttons and may also have braille information for the 10% of vision

impaired people who use Braille. This is according to various Standards (BS EN 81-70: 2003; BS 8300:2009+A1:2010) and Regulations.

References

● Approved Document M: access to and use of buildings, volume 1: dwellings (2015),

Department for Communities and Local Government.

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/50

6503/BR_PDF_AD_M1_2015_with_2016_amendments_V3.pdf (last accessed: 2

April 2016)

● BS EN 81-70:2003, Safety rules for the construction and installation of lifts.

Particular applications for passenger and goods passenger lifts. Accessibility to lifts

for persons including persons with disability, British Standards Institute.

● BS 8300:2009+A1:2010, Design of buildings and their approaches to meet the needs

of disabled people. Code of practice. British Standard Institute. ● AS1735.12 Lifts, escalators and moving walks: Facilities for persons with disabilities.

Indicating next move before getting out of a lift

As with Escalators (Section 4.1.4), Stairs (Section 4.1.5) and the Ticket Control (Section

4.1.7), vision impaired people require advance warning for their next move before they get

out of the lift. This is in order to maintain a continuous flow of people when the lift doors

Page 46: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 46

open and means that vision impaired people will not inadvertently stand in front of the lifts. For example:

“Upon leaving the lift, turn left and move forward to the ticket hall.”

4.1.7. Ticket control

Overview

Ticket control gates usually make a distinctive sound that may be a useful clue for those approaching the gates.

Most travellers choose their gate depending on where they are going and where they came

from. Vision impaired people using a white cane, or without any mobility aid, seem to prefer

the gate with the most straightforward access, regardless if it is a narrow or wide gate. Guide

dog users most often require directions to the wide, accessible gate. In some stations the

accessible gate is designed for wheelchair users and may work differently from the regular gates. The user should be warned about this as they approach.

As it is common practice amongst transport operators to have staff standing next to the wide

gate, it may be preferable to guide vision impaired people towards the wide gate where they will find staff to assist them as required.

Choosing the correct gate to travel in desired direction

To control crowd flow, transport operators try to distinguish routes through tickets gates for

embarking and departing passengers. Vision impaired people should be guided to the gates that will take them in the direction they require. For example:

“Turn left and move forward to the ticket gates. The ticket gates leading to the platforms are on the left.”

When giving this instruction, expect some vision impaired people to bear left so that they align with the gates that lead to the platforms.

Page 47: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 47

Describing the location of a wide gate

For the reasons stated in Section 4.1.7.0 above, vision impaired people should be informed about the position of a wide gate. For example:

“Go through the ticket gates in front of you. The wide gate is on the right.”

Guide dog users prefer the wide gate as they can maintain the correct guiding position in

relation to the dog whilst going through the barrier. If the gate is at the extreme left or right

of the ticket gates the dog will normally return to the initial line of travel without prompting. This is based on the Straight Line principle in guide dog training.

If a digital navigation service captures user options such as the mobility aid (e.g. guide dog or

long cane) (as suggested in Section s4.3.1.7) or preference to the type of gate (narrow or

wide), then the instructions should be personalised and should guide vision impaired people to the appropriate gate, i.e. narrow or wide.

Indicating next move while approaching the ticket control

As with Escalators (Section 4.1.4) and Stairs (Section 4.1.5) vision impaired people need to

know what they should do once they go through the gates, otherwise they may obstruct passenger flows and be at risk of injury. For example:

“Go through the ticket gates in front of you. After the ticket gate, turn left.”

If the first part of the instruction includes a lot of information, pause before providing subsequent information about what to do after the gate. For example:

“Go through the ticket gates in front of you. The wide gate is the on the left.

(Pause) After the gate, turn left and move forward.”

Using appropriate terminology to refer to the ticket control

Terminology depends on context and varies between countries. It is important therefore to

consider the terms to be used to describe the ticket control. The terms most frequently used for the ticket control are:

• ticket barriers

• gates

• wide gates or accessible gates

Page 48: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 48

Avoid operational terms like “gate line” as those are not terms used by the general public.

Clearly communicating actions required

Verbs that communicate clearly an essential action should be used to guide vision impaired

people when they have to go through a gate. For example:

“Go through the ticket gates in front of you”

“Proceed through the ticket barriers in front of you”

Phrases like “cross the gate” may create confusion. If they are taken literally users may think that they simply need to walk past the gate rather than actually going through it.

4.1.8. Platforms

Overview

Some vision impaired people correctly identify when they arrive on a platform as a result of

stimuli such as an alteration in the movement of air around them, the number of people

standing close to them and the sounds coming from the train. This is not true for everyone. It

is therefore vital to make an announcement prior to their arrival on the platform, both to

warn them that they are entering a platform area and to make them aware of the type of platform, for example, an island platform.

Platforms are considered the most challenging part of a station for a number of reasons but

primarily because of the risk of falling on to the tracks. In many countries, mainline railway

and metro stations have tactile paving warning of the edge of the platform. Vision impaired

people look for this in order to be aware of where they are in relation to the edge of the

platform. Some metro stations have sliding doors between the train and the platform to reduce the risk of falling down on to the tracks.

Live information about the next train is extremely helpful for vision impaired passengers who need to know the direction of travel of the train.

Secondly, it can be challenging to know where the train doors will be when the train stops to

ensure easy boarding of the train. Many transport operators cannot predict precisely where the doors will be when the train stops.

Page 49: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 49

Thirdly, it is vital to communicate the location and size of any gap between the train and the

platform edge and whether it is a step up or down to the train. Additionally, any difference

between the lengths of the train and the platform must be communicated. Vision impaired users need this information to step on and off safely.

Guidance is needed when some vision impaired users get off the train so that they can make their way safely to the nearest exit from the platform.

Announcing arrival on the platform

It is advisable to announce that passengers have reached the platform when they are about to board the train. For example:

“You are now at the [Southbound] [Central] line platform.”

“After the escalator, you will be on the [Southbound] [Central] line platform.”

On some platforms there will be tracks on either side, i.e. on island platforms. This should be announced:

“You are now at a platform where trains leave from both sides.

Southbound [Central line] trains leave from the right. Northbound [Central trains] leave from the left.”

Determining orientation in relation to the platform edge

It is necessary to communicate with vision impaired passengers so that they can understand where they are in relation to the platform edge and the train. For example:

“You are now at the platform. The trains leave in front of you.”

“You are now at the platform. The trains leave from your left.”

“You are now at the platform. The platform edge is on your left.”

Page 50: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 50

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

s4.1.8.1 - Determining position in relation to the platform length

Vision impaired passengers require information to determine exactly where they are on the

platform in relation to the platform length. This information is valuable because again it

provides more clues to help them make sense of space. Some vision impaired people might

also want to board a specific section of the train if this will help them find the exit more

easily when they leave the train. For example:

“You are standing at the front part of the platform.”

“You are standing in front of carriage 3.”

You are standing between carriages 2 and 3.”

Note that these messages might not work for island platforms where the direction of trains

might be different or might not be as effective if the trains do not have a consistent number of carriages.

s4.1.8.2 - Determining orientation in relation to the direction of travel

To assist vision impaired passengers it is recommended that there is an announcement about the direction of approach of the train. For example:

“The train will arrive from your left as you face the platform edge”

s4.1.8.3 - Announcing close proximity of two platforms

Where two platforms are very close to each other, but are not island platforms, vision

impaired passengers require guidance about the correct platform that they wish to take. For example:

“You are now at the platform. [Central] line trains leave from the platform on the left.”

Page 51: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 51

s4.1.8.4 - Warning if platform is part of pedestrian route

In some cases the route to another platform or an exit goes via a platform. As platforms are a

potentially risky area for passengers it is recommended that vision impaired users are

advised that they are required to walk along a platform so that they may be aware of the platform edge.

s4.1.8.5 - Announcing nearest way out before leaving the train

On exiting a train vision impaired people will generally want to move away from the platform as quickly and safely as possible by the nearest exit.

Taking into account the need for safest route, as seen in Section 2.5, it is advisable therefore to provide directions that avoid routes involving negotiating other platforms if possible.

4.2. Guidelines for various types of built-

environment

4.2.0. Purpose of this section The purpose of this section is to provide information about different built-environments,

their main characteristics in terms of layout and the potential clues and landmarks that are likely to be used by vision impaired people in these environments.

Under each type of built-environment, there is a link to Guidelines for the environmental elements that are likely to be found in this environment.

4.2.1. Mainline Rail and Metro stations

Overview

As seen in Section 3.1.9, the route should be divided into a number of segments that could

help vision impaired people to understand better the environment they are in. These

segments serve as reference points and help vision impaired people to create an image of the space and understand where they are heading.

Page 52: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 52

For mainline rail and metro stations, there are conventions in the station layout design that could be used as route segments. For example, most stations have:

• Interchange connections or surface transport, that allows passengers to

connect to and from other modes of transport

• A forecourt outside the station with pathways that might lead to other modes of

transport

• Multiple entrances with one usually designed to be the main entrance. Each

entrance might provide a different way to access the concourse, e.g. via stairs, lifts or

escalators

• The concourse is usually divided into the unpaid and the paid concourse, divided by

ticket barriers

• Platforms, where passengers onboard and offboard from trains

• Station accommodation, that includes amenities such as retail outlets, toilets, lounges, private areas for staff etc.

As seen in Section 2.4 vision impaired people might identify various environmental elements

as clues or landmarks along their route based mainly on tactile or auditory cues. Examples of

landmarks that can be found in a mainline rail and metro station include:

• Escalators

• Lifts

• Stairs

• Ramps

• Tactile paving

• Pathway intersections

• Ticket gates

• Cash machines

• Ticket machines

• Info points

• Meeting points

• Toilets

Page 53: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 53

4.3. Guidelines for mobile app development

4.3.0. Purpose of this section Developers and designers of digital navigation services can find here:

• A list of guidelines for mobile app features, whose need has been identified through

Wayfindr’s research trials.

• A list of guidelines on sound design, as sounds play an important role in the overall

wayfinding experience and in particular in attracting users’ attention.

• A link to the open-source Wayfindr demo iOS app with some of the recommended

functionality implemented. This app is intended for testing and demonstration

purposes. Feel free to download and customise it in order to run your own research trials.

4.3.1. Guidelines for mobile app functionality

Providing user preview of all audio instructions

It is recommended that vision impaired people are able to preview the route in advance of their journey so that they may have an idea or create a mental map of what to expect.

Depending on their task or personal preferences, a vision impaired person should be able to choose between:

• Getting a high level description of the journey that covers key turns, changes in levels,

major decisions points, with the additional option of any major points of interest

• Previewing all the turn-by-turn instructions of their journey

NB. This functionality can be partly demonstrated in the Wayfindr Demo iOS App v0.4

Enabling replay of previous audio instruction

It is helpful to provide users with the option to replay a previous instruction in the interests

of reducing distress while on the move. This feature allows vision impaired passengers to feel

Page 54: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 54

more in control of their journey, as they are able to confirm that they have heard an instruction correctly.

Where the voice over mode is on and the instruction is displayed as text on the screen, users

can scroll to the text to listen to it again. However, based on user research findings it is more

effective if there is a clear call-to-action button to “Repeat” the previous instruction. There is

a risk of interference with upcoming instructions while replaying a previous instruction,

which must be taken into account. To mitigate this, you could make it so that ‘reassuring messages’ cannot be replayed if they interfere with the next instruction.

NB. This functionality can be demonstrated in the Wayfindr Demo iOS App v0.4

Enabling dictation for searching

It is recommended to use native OS dictation systems as part of the mobile operating system

as this will allow users to search more quickly when on the move by dictating their

preference, rather than to typing it into the mobile device whilst using their primary mobility aid.

A risk with this is that noisy environments could interfere with the dictated instructions that are being spoken into the device.

NB. This functionality can be demonstrated in the Wayfindr Demo iOS App v0.4

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

s4.3.1.1 - Enabling selection of directional instructions

As mentioned in Section 3.1.7 there are various strategies for communicating directions to

vision impaired users and different approaches in different countries. Vision impaired users

have no clear single preference for the mode of communication as long as they are able to pick the means they prefer.

It is recommended therefore to enable users to choose between:

Page 55: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 55

• Clock face directions, for example “turn at 1 o’clock” (common in the UK)

• Degrees, for example “turn at 45 degrees to your right”

• Proportional directions, for example “turn slightly to your left”

• Cardinal coordinates, for example “turn facing east” (common in the USA)

s4.3.1.2 - Enabling dictation for selection of options

It is recommended to provide vision impaired users with the means to dictate their selection

from a list of options as this reduces the cost of the selection on their device. It is useful too

for the user to ask for the previous instruction to be repeated or to make a selection from a list.

s4.3.1.3 - Providing guidance to the nearest help point

Where a passenger feels that they may have got lost in a controlled environment such as a

station it is important to provide guidance on how to seek help in an emergency. It is

possible to integrate this function with the venue owner’s operations. Alternatively, guide

vision impaired people to the nearest help points that can generally be found at various locations.

s4.3.1.4 - Suggesting the safest not the shortest or fastest route

As mentioned in Section 2.1.5 many vision impaired people prioritise route safety over

distance and duration. Investigating accessibility levels along each route to the destination

enables this metric to be calculated. However, what is considered as a “safe” route is

subjective to each individual and more investigation needs to happen in order to better define the aspects that make a route “safe”.

s4.3.1.5 - Enabling users to choose voice type

This section was removed because the Operating System of the smartphone device

determines what voice can someone use.

Page 56: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 56

s4.3.1.6 – Displaying the instructions as text

Some people might be using braille displays to read the instructions or might be able to read

the instructions. Therefore, apart from communicating navigation instructions with audio, it

might be also helpful to display them as text on the screen.

NB. This functionality can be demonstrated in the Wayfindr Demo iOS App v0.4

s4.3.1.7 - Enabling users to choose their mobility aid

Enabling users to choose a route based on their mobility aid (white cane, guide dog, no aid) offers a more personalised user experience.

A cane user may prefer a route with good tactile clues and landmarks. A guide dog user may

prefer an alternative route that suits the dog’s ability with straight line working or “targeting” doors, steps, turns etc.

Based on their primary mobility aid, different instructions can be provided.

For example:

• For white cane users and no aid users: there may be no need to mention wide gates.

• For no aid users: since these users are more likely to have functional or residual

vision, they will be able to see the layout of the space. To make the system more

responsive to their needs the instruction should be to “take the first left” or “take the

second right.” You can also use proportional directions instead of clock faces or

degrees. For example, “go through to the gates on your left” instead of “turn 45 degrees to your left and go through the gates.”

s4.3.1.8 - Enabling saving of frequently used places

To reduce the effort required to input destinations it is recommended that users are enabled to save their most frequent destinations.

s4.3.1.9 - Enabling saving of personal landmarks

As described in Section 2.4 “Landmarks and clues”, vision impaired people use various

landmarks along a route. These landmarks might be different for every individual and might

Page 57: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 57

be used repeatedly for various purposes. For example, a vision impaired person might use the landmarks as a reassurance that they are on the right track, as a meeting point etc.

Thus, allowing saving of personal landmarks with language that is more meaningful to each individual user is likely to provide a more personalised user experience.

4.3.2. Guidelines for sound design

Overview

As seen in Section 3.1.8 “Provide auditory cues”, sound plays an important role in the

wayfinding experience of vision impaired people, whether it is environmental, e.g. a car

passing by, or pre-determined, e.g. any sound design component such as a keypad sound or

alert. The predetermined auditory information for digital wayfinding, i.e. the sound design elements, comes in two forms:

• Notification alerts, which are sound alerts that precede the audio instructions

• Synthesised voice commands, which are computer-generated voice commands

often associated with the accessible use of a modern smartphone or computer, e.g.

VoiceOver mode on iOS or TalkBack mode on Android. These are vital tools for vision impaired people.

This section includes mainly guidelines about the sound design of notification alerts, as the synthesised voice commands are native to the mobile operating system.

Using sound to attract people’s attention

Using sound to attract people’s attention ahead of an audio instruction is helpful for the following reasons:

• It allows users to have their attention on the environment and to focus on the

technology only when they are required to listen to an instruction. If there is no clear

differentiation from the audio instruction and the environmental noises, they will be

constantly focusing on the technology for fear of missing an audio instruction.

• Similarly, without the use of sound to attract the user’s attention, environmental

noises may override the audio instruction resulting in users missing the audio

instruction.

Page 58: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 58

• In addition, requesting the user’s attention only when it is needed enables them to

engage in conversation with companions. The sound will signal to them that an instruction is coming up, giving them the time to shift the focus of their attention.

Using different notification alerts for different purposes

Consider using two core notification alerts that serve different purposes, which are the

following:

• General Notification Alert is a short 2-note alert heard immediately prior to all

audio instructions, excluding the ‘Journey Completed’ audio instruction (see below).

This notification alert prepares vision impaired people for any upcoming voice

instruction, as seen in Section 3.1.8 “Provide auditory cues”. The proposed Wayfindr

sound can be heard and downloaded below. It is also integrated in the Wayfindr

Demo iOS App v0.4.

• Journey Completed Notification Alert is a short 3-note alert heard immediately

prior to any voice command that confirms the scheduled journey has been

completed. The proposed Wayfindr sound can be heard and downloaded below. It is also integrated in the Wayfindr Demo iOS App v0.4.

The notifications alerts should alert the user without causing undue alarm or stress. They

should be minimally designed functional sounds that are simple and concise, providing

reassurance, whilst preparing the user for the imminent audio instruction. The proposed

Wayfindr sounds follow these principles and it is encouraged to be used in digital navigation services.

The two Wayfindr notification sound alerts above act as a coherent pair, yet they are varied

enough to suit their individual function. The identically pitched 2-notes used for the general

notification alert are short and generic, acting as a quick-fire prompt to the upcoming voice

instruction, whereas the 3-notes used for the journey completed alert have a rising pitch, indicating resolution - the successful completion of the intended journey.

Paired functional sounds (or UI sounds) are common practice, especially within the context

of smartphone use – renowned examples include the “Listen” and “Confirmation” sounds

used for the personal assistant “Siri” in Apple iOS, or the similarly-paired “Google Now”

sounds. Other examples of paired UI sounds typically include an on and off, lock and

unlock, and connect and disconnect pairing. The paired sounds will typically be similar in

Page 59: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 59

tonality or instrumentation (sharing design principles that contribute to a consistent UX), but varied enough in their form to communicate a specific function.

Distinguishing notification alerts from other sounds

The notification alerts should be audible within a noisy environment and be distinguishable

from existing phone operating system alerts, ensuring the wayfinding experience is unique,

consistent and recognisable.

Suggestions for further investigation

The following paragraphs are not guidelines but suggested areas for future investigation through user research.

s4.3.2.1 - Using new sound alerts for more specific purposes

Additional notification alerts can be considered for future updates of the Open Standard –

such as specific warning alerts, categorised alerts for different objects or any other contextually-aware notifications.

Page 60: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 60

5. Wayfinding technologies

5.0. Purpose of the section There are various technologies that can be used for wayfinding purposes. The most common ones are the GPS, Wi-Fi, RFID and Bluetooth.

In this section, as a built-environment owner who wants to provide wayfinding solutions to

make your environment accessible or as a developer who might be involved in wayfinding

projects, you will be able to find recommendations and best practices about the whole lifecycle of a technology, from the installation to long-term maintenance.

The Wayfindr research trials so far have been conducted using Bluetooth Low Energy

beacons. Thus, this version of the Open Standard includes recommendations about this type of technology.

5.1. Bluetooth Low Energy beacons

5.1.0. Purpose of this section This section contains considerations and recommendations that are related to the whole

lifecycle of using Bluetooth Low Energy (BLE) beacons as a technology solution for digital wayfinding.

Venue owners, their accessibility stakeholders and developers who are involved with beacon deployment can find information on:

• Installation of BLE beacons in a built environment

• Configuration of the BLE beacons in order to make the most out of the installation

• Maintenance and Operational considerations that should be taken into account in order to manage a fleet of installed BLE beacons.

Page 61: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 61

5.1.1. What is a Bluetooth Low Energy beacon? Bluetooth is a global wireless communication standard for exchanging data over short distances - it was introduced in 1994 [3].

Bluetooth Low Energy (BLE) is a technology applied in Bluetooth 4.0 protocol and above. It

was specifically designed with low power consumption in mind to support applications and

services that require connected devices (smartphones, tablets, laptops for example) located nearby. Most recent mobile devices support BLE.

A BLE beacon is an electronic device that repeatedly transmits a radio signal at periodic

intervals. The signal’s broadcast frequency and transmit power can be manually adjusted

(see Section 5.1.4). This radio signal carries data that allow each specific beacon to be identified by compatible devices once they are in range.

When a device receives the data from the BLE beacon, it can estimate how far away the BLE

beacon is located. Once the distance estimation is made, it prompts a number of things to

happen depending on the code used in the app. The beacon may cause various triggers. An

application running on a device can detect the presence of BLE beacons and take actions

depending on the distance and the data received. For example, in audio-based wayfinding for

vision impaired people the relevant audio instruction is triggered. All the logic about the

triggering of the correct audio instructions is in the app. See for example the Wayfindr Demo iOS app v0.4.

5.1.2. Sources of Bluetooth signal distortion The BLE signal is transmitted in the 2.4 GHz radio frequency. This means that the BLE

signal may be distorted by interference from specific elements in the environment [2], such as:

• Metallic surfaces bouncing the signal off the surface in unexpected ways as it is

unable to penetrate the material

• Water absorbing BLE signal

• Human body mass absorbing and distorting BLE signal

• Concrete and bulletproof glass absorbing the signal

• Marble and bricks absorbing the signal

• Electronic devices operating in the 2.4 GHz frequency, thus emitting signal on the

same radio frequency which might overlap with the beacon signal

Page 62: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 62

• Fluorescent lighting emitting signal in the 2.4 GHz frequency, thus likely to distort

beacon signal in unexpected ways

• Power sources such as electric railroad tracks or power lines also causing interference

When the Bluetooth signal is distorted, the mobile device will receive a signal that does not

reflect the real situation, e.g. the distance to the BLE beacon as read by the device might not

be accurate. Installing BLE beacons following the best practices proposed in the following section will mitigate this risk.

5.1.3. BLE beacon installation

Purpose of this section

Venue owners and other parties involved in BLE beacon deployment can find a description

of the two main approaches for installing BLE beacons (the proximity-based and the trilateration-based approach) along with best practices for BLE beacon positioning.

Proximity-based approach

Installing BLE beacons following a proximity-based approach means that beacons are placed

only at decision making points where people need to be instructed. These decision making points might be before and after doors, staircases and at pathway intersections.

Advantages of this approach

• The main advantage of the proximity-based approach is that a small number of BLE

beacons is needed to complete this type of installation.

• As a result the costs for BLE beacon procurement, installation and configuration

time, maintenance are reduced.

Disadvantages of this approach

• The decision making points in a venue where BLE beacons will be placed need to be

decided carefully.

• There are likely to be areas that the BLE beacon signal does not reach.

• It might be difficult to work successfully in large open areas.

• Misdirection or user error that leads a user out of the covered area is non-

recoverable.

Page 63: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 63

• To detect the orientation of users, a combination of technologies might be needed. Orientation needs smartphone orientation capabilities.

NB. The Wayfindr Demo iOS app v0.4 is developed for beacon deployments with a

proximity-based approach.

Trilateration-based approach

Installing BLE beacons following a trilateration-based approach means that BLE beacons are

placed so that they provide coverage to the whole area. In this way, the location of the user’s

smartphone device is estimated by measuring distance from the 3 closest BLE beacons using a trilateration algorithm [7].

Advantages

• The main advantage of this approach is that the majority of a venue area is covered in

BLE beacons and as a result there are unlikely to be areas where the user position

cannot be estimated.

• With the trilateration method, the orientation of the user’s smartphone device can be

determined dynamically and as a result, the instructions given to the user can reflect that orientation.

Disadvantages

• A larger amount of BLE beacons is required to achieve trilateration compared to the

proximity-based approach. This means increased costs for the beacon installation,

configuration and maintenance

• Location accuracy cannot be guaranteed as there are a few variables that are likely to

affect the stability of the Bluetooth signal such as the ones mentioned in Section 5.1.2

“Sources of Bluetooth Signal Distortion”

• Changes to a built environment such as new construction, signage, temporary

fixtures, and sources of 2.4GHz being introduced or removed can change the profile

of the space and affect a trilateration algorithm. If physical changes are common,

system calibration will be an on-going task and expense.

• Digital navigation services that do not use a trilateration algorithm to calculate routes

might have difficulties providing a good user experience in built-environments where

beacons have been installed with trilateration in mind. Thus, ensure that beacons are

positioned on decision making points, landmarks or point of interests as described in the following section.

Page 64: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 64

Best practices for BLE beacon positioning

BLE beacons can be installed in both indoor and outdoor environments. There is no single

way to install beacons, as layout and material vary across different environments. The

following best practice guidelines can be applied for efficient positioning across environments:

• Place the BLE beacons above head height (above 2.5 metres) in order to avoid

interference from human body mass, likely to absorb a BLE beacon’s signal in busy

environments.

• If the ceiling is up to 4 metres high, then place the BLE beacon on the ceiling.

• If there is an arch, then place the BLE beacon at the top and centre of the arch.

• If the ceiling or the top of arch is higher than 4 metres then use walls, placing the

BLE beacon at a height of around 2.5 metres (up to +1 metre) from the floor level.

Alternatively, if possible, suspend the BLE beacon from the ceiling on a cable.

• When the optimal position of a beacon interferes with other venue elements such as

metallic signage, major power conduit or fluorescent lighting, place the BLE beacon 1

metre away from these elements.

• If the BLE beacon is to be placed in a corridor below 4 metre width, then place the

BLE beacon in the middle of the corridor to cover the full width equally.

• If the corridor is wider than 4 metres, consider using more BLE beacons to cover the

area evenly. In this case, place a BLE beacon at 4 metre intervals. For example, in an

entrance of a venue with multiple doors, the area is likely to be wider than 4 metres.

In this instance, to cover all doors with beacon signal, place a BLE beacon every 4

metres.

• Place a BLE beacon 4 +/- 1 metres before any landmarks or point of interests that

people are likely to go through or interact with. This should be the case both for the

proximity and the trilateration-based approach to installation. These landmarks and

objects might be for example:

o Entrance doors

o Pathways

o Escalators

o Stairs

o Lifts o Ticket control gates

Read more about the information that needs to be provided when interacting with these landmarks in the Section 4.1 “Guidelines for various environmental elements”.

Page 65: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 65

• Most BLE beacons are suitable for outdoors installations, check with your BLE

beacon supplier if their BLE beacons are waterproof and what temperatures they

work well in.

• Consider the orientation of the BLE beacon directional antenna. Based on the BLE

beacon manufacturer, some BLE beacons might not emit an all-round symmetrical

signal, but instead they emit a signal of elliptical form depending on which way the

BLE beacon antenna is facing. Ask your BLE beacon manufacturer for details and orientate the BLE beacon in a way that supports your needs.

5.1.4. The parameters of a BLE beacon Configuration settings of BLE beacons vary based on the protocol on which they are

configured. There are two main beacon protocols open to any beacon manufacturer in the market at the moment:

• iBeacon: The iBeacon protocol [1] is a communication format developed and

introduced by Apple in 2013 and is based on Bluetooth Low Energy technology. The

protocol is compatible with any iOS or Android device that supports Bluetooth 4.0

and above. The minimum requirements for the operating system is iOS 7 or Android

4.3 (Jelly Bean) and above.

• Eddystone: The Eddystone is an open beacon format developed and introduced by

Google in 2015 [4] and is based on Bluetooth Low Energy technology. It is compatible both for iOS and Android devices that support Bluetooth 4.0 and above.

5.1.4.1 iBeacon

The iBeacon protocol [1] is a communication format developed and introduced by Apple in

2013 and is based on Bluetooth Low Energy technology. The protocol is compatible with any

iOS or Android device that supports Bluetooth 4.0 and above. The minimum requirements

for the operating system is iOS 7 or Android 4.3 (Jelly Bean) and above.

iBeacon Identifiers

A BLE beacon configured with the iBeacon format transmits its Unique ID, an advertising

packet that contains three customisable identifiers: the UUID, the Major and the Minor.

These three identifiers make an iBeacon identifiable and distinguishable from other iBeacons.

Page 66: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 66

• Universally Unique Identifier (UUID): It is a 16 byte (128 bit) number that can

be used to identify a large group of BLE beacons. It is formatted in 32 hexadecimal

digits, split into 5 groups, separated with hyphen characters [8].

An example of a UUID is f3af8c82-a58a-45a1-b9b6-21e3cf47ded9.

It is common practice that the same UUID is being used to identify all the BLE

beacons that belong to the same company or organisation. For example, a transport

operator can have the same UUID for all the stations they manage. This is only a

common practice and not a constraint, as the same company or organisation can

generate more than one UUID. Although it is called “unique identifier”, there is a

possibility that different companies or organisations might be using the same UUID.

• Major: It is a 16-bit integer that can be used to identify a subgroup of BLE beacons

that are under the same UUID. It is common practice that the same Major is being

used for all the BLE beacons that belong to the same region or venue of the

organisation. In the transport operator example, the BLE beacons that belong to the

same station, would be under the same Major. In this case, the Major becomes the

venue identifier.

• Minor: It is a 16-bit integer that can be used to identify an individual beacon within a group of beacons with the same Major.

Things to keep in mind:

• The iBeacon format requires all three identifiers to be assigned.

• These three identifiers are advertised publicly. This means that anyone with an app

or a device that can detect BLE beacons will be able to capture these identifiers.

However, this does not necessarily mean that they can connect with them. Many BLE

beacon manufacturers have solutions to prevent “piggybacking” onto a fleet of BLE

beacons.

• Since Major and Minor are integers, they cannot include characters other than

numbers. Therefore, every venue needs to be related to a number, which very often will be represented by the value of the Major.

5.1.4.2 Eddystone

Where the iBeacon transmits one advertising packet that includes the UUID, the Major and

Minor, the Eddystone BLE beacons broadcast more than one type of advertising packet [4]. These advertising packets are called “frames”.

Page 67: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 67

• The Eddystone-UID frame broadcasts a unique 16-byte BLE Beacon ID. It consists

of two parts: the 10-byte Namespace ID and the 6-byte Instance ID. The Namespace

ID can be used to specify a particular group of beacons, similar to the UUID in

iBeacon. The Instance ID can be used to identify individual beacons in a large fleet of

beacons.

• Eddystone-EID is a component of the Eddystone specification for BLE beacons.

Eddystone-EID BLE beacons broadcast an identifier that changes every few minutes.

The identifier can be resolved to useful information by a service that shares a key (the

Ephemeral Identity Key, or EIK) with the individual BLE beacon. Any use of

Eddystone-EID requires both a BLE beacon and a resolving service (such as the

Google Proximity Beacon API).

• The Eddystone-URL frame broadcasts a URL in a compressed encoding format.

Once received by a device, the URL is decoded and the user can select if they want to

visit the broadcasted URL. Although this advertising packet might not be very helpful

for wayfinding applications, it is useful for applications intended for web content

discovery.

• The Eddystone-TLM frame broadcasts data about the BLE beacon’s own operation,

the so-called telemetry information. This data is useful for monitoring the fleet of

beacons. When the Eddystone-TLM frame is transmitted, the following data can be

captured:

o BLE beacon battery level, in an encrypted format with the shared key, similar

to the Eddystone-EID

o Time that the BLE beacon has been active since last time it was switched on

o The number of frames, i.e. advertising packets, the BLE beacon has

transmitted since last time it was switched on o BLE beacon temperature

5.1.4.3 Estimating the distance from a BLE beacon

As seen above in Section 5.1.2 the Bluetooth signal is open to distortion from different

sources. As a result it is difficult to accurately estimate the distance from a BLE beacon to a

device. Depending on the BLE beacon format that is used, i.e. iBeacon or Eddystone, there

are various parameters that could give an indication of the beacon’s distance from the device:

• Received Signal Strength Indicator (RSSI): This is an indication of the BLE

beacon signal strength as measured by a device. The RSSI is measured in dBm

Page 68: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 68

(Decibel-milliwatts). The bigger the RSSI value, the stronger the BLE beacon signal.

Based on the changes of the RSSI value, we can tell if a user is heading towards or

away from a BLE beacon. The RSSI values are specific to each beacon manufacturer

and based on how they have calibrated their BLE beacons (see “Measured Power or

Ranging Data” in Section 5.1.4.4). The RSSI values as read by smartphones vary

between devices as it also depends on the Bluetooth chip that the device has on

board.

• Proximity zones: For the iBeacon format only, the area around a beacon is divided

in four proximity zones based on the RSSI.

o Immediate, when the device is very close to the beacon in distances less

than 50cm

o Near, when the device is estimated to be in distances between 50 centimetres

and 3 metres from the beacon

o Far, when the device is further away or the signal is fluctuating due to

distortions

o Unknown, when the distance cannot be estimated mainly because the

distance from the BLE beacon is too far and also due to distortions of the signal

The proximity zones can be used as filters in order to trigger content in context. For

example, when a device enters a BLE beacon in the “Near” zone, then a particular set

of content can be triggered, whereas when the device is in the “Immediate” zone a different set of content can be triggered.

• Accuracy: This is a parameter in the iBeacon format only that indicates the

proximity value measured in metres. More specifically, it indicates the one sigma

horizontal accuracy, a parameter used in statistics [6]. The iBeacon documentation

suggests that the Accuracy parameter can be used to differentiate between beacons

with the same proximity value. However, the iBeacon documentation suggests that

the Accuracy should not be used to identify the exact distance of a user’s device from

a beacon. The reason is that the Accuracy levels are affected by various sources of signal distortion and might not reflect the actual distance in metres.

5.1.4.4 BLE Beacon configuration settings

Regardless of their protocol format, i.e. iBeacon or Eddystone, a BLE beacon can be configured by adjusting the following parameters in order to achieve best results:

Page 69: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 69

• Advertising interval. It specifies how often a BLE beacon transmits its signal. The

values for the advertising interval range from 100 milliseconds to 2000 milliseconds.

The shorter the interval, the more often the BLE beacon transmits its signal. Also, the

shorter the interval, the more stable the BLE beacon signal is. Reducing the interval,

i.e. making the BLE beacon emitting its signal faster, has a big impact on the BLE

beacon’s battery life. In most cases an advertising interval adjusted between 300-350

milliseconds will maintain a good balance between signal stability and battery life.

• Broadcasting or Transmission (TX) Power: It determines how far a BLE

beacon emits its signal. The broadcasting power is measured in dBM (Decibel-

milliwatts) and ranges between -100 dBM and +20 dBM. The more power, the

further the signal is emitted. The maximum distance that a Bluetooth Low Energy

signal can travel is several hundred metres, assuming that there are no walls or other sources of signal distortion.

Finding the right broadcasting power for a BLE beacon depends on the context. For

example, a BLE beacon installed in a large concourse might need to be set to a higher

broadcasting power. In cases where a BLE beacon must be triggered only when in the

boundaries of a point of interest or landmark, the broadcasting power will need to be lower so that the BLE beacon is not detected from further away.

The broadcasting power also has an impact on the BLE beacon’s battery life, but not

as greatly as Advertising Interval. As a rule of thumb, battery usage is 30% higher at Maximum power than at a Minimum power.

• Measured Power or Ranging Data: This parameter is used in estimating

distance from a BLE beacon (see Section 5.1.4.4). iBeacon and Eddystone define this

parameter differently. In the iBeacon protocol, it is called Measured Power and it is

the expected value of Received Signal Strength Indicator (RSSI) at one meter

distance from the BLE beacon. In the Eddystone protocol, it is called Ranging Data

and it is measured at zero meter distance from the BLE beacon. However, the

Eddystone specification [4] proposes to “measure the actual output of the BLE

beacon from 1 meter away and then add 41 dBm to that. 41 dBm is the signal loss that

occurs over 1 meter.”

• Many BLE beacon manufacturers have a default factory calibration for the Measured

Power that cannot be changed. However, it is advised that when beacons are installed

indoors, Measure Power or Ranging Data samples be taken in-situ and the value be

set accordingly. In this case the real environment with the potential sources of Bluetooth signal distortion are taken into account (see Section 5.1.2).

Page 70: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 70

Note that the Eddystone protocol is using the Ranging Data parameter only for the UID, EID and URL frames.

5.1.5. Maintenance and Operational considerations As well as considering installation and configuration of BLE beacons, it is important to

consider maintenance and operation over the longer term for the fleet of installed BLE beacons to remain functional and up to date.

Things to consider:

• Which department in the organisation will own the BLE beacon infrastructure and

will be responsible for their installation, configuration and maintenance.

• What happens when the BLE beacon battery is due to run out? There are two options:

Replace the whole BLE beacon or replace the battery in the BLE beacon. The latter

ensures continuity as the beacon does not change.

• What are the organisational needs for the battery health data collection. There are

various options available:

o The battery health data can be collected manually by someone in the

organisation who will need to do a walk through all BLE beacons with a

mobile device that will collect battery levels

o The battery level can be automatically collected and sent anytime a mobile

device is connected to the BLE beacon. In this case, the collection of battery

levels is crowd-sourced through people passing through the venue

o Some BLE beacons have the ability to connect to the Internet themselves if

they are in the range of a wi-fi network. In this case they can send their

battery levels automatically at periodic intervals

• What are the organisational needs for a dashboard to monitor status of the installed

BLE beacon fleet.

• How will the system administrator be notified when the beacon battery is running

low, in order to arrange action for maintenance?

• How battery life can be preserved while the venue is closed.

• What security actions can take place in order to mitigate the risk of someone else

piggy backing on the fleet of BLE beacons without the organisation’s permission.

Although the BLE beacon identifiers are publicly transmitted data, there are ways

available to increase the security of the beacon fleet.

Page 71: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 71

• Where does the data from the user’s smartphone, that is using the BLE beacon

network for travel, ultimately end up.

• How is the users’ travel data stored/deleted or shared /protected?

• Disabling the whole system in case it is needed, for example when system

maintenance is in progress.

• What are the available options for the BLE beacon firmware upgrades provided by

the beacon manufacturer. Updating the firmware keeps the beacon up-to-date with

the latest software and security updates made by the BLE beacon manufacturer.

• What is the performance of the BLE beacon manufacturer’s SDK and API and how

often they are updated.

• To what extent the BLE beacon manufacturer’s SDK and API are documented. A well

documented SDK and API will help developers integrate them better into their

mobile app.

• How engaged is the online community around the beacon manufacturer in order to

provide support and recommendations if any issues faced.

• How responsive is the beacon manufacturer’s support team, and what capability do

they have to escalate questions to a developer when required.

• What capability does the beacon manufacturer provide in order to facilitate sharing

of the beacon network with other third party developers who might be interested in utilising it.

5.1.6. References • Apple, iBeacon for Developers. https://developer.apple.com/ibeacon/ (last accessed:

30 March 2016)

• Apple, Potential sources of Wi-Fi and Bluetooth interference

https://support.apple.com/en-us/HT201542 (last accessed: 30 March 2016)

• Bluetooth, What is Bluetooth technology? https://www.bluetooth.com/what-is-

bluetooth-technology (last accessed: 30 March 2016)

• Google, Eddystone specification https://github.com/google/eddystone (last

accessed: 27 April 2016)

• Google Developers, Beacons https://developers.google.com/beacons/ (last accessed:

27 April 2016)

Page 72: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 72

6. Open Source Wayfindr Demo mobile app

6.0. The aim of the Wayfindr Demo mobile app We know that there is a community of individuals and organisations worldwide interested in

exploring digital wayfinding for vision impaired people. We want to support these people

who are planning to explore audio wayfinding in different types of built-environments, e.g.

transport, retail, health, cultural and entertainment venues. We have open-sourced the

Wayfindr Demo mobile app that has been used in all the Wayfindr trials to date, in order to

provide developers, designers and researchers around the world with a free, open tool to conduct their research in context.

This app serves also as a demonstration of the Section 4.3 “Guidelines for mobile app development” as seen in the Wayfindr Open Standard.

We invite all the interested parties to download the Wayfindr Demo mobile app, customise it

based on their needs, run their experiments and then share new versions of the app with the Wayfindr community.

We would like to see the app become an evolving open tool to aid the research and development of wayfinding systems for vision impaired people.

6.1. How was Wayfindr Demo mobile app

developed The Wayfindr Demo app has been developed over the last few years. The functionality of the

app is based on user needs that have been identified in our trials and have been improved through continuous iteration.

The reasons behind this functionality and the guidelines for developing digital navigation

services for vision impaired people can be found in Section 4.3 “Guidelines for mobile app development” of the Wayfindr Open Standard.

Page 73: Open Standard for Audio-based Wayfinding€¦ · 5.1.4.2 Eddystone Text was modified to better explain what the Eddystone-EID is. 5.1.6 References Three Wikipedia references were

Candidate Recommendation 1.0 73

Have a suggestion? We welcome submissions of research findings, suggestions for future investigation and feedback on existing elements in the Wayfindr Open Standard.

Get in touch

[email protected]