Smart Glass with Voice Recognition and Visual...

6
SEEE DIGIBOOK ON ENGINEERING & TECHNOLOGY, VOL. 01, MAY 2018 . 978-81-933187-0-6 © 2018 SEEEPEDIA.ORG Society for Engineering Education Enrichment A.Nagarajan, [email protected] ; K.Sindhuja, [email protected] ; C.R.Balamurugan, [email protected] ; Smart Glass with Voice Recognition and Visual Processing Facility for Visually Impaired People A.Nagarajan, K.Sindhuja, C.R.Balamurugan Karpagam College of Engineering, Coimbatore, India [email protected] , [email protected] , [email protected] ABSTRACT - Smart - Glass is a wearable smart camera device built with a powerful microcontroller. It understands the user voice requests and supplies the relevant information using auditory feedback through an earphone. The device aims to improve the quality of life for the blind and visually impaired people and makes them understand their surroundings in a clear way as close as to a normal person. The device is built to fulfill the needs of a visually impaired person and thus capable of doing the following. It can identify traffic signal lights, find vehicle and people movement while crossing roads, identify currency notes (10, 50, 100, 500 rupee notes), recognize objects, obstacles, recognize alphabets, numerals, act as night time security camera and play your favorite music. Index Terms - Smart glass, impaired people, voice, visual, specks. I. INTRODUCTION The creation of new technological devices to be used early in life is a must. However, despite the huge improvement of technological devices specifically designed for visually impaired users, we find that many of these solutions are not widely accepted by adults and are not easily adaptable to children. Brabyn [1] discussed about developments in electronic aids for the blind and visually impaired people. Brabyn et al [2] introduced a Talking signs regarding a remote signage, solution for the blind, visually impaired and reading disabled. Panchanathan et al [3] proposed icare - a user centric approach to the development of assistive devices for the blind and visually impaired. ] Maingreaud et al [4] suggested a dynamic tactile map as a tool for space organization perception : application to the design of an electronic travel aid for visually impaired and blind people. Dakopoulos and Bourbakis [5] proposed a 2D tactile vocabulary for navigation of blind and visually impaired. Rastogi et al [6] discussed in detail about the issues of using tactile mice by individuals who are blind and visually impaired. Ganz et al [7] introduced insight : RFID and bluetooth enabled automated space for the blind and visually impaired. Pathy et al [8] developed space technology for the blind and visually impaired. Silva et al [9] discussed regarding indoor guidance system for the blind and the visually impaired. Basit and Sultan [10] suggested about easy learning of quran using mobile devices for blind and visually impaired. Abdullah et al [11] discussed the reliability of the general sporting ability (gsa) protocols to identify sports talent among persons with blind and visually-impaired in klang valley. Anam et al [12] made an Expression for a dyadic conversation aid using google glass for people who are blind or visually impaired. Zhang et al [13] proposed a multimodal approach to image perception of histology for the blind or visually impaired. Lapyko et al [14] introduced a cloud-based outdoor assistive navigation system for the blind and visually impaired. Ferati et al [15] introduce accessibility requirements for blind and visually impaired in a regional context : an exploratory study. Yang et al [16] proposed assistive clothing pattern recognition for visually impaired people. Khan et al [17] developed speech based text correction tool for the visually impaired. Hoonlor et al [18] discussed a crowd sourcing application for the visually impaired and blind persons on android smart phone. Haddad et al [19] introduced a pattern recognition approach to make accessible the geographic images for blind and visually impaired. Vermol et al [20] see the work in progress : mapping the development of quality sensuous response through blind and visually impaired group ( bvig ) touch interaction. Owayjan et al [21] proposed a smart assistive navigation system for blind and visually impaired individuals. Pawluk et al [22] made a designing related to haptic assistive technology for individuals who are blind or visually impaired. Air et al [23] developed ORIENTOMA : a Novel Platform for Autonomous 53

Transcript of Smart Glass with Voice Recognition and Visual...

Page 1: Smart Glass with Voice Recognition and Visual …seeepedia.org/wp-content/uploads/2018/05/S18_05_14.pdfglass for people who are blind or visually impaired. Zhang et al [13] proposed

SEEE DIGIBOOK ON ENGINEERING & TECHNOLOGY, VOL. 01, MAY 2018

. 978-81-933187-0-6 © 2018 SEEEPEDIA.ORG

Society for Engineering Education Enrichment

A.Nagarajan, [email protected]; K.Sindhuja, [email protected]; C.R.Balamurugan, [email protected] ;

Smart Glass with Voice Recognition and Visual Processing Facility for

Visually Impaired People

A.Nagarajan, K.Sindhuja, C.R.Balamurugan Karpagam College of Engineering, Coimbatore, India

[email protected], [email protected], [email protected]

ABSTRACT - Smart - Glass is a wearable smart camera device built with a powerful microcontroller. It understands the user voice requests and supplies the relevant information using auditory feedback through an earphone. The device aims to improve the quality of life for the blind and visually impaired people and makes them understand their surroundings in a clear way as close as to a normal person. The device is built to fulfill the needs of a visually impaired person and thus capable of doing the following. It can identify traffic signal lights, find vehicle and people movement while crossing roads, identify currency notes (10, 50, 100, 500 rupee notes), recognize objects, obstacles, recognize alphabets, numerals, act as night time security camera and play your favorite music. Index Terms - Smart glass, impaired people, voice, visual, specks.

I. INTRODUCTION The creation of new technological devices to be used early in

life is a must. However, despite the huge improvement of technological devices specifically designed for visually impaired users, we find that many of these solutions are not widely accepted by adults and are not easily adaptable to children. Brabyn [1] discussed about developments in electronic aids for the blind and visually impaired people. Brabyn et al [2] introduced a Talking signs regarding a remote signage, solution for the blind, visually impaired and reading disabled. Panchanathan et al [3] proposed icare - a user centric approach to the development of assistive devices for the blind and visually impaired. ] Maingreaud et al [4] suggested a dynamic tactile map as a tool for space organization perception : application to the design of an electronic travel aid for visually impaired and blind people. Dakopoulos and Bourbakis [5] proposed a 2D tactile vocabulary for navigation of blind and visually impaired. Rastogi et al [6] discussed in detail about the issues of using tactile mice by individuals who are blind and visually impaired. Ganz et al [7] introduced insight : RFID and bluetooth enabled automated space for the blind and visually impaired. Pathy et al [8] developed space technology for the blind and visually impaired. Silva et al [9] discussed regarding indoor guidance system for the blind and the visually impaired. Basit and Sultan [10] suggested about easy learning of quran using

mobile devices for blind and visually impaired. Abdullah et al [11] discussed the reliability of the general sporting ability (gsa) protocols to identify sports talent among persons with blind and visually-impaired in klang valley. Anam et al [12] made an Expression for a dyadic conversation aid using google glass for people who are blind or visually impaired. Zhang et al [13] proposed a multimodal approach to image perception of histology for the blind or visually impaired. Lapyko et al [14] introduced a cloud-based outdoor assistive navigation system for the blind and visually impaired. Ferati et al [15] introduce accessibility requirements for blind and visually impaired in a regional context : an exploratory study. Yang et al [16] proposed assistive clothing pattern recognition for visually impaired people. Khan et al [17] developed speech based text correction tool for the visually impaired. Hoonlor et al [18] discussed a crowd sourcing application for the visually impaired and blind persons on android smart phone. Haddad et al [19] introduced a pattern recognition approach to make accessible the geographic images for blind and visually impaired. Vermol et al [20] see the work in progress : mapping the development of quality sensuous response through blind and visually impaired group ( bvig ) touch interaction. Owayjan et al [21] proposed a smart assistive navigation system for blind and visually impaired individuals. Pawluk et al [22] made a designing related to haptic assistive technology for individuals who are blind or visually impaired. Air et al [23] developed ORIENTOMA : a Novel Platform for Autonomous

53

Page 2: Smart Glass with Voice Recognition and Visual …seeepedia.org/wp-content/uploads/2018/05/S18_05_14.pdfglass for people who are blind or visually impaired. Zhang et al [13] proposed

SEEE DIGIBOOK ON ENGINEERING & TECHNOLOGY, VOL. 01, MAY 2018

978-81-933187-0-6 © 2018 SEEEPEDIA.ORG Society for Engineering Education Enrichment

and Safe Navigation for Blind and Visually Impaired. Ramer et al [24] proposed an adaptive, color based lane detection of a wearable jogging navigation system for visually impaired on less structured paths. Panchanathan et al [25], introduced social interaction assistant : a person-centered approach to enrich social interactions for individuals with visual impairments. Matsuo et al [26] made a discussion on shadow rine : accessible game for blind users, and accessible action rpg for visually impaired gamers. Liu et al [27] developed finger-eye : a wearable text reading assistive system for the blind and visually impaired. Villan et al [28] proposed face recognition and spoofing detection system adapted to visually-impaired people. Lee et al [29] introduced magnetic Tensor Sensor and Way-finding Method based on Geomagnetic Field Effects with Applications for Visually Impaired user. Lan et al [30] developed Lightweight smart glass system with audio aid for visually impaired people. Kardyś et al [31] suggested a new android application for blind and visually impaired people. Jakob and Tick [32] developed concept for transfer of driver assistance algorithms for blind and visually impaired people. Cheraghi et al [33] proposed beacon-based indoor way finding for the blind, visually impaired and disoriented. Chaccour and Badr [34] introduced novel indoor navigation system for visually impaired and blind people. Harry in [35] made an interactive demonstration on the use of existing apps on mobile technologies to teach basic photographic techniques to participants who are blind, visually impaired and sighted together.

II. BLOCK DIAGRAM The LPC1764 is a Cortex-M3 microcontroller for embedded

applications featuring a high level of integration and low power consumption at frequencies of 100 MHz. Features include 128 kB of flash memory, 32 kB of data memory, Ethernet MAC, USB Device, 8-channel DMA controller, 4 UARTs, 2 CAN channels, 3 SSP/SPI, 3 I2C, 8-channel 12-bit ADC, motor control PWM, Quadrature Encoder interface, 4 general purpose timers, 6-output general purpose PWM, ultra-low power Real-Time Clock with separate battery supply, and up to 70 general purpose I/O pins. The LPC1764 is pin-compatible to the 100-pin LPC2368 ARM7 MCU.

Fig.1. Block Diagram

Fig. 2. Block Diagram

54

Page 3: Smart Glass with Voice Recognition and Visual …seeepedia.org/wp-content/uploads/2018/05/S18_05_14.pdfglass for people who are blind or visually impaired. Zhang et al [13] proposed

SEEE DIGIBOOK ON ENGINEERING & TECHNOLOGY, VOL. 01, MAY 2018

978-81-933187-0-6 © 2018 SEEEPEDIA.ORG Society for Engineering Education Enrichment

Fig. 3. Hardware kit

Fig. 4. Case Study

III. HARDWARE DESCRIPTION The major components used in this work are LPC1764, an

ARM Cortex-M3 microcontroller, OV7670 camera sensor, AL422 FIFO memory chip, 4-button capacitive touch keypad, HC-05 Bluetooth transceiver, VS1011e, audio codec chip, 2GB microSD card and RGB LED. The features of LPC 1700 are 128 KB on-chip flash program memory with In-System Programming (ISP) and In-Application Programming (IAP) capabilities, 32 KB of SRAM on the CPU with local code / data bus for high-performance CPU access, Single 3.3 V power supply (2.4 V to 3.6 V), ARM Cortex-M3 processor, running at frequencies of up to 100 MHz, 52 General Purpose I / O (GPIO) pins, 8x 12-bit Analog -to-Digital Converter (ADC) and 1x 10-bit Digital-to- Analog Converter (DAC).

The Cortex-M3 processor is specifically developed to enable partners to develop high-performance low-cost platforms for a broad range of devices including microcontrollers, automotive body systems, industrial control systems and wireless networking and sensors.

This camera module can perform image processing such as AWB (auto white balance), AE (automatic exposure) and AGC (automatic gain control), for the video signal coming from CMOS sensor. What’s more, in fusion of other advanced technology such as image enhancement processing under low illumination, and image noise intelligent forecast and suppress, this module would output high quality digital video signals by standard CCIR656 interface. OV7670 built-in JPEG decoder supported real time encoding for collected image, and external controller can easily read the M – JPEG video streams, achieving the camera design of double stream. OV7670 supported motion detection and OSD display function of screen characters and pattern overlay, capable of self-defining detection area and sensitivity.

Fig. 5. Cortex – M3

Fig. 6. Camera OV 7670

The specifications of camera are VGA/QVGA Resolution,

30 frames/sec frame rate, 8-pin parallel interface for data, Low voltage low power CMOS technology, A high speed FIFO for data buffering and 3Mbit FIFO size (384 kb).

55

Page 4: Smart Glass with Voice Recognition and Visual …seeepedia.org/wp-content/uploads/2018/05/S18_05_14.pdfglass for people who are blind or visually impaired. Zhang et al [13] proposed

SEEE DIGIBOOK ON ENGINEERING & TECHNOLOGY, VOL. 01, MAY 2018

978-81-933187-0-6 © 2018 SEEEPEDIA.ORG Society for Engineering Education Enrichment

Fig. 7. Capacitive touch keypad

The parameters of capacitive touch keypad are MPR121 Capacitive Touch Controller, Supports up to 12 electrodes and 3 mm x 3 mm x 0.65 mm 20 lead QFN package. The MPR121 capacitive keypad uses the MPR121 and 12 touch sensitive pads to give you a simple ‘keypad’ with an I2C output. The board also has four mounting holes allowing it to be used as an input system in place of traditional buttons. Check the example code below for an easy way to read the keypad.

Fig.8. Bluetooth

The specifications of Bluetooth are Supports Android Smartphone/Tablet/PC Communication, 3Mbps Modulation with complete 2.4GHz radio transceiver and baseband, Low Power 1.8V Operation ,1.8 to 3.6V I/O and It works up to 30 feet distance.

Fig. 9. MP3 audio decoder

The Decodes multiple formats are MP3, AAC, WMA, FLAC, Ogg Vorbis, WAV and MIDI. The properties are Low-power operation, High-quality on-chip stereo DAC and Zero-cross detection for smooth volume change. Decoders combine multiple functions that allow easy playback of music that's stored in a USB flash memory using an audio player.

IV. SOFTWARE USED The software used is PCXpresso IDE and PCXpresso

Debugger. LPCXpresso is a complete tool chain for LPC1000 series of Cortex-M microcontrollers. Eclipse based IDE. GNU Compiler, Linker and Libraries, Enhanced GDB Debugger. Supports LPC-Link Programmer and Debugger. Developed by NxP Semiconductors and Code Red Technologies. MCUXpresso IDE is now recommended for developers using LPC Cortex-M based microcontrollers. This IDE offers a fully feature development environment for both LPC and Kinetis Cortex-M based microcontrollers which fully supports LPCOpen and MCUXpresso SDK. The LPCXpresso IDE gives developers a low-cost way to create high-quality applications for LPC microcontrollers (MCUs). The LPC-Link2 debug probe is available as a standalone debug probe or built into LPCXpresso V2 and LPC Xpresso V3 boards. When used in conjunction with the LPCXpresso automatically downloads CMSIS-DAP or "Redlink" firmware into the RAM of the probe. LPC-Link2 can then be used with LPCXpresso IDE to debug all NXP LPC Cortex-M MCUs. However, the debugging of LPC2000 (ARM7) / LPC3000 (ARM9) parts is not supported by LPCXpresso IDE via LPC-Link2. CMSIS-DAP and J-Link firmware images suitable for programming into flash (using LPCScrypt).

Fig.10. LPCXpresso IDE

56

Page 5: Smart Glass with Voice Recognition and Visual …seeepedia.org/wp-content/uploads/2018/05/S18_05_14.pdfglass for people who are blind or visually impaired. Zhang et al [13] proposed

SEEE DIGIBOOK ON ENGINEERING & TECHNOLOGY, VOL. 01, MAY 2018

978-81-933187-0-6 © 2018 SEEEPEDIA.ORG Society for Engineering Education Enrichment

Fig. 11. Entire Hardware Kit

V. CONCLUSION The proposed work has been developed for the blind

people. The hardware kit is used to identify the currency notes. The currency notes are already scanned and are stored as images. When the camera reads a particular currency note it is then matched with note already scanned and stored in the data base and announces through voice message that what is kept near the camera. This device aims to improve the quality of life for the blind and visually impaired people and makes them understand their surroundings in a clear way as close as to a normal person. Some more identifications may also be included in this proposed work like traffic signal lights, find vehicle and people movement while crossing roads, recognize objects, obstacles, recognize alphabets, numerals, act as night time security camera and play your favourite music.

REFERENCES [1] J. Brabyn, “Developments in Electronic Aids for the Blind and

Visually Impaired,” pp. 33–37, 1985. [2] J. Brabyn, W. Crandall, and W. Gerrey, “Talking signs: a remote

signage, solution for the blind, visually impaired and reading disabled,” Proc. Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., pp. 1309–1310, 1993.

[3] S. Panchanathan, J. Black, M. Rush, and V. Iyer, “ICare - A User Centric Approach to the Development of Assistive Devices for the Blind and Visually Impaired,” Proc. Int. Conf. Tools with Artif. Intell., pp. 641–648, 2003.

[4] F. Maingreaud, E. E. Pissaloux, R. Velázquez, F. Gaunet, M. Hafez, and J. Alexandre, “A Dynamic Tactile Map as a Tool for Space Organization Perception : Application to the Design of an Electronic Travel Aid for Visually Impaired and Blind People,” IEEE Engineering in medicine and biology 27th annual conference, pp. 4–7, 2005.

[5] D. Dakopoulos and N. Bourbakis, “Towards a 2D Tactile Vocabulary for Navigation of Blind and Visually Impaired,” IEEE

International conference on systems, man and cybernetics, pp. 45–51, 2009.

[6] R. Rastogi, D. T. Pawluk, and J. M. Ketchum, “Issues of Using Tactile Mice by Individuals Who Are Blind and Visually Impaired,” IEEE transactions on neural systems and rehabilitation engineering, vol. 18, no. 3, pp. 311–318, 2010.

[7] A. Ganz, S. R. Gandhi, C. Wilson, and G. Mullett, “INSIGHT : RFID and Bluetooth Enabled Automated Space for the Blind and Visually Impaired,” 32 nd annual international conference of the IEEE EMBS, pp. 331–334, 2010.

[8] N. B. Pathy, N. M. Noh, S. I. Moslin, M. Din, and B. Subari, “Space Technology for the Blind and Visually Impaired,” IEEE International conference on space science and communication, pp. 206–210, 2011.

[9] M. da Silva Cascalheira, P. Pinho, D. Teixeira, and N. B. de Carvalho, “Indoor guidance system for the blind and the visually impaired,” IET Microwaves, Antennas Propag., vol. 6, no. 10, p. 1149, 2012.

[10] W. Basit and N. Sultan, “Easy Learning of Quran Using Mobile Devices for Blind and Visually Impaired,” Proc. - 2013 Taibah Univ. Int. Conf. Adv. Inf. Technol. Holy Quran Its Sci. pp. 155–158, 2013.

[11] N. M. Abdullah, W. Tumijan, R. A. Latif, and N. A. Hamid, “The reliability of the general sporting ability (GSA) protocols to identify sports talent among persons with blind and visually-impaired in Klang valley,” IEEE Business Engineering and Industrial Applications Colloquium, pp. 719–722, 2013.

[12] A. I. Anam, S. Alam, and M. Yeasin, “Expression: A Dyadic Conversation Aid using Google Glass for People who are Blind or Visually Impaired,” Proc. 6th Int. Conf. Mob. Comput. Appl. Serv., pp. 57–64, 2014.

[13] T. Zhang, G. J. Williams, B. S. Duerstock, and J. P. Wachs, “Multimodal Approach to Image Perception of Histology for the Blind or Visually Impaired,” IEEE International Conference on system, man and cybernetics, pp. 3924–3929, 2014.

[14] A. N. Lapyko, L. Tung, and B. P. Lin, “A Cloud-based Outdoor Assistive Navigation System for the Blind and Visually Impaired,” pp. 1-8, 2014.

[15] M. Ferati, B. Raufi, A.Kurti, B.Vogel, “Accessibility Requirements for Blind and Visually Impaired in a Regional Context : An Exploratory Study,” pp. 13–16, 2014.

[16] X. Yang, S. Yuan, Y. Tian, “Assistive Clothing Pattern Recognition for Visually Impaired People,” IEEE Transitions on human machine systems, vol. 44, no. 2, pp. 234–243, 2014.

[17] N. H. Khan, A. H. Arovi, H. Mahmud, K. Hasan, and H. A. Rubaiyeat, “Speech based text correction tool for the visually impaired,” 18th International Conference on computer and information technology, pp. 150–155, 2015.

[18] A. Hoonlor, S. P.N. Ayudhya, S. Harnmetta, S. Kitpanon, and K. Khlaprasit, “UCap : A Crowdsourcing Application for the Visually Impaired and Blind Persons on Android Smartphone.” pp.1-6, 2015.

[19] Z. Haddad, Y. Chen, J. L. Krahe, “a pattern recognition approach to make accessible the geographic images for blind and visually impaired,” ICIP2015, pp. 3205–3209, 2015.

[20] V. V Vermol, R. Anwar, O. H. Hassan, and S. Z. Abidin, “Work in Progress : Mapping the development of Quality Sensuous Response through Blind and Visually Impaired Group ( BVIG ) Touch Interaction,” International conference on interactive collaborative learning, pp. 25–29, 2015.

[21] M. Owayjan, A. Hayek, H. Nassrallah, and M. Eldor, “Smart Assistive Navigation System for Blind and Visually Impaired Individuals,” International Conference on advances in biomedical engineering, pp. 162–165, 2015.

57

Page 6: Smart Glass with Voice Recognition and Visual …seeepedia.org/wp-content/uploads/2018/05/S18_05_14.pdfglass for people who are blind or visually impaired. Zhang et al [13] proposed

SEEE DIGIBOOK ON ENGINEERING & TECHNOLOGY, VOL. 01, MAY 2018

978-81-933187-0-6 © 2018 SEEEPEDIA.ORG Society for Engineering Education Enrichment

[22] D. T. V Pawluk, R. J. Adams, R. Kitada, “Designing Haptic Assistive Technology for Individuals Who Are Blind or Visually Impaired,” IEEE Transactions on Haptics, vol. 8, no. 3, pp. 258–278, 2015.

[23] G. Air, L. O. Russo, S. Rosa, M. Indaco, and P. Torino, “ORIENTOMA : A Novel Platform for Autonomous and Safe Navigation for Blind and Visually Impaired,” 10 th International conference on design and technology of integrated systems in nanoscale era, pp. 1–6, 2015.

[24] C. Ramer, T. Lichtenegger, J. Sessner, M. Landgraf, and J. Franke, “An Adaptive , Color Based Lane Detection of a Wearable Jogging Navigation System for Visually Impaired on Less Structured Paths ,” 6th IEEE International Conference on biomedical robotics and biomechatronics, pp. 741–746, 2016.

[25] S. Panchanathan, S. Chakraborty, and T. Mcdaniel, “Social Interaction Assistant : A Person-Centered Approach to Enrich Social Interactions for Individuals with Visual Impairments,” IEEE journal of selected topics in signal processing, pp. 1–10, 2016.

[26] M. Matsuo, T.Miura, “ShadowRine : Accessible Game for Blind Users , and Accessible Action RPG for Visually Impaired Gamers,” IEEE International conference on systems, man, and cybernetics, pp. 2826–2827, 2016.

[27] Z. Liu, Y. Luo, J. Cordero, N. Zhao, and Y. Shen, “Finger-Eye : A Wearable Text Reading Assistive System for the Blind and Visually Impaired,” IEEE international conference on real time computing and robotics, pp. 123–128, 2016.

[28] A. Fernandez Villan, J. L. Carus, R. Usamentiaga and R. Casado, “Face Recognition and Spoofing Detection System Adapted to Visually-Impaired People,” IEEE Lat. Am. Trans., vol. 14, no. 2, pp. 913–921, 2016.

[29] K. Lee, M. Li, and C. Lin, “Magnetic Tensor Sensor and Way-finding Method based on Geomagnetic Field Effects with Applications for Visually Impaired Users,” IEEE transactions on mechatronics, pp. 1-10, 2016.

[30] F. Lan, G. Zhai, and W. Lin, “Lightweight smart glass system with audio aid for visually impaired people,” IEEE Reg. 10 Annu. Int. Conf. Proceedings/TENCON, January, pp. 4–7, 2016.

[31] P. Kardyś, A. Dąbrowski, M.Iwanowski, D.Hunderek, “A new Android application for blind and visually impaired people,” pp. 152–155, 2016.

[32] J.Jakob, J.Tick, “Concept for transfer of driver assistance algorithms for blind and visually impaired people”, IEEE 15 th International symposium on applied machine intelligence and informatics, pp. 241–246, 2017.

[33] S. A. Cheraghi, V. Namboodiri, and L. Walker, “Guide Beacon : Beacon-Based Indoor Wayfinding for the Blind , Visually Impaired , and Disoriented,” IEEE International Conference on pervasive Computing and Communications, pp. 1-10, 2017.

[34] K. Chaccour and G. Badr, “Novel indoor navigation system for Visually Impaired and blind people.” pp. 1-5. N. P. Harry, “Interactive demonstration on the use of existing apps on mobile technologies to teach basic photographic techniques to participants who are blind , visually impaired and sighted together,” 14th IEEE Annual Consumer Communications and Networking Conference, pp. 623–624, 2017.

58