Indian Sign Language Recognition Method For Deaf People

16
Presentation On Indian Sign Language Enhanced Recognition Methods For Deaf People -Anurag Prasad (120103024) -Takrim Ul Islam Laskar (120103006) Presented By-

Transcript of Indian Sign Language Recognition Method For Deaf People

Page 1: Indian Sign Language Recognition Method For Deaf People

Presentation On Indian Sign Language Enhanced Recognition Methods For Deaf People

-Anurag Prasad (120103024) -Takrim Ul Islam Laskar (120103006)

Presented By-

Page 2: Indian Sign Language Recognition Method For Deaf People

Overview• Introduction• Indian Sign Language • Gesture Recognition• Working Environment• Implementation• Conclusion• Reference

Page 3: Indian Sign Language Recognition Method For Deaf People

Introduction• Indian Sign language is develop for deaf community of India, which can be used as a means of

communication between friends and families of deaf people and the deaf people.

• Sign Language Recognition is one of the most growing fields of research today as well as challenging too.

• There are Many new techniques that have been developed recently in this field. In this project , we are going to develop a system for conversion of Indian sign language to text using OpenCV.

• It presents a methodology which recognizes the Indian Sign Language (ISL) and translates into a normal text.

Page 4: Indian Sign Language Recognition Method For Deaf People

Indian Sign LanguageThe following images represent the symbols for their respective English letters’ representations.

Figure 1: ISL Alphabets.

Page 5: Indian Sign Language Recognition Method For Deaf People

Gesture RecognitionGesture recognition is a topic in computer science and language technology with the goal of interpreting human gesture via mathematical algorithms.

Hardware Tools used in Gesture Recognition:

1. Wired gloves

2. Stereo cameras

3. Controller-based gestures

4. Single camera

In our project we have implemented sign language recognition on a single camera.

Page 6: Indian Sign Language Recognition Method For Deaf People

AlgorithmsBroadly speaking there are 2 different approaches in gesture recognition:

1. 3D model-based algorithms:• Volumetric model• Skeletal models

2. Appearance-based models:• Deformable 2D templates.• Image sequences

Page 7: Indian Sign Language Recognition Method For Deaf People

Working Environment• Tools

• OpenCV 3.0 , Python 2.7

• Environment • IDLE (Integrated Development and Learning Environment) which is the basic Platform for python.

• Experiment Platform • Linux based platform (e.g. Ubuntu 15.10 is a Debian-based Linux Operating System)

Page 8: Indian Sign Language Recognition Method For Deaf People

Implementation• Acquire the Image• Acquiring frames in real time• cap = cv2.VideoCapture( 0 )• ret, img = cap.read()

Figure 2: Acquiring frames in real time.

Page 9: Indian Sign Language Recognition Method For Deaf People

• Image Preprocessingo Morphological Transformso Blurringo Thresholding

Figure 3: Morphological Transforms (a) Grayscale Image (b) Dilation and (c) Erosion.

Page 10: Indian Sign Language Recognition Method For Deaf People

Figure 4: Blurring. Figure 5: Thresholding.

Page 11: Indian Sign Language Recognition Method For Deaf People

• Extract the largest contour using convexhull

Figure 6: Extract the largest contour.

Page 12: Indian Sign Language Recognition Method For Deaf People

• Contour shape matching• diffvalue = cv2.MatchShapes(object1, object2, method, parameter=0)• Performs the comparison after finding the contours.

• Extracting the Matched Sign• if diffvalue<0.1:

print '\nmatched with ',cntname[move],'diff : ',diffvalue

cv2.putText(img,cntname[move],(100,400),cv2.FONT_HERSHEY_SIMPLEX, 4,(255,255,255),2)

Figure 7: Output.

Page 13: Indian Sign Language Recognition Method For Deaf People

• Why we have chosen the 0.1 as maximum difference value ?

• In Figure 8 , the contour difference values are shown at 0.09 , 0.1 & 0.2 .

0.09 0.1 0.2

Figure 8: Contour difference values at 0.9 , 0.1 & 0.2 respectively .

Page 14: Indian Sign Language Recognition Method For Deaf People

Conclusion• Indian Sign Language using object detection and recognition through computer vision was a

partly successful one with an accuracy rate of 82.69 %.• The question of perfection is another quest to deal in the days to come.• The hand gesture detection and recognition were the main topic and problem that were dealt

with.

Page 15: Indian Sign Language Recognition Method For Deaf People

Reference[1] R. Gopalan and B. Dariush, “Towards a Vision Based Hand Gesture Interface for Robotic Grasping”, The IEEE/RSJ International Conference

on Intelligent Robots and Systems, October 11-15, 2009, St. Louis, USA, pp. 1452-1459.

[2] T. Kapuscinski and M. Wysocki, “Hand Gesture Recognition for Man-Machine interaction”, Second Workshop on Robot Motion and Control, October 18-20, 2001, pp. 91-96.

[3] D. Y. Huang, W. C. Hu, and S. H. Chang, “Vision-based Hand Gesture Recognition Using PCA+Gabor Filters and SVM”, IEEE Fifth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, 2009, pp. 1-4.

[4] C. Yu, X. Wang, H. Huang, J. Shen, and K. Wu, “Vision-Based Hand Gesture Recognition Using Combinational Features”, IEEE Sixth International Conference on Intelligent Information Hiding and Multimedia Signal Processing, 2010, pp. 543-546.

[5] J. L. Raheja, K. Das, and A. Chaudhury, “An Efficient Real Time Method of Fingertip Detection”, International Conference on Trends in Industrial Measurements and automation (TIMA), 2011, pp. 447-450.

[6] Manigandan M. and I. M Jackin, “Wireless Vision based Mobile Robot control using Hand Gesture Recognition through Perceptual Color Space”, IEEE International Conference on Advances in Computer Engineering, 2010, pp. 95-99.

[7] A. S. Ghotkar, R. Khatal, S. Khupase, S. Asati, and M. Hadap, “Hand Gesture Recognition for Indian Sign Language”, IEEE International Conference on Computer Communication and Informatics (ICCCI), Jan. 10-12, 2012, Coimbatore, India.

[8] I. G. Incertis, J. G. G. Bermejo, and E.Z. Casanova, “Hand Gesture Recognition for Deaf People Interfacing”, The 18th International Conference on Pattern Recognition (ICPR), 2006.

Page 16: Indian Sign Language Recognition Method For Deaf People

[9] J. Rekha, J. Bhattacharya, and S. Majumder, “Shape, Texture and Local Movement Hand Gesture Features for Indian Sign Language Recognition”, IEEE, 2011, pp. 30-35. [10] L. K. Lee, S. Y. An, and S. Y. Oh, “Robust Fingertip Extraction with Improved Skin Color Segmentation for Finger Gesture Recognition in Human-Robot

Interaction”, WCCI 2012 IEEE World Congress on Computational Intelligence, June, 10-15, 2012, Brisbane, Australia. [11] S. K. Yewale and P. K. Bharne, “Hand Gesture Recognition Using Different Algorithms Based on Artificial Neural Network”, IEEE, 2011, pp. 287-292. [12] Y. Fang, K. Wang, J. Cheng, and H. Lu, “A Real-Time Hand Gesture Recognition Method”, IEEE ICME, 2007, pp. 995-998. [13] S. Saengsri, V. Niennattrakul, and C.A. Ratanamahatana, “TFRS: Thai Finger-Spelling Sign Language Recognition System”, IEEE, 2012, pp. 457-462. [14] J. H. Kim, N. D. Thang, and T. S. Kim, “3-D Hand Motion Tracking and Gesture Recognition Using a Data Glove”, IEEE International Symposium on Industrial Electronics (ISIE), July 5-8, 2009, Seoul Olympic Parktel, Seoul , Korea, pp. 1013-1018. [15] J. Weissmann and R. Salomon, “Gesture Recognition for Virtual Reality Applications Using Data Gloves and Neural Networks”, IEEE, 1999, pp. 2043-2046. [16] W. W. Kong and S. Ranganath, “Sign Language Phoneme Transcription with PCA-based Representation”, The 9th International Conference on Information and Communications Security(ICICS), 2007, China. [17] M. V. Lamar, S. Bhuiyan, and A. Iwata, “Hand Alphabet Recognition Using Morphological PCA and Neural Networks”, IEEE, 1999, pp. 2839-2844. [18] O. B. Henia and S. Bouakaz, “3D Hand Model Animation with a New Data-Driven Method”, Workshop on Digital Media and Digital Content Management (IEEE Computer Society), 2011, pp. 72-76. [19] M. Pahlevanzadeh, M. Vafadoost, and M. Shahnazi, “Sign Language Recognition”, IEEE, 2007. [20] J. B. Kim, K. H. Park, W. C. Bang, and Z. Z. Bien, “Continuous Gesture Recognition System for Korean Sign Language based on Fuzzy Logic and Hidden Markov Model”, IEEE, 2002, pp. 1574-1579.