Gesture-Based Wheelchair Control for Physically- Challenged

download Gesture-Based Wheelchair Control for Physically- Challenged

of 27

  • date post

  • Category


  • view

  • download


Embed Size (px)

Transcript of Gesture-Based Wheelchair Control for Physically- Challenged

  • Slide 1
  • Gesture-Based Wheelchair Control for Physically- Challenged
  • Slide 2
  • Slide 3
  • CONTENTS Abstract Problem Definition Project Overview Implementation Verification Future Work References
  • Slide 4
  • ABSTRACT We have developed a gesture-controlled wheelchair system for use by quadriplegics and other physically-disabled persons. Salient features: Effortless to use Customizable Economical Power-efficient Non-intrusive
  • Slide 5
  • PROBLEM DEFINITION Existing powered wheelchairs demand exertion of force for controlling them, making them unusable to a class of the physically- challenged called quadriplegics. Movements such as pressing buttons or controlling a joystick are impossible for quadriplegics, who lack fine motor control.
  • Slide 6
  • PROJECT OVERVIEW The quadriplegics often retain some imprecise motion of their fingers. Therefore, the best option is a gesture-based interaction with their environment, in particular their wheelchairs. We have developed a robust, real-time vision- based hand gesture recognition engine reliable enough for steering a wheelchair.
  • Slide 7
  • PROJECT OVERVIEW Hardware specification: IR-sensitive USB webcam Diffusion masks + mounting/enclosure x64/x86 PC platform Stripped USB keyboard circuit PIC 16F877A microcontroller-based interface DC motor steering mechanism with ULN2804 and relay-based H-bridge Software specification: MATLAB from The MathWorks Win.32/64-based operating system PIC-based H-bridge control
  • Slide 8
  • Slide 9
  • IMPLEMENTATION Gesture Capture Module developed. A regular web-camera was modified into an IR- sensitive version. An IR-illuminated backlit surface for gesture- capture was created. We used IR LEDs in a 8x8 matrix layout. We used an assembly of tracing paper sheets to create a diffuser for IR radiation. We used an aluminium foil mask to define an active area.
  • Slide 10
  • Slide 11
  • Slide 12
  • IMPLEMENTATION Gesture Recognition Module developed. Successfully captured and processed images from IR-camera in real-time. Defined a protocol of gestures. These are extremely simplified so as to be easy to use by the physically-challenged. Move forward Move backward Turn left Turn right Stop/brake
  • Slide 13
  • IMPLEMENTATION Gesture Recognition Module developed. Created a set of templates for these commands, which are then correlated with the hand positions detected by the gesture-capture module. There is an initial training phase for the software, where the user is able to customize all movement gestures according to their specific needs.
  • Slide 14
  • Slide 15
  • Slide 16
  • Return
  • Slide 17
  • IMPLEMENTATION Interfacing Module developed. A USB keyboard circuit was stripped to give access to the three indicator LEDs Scroll Lock, Caps Lock, Num Lock. The LEDs were controlled from keyboard control libraries linked to MATLAB, according to the gesture recognized. This gives a 3-bit code for each gesture. These codes are given to a PIC16F877A microcontroller, which produces outputs to control an H-bridge circuit.
  • Slide 18
  • IMPLEMENTATION 3-bit signalGesture H-bridge control signal 000Brake0000 001Forward1001 010Reverse0110 011Turn Left1001 0110 100Turn Right0110 1001 101Brake0000 110Brake0000 111Brake0000
  • Slide 19
  • IMPLEMENTATION Motor Control Module developed. Two DC motors are controlled based on inputs from the Interfacing Module. These inputs are linked to an H-bridge circuit through ULN2804.ULN2804 An H-bridge circuit consisting of 8 relays is used to direct the two motors attached to the rear wheels of the wheelchair.H-bridge
  • Slide 20
  • IMPLEMENTATION Both the rear wheels should turn in the same direction for forward/reverse motion of wheelchair. To turn left, the left wheel should turn backward while right wheel turns forward. The reverse is true for right turns. In order to brake, the wheels are made to turn in the opposite direction to current spin for quarter of a second. Then the motors are disconnected from the supply.
  • Slide 21
  • Slide 22
  • Slide 23
  • VERIFICATION Gesture detection verification Gestures that are captured have been correlated in real time with the templates that are stored. Satisfactory results have been achieved. DC motor motion test All possible combinations of 3-bit data were fed to the microcontroller to test motion of DC motors. The inputs were fed externally. The wheels connected to DC motors turned as expected.
  • Slide 24
  • VERIFICATION System Test Motion of wheels has been tested as per the input generated by the gesture capture module. The input fed by PC via USB cable is decoded by microcontroller. Each gesture generates a different 3-bit signal. Redundant signals have never been generated.
  • Slide 25
  • FUTURE PROSPECTS Porting the whole system to an embedded system based on the Intel Atom processor. Fine-tuning the system for power-efficiency and compactness.
  • Slide 26
  • REFERENCES Real-Time Vision-Based Hand Tracking and Gesture Recognition by Qing Chen. Digital Image Processing by William K Pratt. Fundamentals of Digital Image Processing by Anil K Jain. MATLAB Image Processing Toolbox for use with MATLAB.
  • Slide 27