Body area networks for movement tracking applications fine secondo... · Body area networks for...
Transcript of Body area networks for movement tracking applications fine secondo... · Body area networks for...
Body area networks for movement
tracking applications Bojan Milosevic, PhD Student, XXV Ciclo
Electronics, Computer Science and Telecomunication, DEIS
Università di Bologna
Tutor: Prof. Luca Benini
Background
• Development of new, small and affordable inertial tracking devices is enabled by the recent advances in Micro Electro-Mechanical Systems (MEMS), resulting in a new generation of integrated inertal sensors (accelerometers, gyroscopes, magnetometers, presence, light, temperature or pressure sensors)
• Low-power wireless technologies (such are Bluetooth and Zig-Bee) allow embedded devices to efficiently exchange information or to stream sensor data to a central station for further elaboration.
Embedded Interactive Devices
• Scope: Design and development of embedded devices to track human movement, to enhance user interaction or to monitor user’s activity
• Focus: – Low cost and ease of use of the device – Optimal HW/SW design and resource utilization – Build smarter devices using the embedded computation resources and
sensor fusion techniques
Outline
• Scenario: 3D input/editing device • Introduction • Hardware • On-board orientation estimation • Results • Summary • Future Developments
NIIT4CAD – New Interactive and Innovative Technologies 4 CAD
The main idea is to combine a geometric engine based on subdivision surfaces with a simple interactive technology that allows an easy design and editing of surfaces in traditional CAD systems.
‒ A pen-like TUI is tracked in 3D space using Infra Red (IR) LEDs and low-cost webcams,
‒ Integration of video tracking with on-board inertial sensors to improve accuracy.
NIIT4CAD
[Beccari10] A fast interactive reverse-engineering system, Computer-Aided Design, 2010.
Smart Pen
• Easy to use interactive input device: advantages of a familiar daily tool, augmented with new functionalities
Hardware prototype developed:
• MCU: STM32F103VG ‒ 32 bit, up to 72MHz high performance ARM
Cortex M3 core, no floating point • Full Inertial Measurement Unit (IMU):
‒ ST LSM303DLH - 3D Accelerometer + Magnetometer
‒ L3G4200D - 3D Gyroscope • Bluetooth module • 4 IR-LEDs for video tracking
‒ Different distances between LEDs
• Problem: position and orientation tracking of the Smart Pen
• Tools: low cost stereo vision setup, inertial sensors, IR LEDs
• Approach: tracking of the four leds, estimation of inclination and position, integration with inertial measurements
• Contribution:
‒ use the inertial sensors to improve the Computer Vision (CV) based tracking
‒ implementation of sensor fusion and estimation algorithms on board the device and workload balancing between the device and the PC
• porting of a Extended Kalman Filter (EKF) on the STM32 MCU
Contribution
2D LED Points
3D LED Points
Pen Orientation
Pen Position and Orientation
Wiimotes
Stereo Triangulation
Pose Estimation
EKF
Prediction Correction
IMU Sensors
gyro accel mag
Implementation
2D LED Points
3D LED Points
Pen Orientation
Pen Position and Orientation
Wiimotes
Stereo Triangulation
Pose Estimation
EKF
Prediction Correction
IMU Sensors
gyro accel mag
Implementation
The Wiimote includes a 128x96 monochrome camera with built-in IR filters and image processing
• Connects to the PC via Bluetooth
• directly outputs the coordinates of up to 4 IR LEDs.
2D LED Points
3D LED Points
Pen Orientation
Pen Position and Orientation
Wiimotes
Stereo Triangulation
Pose Estimation
EKF
Prediction Correction
IMU Sensors
gyro accel mag
Implementation
The PC collects data from 2 Wiimotes, and uses standard techniques (openCV library) to perform stereo calibration and 3D triangulation to reconstruct the 3D position of the device
‒ no need to apply to whole image just to individual points representing IR Leds
2D LED Points
3D LED Points
Pen Orientation
Pen Position and Orientation
Wiimotes
Stereo Triangulation
Pose Estimation
EKF
Prediction Correction
IMU Sensors
gyro accel mag
Implementation
The SmartPen samples the inertial sensors and corrects the readings for eventual calibration bias and offset
2D LED Points
3D LED Points
Pen Orientation
Pen Position and Orientation
Wiimotes
Stereo Triangulation
EKF
Pose Estimation
Prediction Correction
IMU Sensors
gyro accel mag
Implementation
On the SmartPen: the Extended Kalman Filter (EKF) integrates the gyro measurements to predict the orientation of the device and corrects it with accelerometer and magnetometer readings
2D LED Points
3D LED Points
Pen Orientation
Pen Position and Orientation
Wiimotes
Stereo Triangulation
EKF
Pose Estimation
Prediction Correction
IMU Sensors
gyro accel mag
Implementation
Pose Estimation: • Principal Component Analysis to estimate pen
orientation from 3D points, • linear Kalman Filter to adjust orientation with inertial
estimate, • pen and LED geometry used to find the pen tip and
orientation
Results - On Board EKF
• Fixed point implementation of the Extended Kalman Filter in C on the STM32 platform (GCC for ARM)
– Optimized matrix operations, including exp. and matrix inversion (executed in float)
13 13.5 14 14.5 15 15.5 16 16.5 17 17.5
−25
−20
−15
−10
−5
0
5
10
Time [s]
Roll
[°]
Raw Fixed Matlab Float
Reference Implementation in MATLAB (double precision)
Floating point (32bit - 10Hz) :
•Time: 73 ms
•Max RMSE 2.4°
Fixed Point (10.22bits - 40Hz):
•Time: 22 ms
•Max RMSE 3.3°
• Fixed point @ 40Hz Vs Floating point @ 10Hz
– Best choice: fixed point version, with higher frequencies compensate the small loss in precision
Time [s]
Results - On Board EKF
Results - Tracking
X [mm]
Video
EKF Matlab
EKF MCU
Y [mm]
• Wiimotes output easy to use LED coordinates, but are rather noisy
• Inertial sensors can be used to reconstruct the device’s orientation
‒ On board fixed-point EKF can achieve good performance, with a 3° error compared to the double precision implementation and up to 40Hz
• The fusion of video and IMU sensors gives good overall performance
‒ can be further improved using higher resolution cameras
Summary
• Adaption of the hardware platform to a miniaturized wearable node (pictured right), with the goal to allow the user to wear several nodes to track whole body movements.
• Integration of all nodes in a synchronized Body Area Network (BAN) to monitor human motion and activities.
• Use of an Android smartphone to collect and further elaborate the data from the nodes.
Future Development
Bojan Milosevic
Micrel Lab - DEIS - UniBo
http://www-micrel.deis.unibo.it/~milosevic
Thank You!