Kinect H4x Gesture Recognition and Playback Tools (+Inspiration)
-
Upload
lilian-evans -
Category
Documents
-
view
226 -
download
4
Transcript of Kinect H4x Gesture Recognition and Playback Tools (+Inspiration)
Kinect H4x
Gesture Recognition and Playback Tools (+Inspiration)
SDK Version 1.0 - Out TODAYWhat's New?Ability to control which user(s) are under full skeletal tracking."Near mode" enables interaction as close as 40cm from the device. Includes "too far" and "too close"depth indicators:
Better everything
Mother of All Kinect Demos - Kinect Explorer sample app shows off all features (camera tilt, audio beam angles, etc).
Gesture Recognition: Dynamic Time Warping
A sequence matching algorithm that can adapt to sequences that vary in speed and time. (think Levenshtein distances, but generalized to matching any sort of input to stored data)
It measures the similarity between two sequences based on a cost function of how much it needs to "warp" the points forward/backward in time to have them line up.
In Kinect-land, this means an algorithm that can take streaming joint data and quickly find the closest match to a gesture 'on record'.
There's an App for that -- DEMO
But how does it work?And how can I work into my own project?Three classes:
Skeleton2DDataExtract.cs -- Takes in Kinect SDK joint data, spits out normalized 2d skeleton points.
Skeleton2DdataCoordEventArgs.cs -- Defines the event args that get emitted from the Skeleton2DDataExtract event handler once processed.
DtwGestureRecognizer.cs - Parse the 2d skeleton data and call Recognize() to match against loaded gestures (see code for loading/saving example).
Recognizing A Gesture -- Code
Really Advanced Hacks
The DtwGestureRecognizer can flexibly match any vectorized data stream.
We happened to use skeleton data in our example, but it should be fairly simple to incorporate a depth stream or color stream. Just ensure that each of the sequence objects you pass into AddOrUpdate and Recognize is an array of doubles (e.g. double[] observationPoint).
Working on P4 Collaboratively:Recording and Replaying Skeleton Data
In light of the fact that many teams share a single Kinect, it might be of use to you to be able to record a sequence of skeleton data, write it to a file, and replay it back through a dummy nui.SkeletonFrameReady handler.
Fortunately, the Kinect Toolbox (not to be confused with the Coding4Fun Kinect Toolkit), allows us to do just that.
Get it at: http://kinecttoolbox.codeplex.com/
Demo -- Sample Code Walkthrough -- Saving and
Playing Back Gestures
Candescent NUI (Demo)The Cool News: Hand + Finger Tracking!The Bad News: Behemoth, undocumented code library
Project available at http://candescentnui.codeplex.com
Kinect SDK only provides depth values from 800mm out. The finger tracking code only works in the range of .8 - 1m, so using this project in conjunction with the Kinect SDK will prove difficult. UPDATE: NEW SDK 1.0 provides depth values from 400mm out!
Compatible With Candescent Alternative: OpenNI + NITE uses the raw point cloud to make best-guess tracking estimates < 800mm away. Good community, documentation at OpenNI.org / OpenKinect.org
Anant - Shape Game Walkthrough
Inspirational Project:
Deixis Application To Children's Education Games
Main Idea: Ask children to point and verbally identify ("this one!") a subset of objects
(numbers, colors, animals).
Description + Videohttp://www.cs.utoronto.ca/~uzmakhan/HCIProject.html
Inspirational Hacks
Gesture Enabled Garden Hose
Main Idea: Use servos (simple motors) in conjunction with netduinos (network-enabled microcontrollers) to control the servo via Kinect gestures: http://channel9.msdn.com/coding4fun/kinect/Kinect-to-Netduino-Cross-post
In Practice: http://channel9.msdn.com/coding4fun/kinect/A-Kinect--Netduino-controlled-squirt-gun
http://www.youtube.com/watch?v=FWINsKcP8oQ
Inspirational Hacks Pt. 2
Gesture Based Electronic Music Performance
http://vimeo.com/channels/pulseproject#30301433
Inspirational Hacks Pt. 3EDEN: Interactive Ecosystem Simulation Software
http://vimeo.com/31940579 Main Idea: Create a topographical landscape on the iPad, fill it with (simulated) water, project it onto a sandscape via depth data with an overhead Kinect + projector. Play with the sand to change the climate and topography to terraform your own sandscape.
More
Kinect Telepresence - http://www.youtube.com/watch?v=ecMOX8_GeRYHome Security Camera - http://www.youtube.com/watch?v=UfGOR1Eg_qQLiving Paintings - http://www.youtube.com/watch?v=UjDaHMKwQl4Visually Impaired Navigation Tool- http://www.youtube.com/watch?v=l6QY-eb6NoQ
ZigFu: Single bundle install NITE, OpenNI, PrimeSense Sensor, everything you need to work outside of the official SDK: http://zigfu.com/devtools.html