Kinect Library

Class Instructor Date Language Ta'ed Code
Research Project Karen Liu/Sehoon Ha (PhD student) Spring 2013-Fall 2013 C++ (Windows) No Code N/A

The Kinect Library I wrote was based upon the Microsoft Kinect SDK 1.7. I implemented full skeleton handling/IK, voice recognition, hand state tracking (open, closed, pointing) via a Neural Net I wrote that would learn online, IR/RGB camera support, and numerous Kinect friendly UI components, including a slider, lever, button, check box and even textbox.

The purpose of the library was to provide an interface to facilitate training a physically based character to perform complex parkour-like tricks and maneuvers. The user would make a motion that would serve as an example for the character learning the maneuver. Swapping between "Teaching Mode" and "System Control Mode" needed to be accomplished via voice recognition so as to not train "reaching for the mouse", and a fully Kinect-friendly and accurate inteface for communicating with the optimization engine was desired so the user didn't need to be near a computer.

Here are some videos of the system during development, some functionality was not yet implemented. By the end of the research project, all the UI buttons, levers and sliders were working, the hands supported push, press, and grab for both hands (assisted by the neural net that I implemented in the code) and the voice recognition was also complete. This project was done 9 years ago, and so it's been challenging to find demo videos, to say the least-I really want to find the few I made of the completed, working system, and when I do I'll put them up as well.

An early test of the hands' performance. Future iterations were not nearly so noisy nor so sensitive. Note the various UI elements supported : button, slider, lever. The elements above are buttons and also signify what the system thinks each hand is doing.
An early test of the slider UI element. Note the hand behavior is much more accurate (grab, push, press).