Media Player Gesture Controller

ETH Zurich Spring 2015

This project was completed as a semester capstone for the User Interface Engineering course at ETH Zurich. Our team created a support vector machine classifier for recognizing various gestures using depth map data from the Leap Motion, a sensor with IR LEDs and cameras. The Leap Motion uses hand pose estimation algorithms to cluster similar regions of pixels together, assigns centers to these regions, and then tries to connect them to the skeleton of a human hand. With data of the hand skeleton, we recognized various gesture signaled by a user. These gestures were configured to control a media player so that a user can control his viewing device hands free.