Current Mood: studious
Blogs I Commented On:
Summary:
This gesture recognition paper focuses on the hardware side, specifically ultrasonics for improving recognition accuracy. The paper first discusses three issues inherent in all ultrasonic positioning systems, which are: reflections (false signals due to reflective materials in environment), occlusions (lost signals due to lack of line of sight between communicating devices), and temporal resolution (low transmission counts due to distance between communicating devices). To perform tracking, the authors then smooth the sonic data in two steps: on the raw signals and on the resulting coordinates. Classification is then done using C4.5 and k-Nearest Neighbor (k-NN) for motion sensors analysis. Each manipulative gesture in their experiment corresponds to an individually trained HMM model for model based classification, while a sliding window approach was used for framed based classification. After classification is performed on all frames for a gesture, a majority decision is applied on the results, yielding a filtered decision for a particular gesture. Finally, a complex fusion method is done where separate classification is done on the ultrasonic and motion signals. Their experiment was on manipulative gestures for a bicycle repair task, which performed best with their fusion method over k-NN, HMM, and C4.5.
Discussion:
Ultrasonics as a sensor for the hand gesture recognition domain appears to be potentially viable judging from this paper. Performance were mixed with the traditional machine learning methods experimented, but it gave pretty good accuracy for the fusion approach. I had a hard time understanding quite exactly how their fusion approach though, but from a hardware perspective, supplementing our available sensors with ultrasonics wouldn’t hurt.
0 comments:
Post a Comment