Leightley, D, Li, B, McPhee, JS, Yap, MH and Darby, J (2014) Exemplar-Based Human Action Recognition with Template Matching from a Stream of Motion Capture. In: 11th International Conference, Image Analysis and Recognition (ICIAR), 2014.
|
Available under License In Copyright. Download (508kB) | Preview |
Abstract
Recent works on human action recognition have focused on representing and classifying articulated body motion. These methods require a detailed knowledge of the action composition both in the spatial and temporal domains, which is a difficult task, most notably under real-time conditions. As such, there has been a recent shift towards the exemplar paradigm as an efficient low-level and invariant modelling approach. Motivated by recent success, we believe a real-time solution to the problem of human action recognition can be achieved. In this work, we present an exemplar-based approach where only a single action sequence is used to model an action class. Notably, rotations for each pose are parameterised in Exponential Map form. Delegate exemplars are selected using k-means clustering, where the cluster criteria is selected automatically. For each cluster, a delegate is identified and denoted as the exemplar by means of a similarity function. The number of exemplars is adaptive based on the complexity of the action sequence. For recognition, Dynamic Time Warping and template matching is employed to compare the similarity between a streamed observation and the action model. Experimental results using motion capture demonstrate our approach is superior to current state-of-the-art, with the additional ability to handle large and varied action sequences.
Impact and Reach
Statistics
Additional statistics for this dataset are available via IRStats2.