Towards Real-Time Continuous Emotion Recognition from Body Movements Host Publication: 4th International Workshop on Human Behavior Understanding Authors: W. Wang, V. Enescu and H. Sahli Publisher: Springer International Publishing Publication Date: Oct. 2013 Number of Pages: 11 ISBN: 978-3-319-02713-5
Abstract: Social psychological research indicates that bodily expressions convey important affective information, although this modality is relatively neglected in the literature as compared to facial expressions and speech. In this paper we propose a real-time system that continuously recognizes emotions from body movements data streams. Low-level 3D postural features and high-level kinematic and geometrical features are through summarization (statistical values) or aggregation (feature patches), fed to a random forests classifier. In a first stage, the MoCap UCLIC affective gesture database has been used for training the classifier, which led to an overall recognition rate of 78% using a 10-fold cross-validation (leave-one-out). Subsequently, the trained classifier was tested with different subjects using continuous Kinect data. A performance of 72% was reached in real-time, which proves the efficiency and effectiveness of the proposed system.
|