Human emotional state recognition using 3D facial expression features
thesisposted on 22.05.2021, 12:17 by Yun Tie
In recent years there has been a growing interest in improving all aspects of the interaction between human and computers. Emotion recognition is a new research direction in human-computer interaction (HCI) which is based on affective computing that is expected to significantly improve the quality of HCI system and communications. Most existing works address this problem using 2D features, but they are sensitive to head pose, clutter, and variations in lighting conditions. In light of such problems, two 3D visual feature based approaches are presented in this dissertation. First, we present a recognition method based on the Gabor library for real 3D visual features extraction and an improved kernel canonical correlation analysis (IKCCA) algorithm for emotion classification. Second, to reduce the computation cost and provide a more general approach, we propose using a fiducial points' controlled 3D face model to recognize human emotion from video sequences. An Elastic body spine (EBS) technique is applied for deformation feature extraction and a discriminative Isomap (D-Isomap) based classification is used for the final decision. The most significant contributions of this work are detecting and tracking fiducial points automatically from video sequences to construct a generic 3D face model, and the introduction of EBS deformation features for emotion recognition. The experimental results show the robustness and effectiveness of the proposed methods.