keyboard_arrow_up
Human Emotion Estimation from EEG and Face Using Statistical Features and SVM

Authors

Strahil Sokolov, Yuliyan Velchev, Svetla Radeva and Dimitar Radev, University of Telecommunications and Post, Bulgaria

Abstract

An approach is presented in this paper for automated estimation of human emotions from combination of multimodal data: electroencephalogram and facial images. The used EEG features are the Hjorth parameters calculated for theta, alpha, beta and gamma bands taken from pre-defined channels. For face emotion estimation PCA feature are selected. Classification is performed with support vector machines. Since the human emotions are modelled as combinations from physiological elements such as arousal, valence, dominance, liking, etc., these quantities are the classifier’s outputs. The best achieved correct classification performance for EEG is about 76%. Classifier combination is used to return the final score for the particular subject.

Keywords

Multimodal human emotion, EEG, arousal, valence, BCI

Full Text  Volume 7, Number 2