Bimodal affect recognition based on autoregressive hidden Markov models from physiological signals
MetadataShow full item record
Background and objective: Affect provides contextual information about the emotional state of a person as he/she communicates in both verbal and/or non-verbal forms. While human's are great at determining the emotional state of people while they communicate in person, it is challenging and still largely an unsolved problem to computationally determine the emotional state of a person. Methods: Emotional states of a person manifest in the physiological biosignals such as electrocardiogram (ECG) and electrodermal activity (EDA) because these signals are impacted by the peripheral nervous system of the body, and the peripheral nervous system is strongly coupled with the mental state of the person. In this paper, we present a method to accurately recognize six emotions using ECG and EDA signals and applying autoregressive hidden Markov models (AR-HMMs) and heart rate variability analysis on these signals. The six emotions include happiness, sadness, surprise, fear, anger, and disgust. Results: We evaluated our method on a comprehensive new dataset collected from 30 participants. Our results show that our proposed method achieves an average accuracy of 88.6% in distinguishing across the 6 emotions. Conclusions: The key technical depth of the paper is in the use of the AR-HMMs to model the EDA signal and the use of LDA to enable accurate emotion recognition without requiring a large number of training samples. Unlike other studies, we have taken a hierarchical approach to classify emotions, where we first categorize the emotion as either positive or negative and then identify the exact emotion.
The following license files are associated with this item: