Abstract
Complex emotion is an aggregate of two or more others which has highly variable appearances, inter-dependence, and affective dynamics.These properties make the recognition hard to handle via existing recognition techniques like action units or valence-arousal detection. In this study, we propose a bionic two-system structure for complex emotion recognition. The structure mimics the working theory of the human brain responding to problems decision-making. System I is a fast compound sensing module. System II is a slower cognitive decision module that processes data more integratively. System I contains one branch for facial expression feature representation including basic emotion, action units, and valence arousal detection and one for physiological measurement which is an image-only implementation for practicality. In System II, a decision module with segmentation is employed to ensure the chosen period including the emotion occurrence and iteratively optimize the emotion information in a given segment via reinforcement learning. The proposed method outperforms state-of-the-art on emotion recognition tasks with an accuracy of 94.15% in basic emotion recognition on the BP4D and an accuracy of 68.75% for binary valence arousal classification on the DEAP. For a subset of complex emotions, the recognition accuracy exceeds 70% on both databases, that is a significant improvement.
Original language | English |
---|---|
Pages (from-to) | 1-14 |
Number of pages | 14 |
Journal | IEEE Transactions on Affective Computing |
DOIs | |
State | Accepted/In press - 2023 |
Keywords
- action unit detection
- complex emotion recognition
- Emotion recognition
- Face recognition
- Faces
- facial expression detection
- Gold
- heart rate variability
- Physiology
- reinforcement learning
- remote photoplethysmography
- Sensors
- Task analysis
- valence-arousal detection