EEG-based emotion recognition in music listening

Yuan Pin Lin, Chi Hong Wang, Tzyy Ping Jung*, Tien Lin Wu, Shyh Kang Jeng, Jeng Ren Duann, Jyh Horng Chen

*此作品的通信作者

研究成果: Article同行評審

536 引文 斯高帕斯(Scopus)

摘要

Ongoing brain activity can be recorded as electroencephalograph (EEG) to discover the links between emotional states and brain activity. This study applied machine-learning algorithms to categorize EEG dynamics according to subject self-reported emotional states during music listening. A framework was proposed to optimize EEG-based emotion recognition by systematically 1) seeking emotion-specific EEG features and 2) exploring the efficacy of the classifiers. Support vector machinewas employed to classify four emotional states (joy, anger, sadness, and pleasure) and obtained an averaged classification accuracy of 82.29% ± 3.06% across 26 subjects. Further, this study identified 30 subjectindependent features that were most relevant to emotional processing across subjects and explored the feasibility of using fewer electrodes to characterize the EEG dynamics during music listening. The identified features were primarily derived from electrodes placed near the frontal and the parietal lobes, consistent with many of the findings in the literature. This study might lead to a practical system for noninvasive assessment of the emotional states in practical or clinical applications.

原文English
文章編號5458075
頁(從 - 到)1798-1806
頁數9
期刊IEEE Transactions on Biomedical Engineering
57
發行號7
DOIs
出版狀態Published - 七月 2010

指紋

深入研究「EEG-based emotion recognition in music listening」主題。共同形成了獨特的指紋。

引用此