Enhancing Low-Density EEG-Based Brain-Computer Interfacing With Similarity-Keeping Knowledge Distillation

Xin Yao Huang, Sung Yu Chen, Chun Shu Wei

研究成果: Article同行評審


Electroencephalogram (EEG) has been one of the common neuromonitoring modalities for real-world brain-computer interfaces (BCIs) because of its non-invasiveness, low cost, and high temporal resolution. In recent years, the emergence of lightweight and portable EEG wearable devices with low-density montages has significantly increased the convenience and usability of BCI applications. However, the use of low-density EEG montages often leads to a loss in EEG decoding performance due to the reduced number of electrodes and limited coverage of scalp regions. To address this issue, we introduce knowledge distillation (KD), a learning mechanism developed for transferring information between neural network models, to enhance the performance of low-density EEG decoding. Our framework includes a newly proposed similarity-keeping (SK) teacher-student KD scheme that allows a low-density EEG student model to acquire the inter-sample similarity from a pre-trained teacher model trained on high-density EEG data. The experimental results validate that our SK-KD framework consistently improves motor-imagery EEG decoding accuracy for low-density EEG data and outperforms other KD methods across various model architectures. As the first KD scheme developed for enhancing EEG decoding, we foresee the proposed SK-KD framework facilitating the practicality of low-density EEG-based BCI in real-world applications.

頁(從 - 到)1-11
期刊IEEE Transactions on Emerging Topics in Computational Intelligence
出版狀態Accepted/In press - 2023


深入研究「Enhancing Low-Density EEG-Based Brain-Computer Interfacing With Similarity-Keeping Knowledge Distillation」主題。共同形成了獨特的指紋。