TY - JOUR
T1 - Enhancing Low-Density EEG-Based Brain-Computer Interfacing With Similarity-Keeping Knowledge Distillation
AU - Huang, Xin Yao
AU - Chen, Sung Yu
AU - Wei, Chun Shu
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Electroencephalogram (EEG) has been one of the common neuromonitoring modalities for real-world brain-computer interfaces (BCIs) because of its non-invasiveness, low cost, and high temporal resolution. In recent years, the emergence of lightweight and portable EEG wearable devices with low-density montages has significantly increased the convenience and usability of BCI applications. However, the use of low-density EEG montages often leads to a loss in EEG decoding performance due to the reduced number of electrodes and limited coverage of scalp regions. To address this issue, we introduce knowledge distillation (KD), a learning mechanism developed for transferring information between neural network models, to enhance the performance of low-density EEG decoding. Our framework includes a newly proposed similarity-keeping (SK) teacher-student KD scheme that allows a low-density EEG student model to acquire the inter-sample similarity from a pre-trained teacher model trained on high-density EEG data. The experimental results validate that our SK-KD framework consistently improves motor-imagery EEG decoding accuracy for low-density EEG data and outperforms other KD methods across various model architectures. As the first KD scheme developed for enhancing EEG decoding, we foresee the proposed SK-KD framework facilitating the practicality of low-density EEG-based BCI in real-world applications.
AB - Electroencephalogram (EEG) has been one of the common neuromonitoring modalities for real-world brain-computer interfaces (BCIs) because of its non-invasiveness, low cost, and high temporal resolution. In recent years, the emergence of lightweight and portable EEG wearable devices with low-density montages has significantly increased the convenience and usability of BCI applications. However, the use of low-density EEG montages often leads to a loss in EEG decoding performance due to the reduced number of electrodes and limited coverage of scalp regions. To address this issue, we introduce knowledge distillation (KD), a learning mechanism developed for transferring information between neural network models, to enhance the performance of low-density EEG decoding. Our framework includes a newly proposed similarity-keeping (SK) teacher-student KD scheme that allows a low-density EEG student model to acquire the inter-sample similarity from a pre-trained teacher model trained on high-density EEG data. The experimental results validate that our SK-KD framework consistently improves motor-imagery EEG decoding accuracy for low-density EEG data and outperforms other KD methods across various model architectures. As the first KD scheme developed for enhancing EEG decoding, we foresee the proposed SK-KD framework facilitating the practicality of low-density EEG-based BCI in real-world applications.
KW - Brain-computer interface (BCI)
KW - Electroencephalogram (EEG)
KW - Knowledge distillation (KD)
UR - http://www.scopus.com/inward/record.url?scp=85181563972&partnerID=8YFLogxK
U2 - 10.1109/TETCI.2023.3335943
DO - 10.1109/TETCI.2023.3335943
M3 - Article
AN - SCOPUS:85181563972
SN - 2471-285X
VL - 8
SP - 1156
EP - 1166
JO - IEEE Transactions on Emerging Topics in Computational Intelligence
JF - IEEE Transactions on Emerging Topics in Computational Intelligence
IS - 2
ER -