Decoding neural representations of rhythmic sounds from magnetoencephalography

Pei Chun Chang, Jia Ren Chang, Po Yu Chen, Li Kai Cheng, Jen-Chuen Hsieh, Hsin Yen Yu, Li-Fen Chen, Yong-Sheng Chen

研究成果: Conference article同行評審

1 引文 斯高帕斯(Scopus)


Neuroscience studies have revealed neural processes involving rhythm perception, suggesting that brain encodes rhythmic sounds and embeds information in neural activity. In this work, we investigate how to extract rhythmic information embedded in the brain responses and to decode the original audio waveforms from the extracted information. A spatiotemporal convolutional neural network is adopted to extract compact rhythm-related representations from the noninvasively measured magnetoencephalographic (MEG) signals evoked by listening to rhythmic sounds. These learned MEG representations are then used to condition an audio generator network for the synthesis of the original rhythmic sounds. In the experiments, we evaluated the proposed method by using the MEG signals recorded from eight participants and demonstrated that the generated rhythms are highly related to those evoking the MEG signals. Interestingly, we found that the auditory-related MEG channels reveal high importance in encoding rhythmic representations, the distribution of these representations relate to the timing of beats, and the behavior performance is consistent with the performance of neural decoding. These results suggest that the proposed method can synthesize rhythms by decoding neural representations from MEG.

頁(從 - 到)1280-1284
期刊ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
出版狀態Published - 6月 2021
事件2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021 - Virtual, Toronto, Canada
持續時間: 6 6月 202111 6月 2021


深入研究「Decoding neural representations of rhythmic sounds from magnetoencephalography」主題。共同形成了獨特的指紋。