Continuous-Time Attention for Sequential Learning

Jen Tzung Chien, Yi Hsiang Chen

研究成果: Conference contribution同行評審

15 引文 斯高帕斯(Scopus)

摘要

Attention mechanism is crucial for sequential learning where a wide range of applications have been successfully developed. This mechanism is basically trained to spotlight on the region of interest in hidden states of sequence data. Most of the attention methods compute the attention score through relating between a query and a sequence where the discrete-time state trajectory is represented. Such a discrete-time attention could not directly attend the continuous-time trajectory which is represented via neural differential equation (NDE) combined with recurrent neural network. This paper presents a new continuous-time attention method for sequential learning which is tightly integrated with NDE to construct an attentive continuous-time state machine. The continuous-time attention is performed at all times over the hidden states for different kinds of irregular time signals. The missing information in sequence data due to sampling loss, especially in presence of long sequence, can be seamlessly compensated and attended in learning representation. The experiments on irregular sequence samples from human activities, dialogue sentences and medical features show the merits of the proposed continuous-time attention for activity recognition, sentiment classification and mortality prediction, respectively.

原文English
主出版物標題35th AAAI Conference on Artificial Intelligence, AAAI 2021
發行者Association for the Advancement of Artificial Intelligence
頁面7116-7124
頁數9
ISBN(電子)9781713835974
出版狀態Published - 2021
事件35th AAAI Conference on Artificial Intelligence, AAAI 2021 - Virtual, Online
持續時間: 2 2月 20219 2月 2021

出版系列

名字35th AAAI Conference on Artificial Intelligence, AAAI 2021
8B

Conference

Conference35th AAAI Conference on Artificial Intelligence, AAAI 2021
城市Virtual, Online
期間2/02/219/02/21

指紋

深入研究「Continuous-Time Attention for Sequential Learning」主題。共同形成了獨特的指紋。

引用此