Continuous-time self-attention in neural differential equation

Jen Tzung Chien, Yi Hsiang Chen

研究成果: Conference article同行評審

5 引文 斯高帕斯(Scopus)

摘要

Neural differential equation (NDE) is recently developed as a continuous-time state machine which can faithfully represent the irregularly-sampled sequence data. NDE is seen as a substantial extension of recurrent neural network (RNN) which conducts discrete-time state representation for regularly-sampled data. This study presents a new continuous-time attention to improve sequential learning where the region of interest in continuous-time state trajectory over observed as well as missing samples is sufficiently attended. However, the attention score, calculated by relating between a query and a sequence, is memory demanding because self-attention should treat all time observations as query vectors to feed them into ordinary differential equation (ODE) solver. To deal with this issue, we develop a new form of dynamics for continuous-time attention where the causality property is adopted such that query vector is fed into ODE solver up to current time. The experiments on irregularly-sampled human activities and medical features show that this method obtains desirable performance with efficient memory consumption.

原文English
頁(從 - 到)3290-3294
頁數5
期刊ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
2021-June
DOIs
出版狀態Published - 6月 2021
事件2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021 - Virtual, Toronto, 加拿大
持續時間: 6 6月 202111 6月 2021

指紋

深入研究「Continuous-time self-attention in neural differential equation」主題。共同形成了獨特的指紋。

引用此