Continuous-time self-attention in neural differential equation

Jen Tzung Chien, Yi Hsiang Chen

Research output: Contribution to journalConference articlepeer-review

2 Scopus citations

Abstract

Neural differential equation (NDE) is recently developed as a continuous-time state machine which can faithfully represent the irregularly-sampled sequence data. NDE is seen as a substantial extension of recurrent neural network (RNN) which conducts discrete-time state representation for regularly-sampled data. This study presents a new continuous-time attention to improve sequential learning where the region of interest in continuous-time state trajectory over observed as well as missing samples is sufficiently attended. However, the attention score, calculated by relating between a query and a sequence, is memory demanding because self-attention should treat all time observations as query vectors to feed them into ordinary differential equation (ODE) solver. To deal with this issue, we develop a new form of dynamics for continuous-time attention where the causality property is adopted such that query vector is fed into ODE solver up to current time. The experiments on irregularly-sampled human activities and medical features show that this method obtains desirable performance with efficient memory consumption.

Original languageEnglish
Pages (from-to)3290-3294
Number of pages5
JournalICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
Volume2021-June
DOIs
StatePublished - Jun 2021
Event2021 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2021 - Virtual, Toronto, Canada
Duration: 6 Jun 202111 Jun 2021

Keywords

  • Attention mechanism
  • Causal attention
  • Neural differential equation
  • Sequential learning

Fingerprint

Dive into the research topics of 'Continuous-time self-attention in neural differential equation'. Together they form a unique fingerprint.

Cite this