Stochastic Convolutional Recurrent Networks

Jen-Tzung Chien, Yu Min Huang

研究成果: Conference contribution同行評審

4 引文 斯高帕斯(Scopus)

摘要

Recurrent neural network (RNN) has been widely used for sequential learning which has achieved a great success in different tasks. The temporal convolutional network (TCN), a variant of one-dimensional convolutional neural network (CNN), was also developed for sequential learning in presence of sequence data. RNN and TCN typically captures long-term and short-term features in temporal or spatial domain, respectively. This paper presents a new sequential learning, called the convolutional recurrent network (CRN), which fulfills TCN as an encoder and RNN as a decoder so that the global semantics as well as the local dependencies are simultaneously characterized from sequence data. To facilitate the interpretation and robustness in neural models, we further develop the stochastic modeling for CRN based on variational inference. The merits of CNN and RNN are then incorporated in inference of latent space which sufficiently produces a generative model for sequential prediction. Experiments on language model shows the effectiveness of stochastic CRN when compared with the other sequential machines.

原文English
主出版物標題2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
發行者Institute of Electrical and Electronics Engineers Inc.
ISBN(電子)9781728169262
DOIs
出版狀態Published - 7月 2020
事件2020 International Joint Conference on Neural Networks, IJCNN 2020 - Virtual, Glasgow, United Kingdom
持續時間: 19 7月 202024 7月 2020

出版系列

名字Proceedings of the International Joint Conference on Neural Networks

Conference

Conference2020 International Joint Conference on Neural Networks, IJCNN 2020
國家/地區United Kingdom
城市Virtual, Glasgow
期間19/07/2024/07/20

指紋

深入研究「Stochastic Convolutional Recurrent Networks」主題。共同形成了獨特的指紋。

引用此