Stochastic Convolutional Recurrent Networks

Jen-Tzung Chien, Yu Min Huang

研究成果: Conference contribution同行評審

4 引文 斯高帕斯(Scopus)


Recurrent neural network (RNN) has been widely used for sequential learning which has achieved a great success in different tasks. The temporal convolutional network (TCN), a variant of one-dimensional convolutional neural network (CNN), was also developed for sequential learning in presence of sequence data. RNN and TCN typically captures long-term and short-term features in temporal or spatial domain, respectively. This paper presents a new sequential learning, called the convolutional recurrent network (CRN), which fulfills TCN as an encoder and RNN as a decoder so that the global semantics as well as the local dependencies are simultaneously characterized from sequence data. To facilitate the interpretation and robustness in neural models, we further develop the stochastic modeling for CRN based on variational inference. The merits of CNN and RNN are then incorporated in inference of latent space which sufficiently produces a generative model for sequential prediction. Experiments on language model shows the effectiveness of stochastic CRN when compared with the other sequential machines.

主出版物標題2020 International Joint Conference on Neural Networks, IJCNN 2020 - Proceedings
發行者Institute of Electrical and Electronics Engineers Inc.
出版狀態Published - 7月 2020
事件2020 International Joint Conference on Neural Networks, IJCNN 2020 - Virtual, Glasgow, United Kingdom
持續時間: 19 7月 202024 7月 2020


名字Proceedings of the International Joint Conference on Neural Networks


Conference2020 International Joint Conference on Neural Networks, IJCNN 2020
國家/地區United Kingdom
城市Virtual, Glasgow


深入研究「Stochastic Convolutional Recurrent Networks」主題。共同形成了獨特的指紋。