Meta-Transfer Learning for Low-Resource Abstractive Summarization

Yi Syuan Chen, Hong Han Shuai

研究成果: Conference contribution同行評審

30 引文 斯高帕斯(Scopus)

摘要

Neural abstractive summarization has been studied in many pieces of literature and achieves great success with the aid of large corpora. However, when encountering novel tasks, one may not always benefit from transfer learning due to the domain shifting problem, and overfitting could happen without adequate labeled examples. Furthermore, the annotations of abstractive summarization are costly, which often demand domain knowledge to ensure the ground-truth quality. Thus, there are growing appeals for Low-Resource Abstractive Summarization, which aims to leverage past experience to improve the performance with limited labeled examples of target corpus. In this paper, we propose to utilize two knowledge-rich sources to tackle this problem, which are large pre-trained models and diverse existing corpora. The former can provide the primary ability to tackle summarization tasks; the latter can help discover common syntactic or semantic information to improve the generalization ability. We conduct extensive experiments on various summarization corpora with different writing styles and forms. The results demonstrate that our approach achieves the state-of-the-art on 6 corpora in low-resource scenarios, with only 0.7% of trainable parameters compared to previous work.

原文English
主出版物標題35th AAAI Conference on Artificial Intelligence, AAAI 2021
發行者Association for the Advancement of Artificial Intelligence
頁面12692-12700
頁數9
ISBN(電子)9781713835974
DOIs
出版狀態Published - 2021
事件35th AAAI Conference on Artificial Intelligence, AAAI 2021 - Virtual, Online
持續時間: 2 2月 20219 2月 2021

出版系列

名字35th AAAI Conference on Artificial Intelligence, AAAI 2021
14A

Conference

Conference35th AAAI Conference on Artificial Intelligence, AAAI 2021
城市Virtual, Online
期間2/02/219/02/21

指紋

深入研究「Meta-Transfer Learning for Low-Resource Abstractive Summarization」主題。共同形成了獨特的指紋。

引用此