General then Personal: Decoupling and Pre-training for Personalized Headline Generation

Yun Zhu Song, Yi Syuan Chen, Lu Wang, Hong Han Shuai

研究成果: Article同行評審

1 引文 斯高帕斯(Scopus)

摘要

Personalized Headline Generation aims to generate unique headlines tailored to users’ browsing history. In this task, understanding user preferences from click history and incorporating them into headline generation pose challenges. Existing approaches typi-cally rely on predefined styles as control codes, but personal style lacks explicit def-inition or enumeration, making it difficult to leverage traditional techniques. To tackle these challenges, we propose General Then Personal (GTP), a novel framework com-prising user modeling, headline generation, and customization. We train the framework using tailored designs that emphasize two cen-tral ideas: (a) task decoupling and (b) model pre-training. With the decoupling mechanism separating the task into generation and cus-tomization, two mechanisms, i.e., information self-boosting and mask user modeling, are further introduced to facilitate the training and text control. Additionally, we intro-duce a new evaluation metric to address existing limitations. Extensive experiments conducted on the PENS dataset, considering both zero-shot and few-shot scenarios, demon-strate that GTP outperforms state-of-the-art methods. Furthermore, ablation studies and analysis emphasize the significance of de-coupling and pre-training. Finally, the human evaluation validates the effectiveness of our approaches.1.

原文English
頁(從 - 到)1588-1607
頁數20
期刊Transactions of the Association for Computational Linguistics
11
DOIs
出版狀態Published - 2023

指紋

深入研究「General then Personal: Decoupling and Pre-training for Personalized Headline Generation」主題。共同形成了獨特的指紋。

引用此