TY - JOUR
T1 - General then Personal
T2 - Decoupling and Pre-training for Personalized Headline Generation
AU - Song, Yun Zhu
AU - Chen, Yi Syuan
AU - Wang, Lu
AU - Shuai, Hong Han
N1 - Publisher Copyright:
© 2023 Association for Computational Linguistics.
PY - 2023
Y1 - 2023
N2 - Personalized Headline Generation aims to generate unique headlines tailored to users’ browsing history. In this task, understanding user preferences from click history and incorporating them into headline generation pose challenges. Existing approaches typi-cally rely on predefined styles as control codes, but personal style lacks explicit def-inition or enumeration, making it difficult to leverage traditional techniques. To tackle these challenges, we propose General Then Personal (GTP), a novel framework com-prising user modeling, headline generation, and customization. We train the framework using tailored designs that emphasize two cen-tral ideas: (a) task decoupling and (b) model pre-training. With the decoupling mechanism separating the task into generation and cus-tomization, two mechanisms, i.e., information self-boosting and mask user modeling, are further introduced to facilitate the training and text control. Additionally, we intro-duce a new evaluation metric to address existing limitations. Extensive experiments conducted on the PENS dataset, considering both zero-shot and few-shot scenarios, demon-strate that GTP outperforms state-of-the-art methods. Furthermore, ablation studies and analysis emphasize the significance of de-coupling and pre-training. Finally, the human evaluation validates the effectiveness of our approaches.1.
AB - Personalized Headline Generation aims to generate unique headlines tailored to users’ browsing history. In this task, understanding user preferences from click history and incorporating them into headline generation pose challenges. Existing approaches typi-cally rely on predefined styles as control codes, but personal style lacks explicit def-inition or enumeration, making it difficult to leverage traditional techniques. To tackle these challenges, we propose General Then Personal (GTP), a novel framework com-prising user modeling, headline generation, and customization. We train the framework using tailored designs that emphasize two cen-tral ideas: (a) task decoupling and (b) model pre-training. With the decoupling mechanism separating the task into generation and cus-tomization, two mechanisms, i.e., information self-boosting and mask user modeling, are further introduced to facilitate the training and text control. Additionally, we intro-duce a new evaluation metric to address existing limitations. Extensive experiments conducted on the PENS dataset, considering both zero-shot and few-shot scenarios, demon-strate that GTP outperforms state-of-the-art methods. Furthermore, ablation studies and analysis emphasize the significance of de-coupling and pre-training. Finally, the human evaluation validates the effectiveness of our approaches.1.
UR - http://www.scopus.com/inward/record.url?scp=85180447299&partnerID=8YFLogxK
U2 - 10.1162/tacl_a_00621
DO - 10.1162/tacl_a_00621
M3 - Article
AN - SCOPUS:85180447299
SN - 2307-387X
VL - 11
SP - 1588
EP - 1607
JO - Transactions of the Association for Computational Linguistics
JF - Transactions of the Association for Computational Linguistics
ER -