General then Personal: Decoupling and Pre-training for Personalized Headline Generation

Yun Zhu Song, Yi Syuan Chen, Lu Wang, Hong Han Shuai

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

Personalized Headline Generation aims to generate unique headlines tailored to users’ browsing history. In this task, understanding user preferences from click history and incorporating them into headline generation pose challenges. Existing approaches typi-cally rely on predefined styles as control codes, but personal style lacks explicit def-inition or enumeration, making it difficult to leverage traditional techniques. To tackle these challenges, we propose General Then Personal (GTP), a novel framework com-prising user modeling, headline generation, and customization. We train the framework using tailored designs that emphasize two cen-tral ideas: (a) task decoupling and (b) model pre-training. With the decoupling mechanism separating the task into generation and cus-tomization, two mechanisms, i.e., information self-boosting and mask user modeling, are further introduced to facilitate the training and text control. Additionally, we intro-duce a new evaluation metric to address existing limitations. Extensive experiments conducted on the PENS dataset, considering both zero-shot and few-shot scenarios, demon-strate that GTP outperforms state-of-the-art methods. Furthermore, ablation studies and analysis emphasize the significance of de-coupling and pre-training. Finally, the human evaluation validates the effectiveness of our approaches.1.

Original languageEnglish
Pages (from-to)1588-1607
Number of pages20
JournalTransactions of the Association for Computational Linguistics
Volume11
DOIs
StatePublished - 2023

Fingerprint

Dive into the research topics of 'General then Personal: Decoupling and Pre-training for Personalized Headline Generation'. Together they form a unique fingerprint.

Cite this