Prior art search and reranking for generated patent text

Jieh-Sheng Lee*, Jieh Hsiang

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

2 Scopus citations

Abstract

Generative models, such as GPT-2, have demonstrated impressive results recently. A fundamental question we would like to address is: where did the generated text come from? This work is our initial effort toward answering the question by using prior art search. The purpose of the prior art search is to find the most similar prior text in the training data of GPT-2. We take a reranking approach and apply it to the patent domain. Specifically, we pre-train GPT-2 models from scratch by using the patent data from the USPTO. The input for the prior art search is the patent text generated by the GPT-2 model. We also pre-trained BERT models from scratch for converting patent text to embeddings. The steps of reranking are: (1) search the most similar text in the training data of GPT-2 by taking a bag-of-words ranking approach (BM25), (2) convert the search results in text format to BERT embeddings, and (3) provide the final result by ranking the BERT embeddings based on their similarities with the patent text generated by GPT-2. The experiments in this work show that such reranking is better than ranking with embeddings alone. However, our mixed results also indicate that calculating the semantic similarities among long text spans is still challenging. To our knowledge, this work is the first to implement a reranking system to identify retrospectively the most similar inputs to a GPT model based on its output.

Original languageEnglish
Pages (from-to)18-24
Number of pages7
JournalCEUR Workshop Proceedings
Volume2909
StatePublished - 15 Jul 2021
Event2nd Workshop on Patent Text Mining and Semantic Technologies, PatentSemTech 2021 - Virtual, Online
Duration: 15 Jul 2021 → …

Keywords

  • Deep learning
  • Natural language generation
  • Natural language processing
  • Patent
  • Semantic search

Fingerprint

Dive into the research topics of 'Prior art search and reranking for generated patent text'. Together they form a unique fingerprint.

Cite this