Phrase-BERT: Improved Phrase Embeddings from BERT with an Application to Corpus Exploration

09/13/2021
by   Shufan Wang, et al.
0

Phrase representations derived from BERT often do not exhibit complex phrasal compositionality, as the model relies instead on lexical similarity to determine semantic relatedness. In this paper, we propose a contrastive fine-tuning objective that enables BERT to produce more powerful phrase embeddings. Our approach (Phrase-BERT) relies on a dataset of diverse phrasal paraphrases, which is automatically generated using a paraphrase generation model, as well as a large-scale dataset of phrases in context mined from the Books3 corpus. Phrase-BERT outperforms baselines across a variety of phrase-level similarity tasks, while also demonstrating increased lexical diversity between nearest neighbors in the vector space. Finally, as a case study, we show that Phrase-BERT embeddings can be easily integrated with a simple autoencoder to build a phrase-based neural topic model that interprets topics as mixtures of words and phrases by performing a nearest neighbor search in the embedding space. Crowdsourced evaluations demonstrate that this phrase-based topic model produces more coherent and meaningful topics than baseline word and phrase-level topic models, further validating the utility of Phrase-BERT.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2022

UCTopic: Unsupervised Contrastive Learning for Phrase Representations and Topic Mining

High-quality phrase representations are essential to finding topics and ...
research
06/18/1999

Automatically Selecting Useful Phrases for Dialogue Act Tagging

We present an empirical investigation of various ways to automatically i...
research
08/30/2022

Combining keyphrase extraction and lexical diversity to characterize ideas in publication titles

Beyond bibliometrics, there is interest in characterizing the evolution ...
research
01/11/2022

D-Graph: AI-Assisted Design Concept Exploration Graph

We present an AI-assisted search tool, the "Design Concept Exploration G...
research
07/12/2022

Using Paraphrases to Study Properties of Contextual Embeddings

We use paraphrases as a unique source of data to analyze contextualized ...
research
07/21/2016

Exploring phrase-compositionality in skip-gram models

In this paper, we introduce a variation of the skip-gram model which joi...
research
08/09/2019

Using Semantic Role Knowledge for Relevance Ranking of Key Phrases in Documents: An Unsupervised Approach

In this paper, we investigate the integration of sentence position and s...

Please sign up or login with your details

Forgot password? Click here to reset