ConGraT: Self-Supervised Contrastive Pretraining for Joint Graph and Text Embeddings

05/23/2023
by   William Brannon, et al.
0

We propose ConGraT(Contrastive Graph-Text pretraining), a general, self-supervised method for jointly learning separate representations of texts and nodes in a parent (or “supervening”) graph, where each text is associated with one of the nodes. Datasets fitting this paradigm are common, from social media (users and posts), to citation networks over articles, to link graphs over web pages. We expand on prior work by providing a general, self-supervised, joint pretraining method, one which does not depend on particular dataset structure or a specific task. Our method uses two separate encoders for graph nodes and texts, which are trained to align their representations within a common latent space. Training uses a batch-wise contrastive learning objective inspired by prior work on joint text and image encoding. As graphs are more structured objects than images, we also extend the training objective to incorporate information about node similarity and plausible next guesses in matching nodes and texts. Experiments on various datasets reveal that ConGraT outperforms strong baselines on various downstream tasks, including node and text category classification and link prediction. Code and certain datasets are available at https://github.com/wwbrannon/congrat.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2022

Deep Bidirectional Language-Knowledge Graph Pretraining

Pretraining a language model (LM) on text has been shown to help various...
research
08/21/2022

Relational Self-Supervised Learning on Graphs

Over the past few years, graph representation learning (GRL) has been a ...
research
06/22/2022

Prototypical Contrastive Language Image Pretraining

Contrastive Language Image Pretraining (CLIP) received widespread attent...
research
05/20/2022

MaskGAE: Masked Graph Modeling Meets Graph Autoencoders

We present masked graph autoencoder (MaskGAE), a self-supervised learnin...
research
03/11/2023

Contrastive Learning under Heterophily

Graph Neural Networks are powerful tools for learning node representatio...
research
05/26/2023

Commonsense Knowledge Graph Completion Via Contrastive Pretraining and Node Clustering

The nodes in the commonsense knowledge graph (CSKG) are normally represe...
research
08/13/2022

MetricBERT: Text Representation Learning via Self-Supervised Triplet Training

We present MetricBERT, a BERT-based model that learns to embed text unde...

Please sign up or login with your details

Forgot password? Click here to reset