G5: A Universal GRAPH-BERT for Graph-to-Graph Transfer and Apocalypse Learning

06/11/2020
by   Jiawei Zhang, et al.
9

The recent GRAPH-BERT model introduces a new approach to learning graph representations merely based on the attention mechanism. GRAPH-BERT provides an opportunity for transferring pre-trained models and learned graph representations across different tasks within the same graph dataset. In this paper, we will further investigate the graph-to-graph transfer of a universal GRAPH-BERT for graph representation learning across different graph datasets, and our proposed model is also referred to as the G5 for simplicity. Many challenges exist in learning G5 to adapt the distinct input and output configurations for each graph data source, as well as the information distributions differences. G5 introduces a pluggable model architecture: (a) each data source will be pre-processed with a unique input representation learning component; (b) each output application task will also have a specific functional component; and (c) all such diverse input and output components will all be conjuncted with a universal GRAPH-BERT core component via an input size unification layer and an output representation fusion layer, respectively. The G5 model removes the last obstacle for cross-graph representation learning and transfer. For the graph sources with very sparse training data, the G5 model pre-trained on other graphs can still be utilized for representation learning with necessary fine-tuning. What's more, the architecture of G5 also allows us to learn a supervised functional classifier for data sources without any training data at all. Such a problem is also named as the Apocalypse Learning task in this paper. Two different label reasoning strategies, i.e., Cross-Source Classification Consistency Maximization (CCCM) and Cross-Source Dynamic Routing (CDR), are introduced in this paper to address the problem.

READ FULL TEXT

page 1

page 4

research
02/09/2020

Segmented Graph-Bert for Graph Instance Modeling

In graph instance representation learning, both the diverse graph instan...
research
01/15/2020

Graph-Bert: Only Attention is Needed for Learning Graph Representations

The dominant graph neural networks (GNNs) over-rely on the graph links, ...
research
09/09/2021

MetaXT: Meta Cross-Task Transfer between Disparate Label Spaces

Albeit the universal representational power of pre-trained language mode...
research
03/16/2022

X-Learner: Learning Cross Sources and Tasks for Universal Visual Representation

In computer vision, pre-training models based on largescale supervised l...
research
07/25/2022

Fine-Tuning BERT for Automatic ADME Semantic Labeling in FDA Drug Labeling to Enhance Product-Specific Guidance Assessment

Product-specific guidances (PSGs) recommended by the United States Food ...
research
09/10/2020

Learning Universal Representations from Word to Sentence

Despite the well-developed cut-edge representation learning for language...
research
10/17/2022

6th Place Solution to Google Universal Image Embedding

This paper presents the 6th place solution to the Google Universal Image...

Please sign up or login with your details

Forgot password? Click here to reset