Conformal retrofitting via Riemannian manifolds: distilling task-specific graphs into pretrained embeddings

10/09/2020
by   Justin Dieter, et al.
0

Pretrained (language) embeddings are versatile, task-agnostic feature representations of entities, like words, that are central to many machine learning applications. These representations can be enriched through retrofitting, a class of methods that incorporate task-specific domain knowledge encoded as a graph over a subset of these entities. However, existing retrofitting algorithms face two limitations: they overfit the observed graph by failing to represent relationships with missing entities; and they underfit the observed graph by only learning embeddings in Euclidean manifolds, which cannot faithfully represent even simple tree-structured or cyclic graphs. We address these problems with two key contributions: (i) we propose a novel regularizer, a conformality regularizer, that preserves local geometry from the pretrained embeddings—enabling generalization to missing entities and (ii) a new Riemannian feedforward layer that learns to map pre-trained embeddings onto a non-Euclidean manifold that can better represent the entire graph. Through experiments on WordNet, we demonstrate that the conformality regularizer prevents even existing (Euclidean-only) methods from overfitting on link prediction for missing entities, and—together with the Riemannian feedforward layer—learns non-Euclidean embeddings that outperform them.

READ FULL TEXT
research
11/08/2020

DyERNIE: Dynamic Evolution of Riemannian Manifold Embeddings for Temporal Knowledge Graph Completion

There has recently been increasing interest in learning representations ...
research
02/20/2020

Computationally Tractable Riemannian Manifolds for Graph Embeddings

Representing graphs as sets of node embeddings in certain curved Riemann...
research
06/16/2021

Directed Graph Embeddings in Pseudo-Riemannian Manifolds

The inductive biases of graph representation learning algorithms are oft...
research
04/03/2023

FMGNN: Fused Manifold Graph Neural Network

Graph representation learning has been widely studied and demonstrated e...
research
12/02/2022

Pseudo-Riemannian Embedding Models for Multi-Relational Graph Representations

In this paper we generalize single-relation pseudo-Riemannian graph embe...
research
11/08/2019

Neural Graph Embedding methods for Natural Language Processing

Knowledge graphs are structured representations of facts in a graph, whe...
research
12/11/2018

Adversarial Autoencoders with Constant-Curvature Latent Manifolds

Constant-curvature Riemannian manifolds (CCMs) have been shown to be ide...

Please sign up or login with your details

Forgot password? Click here to reset