RETRO: Relation Retrofitting For In-Database Machine Learning on Textual Data

11/28/2019
by   Michael Günther, et al.
0

There are massive amounts of textual data residing in databases, valuable for many machine learning (ML) tasks. Since ML techniques depend on numerical input representations, word embeddings are increasingly utilized to convert symbolic representations such as text into meaningful numbers. However, a naive one-to-one mapping of each word in a database to a word embedding vector is not sufficient and would lead to poor accuracies in ML tasks. Thus, we argue to additionally incorporate the information given by the database schema into the embedding, e.g. which words appear in the same column or are related to each other. In this paper, we propose RETRO (RElational reTROfitting), a novel approach to learn numerical representations of text values in databases, capturing the best of both worlds, the rich information encoded by word embeddings and the relational information encoded by database tables. We formulate relation retrofitting as a learning problem and present an efficient algorithm solving it. We investigate the impact of various hyperparameters on the learning problem and derive good settings for all of them. Our evaluation shows that the proposed embeddings are ready-to-use for many ML tasks such as classification and regression and even outperform state-of-the-art techniques in integration tasks such as null value imputation and link prediction.

READ FULL TEXT

page 3

page 7

research
06/04/2019

Relational Word Embeddings

While word embeddings have been shown to implicitly encode various forms...
research
09/13/2021

ML Based Lineage in Databases

We track the lineage of tuples throughout their database lifetime. That ...
research
10/28/2020

A Comprehensive Survey on Word Representation Models: From Classical to State-Of-The-Art Word Representation Language Models

Word representation has always been an important research area in the hi...
research
06/03/2019

Global Textual Relation Embedding for Relational Understanding

Pre-trained embeddings such as word embeddings and sentence embeddings a...
research
03/23/2016

Enabling Cognitive Intelligence Queries in Relational Databases using Low-dimensional Word Embeddings

We apply distributed language embedding methods from Natural Language Pr...
research
03/11/2021

Dynamic Database Embeddings with FoRWaRD

We study the problem of computing an embedding of the tuples of a relati...
research
05/03/2020

An Algebraic Approach for High-level Text Analytics

Text analytical tasks like word embedding, phrase mining, and topic mode...

Please sign up or login with your details

Forgot password? Click here to reset