Small Language Models for Tabular Data

11/05/2022
by   Benjamin L. Badger, et al.
0

Supervised deep learning is most commonly applied to difficult problems defined on large and often extensively curated datasets. Here we demonstrate the ability of deep representation learning to address problems of classification and regression from small and poorly formed tabular datasets by encoding input information as abstracted sequences composed of a fixed number of characters per input field. We find that small models have sufficient capacity for approximation of various functions and achieve record classification benchmark accuracy. Such models are shown to form useful embeddings of various input features in their hidden layers, even if the learned task does not explicitly require knowledge of those features. These models are also amenable to input attribution, allowing for an estimation of the importance of each input element to the model output as well as of which inputs features are effectively embedded in the model. We present a proof-of-concept for the application of small language models to mixed tabular data without explicit feature engineering, cleaning, or preprocessing, relying on the model to perform these tasks as part of the representation learning process.

READ FULL TEXT

page 11

page 16

research
06/10/2023

ECGBERT: Understanding Hidden Language of ECGs with Self-Supervised Representation Learning

In the medical field, current ECG signal analysis approaches rely on sup...
research
03/30/2022

Transformer Language Models without Positional Encodings Still Learn Positional Information

Transformers typically require some form of positional encoding, such as...
research
11/11/2022

Depth and Representation in Vision Models

Deep learning models develop successive representations of their input i...
research
03/07/2023

Larger language models do in-context learning differently

We study how in-context learning (ICL) in language models is affected by...
research
03/26/2021

Categorical Representation Learning: Morphism is All You Need

We provide a construction for categorical representation learning and in...
research
03/14/2023

Good Neighbors Are All You Need for Chinese Grapheme-to-Phoneme Conversion

Most Chinese Grapheme-to-Phoneme (G2P) systems employ a three-stage fram...
research
06/24/2023

IERL: Interpretable Ensemble Representation Learning – Combining CrowdSourced Knowledge and Distributed Semantic Representations

Large Language Models (LLMs) encode meanings of words in the form of dis...

Please sign up or login with your details

Forgot password? Click here to reset