Functorial Language Models

03/26/2021
by   Alexis Toumi, et al.
0

We introduce functorial language models: a principled way to compute probability distributions over word sequences given a monoidal functor from grammar to meaning. This yields a method for training categorical compositional distributional (DisCoCat) models on raw text data. We provide a proof-of-concept implementation in DisCoPy, the Python toolbox for monoidal categories.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2018

Translating and Evolving: Towards a Model of Language Change in DisCoCat

The categorical compositional distributional (DisCoCat) model of meaning...
research
03/19/2022

Dependency-based Mixture Language Models

Various models have been proposed to incorporate knowledge of syntactic ...
research
11/08/2018

Internal Wiring of Cartesian Verbs and Prepositions

Categorical compositional distributional semantics (CCDS) allows one to ...
research
05/26/2020

Guiding Symbolic Natural Language Grammar Induction via Transformer-Based Sequence Probabilities

A novel approach to automated learning of syntactic rules governing natu...
research
03/15/2022

Signal in Noise: Exploring Meaning Encoded in Random Character Sequences with Character-Aware Language Models

Natural language processing models learn word representations based on t...
research
09/15/2023

Headless Language Models: Learning without Predicting with Contrastive Weight Tying

Self-supervised pre-training of language models usually consists in pred...
research
08/27/2022

On Unsupervised Training of Link Grammar Based Language Models

In this short note we explore what is needed for the unsupervised traini...

Please sign up or login with your details

Forgot password? Click here to reset