Differentiable Generative Phonology

02/10/2021
by   Shijie Wu, et al.
0

The goal of generative phonology, as formulated by Chomsky and Halle (1968), is to specify a formal system that explains the set of attested phonological strings in a language. Traditionally, a collection of rules (or constraints, in the case of optimality theory) and underlying forms (UF) are posited to work in tandem to generate phonological strings. However, the degree of abstraction of UFs with respect to their concrete realizations is contentious. As the main contribution of our work, we implement the phonological generative system as a neural model differentiable end-to-end, rather than as a set of rules or constraints. Contrary to traditional phonology, in our model, UFs are continuous vectors in ℝ^d, rather than discrete strings. As a consequence, UFs are discovered automatically rather than posited by linguists, and the model can scale to the size of a realistic vocabulary. Moreover, we compare several modes of the generative process, contemplating: i) the presence or absence of an underlying representation in between morphemes and surface forms (SFs); and ii) the conditional dependence or independence of UFs with respect to SFs. We evaluate the ability of each mode to predict attested phonological strings on 2 datasets covering 5 and 28 languages, respectively. The results corroborate two tenets of generative phonology, viz. the necessity for UFs and their independence from SFs. In general, our neural model of generative phonology learns both UFs and SFs automatically and on a large-scale.

READ FULL TEXT
research
03/23/2023

Equational Theorem Proving for Clauses over Strings

Although reasoning about equations over strings has been extensively stu...
research
06/10/2022

Flexible Differentiable Optimization via Model Transformations

We introduce DiffOpt.jl, a Julia library to differentiate through the so...
research
06/05/2020

Provably Stable Interpretable Encodings of Context Free Grammars in RNNs with a Differentiable Stack

Given a collection of strings belonging to a context free grammar (CFG) ...
research
11/16/2021

Introduction to Set Shaping Theory

In this article, we define the Set Shaping Theory whose goal is the stud...
research
08/10/2019

Large Scale Geometries of Infinite Strings

We introduce geometric consideration into the theory of formal languages...
research
04/01/2021

Reconciling the Discrete-Continuous Divide: Towards a Mathematical Theory of Sparse Communication

Neural networks and other machine learning models compute continuous rep...
research
08/15/2018

Using Regular Languages to Explore the Representational Capacity of Recurrent Neural Architectures

The presence of Long Distance Dependencies (LDDs) in sequential data pos...

Please sign up or login with your details

Forgot password? Click here to reset