SoPa: Bridging CNNs, RNNs, and Weighted Finite-State Machines

05/15/2018
by   Roy Schwartz, et al.
0

Recurrent and convolutional neural networks comprise two distinct families of models that have proven to be useful for encoding natural language utterances. In this paper we present SoPa, a new model that aims to bridge these two approaches. SoPa combines neural representation learning with weighted finite-state automata (WFSAs) to learn a soft version of traditional surface patterns. We show that SoPa is an extension of a one-layer CNN, and that such CNNs are equivalent to a restricted version of SoPa, and accordingly, to a restricted form of WFSA. Empirically, on three text classification tasks, SoPa is comparable or better than both a BiLSTM (RNN) baseline and a CNN baseline, and is particularly useful in small data settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/05/2017

Multiple Range-Restricted Bidirectional Gated Recurrent Units with Attention for Relation Classification

Most of neural approaches to relation classification have focused on fin...
research
09/05/2021

Learning Hierarchical Structures with Differentiable Nondeterministic Stacks

Learning hierarchical structures in sequential data – from simple algori...
research
09/20/2018

Symbolic Priors for RNN-based Semantic Parsing

Seq2seq models based on Recurrent Neural Networks (RNNs) have recently r...
research
08/28/2018

Rational Recurrences

Despite the tremendous empirical success of neural models in natural lan...
research
09/10/2020

On Computability, Learnability and Extractability of Finite State Machines from Recurrent Neural Networks

This work aims at shedding some light on connections between finite stat...
research
08/16/2017

Deconvolutional Paragraph Representation Learning

Learning latent representations from long text sequences is an important...

Please sign up or login with your details

Forgot password? Click here to reset