Sample Complexity Bounds for Recurrent Neural Networks with Application to Combinatorial Graph Problems

01/29/2019
by   Nil-Jana Akpinar, et al.
0

Learning to predict solutions to real-valued combinatorial graph problems promises efficient approximations. As demonstrated based on the NP-hard edge clique cover number, recurrent neural networks (RNNs) are particularly suited for this task and can even outperform state-of-the-art heuristics. However, the theoretical framework for estimating real-valued RNNs is understood only poorly. As our primary contribution, this is the first work that upper bounds the sample complexity for learning real-valued RNNs. While such derivations have been made earlier for feed-forward and convolutional neural networks, our work presents the first such attempt for recurrent neural networks. Given a single-layer RNN with a rectified linear units and input of length b, we show that a population prediction error of ε can be realized with at most Õ(a^4b/ε^2) samples. We further derive comparable results for multi-layer RNNs. Accordingly, a size-adaptive RNN fed with graphs of at most n vertices can be learned in Õ(n^6/ε^2), i.e., with only a polynomial number of samples. For combinatorial graph problems, this provides a theoretical foundation that renders RNNs competitive.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/04/2019

Can SGD Learn Recurrent Neural Networks with Provable Generalization?

Recurrent Neural Networks (RNNs) are among the most popular models in se...
research
05/30/2017

Recurrent Estimation of Distributions

This paper presents the recurrent estimation of distributions (RED) for ...
research
11/15/2017

Recurrent Neural Networks as Weighted Language Recognizers

We investigate computational complexity of questions of various problems...
research
09/07/2017

Approximating meta-heuristics with homotopic recurrent neural networks

Much combinatorial optimisation problems constitute a non-polynomial (NP...
research
11/07/2018

RNNFast: An Accelerator for Recurrent Neural Networks Using Domain Wall Memory

Recurrent Neural Networks (RNNs) are an important class of neural networ...
research
02/14/2020

Interleaved Sequence RNNs for Fraud Detection

Payment card fraud causes multibillion dollar losses for banks and merch...
research
08/18/2016

Decoupled Neural Interfaces using Synthetic Gradients

Training directed neural networks typically requires forward-propagating...

Please sign up or login with your details

Forgot password? Click here to reset