On the Expressive Power of Deep Fully Circulant Neural Networks

01/29/2019
by   Alexandre Araujo, et al.
0

In this paper, we study deep fully circulant neural networks, that is deep neural networks in which all weight matrices are circulant ones. We show that these networks outperform the recently introduced deep networks with other types of structured layers. Besides introducing principled techniques for training these models, we provide theoretical guarantees regarding their expressivity. Indeed, we prove that the function space spanned by circulant networks of bounded depth includes the one spanned by dense networks with specific properties on their rank. We conduct a thorough experimental study to compare the performance of deep fully circulant networks with state of the art models based on structured matrices and with dense models. We show that our models achieve better accuracy than their structured alternatives while required 2x fewer weights as the next best approach. Finally we train deep fully circulant networks to build a compact and accurate models on a real world video classification dataset with over 3.8 million training examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/02/2021

Building Compact and Robust Deep Neural Networks with Toeplitz Matrices

Deep neural networks are state-of-the-art in a wide variety of tasks, ho...
research
03/01/2017

Theoretical Properties for Neural Networks with Weight Matrices of Low Displacement Rank

Recently low displacement rank (LDR) matrices, or so-called structured m...
research
10/02/2018

Training compact deep learning models for video classification using circulant matrices

In real world scenarios, model accuracy is hardly the only factor to con...
research
04/08/2019

On the Learnability of Deep Random Networks

In this paper we study the learnability of deep random networks from bot...
research
09/22/2015

Tensorizing Neural Networks

Deep neural networks currently demonstrate state-of-the-art performance ...
research
01/27/2018

Towards an Understanding of Neural Networks in Natural-Image Spaces

Two major uncertainties, dataset bias and perturbation, prevail in state...
research
07/17/2020

Sparse Linear Networks with a Fixed Butterfly Structure: Theory and Practice

Fast Fourier transform, Wavelets, and other well-known transforms in sig...

Please sign up or login with your details

Forgot password? Click here to reset