DeepAI AI Chat
Log In Sign Up

Universal Consistency of Deep Convolutional Neural Networks

by   Shao-Bo Lin, et al.

Compared with avid research activities of deep convolutional neural networks (DCNNs) in practice, the study of theoretical behaviors of DCNNs lags heavily behind. In particular, the universal consistency of DCNNs remains open. In this paper, we prove that implementing empirical risk minimization on DCNNs with expansive convolution (with zero-padding) is strongly universally consistent. Motivated by the universal consistency, we conduct a series of experiments to show that without any fully connected layers, DCNNs with expansive convolution perform not worse than the widely used deep neural networks with hybrid structure containing contracting (without zero-padding) convolution layers and several fully connected layers.


Equivalence of approximation by convolutional neural networks and fully-connected networks

Convolutional neural networks are the most widely used type of neural ne...

Deep Fried Convnets

The fully connected layers of a deep convolutional neural network typica...

Theory of Deep Convolutional Neural Networks III: Approximating Radial Functions

We consider a family of deep neural networks consisting of two groups of...

Unified Backpropagation for Multi-Objective Deep Learning

A common practice in most of deep convolutional neural architectures is ...

Universal Approximation Theorems of Fully Connected Binarized Neural Networks

Neural networks (NNs) are known for their high predictive accuracy in co...

Theory of Deep Convolutional Neural Networks II: Spherical Analysis

Deep learning based on deep neural networks of various structures and ar...

Memory Bounded Deep Convolutional Networks

In this work, we investigate the use of sparsity-inducing regularizers d...