DeepAI AI Chat
Log In Sign Up

Universal Consistency of Deep Convolutional Neural Networks

06/23/2021
by   Shao-Bo Lin, et al.
0

Compared with avid research activities of deep convolutional neural networks (DCNNs) in practice, the study of theoretical behaviors of DCNNs lags heavily behind. In particular, the universal consistency of DCNNs remains open. In this paper, we prove that implementing empirical risk minimization on DCNNs with expansive convolution (with zero-padding) is strongly universally consistent. Motivated by the universal consistency, we conduct a series of experiments to show that without any fully connected layers, DCNNs with expansive convolution perform not worse than the widely used deep neural networks with hybrid structure containing contracting (without zero-padding) convolution layers and several fully connected layers.

READ FULL TEXT
09/04/2018

Equivalence of approximation by convolutional neural networks and fully-connected networks

Convolutional neural networks are the most widely used type of neural ne...
12/22/2014

Deep Fried Convnets

The fully connected layers of a deep convolutional neural network typica...
07/02/2021

Theory of Deep Convolutional Neural Networks III: Approximating Radial Functions

We consider a family of deep neural networks consisting of two groups of...
10/20/2017

Unified Backpropagation for Multi-Objective Deep Learning

A common practice in most of deep convolutional neural architectures is ...
02/04/2021

Universal Approximation Theorems of Fully Connected Binarized Neural Networks

Neural networks (NNs) are known for their high predictive accuracy in co...
07/28/2020

Theory of Deep Convolutional Neural Networks II: Spherical Analysis

Deep learning based on deep neural networks of various structures and ar...
12/03/2014

Memory Bounded Deep Convolutional Networks

In this work, we investigate the use of sparsity-inducing regularizers d...