Maximum-and-Concatenation Networks

07/09/2020
by   Xingyu Xie, et al.
0

While successful in many fields, deep neural networks (DNNs) still suffer from some open problems such as bad local minima and unsatisfactory generalization performance. In this work, we propose a novel architecture called Maximum-and-Concatenation Networks (MCN) to try eliminating bad local minima and improving generalization ability as well. Remarkably, we prove that MCN has a very nice property; that is, every local minimum of an (l+1)-layer MCN can be better than, at least as good as, the global minima of the network consisting of its first l layers. In other words, by increasing the network depth, MCN can autonomously improve its local minima's goodness, what is more, it is easy to plug MCN into an existing deep model to make it also have this property. Finally, under mild conditions, we show that MCN can approximate certain continuous functions arbitrarily well with high efficiency; that is, the covering number of MCN is much smaller than most existing DNNs such as deep ReLU. Based on this, we further provide a tight generalization bound to guarantee the inference ability of MCN when dealing with testing samples.

READ FULL TEXT
research
01/30/2019

Blurred Images Lead to Bad Local Minima

Blurred Images Lead to Bad Local Minima...
research
11/19/2019

Information-Theoretic Local Minima Characterization and Regularization

Recent advances in deep learning theory have evoked the study of general...
research
12/29/2014

Improving approximate RPCA with a k-sparsity prior

A process centric view of robust PCA (RPCA) allows its fast approximate ...
research
11/07/2022

Highly over-parameterized classifiers generalize since bad solutions are rare

We study the generalization of over-parameterized classifiers where Empi...
research
03/06/2019

Positively Scale-Invariant Flatness of ReLU Neural Networks

It was empirically confirmed by Keskar et al.SharpMinima that flatter mi...
research
10/21/2018

Depth with Nonlinearity Creates No Bad Local Minima in ResNets

In this paper, we prove that depth with nonlinearity creates no bad loca...
research
01/28/2022

Interplay between depth of neural networks and locality of target functions

It has been recognized that heavily overparameterized deep neural networ...

Please sign up or login with your details

Forgot password? Click here to reset