Improved Learning of One-hidden-layer Convolutional Neural Networks with Overlaps

05/20/2018
by   Simon S. Du, et al.
0

We propose a new algorithm to learn a one-hidden-layer convolutional neural network where both the convolutional weights and the outputs weights are parameters to be learned. Our algorithm works for a general class of (potentially overlapping) patches, including commonly used structures for computer vision tasks. Our algorithm draws ideas from (1) isotonic regression for learning neural networks and (2) landscape analysis of non-convex matrix factorization problems. We believe these findings may inspire further development in designing provable algorithms for learning neural networks and other complex models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2018

Learning One Convolutional Layer with Overlapping Patches

We give the first provably efficient algorithm for learning a one hidden...
research
05/16/2017

Learning Image Relations with Contrast Association Networks

Inferring the relations between two images is an important class of task...
research
07/26/2023

Understanding Deep Neural Networks via Linear Separability of Hidden Layers

In this paper, we measure the linear separability of hidden layer output...
research
11/09/2018

A Convergence Theory for Deep Learning via Over-Parameterization

Deep neural networks (DNNs) have demonstrated dominating performance in ...
research
10/16/2018

Learning Two-layer Neural Networks with Symmetric Inputs

We give a new algorithm for learning a two-layer neural network under a ...
research
05/16/2018

End-to-end Learning of a Convolutional Neural Network via Deep Tensor Decomposition

In this paper we study the problem of learning the weights of a deep con...
research
10/21/2020

Voronoi Convolutional Neural Networks

In this technical report, we investigate extending convolutional neural ...

Please sign up or login with your details

Forgot password? Click here to reset