Convex Relaxations of Convolutional Neural Nets

12/31/2018
by   Burak Bartan, et al.
0

We propose convex relaxations for convolutional neural nets with one hidden layer where the output weights are fixed. For convex activation functions such as rectified linear units, the relaxations are convex second order cone programs which can be solved very efficiently. We prove that the relaxation recovers the global minimum under a planted model assumption, given sufficiently many training samples from a Gaussian distribution. We also identify a phase transition phenomenon in recovering the global minimum for the relaxation.

READ FULL TEXT
research
05/25/2022

Exact Phase Transitions in Deep Learning

This work reports deep-learning-unique first-order and second-order phas...
research
03/19/2022

Efficient Neural Network Analysis with Sum-of-Infeasibilities

Inspired by sum-of-infeasibilities methods in convex optimization, we pr...
research
12/24/2020

Vector-output ReLU Neural Network Problems are Copositive Programs: Convex Analysis of Two Layer Networks and Polynomial-time Algorithms

We describe the convex semi-infinite dual of the two-layer vector-output...
research
02/24/2020

Neural Networks are Convex Regularizers: Exact Polynomial-time Convex Optimization Formulations for Two-Layer Networks

We develop exact representations of two layer neural networks with recti...
research
03/08/2020

Π-nets: Deep Polynomial Neural Networks

Deep Convolutional Neural Networks (DCNNs) is currently the method of ch...
research
06/20/2020

Deep Polynomial Neural Networks

Deep Convolutional Neural Networks (DCNNs) are currently the method of c...
research
01/06/2022

Bio-inspired Min-Nets Improve the Performance and Robustness of Deep Networks

Min-Nets are inspired by end-stopped cortical cells with units that outp...

Please sign up or login with your details

Forgot password? Click here to reset