Convex Relaxations of Convolutional Neural Nets

12/31/2018
by   Burak Bartan, et al.
0

We propose convex relaxations for convolutional neural nets with one hidden layer where the output weights are fixed. For convex activation functions such as rectified linear units, the relaxations are convex second order cone programs which can be solved very efficiently. We prove that the relaxation recovers the global minimum under a planted model assumption, given sufficiently many training samples from a Gaussian distribution. We also identify a phase transition phenomenon in recovering the global minimum for the relaxation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset