ReLU activated Multi-Layer Neural Networks trained with Mixed Integer Linear Programs

08/19/2020
by   Steffen Goebbels, et al.
0

This paper is a case study to demonstrate that, in principle, multi-layer feedforward Neural Networks activated by ReLU functions can be iteratively trained with Mixed Integer Linear Programs. To this end, two simple networks were trained with a backpropagation-like algorithm on the MNIST dataset that contains handwritten digits.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro