A Tropical Approach to Neural Networks with Piecewise Linear Activations

05/22/2018 ∙ by Vasileios Charisopoulos, et al. ∙ 0

We present a new, unifying approach following some recent developments on the complexity of neural networks with piecewise linear activations. We treat neural network layers with piecewise linear activations, such as Maxout or ReLU units, as polynomials in the (, +) (or so-called tropical) algebra. Following up on the work of Montufar et al. (arXiv:1402.1869), this approach enables us to improve their upper bound on linear regions of layers with ReLU or leaky ReLU activations to { 2^m, 2 ·∑_j=0^n m - 1j}, where n, m are the number of inputs and outputs, respectively. Additionally, we recover their upper bounds on maxout layers. Our work is parallel to the improvements reported in (arXiv:1711.02114, arXiv:1611.01491), though exclusively under the lens of tropical geometry.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.