A Tropical Approach to Neural Networks with Piecewise Linear Activations

05/22/2018
by   Vasileios Charisopoulos, et al.
0

We present a new, unifying approach following some recent developments on the complexity of neural networks with piecewise linear activations. We treat neural network layers with piecewise linear activations, such as Maxout or ReLU units, as polynomials in the (, +) (or so-called tropical) algebra. Following up on the work of Montufar et al. (arXiv:1402.1869), this approach enables us to improve their upper bound on linear regions of layers with ReLU or leaky ReLU activations to { 2^m, 2 ·∑_j=0^n m - 1j}, where n, m are the number of inputs and outputs, respectively. Additionally, we recover their upper bounds on maxout layers. Our work is parallel to the improvements reported in (arXiv:1711.02114, arXiv:1611.01491), though exclusively under the lens of tropical geometry.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

07/14/2020

Bounding The Number of Linear Regions in Local Area for Neural Networks with ReLU Activations

The number of linear regions is one of the distinct properties of the ne...
03/05/2021

Precise Multi-Neuron Abstractions for Neural Network Certification

Formal verification of neural networks is critical for their safe adopti...
12/20/2013

On the number of response regions of deep feed forward networks with piece-wise linear activations

This paper explores the complexity of deep feedforward networks with lin...
10/31/2018

Nearly-tight bounds on linear regions of piecewise linear neural networks

The developments of deep neural networks (DNN) in recent years have ushe...
10/14/2021

Sound and Complete Neural Network Repair with Minimality and Locality Guarantees

We present a novel methodology for repairing neural networks that use Re...
05/31/2021

Towards Lower Bounds on the Depth of ReLU Neural Networks

We contribute to a better understanding of the class of functions that i...
10/02/2019

Identifying Weights and Architectures of Unknown ReLU Networks

The output of a neural network depends on its parameters in a highly non...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.