Neural Polytopes

07/03/2023
by   Koji Hashimoto, et al.
0

We find that simple neural networks with ReLU activation generate polytopes as an approximation of a unit sphere in various dimensions. The species of polytopes are regulated by the network architecture, such as the number of units and layers. For a variety of activation functions, generalization of polytopes is obtained, which we call neural polytopes. They are a smooth analogue of polytopes, exhibiting geometric duality. This finding initiates research of generative discrete geometry to approximate surfaces by machine learning.

READ FULL TEXT

page 3

page 4

research
09/27/2021

SAU: Smooth activation function using convolution with approximate identities

Well-known activation functions like ReLU or Leaky ReLU are non-differen...
research
07/02/2019

Best k-layer neural network approximations

We investigate the geometry of the empirical risk minimization problem f...
research
10/19/2022

A new activation for neural networks and its approximation

Deep learning with deep neural networks (DNNs) has attracted tremendous ...
research
04/08/2020

The Loss Surfaces of Neural Networks with General Activation Functions

We present results extending the foundational work of Choromanska et al ...
research
02/27/2023

Complex Clipping for Improved Generalization in Machine Learning

For many machine learning applications, a common input representation is...
research
01/27/2021

Kähler Geometry of Quiver Varieties and Machine Learning

We develop an algebro-geometric formulation for neural networks in machi...
research
06/24/2022

Neural Networks with A La Carte Selection of Activation Functions

Activation functions (AFs), which are pivotal to the success (or failure...

Please sign up or login with your details

Forgot password? Click here to reset