Log-sum-exp neural networks and posynomial models for convex and log-log-convex data

06/20/2018
by   Giuseppe C. Calafiore, et al.
0

We show that a one-layer feedforward neural network with exponential activation functions in the inner layer and logarithmic activation in the output neuron is a universal approximator of convex functions. Such a network represents a family of scaled log-sum exponential functions, here named LSET. The proof uses a dequantization argument from tropical geometry. Under a suitable exponential transformation LSE maps to a family of generalized posynomial functions GPOST, which we also show to be universal approximators for log-log-convex functions. The key feature of interest in the proposed approach is that, once a LSET network is trained on data, the resulting model is convex in the variables, which makes it readily amenable to efficient design based on convex optimization. Similarly, once a GPOST model is trained on data, it yields a posynomial model that can be efficiently optimized with respect to its variables by using Geometric Programming (GP). Many relevant phenomena in physics and engineering can indeed be modeled, either exactly or approximately, via convex or log-log-convex models. The proposed methodology is illustrated by two numerical examples in which LSET and GPOST models are used to first approximate data gathered from the simulations of two physical processes (the vibration from a vehicle suspension system, and the peak power generated by the combustion of propane), and to later optimize these models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/21/2019

A Universal Approximation Result for Difference of log-sum-exp Neural Networks

We show that a neural network whose output is obtained as the difference...
research
12/10/2018

Disciplined Geometric Programming

We introduce log-log convex programs, which are optimization problems wi...
research
02/03/2016

A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks

We present the soft exponential activation function for artificial neura...
research
01/17/2022

Parametrized Convex Universal Approximators for Decision-Making Problems

Parametrized max-affine (PMA) and parametrized log-sum-exp (PLSE) networ...
research
01/20/2022

A Method of Sequential Log-Convex Programming for Engineering Design

A method of Sequential Log-Convex Programming (SLCP) is constructed that...
research
11/11/2022

Deep equilibrium models as estimators for continuous latent variables

Principal Component Analysis (PCA) and its exponential family extensions...
research
03/19/2022

Efficient Neural Network Analysis with Sum-of-Infeasibilities

Inspired by sum-of-infeasibilities methods in convex optimization, we pr...

Please sign up or login with your details

Forgot password? Click here to reset