Deep Lattice Networks and Partial Monotonic Functions

09/19/2017
by   Seungil You, et al.
0

We propose learning deep models that are monotonic with respect to a user-specified set of inputs by alternating layers of linear embeddings, ensembles of lattices, and calibrators (piecewise linear functions), with appropriate constraints for monotonicity, and jointly training the resulting network. We implement the layers and projections with new computational graph nodes in TensorFlow and use the ADAM optimizer and batched stochastic gradients. Experiments on benchmark and real-world datasets show that six-layer monotonic deep lattice networks achieve state-of-the art performance for classification and regression with monotonicity guarantees.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/20/2020

Certified Monotonic Neural Networks

Learning monotonic models with respect to a subset of the inputs is a de...
research
05/24/2022

Constrained Monotonic Neural Networks

Deep neural networks are becoming increasingly popular in approximating ...
research
09/24/2019

Monotonic Trends in Deep Neural Networks

The importance of domain knowledge in enhancing model performance and ma...
research
05/30/2019

Monotonic Gaussian Process Flow

We propose a new framework of imposing monotonicity constraints in a Bay...
research
06/16/2020

Counterexample-Guided Learning of Monotonic Neural Networks

The widespread adoption of deep learning is often attributed to its auto...
research
11/28/2020

Lattice Fusion Networks for Image Denoising

A novel method for feature fusion in convolutional neural networks is pr...
research
07/14/2023

Expressive Monotonic Neural Networks

The monotonic dependence of the outputs of a neural network on some of i...

Please sign up or login with your details

Forgot password? Click here to reset