Fixed Point Networks: Implicit Depth Models with Jacobian-Free Backprop

03/23/2021
by   Samy Wu Fung, et al.
12

A growing trend in deep learning replaces fixed depth models by approximations of the limit as network depth approaches infinity. This approach uses a portion of network weights to prescribe behavior by defining a limit condition. This makes network depth implicit, varying based on the provided data and an error tolerance. Moreover, existing implicit models can be implemented and trained with fixed memory costs in exchange for additional computational costs. In particular, backpropagation through implicit depth models requires solving a Jacobian-based equation arising from the implicit function theorem. We propose fixed point networks (FPNs), a simple setup for implicit depth learning that guarantees convergence of forward propagation to a unique limit defined by network weights and input data. Our key contribution is to provide a new Jacobian-free backpropagation (JFB) scheme that circumvents the need to solve Jacobian-based equations while maintaining fixed memory costs. This makes FPNs much cheaper to train and easy to implement. Our numerical examples yield state of the art classification results for implicit depth models and outperform corresponding explicit models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/23/2023

Efficient Training of Deep Equilibrium Models

Deep equilibrium models (DEQs) have proven to be very powerful for learn...
research
06/06/2021

Robust Implicit Networks via Non-Euclidean Contractions

Implicit neural networks, a.k.a., deep equilibrium networks, are a class...
research
06/08/2023

A Class of Smoothing Modulus-Based Iterative Method for Solving Implicit Complementarity Problems

In this paper, a class of smoothing modulus-based iterative method was p...
research
08/17/2019

Implicit Deep Learning

We define a new class of "implicit" deep learning prediction rules that ...
research
08/16/2019

Iterative Neural Networks with Bounded Weights

A recent analysis of a model of iterative neural network in Hilbert spac...
research
06/03/2021

Convergent Graph Solvers

We propose the convergent graph solver (CGS), a deep learning method tha...
research
01/28/2022

Mixing Implicit and Explicit Deep Learning with Skip DEQs and Infinite Time Neural ODEs (Continuous DEQs)

Implicit deep learning architectures, like Neural ODEs and Deep Equilibr...

Please sign up or login with your details

Forgot password? Click here to reset