Iterative Neural Networks with Bounded Weights

08/16/2019
by   Tomasz Piotrowski, et al.
0

A recent analysis of a model of iterative neural network in Hilbert spaces established fundamental properties of such networks, such as existence of the fixed points sets, convergence analysis, and Lipschitz continuity. Building on these results, we show that under a single mild condition on the weights of the network, one is guaranteed to obtain a neural network converging to its unique fixed point. We provide a bound on the norm of this fixed point in terms of norms of weights and biases of the network. We also show why this model of a feed-forward neural network is not able to accomodate Hopfield networks under our assumption.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/15/2022

Fixed-Point Centrality for Networks

This paper proposes a family of network centralities called fixed-point ...
research
02/23/2017

A Converse to Banach's Fixed Point Theorem and its CLS Completeness

Banach's fixed point theorem for contraction maps has been widely used t...
research
06/03/2021

Convergent Graph Solvers

We propose the convergent graph solver (CGS), a deep learning method tha...
research
06/30/2021

Fixed points of monotonic and (weakly) scalable neural networks

We derive conditions for the existence of fixed points of neural network...
research
05/23/2018

On the Relation of Impulse Propagation to Synaptic Strength

In neural network, synaptic strength could be seen as probability to tra...
research
03/23/2021

Fixed Point Networks: Implicit Depth Models with Jacobian-Free Backprop

A growing trend in deep learning replaces fixed depth models by approxim...
research
03/23/2019

Connections between spectral properties of asymptotic mappings and solutions to wireless network problems

In this study we establish connections between asymptotic functions and ...

Please sign up or login with your details

Forgot password? Click here to reset