DeepAI AI Chat
Log In Sign Up

Learning compositional functions via multiplicative weight updates

06/25/2020
by   Jeremy Bernstein, et al.
35

Compositionality is a basic structural feature of both biological and artificial neural networks. Learning compositional functions via gradient descent incurs well known problems like vanishing and exploding gradients, making careful learning rate tuning essential for real-world applications. This paper proves that multiplicative weight updates satisfy a descent lemma tailored to compositional functions. Based on this lemma, we derive Madam—a multiplicative version of the Adam optimiser—and show that it can train state of the art neural network architectures without learning rate tuning. We further show that Madam is easily adapted to train natively compressed neural networks by representing their weights in a logarithmic number system. We conclude by drawing connections between multiplicative weight updates and recent findings about synapses in biology.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/07/2018

Robust Implicit Backpropagation

Arguably the biggest challenge in applying neural networks is tuning the...
09/27/2020

Faster Biological Gradient Descent Learning

Back-propagation is a popular machine learning algorithm that uses gradi...
01/09/2018

Convergence Analysis of Gradient Descent Algorithms with Proportional Updates

The rise of deep learning in recent years has brought with it increasing...
10/24/2020

Inductive Bias of Gradient Descent for Exponentially Weight Normalized Smooth Homogeneous Neural Nets

We analyze the inductive bias of gradient descent for weight normalized ...
02/24/2021

Multiplicative Reweighting for Robust Neural Network Optimization

Deep neural networks are widespread due to their powerful performance. Y...
11/23/2020

Natural-gradient learning for spiking neurons

In many normative theories of synaptic plasticity, weight updates implic...
11/16/2020

Learning Associative Inference Using Fast Weight Memory

Humans can quickly associate stimuli to solve problems in novel contexts...

Code Repositories

madam

Pytorch and Jax code for the Madam optimiser.


view repo

opt

Optimization Algorithms for Machine Learning with TensorFlow


view repo