Practical Gauss-Newton Optimisation for Deep Learning

06/12/2017
by   Aleksandar Botev, et al.
0

We present an efficient block-diagonal ap- proximation to the Gauss-Newton matrix for feedforward neural networks. Our result- ing algorithm is competitive against state- of-the-art first order optimisation methods, with sometimes significant improvement in optimisation performance. Unlike first-order methods, for which hyperparameter tuning of the optimisation parameters is often a labo- rious process, our approach can provide good performance even when used with default set- tings. A side result of our work is that for piecewise linear transfer functions, the net- work objective function can have no differ- entiable local maxima, which may partially explain why such transfer functions facilitate effective optimisation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2020

IEO: Intelligent Evolutionary Optimisation for Hyperparameter Tuning

Hyperparameter optimisation is a crucial process in searching the optima...
research
06/29/2023

Obeying the Order: Introducing Ordered Transfer Hyperparameter Optimisation

We introduce ordered transfer hyperparameter optimisation (OTHPO), a ver...
research
05/12/2019

Software System Design based on Patterns for Newton-Type Methods

A wide range of engineering applications uses optimisation techniques as...
research
01/25/2023

Optimisation of seismic imaging via bilevel learning

The implementation of Full Waveform Inversion (FWI) requires the a prior...
research
11/17/2016

Towards a Mathematical Understanding of the Difficulty in Learning with Feedforward Neural Networks

Training deep neural networks for solving machine learning problems is o...
research
06/25/2018

Predicting Effective Control Parameters for Differential Evolution using Cluster Analysis of Objective Function Features

A methodology is introduced which uses three simple objective function f...
research
10/18/2022

Optimisation Generalisation in Networks of Neurons

The goal of this thesis is to develop the optimisation and generalisatio...

Please sign up or login with your details

Forgot password? Click here to reset