SKYNET: an efficient and robust neural network training tool for machine learning in astronomy

09/03/2013
by   Philip Graff, et al.
0

We present the first public release of our generic neural network training algorithm, called SkyNet. This efficient and robust machine learning tool is able to train large and deep feed-forward neural networks, including autoencoders, for use in a wide range of supervised and unsupervised learning applications, such as regression, classification, density estimation, clustering and dimensionality reduction. SkyNet uses a `pre-training' method to obtain a set of network parameters that has empirically been shown to be close to a good solution, followed by further optimisation using a regularised variant of Newton's method, where the level of regularisation is determined and adjusted automatically; the latter uses second-order derivative information to improve convergence, but without the need to evaluate or store the full Hessian matrix, by using a fast approximate method to calculate Hessian-vector products. This combination of methods allows for the training of complicated networks that are difficult to optimise using standard backpropagation techniques. SkyNet employs convergence criteria that naturally prevent overfitting, and also includes a fast algorithm for estimating the accuracy of network outputs. The utility and flexibility of SkyNet are demonstrated by application to a number of toy problems, and to astronomical problems focusing on the recovery of structure from blurred and noisy images, the identification of gamma-ray bursters, and the compression and denoising of galaxy images. The SkyNet software, which is implemented in standard ANSI C and fully parallelised using MPI, is available at http://www.mrao.cam.ac.uk/software/skynet/.

READ FULL TEXT

page 14

page 17

research
09/15/2020

Second-order Neural Network Training Using Complex-step Directional Derivative

While the superior performance of second-order optimization methods such...
research
09/12/2012

Training a Feed-forward Neural Network with Artificial Bee Colony Based Backpropagation Method

Back-propagation algorithm is one of the most widely used and popular te...
research
05/29/2023

SANE: The phases of gradient descent through Sharpness Adjusted Number of Effective parameters

Modern neural networks are undeniably successful. Numerous studies have ...
research
02/07/2020

Low Rank Saddle Free Newton: Algorithm and Analysis

Many tasks in engineering fields and machine learning involve minimizing...
research
06/20/2020

Training (Overparametrized) Neural Networks in Near-Linear Time

The slow convergence rate and pathological curvature issues of first-ord...
research
10/11/2022

Learning to Optimize Quasi-Newton Methods

We introduce a novel machine learning optimizer called LODO, which onlin...
research
05/12/2019

Software System Design based on Patterns for Newton-Type Methods

A wide range of engineering applications uses optimisation techniques as...

Please sign up or login with your details

Forgot password? Click here to reset