Randomness In Neural Network Training: Characterizing The Impact of Tooling

06/22/2021
by   Donglin Zhuang, et al.
35

The quest for determinism in machine learning has disproportionately focused on characterizing the impact of noise introduced by algorithmic design choices. In this work, we address a less well understood and studied question: how does our choice of tooling introduce randomness to deep neural network training. We conduct large scale experiments across different types of hardware, accelerators, state of art networks, and open-source datasets, to characterize how tooling choices contribute to the level of non-determinism in a system, the impact of said non-determinism, and the cost of eliminating different sources of noise. Our findings are surprising, and suggest that the impact of non-determinism in nuanced. While top-line metrics such as top-1 accuracy are not noticeably impacted, model performance on certain parts of the data distribution is far more sensitive to the introduction of randomness. Our results suggest that deterministic tooling is critical for AI safety. However, we also find that the cost of ensuring determinism varies dramatically between neural network architectures and hardware types, e.g., with overhead up to 746%, 241%, and 196% on a spectrum of widely used GPU accelerator architectures, relative to non-deterministic training. The source code used in this paper is available at https://github.com/usyd-fsalab/NeuralNetworkRandomness.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 4

page 8

09/03/2021

Impact of GPU uncertainty on the training of predictive deep neural networks

[retracted] We found out that the difference was dependent on the Chaine...
03/04/2021

Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy

Neural network pruning is a popular technique used to reduce the inferen...
11/14/2019

A Scalable Approach for Facial Action Unit Classifier Training UsingNoisy Data for Pre-Training

Machine learning systems are being used to automate many types of labori...
03/25/2022

A Semi-Decoupled Approach to Fast and Optimal Hardware-Software Co-Design of Neural Accelerators

In view of the performance limitations of fully-decoupled designs for ne...
11/02/2016

Deep Convolutional Neural Network Design Patterns

Recent research in the deep learning field has produced a plethora of ne...
07/15/2021

Neural Code Summarization: How Far Are We?

Source code summaries are important for the comprehension and maintenanc...
04/25/2020

Muscle Synergy and Coupling for Hand

The knowledge of the intuitive link between muscle activity and the fing...

Code Repositories

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.