The Hidden Power of Pure 16-bit Floating-Point Neural Networks

01/30/2023
by   Juyoung Yun, et al.
0

Lowering the precision of neural networks from the prevalent 32-bit precision has long been considered harmful to performance, despite the gain in space and time. Many works propose various techniques to implement half-precision neural networks, but none study pure 16-bit settings. This paper investigates the unexpected performance gain of pure 16-bit neural networks over the 32-bit networks in classification tasks. We present extensive experimental results that favorably compare various 16-bit neural networks' performance to those of the 32-bit models. In addition, a theoretical analysis of the efficiency of 16-bit models is provided, which is coupled with empirical evidence to back it up. Finally, we discuss situations in which low-precision training is indeed detrimental.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2023

Comparative Study: Standalone IEEE 16-bit Floating-Point for Image Classification

Reducing the number of bits needed to encode the weights and activations...
research
10/13/2020

Revisiting BFloat16 Training

State-of-the-art generic low-precision training algorithms use a mix of ...
research
12/04/2017

Precision Scaling of Neural Networks for Efficient Audio Processing

While deep neural networks have shown powerful performance in many audio...
research
01/31/2023

Training with Mixed-Precision Floating-Point Assignments

When training deep neural networks, keeping all tensors in high precisio...
research
09/30/2022

Convolutional Neural Networks Quantization with Attention

It has been proven that, compared to using 32-bit floating-point numbers...
research
04/26/2015

Computational Cost Reduction in Learned Transform Classifications

We present a theoretical analysis and empirical evaluations of a novel s...
research
11/17/2015

Reduced-Precision Strategies for Bounded Memory in Deep Neural Nets

This work investigates how using reduced precision data in Convolutional...

Please sign up or login with your details

Forgot password? Click here to reset