Anytime Neural Network: a Versatile Trade-off Between Computation and Accuracy

08/22/2017
by   Hanzhang Hu, et al.
0

Anytime predictors first produce crude results quickly, and then continuously refine them until the test-time computational budget is depleted. Such predictors are used in real-time vision systems and streaming-data processing to efficiently utilize varying test-time budgets, and to reduce average prediction cost via early-exits. However, anytime prediction algorithms have difficulties utilizing the accurate predictions of deep neural networks (DNNs), because DNNs are often computationally expensive without competitive intermediate results. In this work, we propose to add auxiliary predictions in DNNs to generate anytime predictions, and optimize these predictions simultaneously by minimizing a carefully constructed weighted sum of losses, where the weights also oscillate during training. The proposed anytime neural networks (ANNs) produce reasonable anytime predictions without sacrificing the final performance or incurring noticeable extra computation. This enables us to assemble a sequence of exponentially deepening ANNs, and it achieves, both theoretically and practically, near-optimal anytime predictions at every budget after spending a constant fraction of extra cost. The proposed methods are shown to produce anytime predictions at the state-of-the-art level on visual recognition data-sets, including ILSVRC2012.

READ FULL TEXT
research
10/10/2016

Impatient DNNs - Deep Neural Networks with Dynamic Time Budgets

We propose Impatient Deep Neural Networks (DNNs) which deal with dynamic...
research
04/12/2019

Reliable Prediction Errors for Deep Neural Networks Using Test-Time Dropout

While the use of deep learning in drug discovery is gaining increasing a...
research
06/10/2022

Refining neural network predictions using background knowledge

Recent work has shown logical background knowledge can be used in learni...
research
01/15/2023

EENet: Learning to Early Exit for Adaptive Inference

Budgeted adaptive inference with early exits is an emerging technique to...
research
02/14/2022

PFGE: Parsimonious Fast Geometric Ensembling of DNNs

Ensemble methods have been widely used to improve the performance of mac...
research
10/16/2018

How to Stop Off-the-Shelf Deep Neural Networks from Overthinking

While deep neural networks (DNNs) can perform complex classification tas...
research
09/27/2022

Fluid Batching: Exit-Aware Preemptive Serving of Early-Exit Neural Networks on Edge NPUs

With deep neural networks (DNNs) emerging as the backbone in a multitude...

Please sign up or login with your details

Forgot password? Click here to reset