Consistency Training of Multi-exit Architectures for Sensor Data

09/27/2021
by   Aaqib Saeed, et al.
0

Deep neural networks have become larger over the years with increasing demand of computational resources for inference; incurring exacerbate costs and leaving little room for deployment on devices with limited battery and other resources for real-time applications. The multi-exit architectures are type of deep neural network that are interleaved with several output (or exit) layers at varying depths of the model. They provide a sound approach for improving computational time and energy utilization of running a model through producing predictions from early exits. In this work, we present a novel and architecture-agnostic approach for robust training of multi-exit architectures termed consistent exit training. The crux of the method lies in a consistency-based objective to enforce prediction invariance over clean and perturbed inputs. We leverage weak supervision to align model output with consistency training and jointly optimize dual-losses in a multi-task learning fashion over the exits in a network. Our technique enables exit layers to generalize better when confronted with increasing uncertainty, hence, resulting in superior quality-efficiency trade-offs. We demonstrate through extensive evaluation on challenging learning tasks involving sensor data that our approach allows examples to exit earlier with better detection rate and without executing all the layers in a deep model.

READ FULL TEXT
research
06/17/2022

Binary Early-Exit Network for Adaptive Inference on Low-Resource Devices

Deep neural networks have significantly improved performance on a range ...
research
11/28/2021

Cross-Task Consistency Learning Framework for Multi-Task Learning

Multi-task learning (MTL) is an active field in deep learning in which w...
research
02/13/2023

SubTuning: Efficient Finetuning for Multi-Task Learning

Finetuning a pretrained model has become a standard approach for trainin...
research
06/07/2020

Robust Learning Through Cross-Task Consistency

Visual perception entails solving a wide set of tasks, e.g., object dete...
research
09/06/2017

BranchyNet: Fast Inference via Early Exiting from Deep Neural Networks

Deep neural networks are state of the art methods for many learning task...
research
08/05/2020

DANA: Dimension-Adaptive Neural Architecture for Multivariate Sensor Data

Current deep neural architectures for processing sensor data are mainly ...
research
12/02/2018

Neural Rejuvenation: Improving Deep Network Training by Enhancing Computational Resource Utilization

In this paper, we study the problem of improving computational resource ...

Please sign up or login with your details

Forgot password? Click here to reset