Towards Regression-Free Neural Networks for Diverse Compute Platforms

09/27/2022
by   Rahul Duggal, et al.
0

With the shift towards on-device deep learning, ensuring a consistent behavior of an AI service across diverse compute platforms becomes tremendously important. Our work tackles the emergent problem of reducing predictive inconsistencies arising as negative flips: test samples that are correctly predicted by a less accurate model, but incorrectly by a more accurate one. We introduce REGression constrained Neural Architecture Search (REG-NAS) to design a family of highly accurate models that engender fewer negative flips. REG-NAS consists of two components: (1) A novel architecture constraint that enables a larger model to contain all the weights of the smaller one thus maximizing weight sharing. This idea stems from our observation that larger weight sharing among networks leads to similar sample-wise predictions and results in fewer negative flips; (2) A novel search reward that incorporates both Top-1 accuracy and negative flips in the architecture search metric. We demonstrate that can successfully find desirable architectures with few negative flips in three popular architecture search spaces. Compared to the existing state-of-the-art approach, REG-NAS enables 33-48 negative flips.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2019

HM-NAS: Efficient Neural Architecture Search via Hierarchical Masking

The use of automatic methods, often referred to as Neural Architecture S...
research
02/21/2019

Evaluating the Search Phase of Neural Architecture Search

Neural Architecture Search (NAS) aims to facilitate the design of deep n...
research
07/22/2019

Efficient Novelty-Driven Neural Architecture Search

One-Shot Neural architecture search (NAS) attracts broad attention recen...
research
08/14/2021

FOX-NAS: Fast, On-device and Explainable Neural Architecture Search

Neural architecture search can discover neural networks with good perfor...
research
08/22/2021

Pi-NAS: Improving Neural Architecture Search by Reducing Supernet Training Consistency Shift

Recently proposed neural architecture search (NAS) methods co-train bill...
research
06/13/2020

Optimal Transport Kernels for Sequential and Parallel Neural Architecture Search

Neural architecture search (NAS) automates the design of deep neural net...
research
03/18/2023

Weight-sharing Supernet for Searching Specialized Acoustic Event Classification Networks Across Device Constraints

Acoustic Event Classification (AEC) has been widely used in devices such...

Please sign up or login with your details

Forgot password? Click here to reset