MemNet: Memory-Efficiency Guided Neural Architecture Search with Augment-Trim learning

07/22/2019
by   Peiye Liu, et al.
0

Recent studies on automatic neural architectures search have demonstrated significant performance, competitive to or even better than hand-crafted neural architectures. However, most of the existing network architecture tend to use residual, parallel structures and concatenation block between shallow and deep features to construct a large network. This requires large amounts of memory for storing both weights and feature maps. This is challenging for mobile and embedded devices since they may not have enough memory to perform inference with the designed large network model. To close this gap, we propose MemNet, an augment-trim learning-based neural network search framework that optimizes not only performance but also memory requirement. Specifically, it employs memory consumption based ranking score which forces an upper bound on memory consumption for navigating the search process. Experiment results show that, as compared to the state-of-the-art efficient designing methods, MemNet can find an architecture which can achieve competitive accuracy and save an average of 24.17

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2018

MONAS: Multi-Objective Neural Architecture Search using Reinforcement Learning

Recent studies on neural architecture search have shown that automatical...
research
03/23/2021

Enhanced Gradient for Differentiable Architecture Search

In recent years, neural architecture search (NAS) methods have been prop...
research
02/09/2022

Neural Architecture Search for Energy Efficient Always-on Audio Models

Mobile and edge computing devices for always-on audio classification req...
research
06/13/2022

EmProx: Neural Network Performance Estimation For Neural Architecture Search

Common Neural Architecture Search methods generate large amounts of cand...
research
10/27/2021

Binarized ResNet: Enabling Automatic Modulation Classification at the resource-constrained Edge

In this paper, we propose a ResNet based neural architecture to solve th...
research
09/13/2021

RADARS: Memory Efficient Reinforcement Learning Aided Differentiable Neural Architecture Search

Differentiable neural architecture search (DNAS) is known for its capaci...
research
08/16/2018

BlockQNN: Efficient Block-wise Neural Network Architecture Generation

Convolutional neural networks have gained a remarkable success in comput...

Please sign up or login with your details

Forgot password? Click here to reset