aw_nas: A Modularized and Extensible NAS framework

11/25/2020
by   Xuefei Ning, et al.
0

Neural Architecture Search (NAS) has received extensive attention due to its capability to discover neural network architectures in an automated manner. aw_nas is an open-source Python framework implementing various NAS algorithms in a modularized manner. Currently, aw_nas can be used to reproduce the results of mainstream NAS algorithms of various types. Also, due to the modularized design, one can simply experiment with different NAS algorithms for various applications with awnas (e.g., classification, detection, text modeling, fault tolerance, adversarial robustness, hardware efficiency, and etc.). Codes and documentation are available at https://github.com/walkerning/aw_nas.

READ FULL TEXT
research
08/07/2020

A Surgery of the Neural Architecture Evaluators

Neural architecture search (NAS) recently received extensive attention d...
research
01/22/2022

NAS-VAD: Neural Architecture Search for Voice Activity Detection

The need for automatic design of deep neural networks has led to the eme...
research
03/19/2021

HW-NAS-Bench:Hardware-Aware Neural Architecture Search Benchmark

HardWare-aware Neural Architecture Search (HW-NAS) has recently gained t...
research
07/15/2022

ScaleNet: Searching for the Model to Scale

Recently, community has paid increasing attention on model scaling and c...
research
10/26/2020

Hierarchical Neural Architecture Search for Deep Stereo Matching

To reduce the human efforts in neural network design, Neural Architectur...
research
08/08/2022

Neural Architecture Search as Multiobjective Optimization Benchmarks: Problem Formulation and Performance Assessment

The ongoing advancements in network architecture design have led to rema...
research
03/20/2020

FTT-NAS: Discovering Fault-Tolerant Neural Architecture

With the fast evolvement of embedded deep-learning computing systems, ap...

Please sign up or login with your details

Forgot password? Click here to reset