AugShuffleNet: Improve ShuffleNetV2 via More Information Communication

03/13/2022
by   Longqing Ye, et al.
0

Based on ShuffleNetV2, we build a more powerful and efficient model family, termed as AugShuffleNets, by introducing higher frequency of cross-layer information communication for better model performance. Evaluated on the CIFAR-10 and CIFAR-100 datasets, AugShuffleNet consistently outperforms ShuffleNetV2 in terms of accuracy, with less computational cost, fewer parameter count.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2021

[Reproducibility Report] Rigging the Lottery: Making All Tickets Winners

RigL, a sparse training algorithm, claims to directly train sparse netwo...
research
02/08/2019

Architecture Compression

In this paper we propose a novel approach to model compression termed Ar...
research
09/03/2019

PSDNet and DPDNet: Efficient channel expansion, Depthwise-Pointwise-Depthwise Inverted Bottleneck Block

In many real-time applications, the deployment of deep neural networks i...
research
12/22/2022

Some recent advances in reasoning based on analogical proportions

Analogical proportions compare pairs of items (a, b) and (c, d) in terms...
research
06/16/2022

PRANC: Pseudo RAndom Networks for Compacting deep models

Communication becomes a bottleneck in various distributed Machine Learni...
research
02/17/2019

ODIN: ODE-Informed Regression for Parameter and State Inference in Time-Continuous Dynamical Systems

Parameter inference in ordinary differential equations is an important p...
research
10/01/2019

Sampling Unknown Decision Functions to Build Classifier Copies

Copies have been proposed as a viable alternative to endow machine learn...

Please sign up or login with your details

Forgot password? Click here to reset