Dynamic ReLU

03/22/2020
by   Yinpeng Chen, et al.
0

Rectified linear units (ReLU) are commonly used in deep neural networks. So far ReLU and its generalizations (either non-parametric or parametric) are static, performing identically for all input samples. In this paper, we propose Dynamic ReLU (DY-ReLU), a dynamic rectifier whose parameters are input-dependent as a hyper function over all input elements. The key insight is that DY-ReLU encodes the global context into the hyper function and adapts the piecewise linear activation function accordingly. Compared to its static counterpart, DY-ReLU has negligible extra computational cost, but significantly more representation capability, especially for light-weight neural networks. By simply using DY-ReLU for MobileNetV2, the top-1 accuracy on ImageNet classification is boosted from 72.0

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2018

Deep Learning using Rectified Linear Units (ReLU)

We introduce the use of rectified linear units (ReLU) as the classificat...
research
08/02/2021

Piecewise Linear Units Improve Deep Neural Networks

The activation function is at the heart of a deep neural networks nonlin...
research
06/01/2023

Learning Prescriptive ReLU Networks

We study the problem of learning optimal policy from a set of discrete t...
research
07/26/2022

One Simple Trick to Fix Your Bayesian Neural Network

One of the most popular estimation methods in Bayesian neural networks (...
research
10/23/2021

Parametric Variational Linear Units (PVLUs) in Deep Convolutional Networks

The Rectified Linear Unit is currently a state-of-the-art activation fun...
research
05/20/2020

ReLU Code Space: A Basis for Rating Network Quality Besides Accuracy

We propose a new metric space of ReLU activation codes equipped with a t...
research
05/13/2020

The effect of Target Normalization and Momentum on Dying ReLU

Optimizing parameters with momentum, normalizing data values, and using ...

Please sign up or login with your details

Forgot password? Click here to reset