Parametric Variational Linear Units (PVLUs) in Deep Convolutional Networks

10/23/2021
by   Aarush Gupta, et al.
0

The Rectified Linear Unit is currently a state-of-the-art activation function in deep convolutional neural networks. To combat ReLU's dying neuron problem, we propose the Parametric Variational Linear Unit (PVLU), which adds a sinusoidal function with trainable coefficients to ReLU. Along with introducing nonlinearity and non-zero gradients across the entire real domain, PVLU allows for increased model generalization and robustness when implemented in the context of transfer learning. On a simple, non-transfer sequential CNN, PVLU led to relative error decrease of 16.3 augmentation, relative to ReLU. PVLU is also tested on transfer learning problems. The VGG-16 and VGG-19 models experience relative error reductions of 9.5 PVLU. When training on Gaussian-filtered CIFAR-10 images, similar improvements are noted for the VGG models. Most notably, PVLU fine tuning allows for relative error reductions up to and exceeding 10 ResNet models for both CIFAR-10 and CIFAR-100.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset