Average Biased ReLU Based CNN Descriptor for Improved Face Retrieval

04/02/2018
by   Shiv Ram Dubey, et al.
0

The convolutional neural networks (CNN) like AlexNet, GoogleNet, VGGNet, etc. have been proven as the very discriminative feature descriptor for many computer vision problems. The trained CNN model over one dataset performs reasonably well over another dataset of similar type and outperforms the hand-designed feature descriptor. The Rectified Linear Unit (ReLU) layer discards some information in order to introduce the non-linearity. In this paper, it is proposed that the discriminative ability of deep image representation using trained model can be improved by Average Biased ReLU (AB-ReLU) at last few layers. Basically, AB-ReLU improves the discriminative ability by two ways: 1) it also exploits some of the discriminative and discarded negative information of ReLU and 2) it kills the irrelevant and positive information used by ReLU. The VGGFace model already trained in MatConvNet over the VGG-Face dataset is used as the feature descriptor for face retrieval over other face datasets. The proposed approach is tested over six challenging unconstrained and robust face datasets like PubFig, LFW, PaSC, AR, etc. in retrieval framework. It is observed that AB-ReLU is consistently performed better than ReLU using VGGFace pretrained model over face datasets.

READ FULL TEXT
research
07/17/2015

Learning Robust Deep Face Representation

With the development of convolution neural network, more and more resear...
research
09/20/2017

Local Directional Relation Pattern for Unconstrained and Robust Face Retrieval

Face recognition is still a very demanding area of research. This proble...
research
12/11/2020

ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance

Despite the unresolved 'dying ReLU problem', the classical ReLU activati...
research
01/29/2016

Face Alignment by Local Deep Descriptor Regression

We present an algorithm for extracting key-point descriptors using deep ...
research
09/22/2017

EraseReLU: A Simple Way to Ease the Training of Deep Convolution Neural Networks

For most state-of-the-art architectures, Rectified Linear Unit (ReLU) be...
research
01/28/2021

Reducing ReLU Count for Privacy-Preserving CNN Speedup

Privacy-Preserving Machine Learning algorithms must balance classificati...
research
06/13/2021

Reborn Mechanism: Rethinking the Negative Phase Information Flow in Convolutional Neural Network

This paper proposes a novel nonlinear activation mechanism typically for...

Please sign up or login with your details

Forgot password? Click here to reset