Understanding Unconventional Preprocessors in Deep Convolutional Neural Networks for Face Identification

03/27/2019
by   Chollette C. Olisah, et al.
0

Deep networks have achieved huge successes in application domains like object and face recognition. The performance gain is attributed to different facets of the network architecture such as: depth of the convolutional layers, activation function, pooling, batch normalization, forward and back propagation and many more. However, very little emphasis is made on the preprocessors. Therefore, in this paper, the network's preprocessing module is varied across different preprocessing approaches while keeping constant other facets of the network architecture, to investigate the contribution preprocessing makes to the network. Commonly used preprocessors are the data augmentation and normalization and are termed conventional preprocessors. Others are termed the unconventional preprocessors, they are: color space converters; HSV, CIE L*a*b* and YCBCR, grey-level resolution preprocessors; full-based and plane-based image quantization, illumination normalization and insensitive feature preprocessing using: histogram equalization (HE), local contrast normalization (LN) and complete face structural pattern (CFSP). To achieve fixed network parameters, CNNs with transfer learning is employed. Knowledge from the high-level feature vectors of the Inception-V3 network is transferred to offline preprocessed LFW target data; and features trained using the SoftMax classifier for face identification. The experiments show that the discriminative capability of the deep networks can be improved by preprocessing RGB data with HE, full-based and plane-based quantization, rgbGELog, and YCBCR, preprocessors before feeding it to CNNs. However, for best performance, the right setup of preprocessed data with augmentation and/or normalization is required. The plane-based image quantization is found to increase the homogeneity of neighborhood pixels and utilizes reduced bit depth for better storage efficiency.

READ FULL TEXT

page 4

page 6

page 7

research
07/29/2019

AirFace:Lightweight and Efficient Model for Face Recognition

With the development of convolutional neural network, significant progre...
research
12/09/2019

An Empirical Study on Position of the Batch Normalization Layer in Convolutional Neural Networks

In this paper, we have studied how the training of the convolutional neu...
research
08/31/2019

Towards Improving Generalization of Deep Networks via Consistent Normalization

Batch Normalization (BN) was shown to accelerate training and improve ge...
research
04/18/2018

Deep Face Recognition: A Survey

Driven by graphics processing units (GPUs), massive amounts of annotated...
research
09/08/2017

Best Practices in Convolutional Networks for Forward-Looking Sonar Image Recognition

Convolutional Neural Networks (CNN) have revolutionized perception for c...
research
08/14/2017

Context-based Normalization of Histological Stains using Deep Convolutional Features

While human observers are able to cope with variations in color and appe...
research
12/13/2022

Improving Depression estimation from facial videos with face alignment, training optimization and scheduling

Deep learning models have shown promising results in recognizing depress...

Please sign up or login with your details

Forgot password? Click here to reset