Spectrum Translation for Cross-Spectral Ocular Matching

02/14/2020
by   Kevin Hernandez-Diaz, et al.
0

Cross-spectral verification remains a big issue in biometrics, especially for the ocular area due to differences in the reflected features in the images depending on the region and spectrum used. In this paper, we investigate the use of Conditional Adversarial Networks for spectrum translation between near infra-red and visual light images for ocular biometrics. We analyze the transformation based on the overall visual quality of the transformed images and the accuracy drop of the identification system when trained with opposing data. We use the PolyU database and propose two different systems for biometric verification, the first one based on Siamese Networks trained with Softmax and Cross-Entropy loss, and the second one a Triplet Loss network. We achieved an EER of 1% when using a Triplet Loss network trained for NIR and finding the Euclidean distance between the real NIR images and the fake ones translated from the visible spectrum. We also outperform previous results using baseline algorithms.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset