Arbitrary-sized Image Training and Residual Kernel Learning: Towards Image Fraud Identification

05/22/2020
by   Hongyu Li, et al.
4

Preserving original noise residuals in images are critical to image fraud identification. Since the resizing operation during deep learning will damage the microstructures of image noise residuals, we propose a framework for directly training images of original input scales without resizing. Our arbitrary-sized image training method mainly depends on the pseudo-batch gradient descent (PBGD), which bridges the gap between the input batch and the update batch to assure that model updates can normally run for arbitrary-sized images. In addition, a 3-phase alternate training strategy is designed to learn optimal residual kernels for image fraud identification. With the learnt residual kernels and PBGD, the proposed framework achieved the state-of-the-art results in image fraud identification, especially for images with small tampered regions or unseen images with different tampering distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 7

page 8

page 9

research
02/05/2019

Multi-Kernel Prediction Networks for Denoising of Burst Images

In low light or short-exposure photography the image is often corrupted ...
research
12/02/2018

Analysis on Gradient Propagation in Batch Normalized Residual Networks

We conduct mathematical analysis on the effect of batch normalization (B...
research
04/15/2021

See through Gradients: Image Batch Recovery via GradInversion

Training deep neural networks requires gradient estimation from data bat...
research
03/01/2022

Differentially private training of residual networks with scale normalisation

We investigate the optimal choice of replacement layer for Batch Normali...
research
09/20/2022

NBD-GAP: Non-Blind Image Deblurring Without Clean Target Images

In recent years, deep neural network-based restoration methods have achi...
research
03/20/2019

Convolution with even-sized kernels and symmetric padding

Compact convolutional neural networks gain efficiency mainly through dep...
research
06/12/2017

Recursive Multikernel Filters Exploiting Nonlinear Temporal Structure

In kernel methods, temporal information on the data is commonly included...

Please sign up or login with your details

Forgot password? Click here to reset