New Perspective on Progressive GANs Distillation for One-class Novelty Detection

09/15/2021
by   Zhiwei Zhang, et al.
8

One-class novelty detection is conducted to identify anomalous instances, with different distributions from the expected normal instances. In this paper, the Generative Adversarial Network based on the Encoder-Decoder-Encoder scheme (EDE-GAN) achieves state-of-the-art performance. The two factors bellow serve the above purpose: 1) The EDE-GAN calculates the distance between two latent vectors as the anomaly score, which is unlike the previous methods by utilizing the reconstruction error between images. 2) The model obtains best results when the batch size is set to 1. To illustrate their superiority, we design a new GAN architecture, and compare performances according to different batch sizes. Moreover, with experimentation leads to discovery, our result implies there is also evidence of just how beneficial constraint on the latent space are when engaging in model training. In an attempt to learn compact and fast models, we present a new technology, Progressive Knowledge Distillation with GANs (P-KDGAN), which connects two standard GANs through the designed distillation loss. Two-step progressive learning continuously augments the performance of student GANs with improved results over single-step approach. Our experimental results on CIFAR-10, MNIST, and FMNIST datasets illustrate that P-KDGAN improves the performance of the student GAN by 2.44 compressing the computationat ratios of 24.45:1, 311.11:1, and 700:1, respectively.

READ FULL TEXT

page 1

page 3

page 4

page 9

research
07/14/2020

P-KDGAN: Progressive Knowledge Distillation with GANs for One-class Novelty Detection

One-class novelty detection is to identify anomalous instances that do n...
research
02/03/2020

Novelty Detection via Non-Adversarial Generative Network

One-class novelty detection is the process of determining if a query exa...
research
02/01/2019

Compressing GANs using Knowledge Distillation

Generative Adversarial Networks (GANs) have been used in several machine...
research
01/26/2022

Anomaly Detection via Reverse Distillation from One-Class Embedding

Knowledge distillation (KD) achieves promising results on the challengin...
research
03/16/2022

PPCD-GAN: Progressive Pruning and Class-Aware Distillation for Large-Scale Conditional GANs Compression

We push forward neural network compression research by exploiting a nove...
research
11/22/2022

Accelerating Diffusion Sampling with Classifier-based Feature Distillation

Although diffusion model has shown great potential for generating higher...
research
08/18/2022

Mind the Gap in Distilling StyleGANs

StyleGAN family is one of the most popular Generative Adversarial Networ...

Please sign up or login with your details

Forgot password? Click here to reset