Compressing Facial Makeup Transfer Networks by Collaborative Distillation and Kernel Decomposition

09/16/2020
by   Bianjiang Yang, et al.
0

Although the facial makeup transfer network has achieved high-quality performance in generating perceptually pleasing makeup images, its capability is still restricted by the massive computation and storage of the network architecture. We address this issue by compressing facial makeup transfer networks with collaborative distillation and kernel decomposition. The main idea of collaborative distillation is underpinned by a finding that the encoder-decoder pairs construct an exclusive collaborative relationship, which is regarded as a new kind of knowledge for low-level vision tasks. For kernel decomposition, we apply the depth-wise separation of convolutional kernels to build a light-weighted Convolutional Neural Network (CNN) from the original network. Extensive experiments show the effectiveness of the compression method when applied to the state-of-the-art facial makeup transfer network – BeautyGAN.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 2

page 4

03/18/2020

Collaborative Distillation for Ultra-Resolution Universal Style Transfer

Universal style transfer methods typically leverage rich representations...
06/07/2020

Peer Collaborative Learning for Online Knowledge Distillation

Traditional knowledge distillation uses a two-stage training strategy to...
09/30/2020

Efficient Kernel Transfer in Knowledge Distillation

Knowledge distillation is an effective way for model compression in deep...
12/12/2020

Periocular in the Wild Embedding Learning with Cross-Modal Consistent Knowledge Distillation

Periocular biometric, or peripheral area of ocular, is a collaborative a...
10/09/2020

Be Your Own Best Competitor! Multi-Branched Adversarial Knowledge Transfer

Deep neural network architectures have attained remarkable improvements ...
12/08/2019

Lossless Compression for 3DCNNs Based on Tensor Train Decomposition

Three dimensional convolutional neural networks (3DCNNs) have been appli...
09/28/2020

Kernel Based Progressive Distillation for Adder Neural Networks

Adder Neural Networks (ANNs) which only contain additions bring us a new...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.