Cut Inner Layers: A Structured Pruning Strategy for Efficient U-Net GANs

06/29/2022
by   Bo-Kyeong Kim, et al.
0

Pruning effectively compresses overparameterized models. Despite the success of pruning methods for discriminative models, applying them for generative models has been relatively rarely approached. This study conducts structured pruning on U-Net generators of conditional GANs. A per-layer sensitivity analysis confirms that many unnecessary filters exist in the innermost layers near the bottleneck and can be substantially pruned. Based on this observation, we prune these filters from multiple inner layers or suggest alternative architectures by completely eliminating the layers. We evaluate our approach with Pix2Pix for image-to-image translation and Wav2Lip for speech-driven talking face generation. Our method outperforms global pruning baselines, demonstrating the importance of properly considering where to prune for U-Net generators.

READ FULL TEXT
research
05/16/2019

Investigating Channel Pruning through Structural Redundancy Reduction - A Statistical Study

Most existing channel pruning methods formulate the pruning task from a ...
research
04/26/2023

Filter Pruning via Filters Similarity in Consecutive Layers

Filter pruning is widely adopted to compress and accelerate the Convolut...
research
03/21/2023

Performance-aware Approximation of Global Channel Pruning for Multitask CNNs

Global channel pruning (GCP) aims to remove a subset of channels (filter...
research
03/14/2023

Automatic Attention Pruning: Improving and Automating Model Pruning using Attentions

Pruning is a promising approach to compress deep learning models in orde...
research
07/25/2019

Co-Evolutionary Compression for Unpaired Image Translation

Generative adversarial networks (GANs) have been successfully used for c...
research
06/01/2020

Pruning via Iterative Ranking of Sensitivity Statistics

With the introduction of SNIP [arXiv:1810.02340v2], it has been demonstr...
research
06/22/2020

Slimming Neural Networks using Adaptive Connectivity Scores

There are two broad approaches to deep neural network (DNN) pruning: 1) ...

Please sign up or login with your details

Forgot password? Click here to reset