Batch-Shaped Channel Gated Networks

07/15/2019
by   Babak Ehteshami Bejnordi, et al.
1

We present a method for gating deep-learning architectures on a fine-grained level. Individual convolutional maps are turned on/off conditionally on features in the network. This method allows us to train neural networks with a large capacity, but lower inference time than the full network. To achieve this, we introduce a new residual block architecture that gates convolutional channels in a fine-grained manner. We also introduce a generally applicable tool "batch-shaping" that matches the marginal aggregate posteriors of features in a neural network to a pre-specified prior distribution. We use this novel technique to force gates to be more conditional on the data. We present results on CIFAR-10 and ImageNet datasets for image classification and Cityscapes for semantic segmentation. Our results show that our method can slim down large architectures conditionally, such that the average computational cost on the data is on par with a smaller architecture, but with higher accuracy. In particular, our ResNet34 gated network achieves a performance of 72.55 accuracy compared to the 69.76 similar complexity. We also show that the resulting networks automatically learn to use more features for difficult examples and fewer features for simple examples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset