FlexConv: Continuous Kernel Convolutions with Differentiable Kernel Sizes

10/15/2021
by   David W. Romero, et al.
0

When designing Convolutional Neural Networks (CNNs), one must select the size of the convolutional kernels before training. Recent works show CNNs benefit from different kernel sizes at different layers, but exploring all possible combinations is unfeasible in practice. A more efficient approach is to learn the kernel size during training. However, existing works that learn the kernel size have a limited bandwidth. These approaches scale kernels by dilation, and thus the detail they can describe is limited. In this work, we propose FlexConv, a novel convolutional operation with which high bandwidth convolutional kernels of learnable kernel size can be learned at a fixed parameter cost. FlexNets model long-term dependencies without the use of pooling, achieve state-of-the-art performance on several sequential datasets, outperform recent works with learned kernel sizes, and are competitive with much deeper ResNets on image benchmark datasets. Additionally, FlexNets can be deployed at higher resolutions than those seen during training. To avoid aliasing, we propose a novel kernel parameterization with which the frequency of the kernels can be analytically controlled. Our novel kernel parameterization shows higher descriptive power and faster convergence speed than existing parameterizations. This leads to important improvements in classification accuracy.

READ FULL TEXT

page 2

page 3

page 6

page 8

page 9

page 15

page 16

page 18

research
02/03/2020

DWM: A Decomposable Winograd Method for Convolution Acceleration

Winograd's minimal filtering algorithm has been widely used in Convoluti...
research
07/06/2021

Integrating Circle Kernels into Convolutional Neural Networks

The square kernel is a standard unit for contemporary Convolutional Neur...
research
02/10/2023

DNArch: Learning Convolutional Neural Architectures by Backpropagation

We present Differentiable Neural Architectures (DNArch), a method that j...
research
11/29/2019

Deep Networks with Adaptive Nyström Approximation

Recent work has focused on combining kernel methods and deep learning to...
research
02/04/2021

CKConv: Continuous Kernel Convolution For Sequential Data

Conventional neural architectures for sequential data present important ...
research
11/18/2019

Grassmannian Packings in Neural Networks: Learning with Maximal Subspace Packings for Diversity and Anti-Sparsity

Kernel sparsity ("dying ReLUs") and lack of diversity are commonly obser...
research
12/09/2019

Naive Gabor Networks

In this paper, we introduce naive Gabor Networks or Gabor-Nets which, fo...

Please sign up or login with your details

Forgot password? Click here to reset