Learning with convolution and pooling operations in kernel methods

11/16/2021
by   Theodor Misiakiewicz, et al.
0

Recent empirical work has shown that hierarchical convolutional kernels inspired by convolutional neural networks (CNNs) significantly improve the performance of kernel methods in image classification tasks. A widely accepted explanation for the success of these architectures is that they encode hypothesis classes that are suitable for natural images. However, understanding the precise interplay between approximation and generalization in convolutional architectures remains a challenge. In this paper, we consider the stylized setting of covariates (image pixels) uniformly distributed on the hypercube, and fully characterize the RKHS of kernels composed of single layers of convolution, pooling, and downsampling operations. We then study the gain in sample efficiency of kernel methods using these kernels over standard inner-product kernels. In particular, we show that 1) the convolution layer breaks the curse of dimensionality by restricting the RKHS to `local' functions; 2) local pooling biases learning towards low-frequency functions, which are stable by small translations; 3) downsampling may modify the high-frequency eigenspaces but leaves the low-frequency part approximately unchanged. Notably, our results quantify how choosing an architecture adapted to the target function leads to a large improvement in the sample complexity.

READ FULL TEXT
research
12/07/2019

Dynamic Convolution: Attention over Convolution Kernels

Light-weight convolutional neural networks (CNNs) suffer performance deg...
research
05/06/2020

Regularized Pooling

In convolutional neural networks (CNNs), pooling operations play importa...
research
11/10/2015

Analyzing Stability of Convolutional Neural Networks in the Frequency Domain

Understanding the internal process of ConvNets is commonly done using vi...
research
10/22/2016

Optimization on Submanifolds of Convolution Kernels in CNNs

Kernel normalization methods have been employed to improve robustness of...
research
03/18/2021

Stride and Translation Invariance in CNNs

Convolutional Neural Networks have become the standard for image classif...
research
02/19/2021

On Approximation in Deep Convolutional Networks: a Kernel Perspective

The success of deep convolutional networks on on tasks involving high-di...
research
12/09/2019

Naive Gabor Networks

In this paper, we introduce naive Gabor Networks or Gabor-Nets which, fo...

Please sign up or login with your details

Forgot password? Click here to reset