Deep Networks with Adaptive Nyström Approximation

11/29/2019
by   Luc Giffon, et al.
0

Recent work has focused on combining kernel methods and deep learning to exploit the best of the two approaches. Here, we introduce a new architecture of neural networks in which we replace the top dense layers of standard convolutional architectures with an approximation of a kernel function by relying on the Nyström approximation. Our approach is easy and highly flexible. It is compatible with any kernel function and it allows exploiting multiple kernels. We show that our architecture has the same performance than standard architecture on datasets like SVHN and CIFAR100. One benefit of the method lies in its limited number of learnable parameters which makes it particularly suited for small training set sizes, e.g. from 5 to 20 samples per class.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/08/2018

Some Approximation Bounds for Deep Networks

In this paper we introduce new bounds on the approximation of functions ...
research
10/15/2021

FlexConv: Continuous Kernel Convolutions with Differentiable Kernel Sizes

When designing Convolutional Neural Networks (CNNs), one must select the...
research
10/11/2013

Deep Multiple Kernel Learning

Deep learning methods have predominantly been applied to large artificia...
research
12/28/2016

A Deep Learning Approach To Multiple Kernel Fusion

Kernel fusion is a popular and effective approach for combining multiple...
research
08/30/2021

A fast point solver for deep nonlinear function approximators

Deep kernel processes (DKPs) generalise Bayesian neural networks, but do...
research
06/24/2020

Distribution-Based Invariant Deep Networks for Learning Meta-Features

Recent advances in deep learning from probability distributions enable t...
research
11/30/2020

Every Model Learned by Gradient Descent Is Approximately a Kernel Machine

Deep learning's successes are often attributed to its ability to automat...

Please sign up or login with your details

Forgot password? Click here to reset