Improving Sample Efficiency with Normalized RBF Kernels

07/30/2020
by   Sebastian Pineda-Arango, et al.
0

In deep learning models, learning more with less data is becoming more important. This paper explores how neural networks with normalized Radial Basis Function (RBF) kernels can be trained to achieve better sample efficiency. Moreover, we show how this kind of output layer can find embedding spaces where the classes are compact and well-separated. In order to achieve this, we propose a two-phase method to train those type of neural networks on classification tasks. Experiments on CIFAR-10 and CIFAR-100 show that networks with normalized kernels as output layer can achieve higher sample efficiency, high compactness and well-separability through the presented method in comparison to networks with SoftMax output layer.

READ FULL TEXT
research
10/11/2013

Deep Multiple Kernel Learning

Deep learning methods have predominantly been applied to large artificia...
research
07/24/2017

Global Normalization of Convolutional Neural Networks for Joint Entity and Relation Classification

We introduce globally normalized convolutional neural networks for joint...
research
05/12/2020

Modularizing Deep Learning via Pairwise Learning With Kernels

By redefining the conventional notions of layers, we present an alternat...
research
07/03/2020

On the Similarity between the Laplace and Neural Tangent Kernels

Recent theoretical work has shown that massively overparameterized neura...
research
05/24/2019

What Can ResNet Learn Efficiently, Going Beyond Kernels?

How can neural networks such as ResNet efficiently learn CIFAR-10 with t...
research
04/20/2020

OSLNet: Deep Small-Sample Classification with an Orthogonal Softmax Layer

A deep neural network of multiple nonlinear layers forms a large functio...
research
12/09/2019

Basis Prediction Networks for Effective Burst Denoising with Large Kernels

Bursts of images exhibit significant self-similarity across both time an...

Please sign up or login with your details

Forgot password? Click here to reset