Hyperplane bounds for neural feature mappings

01/15/2022
by   Antonio Jimeno Yepes, et al.
0

Deep learning methods minimise the empirical risk using loss functions such as the cross entropy loss. When minimising the empirical risk, the generalisation of the learnt function still depends on the performance on the training data, the Vapnik-Chervonenkis(VC)-dimension of the function and the number of training examples. Neural networks have a large number of parameters, which correlates with their VC-dimension that is typically large but not infinite, and typically a large number of training instances are needed to effectively train them. In this work, we explore how to optimize feature mappings using neural network with the intention to reduce the effective VC-dimension of the hyperplane found in the space generated by the mapping. An interpretation of the results of this study is that it is possible to define a loss that controls the VC-dimension of the separating hyperplane. We evaluate this approach and observe that the performance when using this method improves when the size of the training set is small.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/21/2018

Smooth Loss Functions for Deep Top-k Classification

The top-k error is a common measure of performance in machine learning a...
research
05/30/2023

Asymptotic Characterisation of Robust Empirical Risk Minimisation Performance in the Presence of Outliers

We study robust linear regression in high-dimension, when both the dimen...
research
03/17/2023

Alternate Loss Functions Can Improve the Performance of Artificial Neural Networks

All machine learning algorithms use a loss, cost, utility or reward func...
research
06/10/2023

Any-dimensional equivariant neural networks

Traditional supervised learning aims to learn an unknown mapping by fitt...
research
04/21/2021

MLDS: A Dataset for Weight-Space Analysis of Neural Networks

Neural networks are powerful models that solve a variety of complex real...
research
10/15/2020

LiteDepthwiseNet: An Extreme Lightweight Network for Hyperspectral Image Classification

Deep learning methods have shown considerable potential for hyperspectra...
research
07/23/2021

Bias Loss for Mobile Neural Networks

Compact convolutional neural networks (CNNs) have witnessed exceptional ...

Please sign up or login with your details

Forgot password? Click here to reset