Learning Non-Parametric Invariances from Data with Permanent Random Connectomes

11/13/2019
by   Dipan K. Pal, et al.
0

One of the fundamental problems in supervised classification and in machine learning in general, is the modelling of non-parametric invariances that exist in data. Most prior art has focused on enforcing priors in the form of invariances to parametric nuisance transformations that are expected to be present in data. Learning non-parametric invariances directly from data remains an important open problem. In this paper, we introduce a new architectural layer for convolutional networks which is capable of learning general invariances from data itself. This layer can learn invariance to non-parametric transformations and interestingly, motivates and incorporates permanent random connectomes, thereby being called Permanent Random Connectome Non-Parametric Transformation Networks (PRC-NPTN). PRC-NPTN networks are initialized with random connections (not just weights) which are a small subset of the connections in a fully connected convolution layer. Importantly, these connections in PRC-NPTNs once initialized remain permanent throughout training and testing. Permanent random connectomes make these architectures loosely more biologically plausible than many other mainstream network architectures which require highly ordered structures. We motivate randomly initialized connections as a simple method to learn invariance from data itself while invoking invariance towards multiple nuisance transformations simultaneously. We find that these randomly initialized permanent connections have positive effects on generalization, outperform much larger ConvNet baselines and the recently proposed Non-Parametric Transformation Network (NPTN) on benchmarks that enforce learning invariances from the data itself.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2018

Non-Parametric Transformation Networks

ConvNets have been very effective in many applications where it is requi...
research
07/24/2023

Unsupervised Learning of Invariance Transformations

The need for large amounts of training data in modern machine learning i...
research
10/22/2018

Learning sparse transformations through backpropagation

Many transformations in deep learning architectures are sparsely connect...
research
02/15/2022

Binary Classification for High Dimensional Data using Supervised Non-Parametric Ensemble Method

Medical Research data used for prognostication deals with binary classif...
research
02/09/2021

More Is More – Narrowing the Generalization Gap by Adding Classification Heads

Overfit is a fundamental problem in machine learning in general, and in ...
research
11/22/2018

Copy the Old or Paint Anew? An Adversarial Framework for (non-) Parametric Image Stylization

Parametric generative deep models are state-of-the-art for photo and non...
research
05/05/2023

Decentralized diffusion-based learning under non-parametric limited prior knowledge

We study the problem of diffusion-based network learning of a nonlinear ...

Please sign up or login with your details

Forgot password? Click here to reset