Non-Parametric Representation Learning with Kernels

09/05/2023
by   Pascal Esser, et al.
0

Unsupervised and self-supervised representation learning has become popular in recent years for learning useful features from unlabelled data. Representation learning has been mostly developed in the neural network literature, and other models for representation learning are surprisingly unexplored. In this work, we introduce and analyze several kernel-based representation learning approaches: Firstly, we define two kernel Self-Supervised Learning (SSL) models using contrastive loss functions and secondly, a Kernel Autoencoder (AE) model based on the idea of embedding and reconstructing data. We argue that the classical representer theorems for supervised kernel machines are not always applicable for (self-supervised) representation learning, and present new representer theorems, which show that the representations learned by our kernel models can be expressed in terms of kernel matrices. We further derive generalisation error bounds for representation learning with kernel SSL and AE, and empirically evaluate the performance of these methods in both small data regimes as well as in comparison with neural network based models.

READ FULL TEXT
research
09/29/2022

Joint Embedding Self-Supervised Learning in the Kernel Regime

The fundamental goal of self-supervised learning (SSL) is to produce use...
research
10/28/2021

Self-Supervised Representation Learning on Neural Network Weights for Model Characteristic Prediction

Self-Supervised Learning (SSL) has been shown to learn useful and inform...
research
09/05/2023

Representation Learning Dynamics of Self-Supervised Models

Self-Supervised Learning (SSL) is an important paradigm for learning rep...
research
10/08/2021

SubTab: Subsetting Features of Tabular Data for Self-Supervised Representation Learning

Self-supervised learning has been shown to be very effective in learning...
research
04/25/2020

Convex Representation Learning for Generalized Invariance in Semi-Inner-Product Space

Invariance (defined in a general sense) has been one of the most effecti...
research
06/03/2020

A Convolutional Deep Markov Model for Unsupervised Speech Representation Learning

Probabilistic Latent Variable Models (LVMs) provide an alternative to se...
research
10/04/2022

Contrastive Learning Can Find An Optimal Basis For Approximately View-Invariant Functions

Contrastive learning is a powerful framework for learning self-supervise...

Please sign up or login with your details

Forgot password? Click here to reset