HOME: High-Order Mixed-Moment-based Embedding for Representation Learning

07/15/2022
by   Chuang Niu, et al.
3

Minimum redundancy among different elements of an embedding in a latent space is a fundamental requirement or major preference in representation learning to capture intrinsic informational structures. Current self-supervised learning methods minimize a pair-wise covariance matrix to reduce the feature redundancy and produce promising results. However, such representation features of multiple variables may contain the redundancy among more than two feature variables that cannot be minimized via the pairwise regularization. Here we propose the High-Order Mixed-Moment-based Embedding (HOME) strategy to reduce the redundancy between any sets of feature variables, which is to our best knowledge the first attempt to utilize high-order statistics/information in this context. Multivariate mutual information is minimum if and only if multiple variables are mutually independent, which suggests the necessary conditions of factorized mixed moments among multiple variables. Based on these statistical and information theoretic principles, our general HOME framework is presented for self-supervised representation learning. Our initial experiments show that a simple version in the form of a three-order HOME scheme already significantly outperforms the current two-order baseline method (i.e., Barlow Twins) in terms of the linear evaluation on representation features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/15/2021

HDMI: High-order Deep Multiplex Infomax

Networks have been widely used to represent the relations between object...
research
06/13/2022

Self-Supervised Representation Learning With MUlti-Segmental Informational Coding (MUSIC)

Self-supervised representation learning maps high-dimensional data into ...
research
06/21/2022

TiCo: Transformation Invariance and Covariance Contrast for Self-Supervised Visual Representation Learning

We present Transformation Invariance and Covariance Contrast (TiCo) for ...
research
02/05/2021

Self-Supervised Deep Graph Embedding with High-Order Information Fusion for Community Discovery

Deep graph embedding is an important approach for community discovery. D...
research
10/18/2019

A Mutual Information Maximization Perspective of Language Representation Learning

We show state-of-the-art word representation learning methods maximize a...
research
02/28/2019

Quantifying High-order Interdependencies via Multivariate Extensions of the Mutual Information

This article introduces a model-agnostic approach to study statistical s...
research
10/02/2020

Hyperharmonic analysis for the study of high-order information-theoretic signals

Network representations often cannot fully account for the structural ri...

Please sign up or login with your details

Forgot password? Click here to reset