Computationally Efficient Approximations for Matrix-based Renyi's Entropy

12/27/2021
by   Tieliang Gong, et al.
14

The recently developed matrix based Renyi's entropy enables measurement of information in data simply using the eigenspectrum of symmetric positive semi definite (PSD) matrices in reproducing kernel Hilbert space, without estimation of the underlying data distribution. This intriguing property makes the new information measurement widely adopted in multiple statistical inference and learning tasks. However, the computation of such quantity involves the trace operator on a PSD matrix G to power α(i.e., tr(G^α)), with a normal complexity of nearly O(n^3), which severely hampers its practical usage when the number of samples (i.e., n) is large. In this work, we present computationally efficient approximations to this new entropy functional that can reduce its complexity to even significantly less than O(n^2). To this end, we first develop randomized approximations to (^α) that transform the trace estimation into matrix-vector multiplications problem. We extend such strategy for arbitrary values of α (integer or non-integer). We then establish the connection between the matrix-based Renyi's entropy and PSD matrix approximation, which enables us to exploit both clustering and block low-rank structure of to further reduce the computational cost. We theoretically provide approximation accuracy guarantees and illustrate the properties of different approximations. Large-scale experimental evaluations on both synthetic and real-world data corroborate our theoretical findings, showing promising speedup with negligible loss in accuracy.

READ FULL TEXT

page 9

page 10

page 15

research
05/16/2022

Optimal Randomized Approximations for Matrix based Renyi's Entropy

The Matrix-based Renyi's entropy enables us to directly measure informat...
research
11/30/2022

Robust and Fast Measure of Information via Low-rank Representation

The matrix-based Rényi's entropy allows us to directly quantify informat...
research
12/18/2021

Revisiting Memory Efficient Kernel Approximation: An Indefinite Learning Perspective

Matrix approximations are a key element in large-scale algebraic machine...
research
09/22/2022

Randomized low-rank approximation of monotone matrix functions

This work is concerned with computing low-rank approximations of a matri...
research
02/10/2021

Fast and stable deterministic approximation of general symmetric kernel matrices in high dimensions

Kernel methods are used frequently in various applications of machine le...
research
01/03/2018

Randomized Linear Algebra Approaches to Estimate the Von Neumann Entropy of Density Matrices

The von Neumann entropy, named after John von Neumann, is the extension ...
research
06/03/2019

MEMe: An Accurate Maximum Entropy Method for Efficient Approximations in Large-Scale Machine Learning

Efficient approximation lies at the heart of large-scale machine learnin...

Please sign up or login with your details

Forgot password? Click here to reset