Fast Estimation of Information Theoretic Learning Descriptors using Explicit Inner Product Spaces

01/01/2020
by   Kan Li, et al.
19

Kernel methods form a theoretically-grounded, powerful and versatile framework to solve nonlinear problems in signal processing and machine learning. The standard approach relies on the kernel trick to perform pairwise evaluations of a kernel function, leading to scalability issues for large datasets due to its linear and superlinear growth with respect to the training data. Recently, we proposed no-trick (NT) kernel adaptive filtering (KAF) that leverages explicit feature space mappings using data-independent basis with constant complexity. The inner product defined by the feature mapping corresponds to a positive-definite finite-rank kernel that induces a finite-dimensional reproducing kernel Hilbert space (RKHS). Information theoretic learning (ITL) is a framework where information theory descriptors based on non-parametric estimator of Renyi entropy replace conventional second-order statistics for the design of adaptive systems. An RKHS for ITL defined on a space of probability density functions simplifies statistical inference for supervised or unsupervised learning. ITL criteria take into account the higher-order statistical behavior of the systems and signals as desired. However, this comes at a cost of increased computational complexity. In this paper, we extend the NT kernel concept to ITL for improved information extraction from the signal without compromising scalability. Specifically, we focus on a family of fast, scalable, and accurate estimators for ITL using explicit inner product space (EIPS) kernels. We demonstrate the superior performance of EIPS-ITL estimators and combined NT-KAF using EIPS-ITL cost functions through experiments.

READ FULL TEXT
research
12/10/2019

No-Trick (Treat) Kernel Adaptive Filtering using Deterministic Features

Kernel methods form a powerful, versatile, and theoretically-grounded un...
research
08/28/2015

Regularized Kernel Recursive Least Square Algoirthm

In most adaptive signal processing applications, system linearity is ass...
research
05/07/2019

Sparse multiresolution representations with adaptive kernels

Reproducing kernel Hilbert spaces (RKHSs) are key elements of many non-p...
research
06/09/2015

Deep SimNets

We present a deep layered architecture that generalizes convolutional ne...
research
01/16/2013

Information Theoretic Learning with Infinitely Divisible Kernels

In this paper, we develop a framework for information theoretic learning...
research
09/14/2023

Proximal Bellman mappings for reinforcement learning and their application to robust adaptive filtering

This paper aims at the algorithmic/theoretical core of reinforcement lea...
research
11/15/2017

Optimizing Kernel Machines using Deep Learning

Building highly non-linear and non-parametric models is central to sever...

Please sign up or login with your details

Forgot password? Click here to reset