Domain Adaptive Learning Based on Sample-Dependent and Learnable Kernels

02/18/2021
by   Xinlong Lu, et al.
0

Reproducing Kernel Hilbert Space (RKHS) is the common mathematical platform for various kernel methods in machine learning. The purpose of kernel learning is to learn an appropriate RKHS according to different machine learning scenarios and training samples. Because RKHS is uniquely generated by the kernel function, kernel learning can be regarded as kernel function learning. This paper proposes a Domain Adaptive Learning method based on Sample-Dependent and Learnable Kernels (SDLK-DAL). The first contribution of our work is to propose a sample-dependent and learnable Positive Definite Quadratic Kernel function (PDQK) framework. Unlike learning the exponential parameter of Gaussian kernel function or the coefficient of kernel combinations, the proposed PDQK is a positive definite quadratic function, in which the symmetric positive semi-definite matrix is the learnable part in machine learning applications. The second contribution lies on that we apply PDQK to Domain Adaptive Learning (DAL). Our approach learns the PDQK through minimizing the mean discrepancy between the data of source domain and target domain and then transforms the data into an optimized RKHS generated by PDQK. We conduct a series of experiments that the RKHS determined by PDQK replaces those in several state-of-the-art DAL algorithms, and our approach achieves better performance.

READ FULL TEXT

page 1

page 8

page 10

page 11

page 14

research
11/28/2009

Positive Definite Kernels in Machine Learning

This survey is an introduction to positive definite kernels and the set ...
research
12/12/2018

Kernel Treelets

A new method for hierarchical clustering is presented. It combines treel...
research
04/25/2018

Generalized Gaussian Kernel Adaptive Filtering

The present paper proposes generalized Gaussian kernel adaptive filterin...
research
02/17/2016

Robust Kernel (Cross-) Covariance Operators in Reproducing Kernel Hilbert Space toward Kernel Methods

To the best of our knowledge, there are no general well-founded robust m...
research
06/11/2019

Discrepancy, Coresets, and Sketches in Machine Learning

This paper defines the notion of class discrepancy for families of funct...
research
06/27/2017

Forecasting and Granger Modelling with Non-linear Dynamical Dependencies

Traditional linear methods for forecasting multivariate time series are ...
research
02/23/2018

The Weighted Kendall and High-order Kernels for Permutations

We propose new positive definite kernels for permutations. First we intr...

Please sign up or login with your details

Forgot password? Click here to reset