A Unifying View of Explicit and Implicit Feature Maps for Structured Data: Systematic Studies of Graph Kernels

03/02/2017
by   Nils M. Kriege, et al.
0

Non-linear kernel methods can be approximated by fast linear ones using suitable explicit feature maps allowing their application to large scale problems. To this end, explicit feature maps of kernels for vectorial data have been extensively studied. As many real-world data is structured, various kernels for complex data like graphs have been proposed. Indeed, many of them directly compute feature maps. However, the kernel trick is employed when the number of features is very large or the individual vertices of graphs are annotated by real-valued attributes. Can we still compute explicit feature maps efficiently under these circumstances? Triggered by this question, we investigate how general convolution kernels are composed from base kernels and construct corresponding feature maps. We apply our results to widely used graph kernels and analyze for which kernels and graph properties computation by explicit feature maps is feasible and actually more efficient. In particular, we derive feature maps for random walk and subgraph matching kernels and apply them to real-world graphs with discrete labels. Thereby, our theoretical results are confirmed experimentally by observing a phase transition when comparing running time with respect to label diversity, walk lengths and subgraph size, respectively. Moreover, we derive approximative, explicit feature maps for state-of-the-art kernels supporting real-valued attributes including the GraphHopper and Graph Invariant kernels. In extensive experiments we show that our approaches often achieve a classification accuracy close to the exact methods based on the kernel trick, but require only a fraction of their running time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/04/2022

Complex-to-Real Random Features for Polynomial Kernels

Kernel methods are ubiquitous in statistical modeling due to their theor...
research
05/25/2018

KONG: Kernels for ordered-neighborhood graphs

We present novel graph kernels for graphs with node and edge labels that...
research
07/12/2020

On the generalization of Tanimoto-type kernels to real valued functions

The Tanimoto kernel (Jaccard index) is a well known tool to describe the...
research
10/09/2018

Learning Bounds for Greedy Approximation with Explicit Feature Maps from Multiple Kernels

Nonlinear kernels can be approximated using finite-dimensional feature m...
research
01/31/2012

Random Feature Maps for Dot Product Kernels

Approximating non-linear kernels using feature maps has gained a lot of ...
research
02/18/2018

Scalable Alignment Kernels via Space-Efficient Feature Maps

String kernels are attractive data analysis tools for analyzing string d...
research
02/04/2016

Random Feature Maps via a Layered Random Projection (LaRP) Framework for Object Classification

The approximation of nonlinear kernels via linear feature maps has recen...

Please sign up or login with your details

Forgot password? Click here to reset