DeepAI AI Chat
Log In Sign Up

A Unifying View of Explicit and Implicit Feature Maps for Structured Data: Systematic Studies of Graph Kernels

by   Nils M. Kriege, et al.
TU Dortmund
Washington University in St Louis

Non-linear kernel methods can be approximated by fast linear ones using suitable explicit feature maps allowing their application to large scale problems. To this end, explicit feature maps of kernels for vectorial data have been extensively studied. As many real-world data is structured, various kernels for complex data like graphs have been proposed. Indeed, many of them directly compute feature maps. However, the kernel trick is employed when the number of features is very large or the individual vertices of graphs are annotated by real-valued attributes. Can we still compute explicit feature maps efficiently under these circumstances? Triggered by this question, we investigate how general convolution kernels are composed from base kernels and construct corresponding feature maps. We apply our results to widely used graph kernels and analyze for which kernels and graph properties computation by explicit feature maps is feasible and actually more efficient. In particular, we derive feature maps for random walk and subgraph matching kernels and apply them to real-world graphs with discrete labels. Thereby, our theoretical results are confirmed experimentally by observing a phase transition when comparing running time with respect to label diversity, walk lengths and subgraph size, respectively. Moreover, we derive approximative, explicit feature maps for state-of-the-art kernels supporting real-valued attributes including the GraphHopper and Graph Invariant kernels. In extensive experiments we show that our approaches often achieve a classification accuracy close to the exact methods based on the kernel trick, but require only a fraction of their running time.


page 1

page 2

page 3

page 4


Complex-to-Real Random Features for Polynomial Kernels

Kernel methods are ubiquitous in statistical modeling due to their theor...

KONG: Kernels for ordered-neighborhood graphs

We present novel graph kernels for graphs with node and edge labels that...

On the generalization of Tanimoto-type kernels to real valued functions

The Tanimoto kernel (Jaccard index) is a well known tool to describe the...

Learning Bounds for Greedy Approximation with Explicit Feature Maps from Multiple Kernels

Nonlinear kernels can be approximated using finite-dimensional feature m...

Random Feature Maps for Dot Product Kernels

Approximating non-linear kernels using feature maps has gained a lot of ...

Scalable Alignment Kernels via Space-Efficient Feature Maps

String kernels are attractive data analysis tools for analyzing string d...

Graph Kernels via Functional Embedding

We propose a representation of graph as a functional object derived from...