Sparse Learning over Infinite Subgraph Features

03/20/2014
by   Ichigaku Takigawa, et al.
0

We present a supervised-learning algorithm from graph data (a set of graphs) for arbitrary twice-differentiable loss functions and sparse linear models over all possible subgraph features. To date, it has been shown that under all possible subgraph features, several types of sparse learning, such as Adaboost, LPBoost, LARS/LASSO, and sparse PLS regression, can be performed. Particularly emphasis is placed on simultaneous learning of relevant features from an infinite set of candidates. We first generalize techniques used in all these preceding studies to derive an unifying bounding technique for arbitrary separable functions. We then carefully use this bounding to make block coordinate gradient descent feasible over infinite subgraph features, resulting in a fast converging algorithm that can solve a wider class of sparse learning problems over graph data. We also empirically study the differences from the existing approaches in convergence property, selected subgraph features, and search-space sizes. We further discuss several unnoticed issues in sparse learning over all possible subgraph features.

READ FULL TEXT
research
07/09/2018

Jointly learning relevant subgraph patterns and nonlinear models of their indicators

Classification and regression in which the inputs are graphs of arbitrar...
research
12/25/2019

Neural Subgraph Isomorphism Counting

In this paper, we study a new graph learning problem: learning to count ...
research
04/18/2019

New Subgraph Isomorphism Algorithms: Vertex versus Path-at-a-time Matching

Graphs are widely used to model complicated data semantics in many appli...
research
05/06/2012

Sparse group lasso and high dimensional multinomial classification

The sparse group lasso optimization problem is solved using a coordinate...
research
02/19/2019

Dominator Colorings of Digraphs

This paper serves as the first extension of the topic of dominator color...
research
04/20/2012

Supervised Feature Selection in Graphs with Path Coding Penalties and Network Flows

We consider supervised learning problems where the features are embedded...

Please sign up or login with your details

Forgot password? Click here to reset