Efficient Multi-task Feature and Relationship Learning
In this paper we propose a multi-convex framework for multi-task learning that improves predictions by learning relationships both between tasks and between features. Our framework is a generalization of related methods in multi-task learning, that either learn task relationships, or feature relationships, but not both. We start with a hierarchical Bayesian model, and use the empirical Bayes method to transform the underlying inference problem into a multi-convex optimization problem. We propose a coordinate-wise minimization algorithm that has a closed form solution for each block subproblem. Naively these solutions would be expensive to compute, but by using the theory of doubly stochastic matrices, we are able to reduce the underlying matrix optimization subproblem into a minimum weight perfect matching problem on a complete bipartite graph, and solve it analytically and efficiently. To solve the weight learning subproblem, we propose three different strategies, including a gradient descent method with linear convergence guarantee when the instances are not shared by multiple tasks, and a numerical solution based on Sylvester equation when instances are shared. We demonstrate the efficiency of our method on both synthetic datasets and real-world datasets. Experiments show that the proposed optimization method is orders of magnitude faster than an off-the-shelf projected gradient method, and our model is able to exploit the correlation structures among multiple tasks and features.
READ FULL TEXT