DeepAI
Log In Sign Up

Efficient Tensor Kernel methods for sparse regression

03/23/2020
by   Feliks Hibraj, et al.
0

Recently, classical kernel methods have been extended by the introduction of suitable tensor kernels so to promote sparsity in the solution of the underlying regression problem. Indeed, they solve an lp-norm regularization problem, with p=m/(m-1) and m even integer, which happens to be close to a lasso problem. However, a major drawback of the method is that storing tensors requires a considerable amount of memory, ultimately limiting its applicability. In this work we address this problem by proposing two advances. First, we directly reduce the memory requirement, by intriducing a new and more efficient layout for storing the data. Second, we use a Nystrom-type subsampling approach, which allows for a training phase with a smaller number of data points, so to reduce the computational cost. Experiments, both on synthetic and read datasets, show the effectiveness of the proposed improvements. Finally, we take case of implementing the cose in C++ so to further speed-up the computation.

READ FULL TEXT
07/18/2017

Solving ℓ^p-norm regularization with tensor kernels

In this paper, we discuss how a suitable family of tensor kernels can be...
05/25/2019

Fast and Accurate Gaussian Kernel Ridge Regression Using Matrix Decompositions for Preconditioning

This paper presents a method for building a preconditioner for a kernel ...
03/18/2016

Generalized support vector regression: duality and tensor-kernel representation

In this paper we study the variational problem associated to support vec...
01/02/2020

A Parallel Sparse Tensor Benchmark Suite on CPUs and GPUs

Tensor computations present significant performance challenges that impa...
11/16/2022

On some orthogonalization schemes in Tensor Train format

In the framework of tensor spaces, we consider orthogonalization kernels...
02/28/2019

Tensor-variate Mixture of Experts

When data are organized in matrices or arrays of higher dimensions (tens...