Solving ℓ^p-norm regularization with tensor kernels

07/18/2017
by   Saverio Salzo, et al.
0

In this paper, we discuss how a suitable family of tensor kernels can be used to efficiently solve nonparametric extensions of ℓ^p regularized learning methods. Our main contribution is proposing a fast dual algorithm, and showing that it allows to solve the problem efficiently. Our results contrast recent findings suggesting kernel methods cannot be extended beyond Hilbert setting. Numerical experiments confirm the effectiveness of the method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/23/2020

Efficient Tensor Kernel methods for sparse regression

Recently, classical kernel methods have been extended by the introductio...
research
01/30/2020

Reproducing kernels based schemes for nonparametric regression

In this work, we develop and study an empirical projection operator sche...
research
12/16/2021

Randomized regularized extended Kaczmarz algorithms for tensor recovery

Randomized regularized Kaczmarz algorithms have recently been proposed t...
research
03/18/2016

Generalized support vector regression: duality and tensor-kernel representation

In this paper we study the variational problem associated to support vec...
research
05/16/2018

End-to-end Learning of a Convolutional Neural Network via Deep Tensor Decomposition

In this paper we study the problem of learning the weights of a deep con...
research
05/09/2012

L2 Regularization for Learning Kernels

The choice of the kernel is critical to the success of many learning alg...
research
09/06/2015

Theoretical and Experimental Analyses of Tensor-Based Regression and Classification

We theoretically and experimentally investigate tensor-based regression ...

Please sign up or login with your details

Forgot password? Click here to reset