On the Dualization of Operator-Valued Kernel Machines

10/10/2019 ∙ by Pierre Laforgue, et al. ∙ 0

Operator-Valued Kernels (OVKs) and Vector-Valued Reproducing Kernel Hilbert Spaces (vv-RKHSs) provide an elegant way to extend scalar kernel methods when the output space is a Hilbert space. First used in multi-task regression, this theoretical framework opens the door to various applications, ranging from structured output prediction to functional regression, thanks to its ability to deal with infinite dimensional output spaces. This work investigates how to use the duality principle to handle different families of loss functions, yet unexplored within vv-RKHSs. The difficulty of having infinite dimensional dual variables is overcome, either by means of a Double Representer Theorem when the loss depends on inner products solely, or by an in-depth analysis of the Fenchel-Legendre transform of integral losses. Experiments on structured prediction, function-to-function regression and structured representation learning with ϵ-insensitive and Huber losses illustrate the benefits of this framework.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.