Fine-grained Generalization Analysis of Vector-valued Learning

04/29/2021
by   Liang Wu, et al.
0

Many fundamental machine learning tasks can be formulated as a problem of learning with vector-valued functions, where we learn multiple scalar-valued functions together. Although there is some generalization analysis on different specific algorithms under the empirical risk minimization principle, a unifying analysis of vector-valued learning under a regularization framework is still lacking. In this paper, we initiate the generalization analysis of regularized vector-valued learning algorithms by presenting bounds with a mild dependency on the output dimension and a fast rate on the sample size. Our discussions relax the existing assumptions on the restrictive constraint of hypothesis spaces, smoothness of loss functions and low-noise condition. To understand the interaction between optimization and learning, we further use our results to derive the first generalization bounds for stochastic gradient descent with vector-valued functions. We apply our general results to multi-class classification and multi-label classification, which yield the first bounds with a logarithmic dependency on the output dimension for extreme multi-label classification with the Frobenius regularization. As a byproduct, we derive a Rademacher complexity bound for loss function classes defined in terms of a general strongly convex function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/11/2019

Learning Vector-valued Functions with Local Rademacher Complexity

We consider a general family of problems of which the output space admit...
research
02/22/2020

Optimistic bounds for multi-output prediction

We investigate the challenge of multi-output learning, where the goal is...
research
02/09/2022

Towards Empirical Process Theory for Vector-Valued Functions: Metric Entropy of Smooth Function Classes

This paper provides some first steps in developing empirical process the...
research
05/31/2021

Fine-grained Generalization Analysis of Structured Output Prediction

In machine learning we often encounter structured output prediction prob...
research
06/05/2016

Bounds for Vector-Valued Function Estimation

We present a framework to derive risk bounds for vector-valued learning ...
research
01/31/2014

A Unifying Framework in Vector-valued Reproducing Kernel Hilbert Spaces for Manifold Regularization and Co-Regularized Multi-view Learning

This paper presents a general vector-valued reproducing kernel Hilbert s...
research
09/19/2022

Generalization Bounds for Stochastic Gradient Descent via Localized ε-Covers

In this paper, we propose a new covering technique localized for the tra...

Please sign up or login with your details

Forgot password? Click here to reset