Theoretical Analysis of Divide-and-Conquer ERM: Beyond Square Loss and RKHS

03/09/2020
by   Yong Liu, et al.
0

Theoretical analysis of the divide-and-conquer based distributed learning with least square loss in the reproducing kernel Hilbert space (RKHS) have recently been explored within the framework of learning theory. However, the studies on learning theory for general loss functions and hypothesis spaces remain limited. To fill the gap, we study the risk performance of distributed empirical risk minimization (ERM) for general loss functions and hypothesis spaces. The main contributions are two-fold. First, we derive two tight risk bounds under certain basic assumptions on the hypothesis space, as well as the smoothness, Lipschitz continuity, strong convexity of the loss function. Second, we further develop a more general risk bound for distributed ERM without the restriction of strong convexity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/09/2020

Risk Analysis of Divide-and-Conquer ERM

Theoretical analysis of the divide-and-conquer based distributed learnin...
research
12/19/2018

Max-Diversity Distributed Learning: Theory and Algorithms

We study the risk performance of distributed learning for the regulariza...
research
05/03/2014

On Lipschitz Continuity and Smoothness of Loss Functions in Learning to Rank

In binary classification and regression problems, it is well understood ...
research
06/17/2020

Regularized ERM on random subspaces

We study a natural extension of classical empirical risk minimization, w...
research
01/03/2019

Sparse Learning in reproducing kernel Hilbert space

Sparse learning aims to learn the sparse structure of the true target fu...
research
06/24/2018

A Unified Analysis of Random Fourier Features

We provide the first unified theoretical analysis of supervised learning...
research
04/21/2021

Robust Kernel-based Distribution Regression

Regularization schemes for regression have been widely studied in learni...

Please sign up or login with your details

Forgot password? Click here to reset