On the Connection between L_p and Risk Consistency and its Implications on Regularized Kernel Methods

03/27/2023
by   Hannes Köhler, et al.
0

As a predictor's quality is often assessed by means of its risk, it is natural to regard risk consistency as a desirable property of learning methods, and many such methods have indeed been shown to be risk consistent. The first aim of this paper is to establish the close connection between risk consistency and L_p-consistency for a considerably wider class of loss functions than has been done before. The attempt to transfer this connection to shifted loss functions surprisingly reveals that this shift does not reduce the assumptions needed on the underlying probability measure to the same extent as it does for many other results. The results are applied to regularized kernel methods such as support vector machines.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset