Efficient online learning with kernels for adversarial large scale problems

02/26/2019
by   Rémi Jézéquel, et al.
0

We are interested in a framework of online learning with kernels for low-dimensional but large-scale and potentially adversarial datasets. Considering the Gaussian kernel, we study the computational and theoretical performance of online variations of kernel Ridge regression. The resulting algorithm is based on approximations of the Gaussian kernel through Taylor expansion. It achieves for d-dimensional inputs a (close to) optimal regret of order O(( n)^d+1) with per-round time complexity and space complexity O(( n)^2d). This makes the algorithm a suitable choice as soon as n ≫ e^d which is likely to happen in a scenario with small dimensional and large-scale dataset.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset