Using Multilevel Circulant Matrix Approximate to Speed Up Kernel Logistic Regression

08/19/2021
by   Junna~Zhang, et al.
0

Kernel logistic regression (KLR) is a classical nonlinear classifier in statistical machine learning. Newton method with quadratic convergence rate can solve KLR problem more effectively than the gradient method. However, an obvious limitation of Newton method for training large-scale problems is the O(n^3) time complexity and O(n^2) space complexity, where n is the number of training instances. In this paper, we employ the multilevel circulant matrix (MCM) approximate kernel matrix to save in storage space and accelerate the solution of the KLR. Combined with the characteristics of MCM and our ingenious design, we propose an MCM approximate Newton iterative method. We first simplify the Newton direction according to the semi-positivity of the kernel matrix and then perform a two-step approximation of the Newton direction by using MCM. Our method reduces the time complexity of each iteration to O(n log n) by using the multidimensional fast Fourier transform (mFFT). In addition, the space complexity can be reduced to O(n) due to the built-in periodicity of MCM. Experimental results on some large-scale binary and multi-classification problems show that our method makes KLR scalable for large-scale problems, with less memory consumption, and converges to test accuracy without sacrifice in a shorter time.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset