NeCPD: An Online Tensor Decomposition with Optimal Stochastic Gradient Descent
Multi-way data analysis has become an essential tool for capturing underlying structures in higher-order datasets stored in tensor X∈R ^I_1 ×...× I_N. CANDECOMP/PARAFAC (CP) decomposition has been extensively studied and applied to approximate X by N loading matrices A^(1), ..., A^(N) where N represents the order of the tensor. We propose a new efficient CP decomposition solver named NeCPD for non-convex problem in multi-way online data based on stochastic gradient descent (SGD) algorithm. SGD is very useful in online setting since it allows us to update X^(t+1) in one single step. In terms of global convergence, it is well known that SGD stuck in many saddle points when it deals with non-convex problems. We study the Hessian matrix to identify theses saddle points, and then try to escape them using the perturbation approach which adds little noise to the gradient update step. We further apply Nesterov's Accelerated Gradient (NAG) method in SGD algorithm to optimally accelerate the convergence rate and compensate Hessian computational delay time per epoch. Experimental evaluation in the field of structural health monitoring using laboratory-based and real-life structural datasets show that our method provides more accurate results compared with existing online tensor analysis methods.
READ FULL TEXT