Partition-Insensitive Parallel ADMM Algorithm for High-dimensional Linear Models
The parallel alternating direction method of multipliers (ADMM) algorithms have gained popularity in statistics and machine learning for their efficient handling of large sample data problems. However, the parallel structure of these algorithms is based on the consensus problem, which can lead to an excessive number of auxiliary variables for high-dimensional data. In this paper, we propose a partition-insensitive parallel framework based on the linearized ADMM (LADMM) algorithm and apply it to solve nonconvex penalized smooth quantile regression problems. Compared to existing parallel ADMM algorithms, our algorithm does not rely on the consensus problem, resulting in a significant reduction in the number of variables that need to be updated at each iteration. It is worth noting that the solution of our algorithm remains unchanged regardless of how the total sample is divided, which is also known as partition-insensitivity. Furthermore, under some mild assumptions, we prove that the iterative sequence generated by the parallel LADMM algorithm converges to a critical point of the nonconvex optimization problem. Numerical experiments on synthetic and real datasets demonstrate the feasibility and validity of the proposed algorithm.
READ FULL TEXT