Two-Stage Stochastic Optimization via Primal-Dual Decomposition and Deep Unrolling
We consider a two-stage stochastic optimization problem, in which a long-term optimization variable is coupled with a set of short-term optimization variables in both objective and constraint functions. Despite that two-stage stochastic optimization plays a critical role in various engineering and scientific applications, there still lack efficient algorithms, especially when the long-term and short-term variables are coupled in the constraints. To overcome the challenge caused by tightly coupled stochastic constraints, we first establish a two-stage primal-dual decomposition (PDD) method to decompose the two-stage problem into a long-term problem and a family of short-term subproblems. Then we propose a PDD-based stochastic successive convex approximation (PDD-SSCA) algorithmic framework to find KKT solutions for two-stage stochastic optimization problems. At each iteration, PDD-SSCA first runs a short-term sub-algorithm to find stationary points of the short-term subproblems associated with a mini-batch of the state samples. Then it constructs a convex surrogate for the long-term problem based on the deep unrolling of the short-term sub-algorithm and the back propagation method. Finally, the optimal solution of the convex surrogate problem is solved to generate the next iterate. We establish the almost sure convergence of PDD-SSCA and customize the algorithmic framework to solve two important application problems. Simulations show that PDD-SSCA can achieve superior performance over existing solutions.
READ FULL TEXT