Training Recurrent Neural Networks by Sequential Least Squares and the Alternating Direction Method of Multipliers
For training recurrent neural network models of nonlinear dynamical systems from an input/output training dataset based on rather arbitrary convex and twice-differentiable loss functions and regularization terms, we propose the use of sequential least squares for determining the optimal network parameters and hidden states. In addition, to handle non-smooth regularization terms such as L1, L0, and group-Lasso regularizers, as well as to impose possibly non-convex constraints such as integer and mixed-integer constraints, we combine sequential least squares with the alternating direction method of multipliers (ADMM). The performance of the resulting algorithm, that we call NAILS (Nonconvex ADMM Iterations and Least Squares), is tested in a nonlinear system identification benchmark.
READ FULL TEXT