Distributed gradient-based optimization in the presence of dependent aperiodic communication
Iterative distributed optimization algorithms involve multiple agents that communicate with each other, over time, in order to minimize/maximize a global objective. In the presence of unreliable communication networks, the Age-of-Information (AoI), which measures the freshness of data received, may be large and hence hinder algorithmic convergence. In this paper, we study the convergence of general distributed gradient-based optimization algorithms in the presence of communication that neither happens periodically nor at stochastically independent points in time. We show that convergence is guaranteed provided the random variables associated with the AoI processes are stochastically dominated by a random variable with finite first moment. This improves on previous requirements of boundedness of more than the first moment. We then introduce stochastically strongly connected (SSC) networks, a new stochastic form of strong connectedness for time-varying networks. We show: If for any p ≥0 the processes that describe the success of communication between agents in a SSC network are α-mixing with n^p-1α(n) summable, then the associated AoI processes are stochastically dominated by a random variable with finite p-th moment. In combination with our first contribution, this implies that distributed stochastic gradient descend converges in the presence of AoI, if α(n) is summable.
READ FULL TEXT