Fundamental limits of over-the-air optimization: Are analog schemes optimal?
We consider over-the-air convex optimization on a d-dimensional space where coded gradients are sent over an additive Gaussian noise channel with variance σ^2. The codewords satisfy an average power constraint P, resulting in the signal-to-noise ratio (SNR) of P/σ^2. We derive bounds for the convergence rates for over-the-air optimization. Our first result is a lower bound for the convergence rate showing that any code must slowdown the convergence rate by a factor of roughly √(d/log(1+𝚂𝙽𝚁)). Next, we consider a popular class of schemes called analog coding, where a linear function of the gradient is sent. We show that a simple scaled transmission analog coding scheme results in a slowdown in convergence rate by a factor of √(d(1+1/𝚂𝙽𝚁)). This matches the previous lower bound up to constant factors for low SNR, making the scaled transmission scheme optimal at low SNR. However, we show that this slowdown is necessary for any analog coding scheme. In particular, a slowdown in convergence by a factor of √(d) for analog coding remains even when SNR tends to infinity. Remarkably, we present a simple quantize-and-modulate scheme that uses Amplitude Shift Keying and almost attains the optimal convergence rate at all SNRs.
READ FULL TEXT