Upper and Lower Bounds on the Smoothed Complexity of the Simplex Method
The simplex method for linear programming is known to be highly efficient in practice, and understanding its performance from a theoretical perspective is an active research topic. The framework of smoothed analysis, first introduced by Spielman and Teng (JACM '04) for this purpose, defines the smoothed complexity of solving a linear program with d variables and n constraints as the expected running time when Gaussian noise of variance σ^2 is added to the LP data. We prove that the smoothed complexity of the simplex method is O(σ^-3/2 d^13/4log^7/4 n), improving the dependence on 1/σ compared to the previous bound of O(σ^-2 d^2√(log n)). We accomplish this through a new analysis of the shadow bound, key to earlier analyses as well. Illustrating the power of our new method, we use our method to prove a nearly tight upper bound on the smoothed complexity of two-dimensional polygons. We also establish the first non-trivial lower bound on the smoothed complexity of the simplex method, proving that the shadow vertex simplex method requires at least Ω(min(σ^-1/2 d^-1/2log^-1/4 d,2^d ) ) pivot steps with high probability. A key part of our analysis is a new variation on the extended formulation for the regular 2^k-gon. We end with a numerical experiment that suggests this analysis could be further improved.
READ FULL TEXT