Hyperbolic Polynomials I : Concentration and Discrepancy
Chernoff bound is a fundamental tool in theoretical computer science. It has been extensively used in randomized algorithm design and stochastic type analysis. The discrepancy theory, which deals with finding a bi-coloring of a set system such that the coloring of each set is balanced, has a huge number of applications in approximation algorithm design. A classical theorem of Spencer [Spe85] shows that any set system with n sets and n elements has discrepancy O(√(n)) while Chernoff [Che52] only gives O(√(n log n)). The study of hyperbolic polynomial is dating back to the early 20th century, due to Gårding [Går59] for solving PDEs. In recent years, more applications are found in control theory, optimization, real algebraic geometry, and so on. In particular, the breakthrough result by Marcus, Spielman, and Srivastava [MSS15] uses the theory of hyperbolic polynomial to prove the Kadison-Singer conjecture [KS59], which is closely related to the discrepancy theory. In this paper, we present two new results for hyperbolic polynomials ∙ We show an optimal hyperbolic Chernoff bound for any constant degree hyperbolic polynomials. ∙ We prove a hyperbolic Spencer theorem for any constant rank vectors. The classical matrix Chernoff and discrepancy results are based on determinant polynomial which is a special case of hyperbolic polynomial. To the best of our knowledge, this paper is the first work that shows either concentration or discrepancy results for hyperbolic polynomials.
READ FULL TEXT