Near-Optimal Statistical Query Lower Bounds for Agnostically Learning Intersections of Halfspaces with Gaussian Marginals
We consider the well-studied problem of learning intersections of halfspaces under the Gaussian distribution in the challenging agnostic learning model. Recent work of Diakonikolas et al. (2021) shows that any Statistical Query (SQ) algorithm for agnostically learning the class of intersections of k halfspaces over ℝ^n to constant excess error either must make queries of tolerance at most n^-Ω̃(√(log k)) or must make 2^n^Ω(1) queries. We strengthen this result by improving the tolerance requirement to n^-Ω̃(log k). This lower bound is essentially best possible since an SQ algorithm of Klivans et al. (2008) agnostically learns this class to any constant excess error using n^O(log k) queries of tolerance n^-O(log k). We prove two variants of our lower bound, each of which combines ingredients from Diakonikolas et al. (2021) with (an extension of) a different earlier approach for agnostic SQ lower bounds for the Boolean setting due to Dachman-Soled et al. (2014). Our approach also yields lower bounds for agnostically SQ learning the class of "convex subspace juntas" (studied by Vempala, 2010) and the class of sets with bounded Gaussian surface area; all of these lower bounds are nearly optimal since they essentially match known upper bounds from Klivans et al. (2008).
READ FULL TEXT