Approximate Vanishing Ideal Computations at Scale

07/04/2022
by   Elias Wirth, et al.
0

The approximate vanishing ideal of a set of points X = {𝐱_1, …, 𝐱_m}⊆ [0,1]^n is the set of polynomials that approximately evaluate to 0 over all points 𝐱∈ X and admits an efficient representation by a finite set of polynomials called generators. Algorithms that construct this set of generators are extensively studied but ultimately find little practical application because their computational complexities are thought to be superlinear in the number of samples m. In this paper, we focus on scaling up the Oracle Approximate Vanishing Ideal algorithm (OAVI), one of the most powerful of these methods. We prove that the computational complexity of OAVI is not superlinear but linear in the number of samples m and polynomial in the number of features n, making OAVI an attractive preprocessing technique for large-scale machine learning. To further accelerate OAVI's training time, we propose two changes: First, as the name suggests, OAVI makes repeated oracle calls to convex solvers throughout its execution. By replacing the Pairwise Conditional Gradients algorithm, one of the standard solvers used in OAVI, with the faster Blended Pairwise Conditional Gradients algorithm, we illustrate how OAVI directly benefits from advancements in the study of convex solvers. Second, we propose Inverse Hessian Boosting (IHB): IHB exploits the fact that OAVI repeatedly solves quadratic convex optimization problems that differ only by very little and whose solutions can be written in closed form using inverse Hessian information. By efficiently updating the inverse of the Hessian matrix, the convex optimization problems can be solved almost instantly, accelerating OAVI's training time by up to multiple orders of magnitude. We complement our theoretical analysis with extensive numerical experiments on data sets whose sample numbers are in the millions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/07/2022

Conditional Gradients for the Approximately Vanishing Ideal

The vanishing ideal of a set of points X⊆ℝ^n is the set of polynomials t...
research
01/25/2019

Spurious Vanishing Problem in Approximate Vanishing Ideal

Approximate vanishing ideal, which is a new concept from computer algebr...
research
07/03/2022

Computing Vanishing Ideals for Toric Codes

Motivated by applications to the theory of error-correcting codes, we gi...
research
01/29/2018

Approximate Vanishing Ideal via Data Knotting

The vanishing ideal is a set of polynomials that takes zero value on the...
research
09/06/2018

Escaping Saddle Points in Constrained Optimization

In this paper, we focus on escaping from saddle points in smooth nonconv...
research
09/25/2018

Hessian barrier algorithms for linearly constrained optimization problems

In this paper, we propose an interior-point method for linearly constrai...
research
11/11/2019

Gradient Boosts the Approximate Vanishing Ideal

In the last decade, the approximate vanishing ideal and its basis constr...

Please sign up or login with your details

Forgot password? Click here to reset