Minimizing Quadratic Functions in Constant Time

08/25/2016
by   Kohei Hayashi, et al.
0

A sampling-based optimization method for quadratic functions is proposed. Our method approximately solves the following n-dimensional quadratic minimization problem in constant time, which is independent of n: z^*=_v∈R^n〈v, A v〉 + n〈v, diag(d)v〉 + n〈b, v〉, where A ∈R^n × n is a matrix and d,b∈R^n are vectors. Our theoretical analysis specifies the number of samples k(δ, ϵ) such that the approximated solution z satisfies |z - z^*| = O(ϵ n^2) with probability 1-δ. The empirical performance (accuracy and runtime) is positively confirmed by numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2018

A Limitation of V-Matrix based Methods

To estimate the conditional probability functions based on the direct pr...
research
07/17/2018

Minimizing convex quadratic with variable precision Krylov methods

Iterative algorithms for the solution of convex quadratic optimization p...
research
05/25/2023

Computing the Quadratic Numerical Range

A novel algorithm for the computation of the quadratic numerical range i...
research
02/17/2023

A Class of Algorithms for Quadratic Minimization

Certain problems in quadratic minimization can be reduced to finding the...
research
04/26/2021

Quadratic Payments with constrained probabilities

Dealing with quadratic payments, marginal probability is usually conside...
research
03/29/2018

Structural Risk Minimization for C^1,1(R^d) Regression

One means of fitting functions to high-dimensional data is by providing ...
research
12/08/2015

Towards the Application of Linear Programming Methods For Multi-Camera Pose Estimation

We presented a separation based optimization algorithm which, rather tha...

Please sign up or login with your details

Forgot password? Click here to reset