Coresets For Monotonic Functions with Applications to Deep Learning

02/21/2018
by   Elad Tolochinsky, et al.
0

Coreset (or core-set) in this paper is a small weighted subset Q of the input set P with respect to a given monotonic function f:R→R that provably approximates its fitting loss ∑_p∈ Pf(p· x) to any given x∈R^d. Using Q we can obtain approximation to x^* that minimizes this loss, by running existing optimization algorithms on Q. We provide: (i) a lower bound that proves that there are sets with no coresets smaller than n=|P| , (ii) a proof that a small coreset of size near-logarithmic in n exists for any input P, under natural assumption that holds e.g. for logistic regression and the sigmoid activation function. (iii) a generic algorithm that computes Q in O(nd+n n) expected time, (iv) novel technique for improving existing deep networks using such coresets, (v) extensive experimental results with open code.oving existing deep networks using such coresets, (v) extensive experimental results with open code.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset