DeepAI AI Chat
Log In Sign Up

Optimal Sub-sampling with Influence Functions

by   Daniel Ting, et al.

Sub-sampling is a common and often effective method to deal with the computational challenges of large datasets. However, for most statistical models, there is no well-motivated approach for drawing a non-uniform subsample. We show that the concept of an asymptotically linear estimator and the associated influence function leads to optimal sampling procedures for a wide class of popular models. Furthermore, for linear regression models which have well-studied procedures for non-uniform sub-sampling, we show our optimal influence function based method outperforms previous approaches. We empirically show the improved performance of our method on real datasets.


page 1

page 2

page 3

page 4


Sub-sampled Newton Methods with Non-uniform Sampling

We consider the problem of finding the minimizer of a convex function F:...

Achieving greater Explanatory Power and Forecasting Accuracy with Non-uniform spread Fuzzy Linear Regression

Fuzzy regression models have been applied to several Operations Research...

Optimal Non-Uniform Deployments of LoRa Networks

LoRa wireless technology is an increasingly prominent solution for massi...

By chance is not enough: Preserving relative density through non uniform sampling

Dealing with visualizations containing large data set is a challenging i...

Testing the Stationarity Assumption in Software Effort Estimation Datasets

Software effort estimation (SEE) models are typically developed based on...

AdaInt: Learning Adaptive Intervals for 3D Lookup Tables on Real-time Image Enhancement

The 3D Lookup Table (3D LUT) is a highly-efficient tool for real-time im...

Revisiting the Fragility of Influence Functions

In the last few years, many works have tried to explain the predictions ...