A Look at the Effect of Sample Design on Generalization through the Lens of Spectral Analysis

by   Bhavya Kailkhura, et al.

This paper provides a general framework to study the effect of sampling properties of training data on the generalization error of the learned machine learning (ML) models. Specifically, we propose a new spectral analysis of the generalization error, expressed in terms of the power spectra of the sampling pattern and the function involved. The framework is build in the Euclidean space using Fourier analysis and establishes a connection between some high dimensional geometric objects and optimal spectral form of different state-of-the-art sampling patterns. Subsequently, we estimate the expected error bounds and convergence rate of different state-of-the-art sampling patterns, as the number of samples and dimensions increase. We make several observations about generalization error which are valid irrespective of the approximation scheme (or learning architecture) and training (or optimization) algorithms. Our result also sheds light on ways to formulate design principles for constructing optimal sampling methods for particular problems.


A Comprehensive Theory and Variational Framework for Anti-aliasing Sampling Patterns

In this paper, we provide a comprehensive theory of anti-aliasing sampli...

A Priori Generalization Error Analysis of Two-Layer Neural Networks for Solving High Dimensional Schrödinger Eigenvalue Problems

This paper analyzes the generalization error of two-layer neural network...

Controlled Random Search Improves Sample Mining and Hyper-Parameter Optimization

A common challenge in machine learning and related fields is the need to...

Error estimates for DeepOnets: A deep learning framework in infinite dimensions

DeepOnets have recently been proposed as a framework for learning nonlin...

A Closer Look at Data Bias in Neural Extractive Summarization Models

In this paper, we take stock of the current state of summarization datas...

Tight Convergence Rate Bounds for Optimization Under Power Law Spectral Conditions

Performance of optimization on quadratic problems sensitively depends on...

Generalization error bounds for DECONET: a deep unfolding network for analysis Compressive Sensing

In this paper, we propose a new deep unfolding neural network – based on...