A Generalized Weighted Optimization Method for Computational Learning and Inversion

01/23/2022
by   Kui Ren, et al.
0

The generalization capacity of various machine learning models exhibits different phenomena in the under- and over-parameterized regimes. In this paper, we focus on regression models such as feature regression and kernel regression and analyze a generalized weighted least-squares optimization method for computational learning and inversion with noisy data. The highlight of the proposed framework is that we allow weighting in both the parameter space and the data space. The weighting scheme encodes both a priori knowledge on the object to be learned and a strategy to weight the contribution of different data points in the loss function. Here, we characterize the impact of the weighting scheme on the generalization error of the learning method, where we derive explicit generalization errors for the random Fourier feature model in both the under- and over-parameterized regimes. For more general feature maps, error bounds are provided based on the singular values of the feature matrix. We demonstrate that appropriate weighting from prior knowledge can improve the generalization capability of the learned model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/28/2021

Understanding the role of importance weighting for deep learning

The recent paper by Byrd Lipton (2019), based on empirical observati...
research
05/09/2016

Mean Absolute Percentage Error for regression models

We study in this paper the consequences of using the Mean Absolute Perce...
research
05/01/2020

Generalization Error of Generalized Linear Models in High Dimensions

At the heart of machine learning lies the question of generalizability o...
research
04/17/2023

Analysis of Interpolating Regression Models and the Double Descent Phenomenon

A regression model with more parameters than data points in the training...
research
02/22/2023

A Generalized Weighted Loss for SVC and MLP

Usually standard algorithms employ a loss where each error is the mere a...
research
06/15/2020

Weighted Optimization: better generalization by smoother interpolation

We provide a rigorous analysis of how implicit bias towards smooth inter...
research
08/14/2023

Locally Adaptive and Differentiable Regression

Over-parameterized models like deep nets and random forests have become ...

Please sign up or login with your details

Forgot password? Click here to reset