Bias Correction for Regularized Regression and its Application in Learning with Streaming Data

03/15/2016
by   Qiang Wu, et al.
0

We propose an approach to reduce the bias of ridge regression and regularization kernel network. When applied to a single data set the new algorithms have comparable learning performance with the original ones. When applied to incremental learning with block wise streaming data the new algorithms are more efficient due to bias reduction. Both theoretical characterizations and simulation studies are used to verify the effectiveness of these new algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/07/2017

Learning Theory of Distributed Regression with Bias Corrected Regularization Kernel Network

Distributed learning is an effective way to analyze big data. In distrib...
research
06/30/2021

Real-Time Regression Analysis of Streaming Clustered Data With Possible Abnormal Data Batches

This paper develops an incremental learning algorithm based on quadratic...
research
08/01/2016

Efficient Multiple Incremental Computation for Kernel Ridge Regression with Bayesian Uncertainty Modeling

This study presents an efficient incremental/decremental approach for bi...
research
11/05/2020

Accurate inference in negative binomial regression

Negative binomial regression is commonly employed to analyze overdispers...
research
03/30/2022

Remember to correct the bias when using deep learning for regression!

When training deep learning models for least-squares regression, we cann...
research
02/22/2021

Debiased Kernel Methods

I propose a practical procedure based on bias correction and sample spli...
research
08/29/2023

Streaming Compression of Scientific Data via weak-SINDy

In this paper a streaming weak-SINDy algorithm is developed specifically...

Please sign up or login with your details

Forgot password? Click here to reset