A Note On Nonlinear Regression Under L2 Loss

03/30/2023
by   Kaan Gokcesu, et al.
0

We investigate the nonlinear regression problem under L2 loss (square loss) functions. Traditional nonlinear regression models often result in non-convex optimization problems with respect to the parameter set. We show that a convex nonlinear regression model exists for the traditional least squares problem, which can be a promising towards designing more complex systems with easier to train models.

READ FULL TEXT

page 1

page 2

page 3

research
11/05/2012

Soft (Gaussian CDE) regression models and loss functions

Regression, unlike classification, has lacked a comprehensive and effect...
research
11/23/2018

Nonlinear Regression without i.i.d. Assumption

In this paper, we consider a class of nonlinear regression problems with...
research
08/10/2019

Separable nonlinear least-squares parameter estimation for complex dynamic systems

Nonlinear dynamic models are widely used for characterizing functional f...
research
03/02/2021

Hessian Eigenspectra of More Realistic Nonlinear Models

Given an optimization problem, the Hessian matrix and its eigenspectrum ...
research
09/20/2022

Symbolic Regression with Fast Function Extraction and Nonlinear Least Squares Optimization

Fast Function Extraction (FFX) is a deterministic algorithm for solving ...
research
08/26/2011

Prediction of peptide bonding affinity: kernel methods for nonlinear modeling

This paper presents regression models obtained from a process of blind p...
research
07/12/2021

Nonlinear Least Squares for Large-Scale Machine Learning using Stochastic Jacobian Estimates

For large nonlinear least squares loss functions in machine learning we ...

Please sign up or login with your details

Forgot password? Click here to reset