Symbolic Regression with Fast Function Extraction and Nonlinear Least Squares Optimization

09/20/2022
by   Lukas Kammerer, et al.
0

Fast Function Extraction (FFX) is a deterministic algorithm for solving symbolic regression problems. We improve the accuracy of FFX by adding parameters to the arguments of nonlinear functions. Instead of only optimizing linear parameters, we optimize these additional nonlinear parameters with separable nonlinear least squared optimization using a variable projection algorithm. Both FFX and our new algorithm is applied on the PennML benchmark suite. We show that the proposed extensions of FFX leads to higher accuracy while providing models of similar length and with only a small increase in runtime on the given data. Our results are compared to a large set of regression methods that were already published for the given benchmark suite.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2022

A Randomised Subspace Gauss-Newton Method for Nonlinear Least-Squares

We propose a Randomised Subspace Gauss-Newton (R-SGN) algorithm for solv...
research
07/03/2021

A fast algorithm for computing the Boys function

We present a new fast algorithm for computing the Boys function using no...
research
03/30/2023

A Note On Nonlinear Regression Under L2 Loss

We investigate the nonlinear regression problem under L2 loss (square lo...
research
04/24/2017

Elite Bases Regression: A Real-time Algorithm for Symbolic Regression

Symbolic regression is an important but challenging research topic in da...
research
09/20/2021

A Hybrid Symbolic/Numeric Solution To Polynomial SEM

There are many approaches to nonlinear SEM (structural equation modeling...
research
05/31/2022

Correlation versus RMSE Loss Functions in Symbolic Regression Tasks

The use of correlation as a fitness function is explored in symbolic reg...

Please sign up or login with your details

Forgot password? Click here to reset