DeepAI AI Chat
Log In Sign Up

Decoupling multivariate functions using a nonparametric filtered tensor decomposition

by   Jan Decuyper, et al.

Multivariate functions emerge naturally in a wide variety of data-driven models. Popular choices are expressions in the form of basis expansions or neural networks. While highly effective, the resulting functions tend to be hard to interpret, in part because of the large number of required parameters. Decoupling techniques aim at providing an alternative representation of the nonlinearity. The so-called decoupled form is often a more efficient parameterisation of the relationship while being highly structured, favouring interpretability. In this work two new algorithms, based on filtered tensor decompositions of first order derivative information are introduced. The method returns nonparametric estimates of smooth decoupled functions. Direct applications are found in, i.a. the fields of nonlinear system identification and machine learning.


page 1

page 2

page 3

page 4


Decoupling multivariate functions using second-order information and tensors

The power of multivariate functions is their ability to model a wide var...

Decoupling P-NARX models using filtered CPD

Nonlinear Auto-Regressive eXogenous input (NARX) models are a popular cl...

A multivariate Riesz basis of ReLU neural networks

We consider the trigonometric-like system of piecewise linear functions ...

Tensor decompositions and algorithms, with applications to tensor learning

A new algorithm of the canonical polyadic decomposition (CPD) presented ...

Tensor-Train Numerical Integration of Multivariate Functions with Singularities

Numerical integration is a classical problem emerging in many fields of ...

SOCKS: A Stochastic Optimal Control and Reachability Toolbox Using Kernel Methods

We present SOCKS, a data-driven stochastic optimal control toolbox based...

Nonparametric Universal Copula Modeling

To handle the ubiquitous problem of "dependence learning," copulas are q...