Decoupling multivariate functions using second-order information and tensors
The power of multivariate functions is their ability to model a wide variety of phenomena, but have the disadvantages that they lack an intuitive or interpretable representation, and often require a (very) large number of parameters. We study decoupled representations of multivariate vector functions, which are linear combinations of univariate functions in linear combinations of the input variables. This model structure provides a description with fewer parameters, and reveals the internal workings in a simpler way, as the nonlinearities are one-to-one functions. In earlier work, a tensor-based method was developed for performing this decomposition by using first-order derivative information. In this article, we generalize this method and study how the use of second-order derivative information can be incorporated. By doing this, we are able to push the method towards more involved configurations, while preserving uniqueness of the underlying tensor decompositions. Furthermore, even for some non-identifiable structures, the method seems to return a valid decoupled representation. These results are a step towards more general data-driven and noise-robust tensor-based framework for computing decoupled function representations.
READ FULL TEXT