Approximation with Tensor Networks. Part II: Approximation Rates for Smoothness Classes

06/30/2020
by   Mazen Ali, et al.
0

We study the approximation by tensor networks (TNs) of functions from classical smoothness classes. The considered approximation tool combines a tensorization of functions in L^p([0,1)), which allows to identify a univariate function with a multivariate function (or tensor), and the use of tree tensor networks (the tensor train format) for exploiting low-rank structures of multivariate functions. The resulting tool can be interpreted as a feed-forward neural network, with first layers implementing the tensorization, interpreted as a particular featuring step, followed by a sum-product network with sparse architecture. In part I of this work, we presented several approximation classes associated with different measures of complexity of tensor networks and studied their properties. In this work (part II), we show how classical approximation tools, such as polynomials or splines (with fixed or free knots), can be encoded as a tensor network with controlled complexity. We use this to derive direct (Jackson) inequalities for the approximation spaces of tensor networks. This is then utilized to show that Besov spaces are continuously embedded into these approximation spaces. In other words, we show that arbitrary Besov functions can be approximated with optimal or near to optimal rate. We also show that an arbitrary function in the approximation class possesses no Besov smoothness, unless one limits the depth of the tensor network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2020

Approximation with Tensor Networks. Part I: Approximation Spaces

We study the approximation of functions by tensor networks (TNs). We sho...
research
01/28/2021

Approximation with Tensor Networks. Part III: Multivariate Approximation

We study the approximation of multivariate functions with tensor network...
research
07/02/2020

Learning with tree tensor networks: complexity estimates and model selection

In this paper, we propose and analyze a model selection method for tree ...
research
12/15/2020

Approximation by linear combinations of translates of a single function

We study approximation by arbitrary linear combinations of n translates ...
research
11/11/2018

Learning with tree-based tensor formats

This paper is concerned with the approximation of high-dimensional funct...
research
06/17/2020

Generalising Recursive Neural Models by Tensor Decomposition

Most machine learning models for structured data encode the structural k...
research
12/02/2021

Approximation by tree tensor networks in high dimensions: Sobolev and compositional functions

This paper is concerned with convergence estimates for fully discrete tr...

Please sign up or login with your details

Forgot password? Click here to reset