Tensor train-Karhunen-Loève expansion for continuous-indexed random fields using higher-order cumulant functions

07/15/2019
by   Ling-Ze Bu, et al.
0

The goals of this work are two-fold: firstly, to propose a new theoretical framework for representing random fields on a large class of multidimensional geometrical domain in the tensor train format; secondly, to develop a new algorithm framework for accurately computing the modes and the second and third-order cumulant tensors within moderate time. The core of the new theoretical framework is the tensor train decomposition of cumulant functions. This decomposition is accurately computed with a novel rank-revealing algorithm. Compared with existing Galerkin-type and collocation-type methods, the proposed computational procedure totally removes the need of selecting the basis functions or collocation points and the quadrature points, which not only greatly enhances adaptivity, but also avoids solving large-scale eigenvalue problems. Moreover, by computing with third-order cumulant functions, the new theoretical and algorithm frameworks show great potential for representing general non-Gaussian non-homogeneous random fields. Three numerical examples, including a three-dimensional random field discretization problem, illustrate the efficiency and accuracy of the proposed algorithm framework.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/17/2018

A Galerkin Isogeometric Method for Karhunen-Loeve Approximation of Random Fields

This paper marks the debut of a Galerkin isogeometric method for solving...
research
01/15/2023

Some Tucker-like approximations based on the modal semi-tensor product

Approximating higher-order tensors by the Tucker format has been applied...
research
01/24/2020

Certified and fast computations with shallow covariance kernels

Many techniques for data science and uncertainty quantification demand e...
research
11/22/2022

An Incremental Tensor Train Decomposition Algorithm

We present a new algorithm for incrementally updating the tensor-train d...
research
04/16/2013

Learning Heteroscedastic Models by Convex Programming under Group Sparsity

Popular sparse estimation methods based on ℓ_1-relaxation, such as the L...
research
07/14/2011

Modelling Distributed Shape Priors by Gibbs Random Fields of Second Order

We analyse the potential of Gibbs Random Fields for shape prior modellin...

Please sign up or login with your details

Forgot password? Click here to reset