Doubly Decomposing Nonparametric Tensor Regression

06/19/2015
by   Masaaki Imaizumi, et al.
0

Nonparametric extension of tensor regression is proposed. Nonlinearity in a high-dimensional tensor space is broken into simple local functions by incorporating low-rank tensor decomposition. Compared to naive nonparametric approaches, our formulation considerably improves the convergence rate of estimation while maintaining consistency with the same function class under specific conditions. To estimate local functions, we develop a Bayesian estimator with the Gaussian process prior. Experimental results show its theoretical properties and high performance in terms of predicting a summary statistic of a real complex network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2022

Robust low-rank tensor regression via truncation and adaptive Huber loss

This paper investigates robust low-rank tensor regression with only fini...
research
04/06/2020

Strong consistency of the nonparametric local linear regression estimation under censorship model

We introduce and study a local linear nonparametric regression estimator...
research
08/29/2020

Broadcasted Nonparametric Tensor Regression

We propose a novel broadcasting idea to model the nonlinearity in tensor...
research
08/27/2020

Nonparametric Multivariate Density Estimation: A Low-Rank Characteristic Function Approach

Effective non-parametric density estimation is a key challenge in high-d...
research
07/06/2022

Nonparametric Factor Trajectory Learning for Dynamic Tensor Decomposition

Tensor decomposition is a fundamental framework to analyze data that can...
research
02/05/2015

Provable Sparse Tensor Decomposition

We propose a novel sparse tensor decomposition method, namely Tensor Tru...
research
10/06/2020

Improving Nonparametric Density Estimation with Tensor Decompositions

While nonparametric density estimators often perform well on low dimensi...

Please sign up or login with your details

Forgot password? Click here to reset