DeepAI AI Chat
Log In Sign Up

Doubly Decomposing Nonparametric Tensor Regression

by   Masaaki Imaizumi, et al.

Nonparametric extension of tensor regression is proposed. Nonlinearity in a high-dimensional tensor space is broken into simple local functions by incorporating low-rank tensor decomposition. Compared to naive nonparametric approaches, our formulation considerably improves the convergence rate of estimation while maintaining consistency with the same function class under specific conditions. To estimate local functions, we develop a Bayesian estimator with the Gaussian process prior. Experimental results show its theoretical properties and high performance in terms of predicting a summary statistic of a real complex network.


page 1

page 2

page 3

page 4


Robust low-rank tensor regression via truncation and adaptive Huber loss

This paper investigates robust low-rank tensor regression with only fini...

Strong consistency of the nonparametric local linear regression estimation under censorship model

We introduce and study a local linear nonparametric regression estimator...

Broadcasted Nonparametric Tensor Regression

We propose a novel broadcasting idea to model the nonlinearity in tensor...

Nonparametric Multivariate Density Estimation: A Low-Rank Characteristic Function Approach

Effective non-parametric density estimation is a key challenge in high-d...

Nonparametric Factor Trajectory Learning for Dynamic Tensor Decomposition

Tensor decomposition is a fundamental framework to analyze data that can...

Provable Sparse Tensor Decomposition

We propose a novel sparse tensor decomposition method, namely Tensor Tru...

Improving Nonparametric Density Estimation with Tensor Decompositions

While nonparametric density estimators often perform well on low dimensi...