On the Statistical Efficiency of Compositional Nonparametric Prediction

04/06/2017
by   Yixi Xu, et al.
0

In this paper, we propose a compositional nonparametric method in which a model is expressed as a labeled binary tree of 2k+1 nodes, where each node is either a summation, a multiplication, or the application of one of the q basis functions to one of the p covariates. We show that in order to recover a labeled binary tree from a given dataset, the sufficient number of samples is O(k(pq)+(k!)), and the necessary number of samples is Ω(k (pq)-(k!)). We further propose a greedy algorithm for regression in order to validate our theoretical findings through synthetic experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/12/2018

Learning Binary Bayesian Networks in Polynomial Time and Sample Complexity

We consider the problem of structure learning for binary Bayesian networ...
research
01/26/2017

Information Theoretic Limits for Linear Prediction with Graph-Structured Sparsity

We analyze the necessary number of samples for sparse vector recovery in...
research
02/28/2020

Spectral neighbor joining for reconstruction of latent tree models

A key assumption in multiple scientific applications is that the distrib...
research
08/11/2014

Comparing Nonparametric Bayesian Tree Priors for Clonal Reconstruction of Tumors

Statistical machine learning methods, especially nonparametric Bayesian ...
research
06/20/2023

A Model-free Closeness-of-influence Test for Features in Supervised Learning

Understanding the effect of a feature vector x ∈ℝ^d on the response valu...
research
04/14/2021

Regularized regression on compositional trees with application to MRI analysis

A compositional tree refers to a tree structure on a set of random varia...
research
10/18/2020

JSRT: James-Stein Regression Tree

Regression tree (RT) has been widely used in machine learning and data m...

Please sign up or login with your details

Forgot password? Click here to reset