On Learning and Testing Decision Tree

08/10/2021
by   Nader H. Bshouty, et al.
0

In this paper, we study learning and testing decision tree of size and depth that are significantly smaller than the number of attributes n. Our main result addresses the problem of poly(n,1/ϵ) time algorithms with poly(s,1/ϵ) query complexity (independent of n) that distinguish between functions that are decision trees of size s from functions that are ϵ-far from any decision tree of size ϕ(s,1/ϵ), for some function ϕ > s. The best known result is the recent one that follows from Blank, Lange and Tan, <cit.>, that gives ϕ(s,1/ϵ)=2^O((log^3s)/ϵ^3). In this paper, we give a new algorithm that achieves ϕ(s,1/ϵ)=2^O(log^2 (s/ϵ)). Moreover, we study the testability of depth-d decision tree and give a distribution free tester that distinguishes between depth-d decision tree and functions that are ϵ-far from depth-d^2 decision tree. In particular, for decision trees of size s, the above result holds in the distribution-free model when the tree depth is O(log(s/ϵ)). We also give other new results in learning and testing of size-s decision trees and depth-d decision trees that follow from results in the literature and some results we prove in this paper.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/16/2020

Testing and reconstruction via decision trees

We study sublinear and local computation algorithms for decision trees, ...
research
10/13/2020

Succinct Explanations With Cascading Decision Trees

Classic decision tree learning is a binary classification algorithm that...
research
06/19/2020

When Is Amplification Necessary for Composition in Randomized Query Complexity?

Suppose we have randomized decision trees for an outer function f and an...
research
06/07/2023

On Computing Optimal Tree Ensembles

Random forests and, more generally, (decision-)tree ensembles are widely...
research
11/08/2019

An Experimental Comparison of Old and New Decision Tree Algorithms

This paper presents a detailed comparison of a recently proposed algorit...
research
05/16/2022

The Influence of Dimensions on the Complexity of Computing Decision Trees

A decision tree recursively splits a feature space ℝ^d and then assigns ...
research
03/27/2023

Lifting uniform learners via distributional decomposition

We show how any PAC learning algorithm that works under the uniform dist...

Please sign up or login with your details

Forgot password? Click here to reset