Dive into Decision Trees and Forests: A Theoretical Demonstration

01/20/2021
by   Jinxiong Zhang, et al.
0

Based on decision trees, many fields have arguably made tremendous progress in recent years. In simple words, decision trees use the strategy of "divide-and-conquer" to divide the complex problem on the dependency between input features and labels into smaller ones. While decision trees have a long history, recent advances have greatly improved their performance in computational advertising, recommender system, information retrieval, etc. We introduce common tree-based models (e.g., Bayesian CART, Bayesian regression splines) and training techniques (e.g., mixed integer programming, alternating optimization, gradient descent). Along the way, we highlight probabilistic characteristics of tree-based models and explain their practical and theoretical benefits. Except machine learning and data mining, we try to show theoretical advances on tree-based models from other fields such as statistics and operation research. We list the reproducible resource at the end of each method.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset