Neural Decision Trees

02/23/2017
by   Randall Balestriero, et al.
0

In this paper we propose a synergistic melting of neural networks and decision trees (DT) we call neural decision trees (NDT). NDT is an architecture a la decision tree where each splitting node is an independent multilayer perceptron allowing oblique decision functions or arbritrary nonlinear decision function if more than one layer is used. This way, each MLP can be seen as a node of the tree. We then show that with the weight sharing asumption among those units, we end up with a Hashing Neural Network (HNN) which is a multilayer perceptron with sigmoid activation function for the last layer as opposed to the standard softmax. The output units then jointly represent the probability to be in a particular region. The proposed framework allows for global optimization as opposed to greedy in DT and differentiability w.r.t. all parameters and the input, allowing easy integration in any learnable pipeline, for example after CNNs for computer vision tasks. We also demonstrate the modeling power of HNN allowing to learn union of disjoint regions for final clustering or classification making it more general and powerful than standard softmax MLP requiring linear separability thus reducing the need on the inner layer to perform complex data transformations. We finally show experiments for supervised, semi-suppervised and unsupervised tasks and compare results with standard DTs and MLPs.

READ FULL TEXT

page 4

page 7

page 8

page 9

research
12/19/2014

Distributed Decision Trees

Recently proposed budding tree is a decision tree algorithm in which eve...
research
03/29/2021

One Network Fits All? Modular versus Monolithic Task Formulations in Neural Networks

Can deep learning solve multiple tasks simultaneously, even when they ar...
research
02/01/2022

Exploring layerwise decision making in DNNs

While deep neural networks (DNNs) have become a standard architecture fo...
research
10/26/2022

Convergence Rates of Oblique Regression Trees for Flexible Function Libraries

We develop a theoretical framework for the analysis of oblique decision ...
research
01/14/2019

Towards Using Context-Dependent Symbols in CTC Without State-Tying Decision Trees

Deep neural acoustic models benefit from context dependent modeling of o...
research
09/13/2019

Neural Oblivious Decision Ensembles for Deep Learning on Tabular Data

Nowadays, deep neural networks (DNNs) have become the main instrument fo...
research
12/21/2020

A Network of Localized Linear Discriminants

The localized linear discriminant network (LLDN) has been designed to ad...

Please sign up or login with your details

Forgot password? Click here to reset