Manifold Forests: Closing the Gap on Neural Networks

09/25/2019
by   Ronan Perry, et al.
54

Decision forests (DF), in particular random forests and gradient boosting trees, have demonstrated state-of-the-art accuracy compared to other methods in many supervised learning scenarios. In particular, DFs dominate other methods in tabular data, that is, when the feature space is unstructured, so that the signal is invariant to permuting feature indices. However, in structured data lying on a manifold---such as images, text, and speech---neural nets (NN) tend to outperform DFs. We conjecture that at least part of the reason for this is that the input to NN is not simply the feature magnitudes, but also their indices (for example, the convolution operation uses "feature locality"). In contrast, naïve DF implementations fail to explicitly consider feature indices. A recently proposed DF approach demonstrates that DFs, for each node, implicitly sample a random matrix from some specific distribution. Here, we build on that to show that one can choose distributions in a manifold aware fashion. For example, for image classification, rather than randomly selecting pixels, one can randomly select contiguous patches. We demonstrate the empirical performance of data living on three different manifolds: images, time-series, and a torus. In all three cases, our Manifold Forest () algorithm empirically dominates other state-of-the-art approaches that ignore feature space structure, achieving a lower classification error on all sample sizes. This dominance extends to the MNIST data set as well. Moreover, both training and test time is significantly faster for manifold forests as compared to deep nets. This approach, therefore, has promise to enable DFs and other machine learning methods to close the gap with deep nets on manifold-valued data.

READ FULL TEXT

page 1

page 6

page 7

research
07/11/2020

Towards Robust Classification with Deep Generative Forests

Decision Trees and Random Forests are among the most widely used machine...
research
01/24/2022

Neural Manifold Clustering and Embedding

Given a union of non-linear manifolds, non-linear subspace clustering or...
research
06/01/2022

Hopular: Modern Hopfield Networks for Tabular Data

While Deep Learning excels in structured data as encountered in vision a...
research
12/29/2022

Effects of Data Geometry in Early Deep Learning

Deep neural networks can approximate functions on different types of dat...
research
07/27/2020

Robust Similarity and Distance Learning via Decision Forests

Canonical distances such as Euclidean distance often fail to capture the...
research
08/31/2021

When are Deep Networks really better than Random Forests at small sample sizes?

Random forests (RF) and deep networks (DN) are two of the most popular m...
research
05/04/2023

When Do Neural Nets Outperform Boosted Trees on Tabular Data?

Tabular data is one of the most commonly used types of data in machine l...

Please sign up or login with your details

Forgot password? Click here to reset