Universal guarantees for decision tree induction via a higher-order splitting criterion

Part of Advances in Neural Information Processing Systems 33 (NeurIPS 2020)

AuthorFeedback Bibtex MetaReview Paper Review Supplemental

Authors

Guy Blanc, Neha Gupta, Jane Lange, Li-Yang Tan

Abstract

We propose a simple extension of {\sl top-down decision tree learning heuristics} such as ID3, C4.5, and CART. Our algorithm achieves provable guarantees for all target functions $f: \{-1,1\}^n \to \{-1,1\}$ with respect to the uniform distribution, circumventing impossibility results showing that existing heuristics fare poorly even for simple target functions. The crux of our extension is a new splitting criterion that takes into account the correlations between $f$ and {\sl small subsets} of its attributes. The splitting criteria of existing heuristics (e.g. Gini impurity and information gain), in contrast, are based solely on the correlations between $f$ and its {\sl individual} attributes. Our algorithm satisfies the following guarantee: for all target functions $f : \{-1,1\}^n \to \{-1,1\}$, sizes $s\in \N$, and error parameters $\eps$, it constructs a decision tree of size $s^{\tilde{O}((\log s)^2/\eps^2)}$ that achieves error $\le O(\opt_s) + \eps$, where $\opt_s$ denotes the error of the optimal size-$s$ decision tree for $f$. A key technical notion that drives our analysis is the {\sl noise stability} of $f$, a well-studied smoothness measure of $f$.