Inference with Mondrian Random Forests

📅 2023-10-15
📈 Citations: 6
Influential: 1
📄 PDF
🤖 AI Summary
This paper addresses the lack of uncertainty quantification in Mondrian random forests for regression. We establish, for the first time, exact bias–variance decompositions and a central limit theorem for Mondrian forests, and propose a theory-driven debiasing strategy that achieves minimax-optimal convergence rates over β-Hölder function classes. Methodologically, our approach integrates Mondrian process-based tree construction, analytical variance estimation, and computationally tractable debiasing corrections, supporting both batch and online learning with rigorous computational complexity analysis. Key contributions include: (1) the first practical confidence interval construction with explicit error bounds; (2) a rigorous statistical inference framework enabling hypothesis testing and calibrated prediction; and (3) theoretical guarantees on error control, corroborated by simulations demonstrating high accuracy and robustness in finite-sample settings. Our work fills a fundamental theoretical gap in the statistical inference of Mondrian forests.
📝 Abstract
Random forests are popular methods for regression and classification analysis, and many different variants have been proposed in recent years. One interesting example is the Mondrian random forest, in which the underlying constituent trees are constructed via a Mondrian process. We give precise bias and variance characterizations, along with a Berry-Esseen-type central limit theorem, for the Mondrian random forest regression estimator. By combining these results with a carefully crafted debiasing approach and an accurate variance estimator, we present valid statistical inference methods for the unknown regression function. These methods come with explicitly characterized error bounds in terms of the sample size, tree complexity parameter, and number of trees in the forest, and include coverage error rates for feasible confidence interval estimators. Our novel debiasing procedure for the Mondrian random forest also allows it to achieve the minimax-optimal point estimation convergence rate in mean squared error for multivariate $eta$-H""older regression functions, for all $eta>0$, provided that the underlying tuning parameters are chosen appropriately. Efficient and implementable algorithms are devised for both batch and online learning settings, and we study the computational complexity of different Mondrian random forest implementations. Finally, simulations with synthetic data validate our theory and methodology, demonstrating their excellent finite-sample properties.
Problem

Research questions and friction points this paper is trying to address.

Characterize bias and variance in Mondrian random forests
Develop valid inference methods for regression functions
Achieve minimax-optimal convergence rates in estimation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mondrian process constructs random forest trees
Debiasing approach enhances regression function inference
Algorithms for batch and online learning settings
🔎 Similar Papers
No similar papers found.
M
M. D. Cattaneo
Department of Operations Research and Financial Engineering, Princeton University
Jason M. Klusowski
Jason M. Klusowski
Assistant Professor, Department of Operations Research & Financial Engineering
statisticsprobabilitymachine learninginformation theory
W
W. Underwood
Statistical Laboratory, University of Cambridge