The Uniformly Rotated Mondrian Kernel

📅 2025-02-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Standard Mondrian kernels lack rotational invariance and thus poorly approximate isotropic kernels. To address this, we propose the Uniformly Random Rotated Mondrian Process (URMP), enabling efficient, rotationally invariant random feature mappings. We derive, for the first time, a closed-form expression for the rotationally invariant Mondrian kernel; rigorously establish the uniform convergence rate of its random feature estimator; and characterize the spherical symmetry of typical cells in the Mondrian tessellation under multiple random rotations. Our approach integrates tools from stochastic geometry, stationary random tessellations, and averaging over the rotation group. Empirically, URMP significantly outperforms the original Mondrian kernel on both synthetic and real-world datasets—particularly under distribution shift—demonstrating superior robustness. This work provides a theoretically grounded and practically effective tool for large-scale kernel learning.

Technology Category

Application Category

📝 Abstract
First proposed by Rahimi and Recht, random features are used to decrease the computational cost of kernel machines in large-scale problems. The Mondrian kernel is one such example of a fast random feature approximation of the Laplace kernel, generated by a computationally efficient hierarchical random partition of the input space known as the Mondrian process. In this work, we study a variation of this random feature map by using uniformly randomly rotated Mondrian processes to approximate a kernel that is invariant under rotations. We obtain a closed-form expression for this isotropic kernel, as well as a uniform convergence rate of the uniformly rotated Mondrian kernel to this limit. To this end, we utilize techniques from the theory of stationary random tessellations in stochastic geometry and prove a new result on the geometry of the typical cell of the superposition of uniformly random rotations of Mondrian tessellations. Finally, we test the empirical performance of this random feature map on both synthetic and real-world datasets, demonstrating its improved performance over the Mondrian kernel on a debiased dataset.
Problem

Research questions and friction points this paper is trying to address.

Reduce computational cost of kernel machines
Approximate rotation-invariant kernels efficiently
Improve performance on debiased datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uniformly rotated Mondrian processes
Isotropic kernel closed-form expression
Stationary random tessellations theory
🔎 Similar Papers
No similar papers found.