The Root Finding Problem Revisited: Beyond the Robbins-Monro procedure

📅 2025-08-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
When the derivative of the regression function vanishes or is discontinuous at its root, the Robbins–Monro (RM) method suffers from slow convergence (only O(1/n)) and poor variance properties. To address this, we propose the Sequential Probability Ratio Bisection (SPRB) algorithm, which integrates sequential probability ratio testing with bisection search to adaptively characterize local function behavior. SPRB retains the optimal O(1/n) convergence rate even under vanishing or discontinuous derivatives, and achieves exponential convergence in certain regimes. Theoretical contributions include: (i) non-asymptotic sample complexity bounds; (ii) a generalized central limit theorem under random stopping times; (iii) asymptotic optimality of the estimator’s variance; and (iv) automatic generation of time-uniform, non-asymptotic confidence sequences. Numerical experiments demonstrate that SPRB significantly outperforms classical RM across a range of pathological settings.

Technology Category

Application Category

📝 Abstract
We introduce Sequential Probability Ratio Bisection (SPRB), a novel stochastic approximation algorithm that adapts to the local behavior of the (regression) function of interest around its root. We establish theoretical guarantees for SPRB's asymptotic performance, showing that it achieves the optimal convergence rate and minimal asymptotic variance even when the target function's derivative at the root is small (at most half the step size), a regime where the classical Robbins-Monro procedure typically suffers reduced convergence rates. Further, we show that if the regression function is discontinuous at the root, Robbins-Monro converges at a rate of $1/n$ whilst SPRB attains exponential convergence. If the regression function has vanishing first-order derivative, SPRB attains a faster rate of convergence compared to stochastic approximation. As part of our analysis, we derive a nonasymptotic bound on the expected sample size and establish a generalized Central Limit Theorem under random stopping times. Remarkably, SPRB automatically provides nonasymptotic time-uniform confidence sequences that do not explicitly require knowledge of the convergence rate. We demonstrate the practical effectiveness of SPRB through simulation results.
Problem

Research questions and friction points this paper is trying to address.

Introduces SPRB algorithm for stochastic root finding problems
Addresses slow convergence when derivative is small or zero
Handles discontinuous regression functions with exponential convergence
Innovation

Methods, ideas, or system contributions that make the work stand out.

Sequential Probability Ratio Bisection algorithm
Adapts to local function behavior near root
Achieves optimal convergence rate and minimal variance
🔎 Similar Papers
No similar papers found.