Variation Matters: from Mitigating to Embracing Zero-Shot NAS Ranking Function Variation

📅 2025-02-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In zero-shot neural architecture search (NAS), ranking functions exhibit severe output volatility due to stochastic factors such as weight initialization and batch sampling, critically undermining search stability and reliability. This work is the first to explicitly model zero-shot NAS rankings as random variables and, grounded in stochastic order theory, formalize performance metrics via stochastic dominance relations. We propose a novel paradigm—“leveraging variability” rather than suppressing it—by integrating Monte Carlo evaluation with statistically significant ranking. Our approach robustly characterizes ranking uncertainty through repeated stochastic evaluations and hypothesis testing. Empirically, on standard benchmark search spaces (e.g., NAS-Bench-201, DARTS), the method substantially improves search stability and final model accuracy: top-1 architectures achieve average validation accuracy gains of 1.8–3.2 percentage points over state-of-the-art zero-shot baselines.

Technology Category

Application Category

📝 Abstract
Neural Architecture Search (NAS) is a powerful automatic alternative to manual design of a neural network. In the zero-shot version, a fast ranking function is used to compare architectures without training them. The outputs of the ranking functions often vary significantly due to different sources of randomness, including the evaluated architecture's weights' initialization or the batch of data used for calculations. A common approach to addressing the variation is to average a ranking function output over several evaluations. We propose taking into account the variation in a different manner, by viewing the ranking function output as a random variable representing a proxy performance metric. During the search process, we strive to construct a stochastic ordering of the performance metrics to determine the best architecture. Our experiments show that the proposed stochastic ordering can effectively boost performance of a search on standard benchmark search spaces.
Problem

Research questions and friction points this paper is trying to address.

Addressing zero-shot NAS ranking function variation
Constructing stochastic ordering of performance metrics
Boosting performance in neural architecture search
Innovation

Methods, ideas, or system contributions that make the work stand out.

Embrace ranking function variation
Construct stochastic performance ordering
Boost NAS search performance
🔎 Similar Papers
No similar papers found.