🤖 AI Summary
Existing methods for hyperparameter importance estimation struggle to handle hierarchical dependencies, dynamic activation, and domain shifts of hyperparameters in conditional search spaces, often yielding misleading results in high-performance regions. This work proposes the condPED-ANOVA framework, which formally defines conditional hyperparameter importance for the first time. Building upon PED-ANOVA, it incorporates conditional probabilistic modeling and a closed-form analytical estimator to accurately capture importance under complex conditional structures. Experimental results demonstrate that condPED-ANOVA delivers accurate, consistent, and interpretable importance estimates in conditional settings, significantly outperforming current state-of-the-art approaches.
📝 Abstract
We propose conditional PED-ANOVA (condPED-ANOVA), a principled framework for estimating hyperparameter importance (HPI) in conditional search spaces, where the presence or domain of a hyperparameter can depend on other hyperparameters. Although the original PED-ANOVA provides a fast and efficient way to estimate HPI within the top-performing regions of the search space, it assumes a fixed, unconditional search space and therefore cannot properly handle conditional hyperparameters. To address this, we introduce a conditional HPI for top-performing regions and derive a closed-form estimator that accurately reflects conditional activation and domain changes. Experiments show that naive adaptations of existing HPI estimators yield misleading or uninterpretable importances in conditional settings, whereas condPED-ANOVA consistently provides meaningful importances that reflect the underlying conditional structure. Our code is publicly available at https://github.com/kAIto47802/condPED-ANOVA.