Conditional PED-ANOVA: Hyperparameter Importance in Hierarchical&Dynamic Search Spaces

📅 2026-01-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing methods for hyperparameter importance estimation struggle to handle hierarchical dependencies, dynamic activation, and domain shifts of hyperparameters in conditional search spaces, often yielding misleading results in high-performance regions. This work proposes the condPED-ANOVA framework, which formally defines conditional hyperparameter importance for the first time. Building upon PED-ANOVA, it incorporates conditional probabilistic modeling and a closed-form analytical estimator to accurately capture importance under complex conditional structures. Experimental results demonstrate that condPED-ANOVA delivers accurate, consistent, and interpretable importance estimates in conditional settings, significantly outperforming current state-of-the-art approaches.

Technology Category

Application Category

📝 Abstract
We propose conditional PED-ANOVA (condPED-ANOVA), a principled framework for estimating hyperparameter importance (HPI) in conditional search spaces, where the presence or domain of a hyperparameter can depend on other hyperparameters. Although the original PED-ANOVA provides a fast and efficient way to estimate HPI within the top-performing regions of the search space, it assumes a fixed, unconditional search space and therefore cannot properly handle conditional hyperparameters. To address this, we introduce a conditional HPI for top-performing regions and derive a closed-form estimator that accurately reflects conditional activation and domain changes. Experiments show that naive adaptations of existing HPI estimators yield misleading or uninterpretable importances in conditional settings, whereas condPED-ANOVA consistently provides meaningful importances that reflect the underlying conditional structure. Our code is publicly available at https://github.com/kAIto47802/condPED-ANOVA.
Problem

Research questions and friction points this paper is trying to address.

hyperparameter importance
conditional search spaces
hierarchical hyperparameters
dynamic search spaces
conditional hyperparameters
Innovation

Methods, ideas, or system contributions that make the work stand out.

conditional hyperparameter importance
PED-ANOVA
hierarchical search spaces
closed-form estimator
hyperparameter optimization
🔎 Similar Papers
No similar papers found.
Kaito Baba
Kaito Baba
The University of Tokyo
Deep LearningMachine LearningAlgorithmLearning TheoryOptimization
Y
Yoshihiko Ozaki
Preferred Networks, Inc., Tokyo, Japan
S
Shuhei Watanabe
SB Intuitions Corp., Tokyo, Japan