HHNAS-AM: Hierarchical Hybrid Neural Architecture Search using Adaptive Mutation Policies

📅 2025-08-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural architecture search (NAS) for text classification suffers from excessively large, unstructured, and flat search spaces, impeding efficient traversal and convergence. To address these challenges, this paper proposes a hierarchical hybrid NAS framework. First, it incorporates domain-knowledge-guided structured architecture templates to construct a hierarchical and constrained search space. Second, it introduces a Q-learning-driven adaptive mutation mechanism to enable efficient, goal-directed exploration. Third, it integrates a full-probability search strategy to enhance the stability and reproducibility of high-performing architecture discovery. Experiments on the Spider dataset demonstrate that the proposed method achieves an 8% absolute accuracy improvement over baseline models. Moreover, it consistently converges to high-performance architectures across multiple independent runs, significantly improving both search efficiency and generalization capability.

Technology Category

Application Category

📝 Abstract
Neural Architecture Search (NAS) has garnered significant research interest due to its capability to discover architectures superior to manually designed ones. Learning text representation is crucial for text classification and other language-related tasks. The NAS model used in text classification does not have a Hybrid hierarchical structure, and there is no restriction on the architecture structure, due to which the search space becomes very large and mostly redundant, so the existing RL models are not able to navigate the search space effectively. Also, doing a flat architecture search leads to an unorganised search space, which is difficult to traverse. For this purpose, we propose HHNAS-AM (Hierarchical Hybrid Neural Architecture Search with Adaptive Mutation Policies), a novel approach that efficiently explores diverse architectural configurations. We introduce a few architectural templates to search on which organise the search spaces, where search spaces are designed on the basis of domain-specific cues. Our method employs mutation strategies that dynamically adapt based on performance feedback from previous iterations using Q-learning, enabling a more effective and accelerated traversal of the search space. The proposed model is fully probabilistic, enabling effective exploration of the search space. We evaluate our approach on the database id (db_id) prediction task, where it consistently discovers high-performing architectures across multiple experiments. On the Spider dataset, our method achieves an 8% improvement in test accuracy over existing baselines.
Problem

Research questions and friction points this paper is trying to address.

Efficiently explores diverse architectural configurations for text classification
Reduces large redundant search space in Neural Architecture Search
Improves navigation using hierarchical hybrid structure and adaptive mutations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hierarchical hybrid neural architecture search method
Adaptive mutation policies using Q-learning feedback
Probabilistic model with domain-specific architectural templates
🔎 Similar Papers
No similar papers found.
Anurag Tripathi
Anurag Tripathi
Info Origin INC
Ajeet Kumar Singh
Ajeet Kumar Singh
Info Origin INC
R
Rajsabi Surya
Info Origin INC
A
Aum Gupta
Indian Institute of Technology, New Delhi (IITD)
S
Sahiinii Lemaina Veikho
Indian Institute of Technology, New Delhi (IITD)
Dorien Herremans
Dorien Herremans
Associate professor, Singapore University of Technology and Design
machine learningartificial intelligenceaudiomusicfintech
S
Sudhir Bisane
Info Origin INC