Universal Neural Architecture Space: Covering ConvNets, Transformers and Everything in Between

📅 2025-10-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
How to construct a universal neural architecture search (NAS) space encompassing convolutional networks, Transformers, and their hybrids to facilitate novel architecture discovery and systematic analysis of existing models? Method: We propose UniNAS—the first unified, differentiable graph-structured NAS space that jointly models all three major architecture families. It employs a standardized training-and-evaluation protocol to ensure reproducibility and fair comparison, and introduces a lightweight graph representation with a customized search algorithm for efficient traversal and optimization over heterogeneous architectures. Contributions/Results: Under unified experimental settings, UniNAS-discovered architectures surpass state-of-the-art hand-crafted models (e.g., ConvNeXt, ViT) on ImageNet, demonstrating its expressive power and practicality. Key contributions include: (1) the first general, extensible neural architecture representation framework; (2) a search paradigm balancing fairness and efficiency; and (3) a foundation for interpretable analysis of architectural evolution patterns.

Technology Category

Application Category

📝 Abstract
We introduce Universal Neural Architecture Space (UniNAS), a generic search space for neural architecture search (NAS) which unifies convolutional networks, transformers, and their hybrid architectures under a single, flexible framework. Our approach enables discovery of novel architectures as well as analyzing existing architectures in a common framework. We also propose a new search algorithm that allows traversing the proposed search space, and demonstrate that the space contains interesting architectures, which, when using identical training setup, outperform state-of-the-art hand-crafted architectures. Finally, a unified toolkit including a standardized training and evaluation protocol is introduced to foster reproducibility and enable fair comparison in NAS research. Overall, this work opens a pathway towards systematically exploring the full spectrum of neural architectures with a unified graph-based NAS perspective.
Problem

Research questions and friction points this paper is trying to address.

Unifying convolutional networks and transformers in one framework
Discovering novel architectures through unified search space
Outperforming hand-crafted models with automated architecture discovery
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unified search space for ConvNets and Transformers
Novel search algorithm for architecture discovery
Standardized toolkit for training and evaluation
🔎 Similar Papers
No similar papers found.