Hyper-Connections

πŸ“… 2024-09-29
πŸ›οΈ arXiv.org
πŸ“ˆ Citations: 2
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work addresses the β€œseesaw effect” β€” the inherent trade-off between vanishing gradients and representational collapse β€” in residual connection variants. We propose Hyper-Connection, a novel mechanism built upon learnable gating and depth-aware weight allocation, enabling *joint dynamic control* of connection strength and network topology for the first time. It supports adaptive cross-depth feature routing and on-the-fly layer structural reconfiguration. Theoretical analysis shows it circumvents the fundamental limitations of existing residual variants. Lightweight and plug-and-play, Hyper-Connection requires no modifications to optimizers or training pipelines, and is compatible with both dense and sparse large language models (LLMs) as well as vision tasks. Experiments demonstrate significantly accelerated convergence and improved downstream performance in LLM pretraining; consistent gains are also observed on vision benchmarks, validating its cross-modal generalizability and architectural universality.

Technology Category

Application Category

πŸ“ Abstract
We present hyper-connections, a simple yet effective method that can serve as an alternative to residual connections. This approach specifically addresses common drawbacks observed in residual connection variants, such as the seesaw effect between gradient vanishing and representation collapse. Theoretically, hyper-connections allow the network to adjust the strength of connections between features at different depths and dynamically rearrange layers. We conduct experiments focusing on the pre-training of large language models, including dense and sparse models, where hyper-connections show significant performance improvements over residual connections. Additional experiments conducted on vision tasks also demonstrate similar improvements. We anticipate that this method will be broadly applicable and beneficial across a wide range of AI problems.
Problem

Research questions and friction points this paper is trying to address.

Addresses gradient vanishing and representation collapse in neural networks.
Enables dynamic adjustment of feature connection strengths across layers.
Improves performance in large language models and vision tasks.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hyper-connections replace residual connections effectively.
Adjusts connection strength dynamically between network layers.
Improves performance in language and vision tasks.
πŸ”Ž Similar Papers
No similar papers found.