Dynamic neurons: A statistical physics approach for analyzing deep neural networks

📅 2024-10-01
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
Understanding the scaling behavior and critical phenomena in deep neural networks (DNNs) remains challenging due to the lack of principled frameworks linking their architecture to statistical-physics-inspired renormalization group (RG) analysis. Method: We model neurons as dynamical degrees of freedom and, for the first time in DNNs, identify approximate translational symmetry. Building upon statistical physics, we develop a scalable dynamical modeling framework that unifies symmetry analysis, RG transformations, and coarse-graining of degrees of freedom. Contribution/Results: Our approach provides a universal characterization of recurrent structural motifs across deep architectures. It bridges RG theory with deep learning by enabling rigorous analysis of critical dynamics—thereby significantly enhancing model interpretability. Most notably, it establishes the first self-consistent theoretical framework for “critical learning,” offering foundational insights into how DNNs may operate near criticality during training and inference.

Technology Category

Application Category

📝 Abstract
Deep neural network architectures often consist of repetitive structural elements. We introduce a new approach that reveals these patterns and can be broadly applied to the study of deep learning. Similar to how a power strip helps untangle and organize complex cable connections, this approach treats neurons as additional degrees of freedom in interactions, simplifying the structure and enhancing the intuitive understanding of interactions within deep neural networks. Furthermore, it reveals the translational symmetry of deep neural networks, which simplifies the application of the renormalization group transformation - a method that effectively analyzes the scaling behavior of the system. By utilizing translational symmetry and renormalization group transformations, we can analyze critical phenomena. This approach may open new avenues for studying deep neural networks using statistical physics.
Problem

Research questions and friction points this paper is trying to address.

Decoupling neurons to simplify deep neural network structures
Revealing translational symmetry in deep neural networks
Applying renormalization group analysis to study critical phenomena
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dynamic neuron approach decouples neurons
Utilizes translational symmetry in networks
Applies renormalization group transformations
🔎 Similar Papers
No similar papers found.
D
Donghee Lee
Department of Physics, Korea Advanced Institute of Science and Technology, Daejeon 34141, Korea
H
Hye-Sung Lee
Department of Physics, Korea Advanced Institute of Science and Technology, Daejeon 34141, Korea
J
Jaeok Yi
Department of Physics, Korea Advanced Institute of Science and Technology, Daejeon 34141, Korea