Fractal and Regular Geometry of Deep Neural Networks

📅 2025-04-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study characterizes the geometric evolution of decision boundaries in deep random neural networks as network depth increases. Specifically, it establishes rigorous correspondences between boundary volume growth patterns—governed by activation function properties—and their regularity and spectral parameters: non-smooth activations (e.g., Hard Tanh) yield fractal boundaries whose Hausdorff dimension monotonically increases with depth; in contrast, for regular activations (e.g., ReLU), boundary volume dynamics are fully determined by a single computable spectral parameter, exhibiting convergence, constancy, or exponential divergence. Methodologically, the work integrates random field theory, fractal geometry (Hausdorff dimension analysis), high-dimensional excursion set boundary volume estimation, and Monte Carlo simulation. All theoretical predictions are rigorously validated via numerical experiments. This work provides the first quantitative mapping from the differential regularity of activation functions to the emergent geometric behavior of deep network decision boundaries.

Technology Category

Application Category

📝 Abstract
We study the geometric properties of random neural networks by investigating the boundary volumes of their excursion sets for different activation functions, as the depth increases. More specifically, we show that, for activations which are not very regular (e.g., the Heaviside step function), the boundary volumes exhibit fractal behavior, with their Hausdorff dimension monotonically increasing with the depth. On the other hand, for activations which are more regular (e.g., ReLU, logistic and $ anh$), as the depth increases, the expected boundary volumes can either converge to zero, remain constant or diverge exponentially, depending on a single spectral parameter which can be easily computed. Our theoretical results are confirmed in some numerical experiments based on Monte Carlo simulations.
Problem

Research questions and friction points this paper is trying to address.

Study geometric properties of random neural networks
Analyze boundary volumes for different activation functions
Determine fractal behavior vs. regularity in deep networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analyzes boundary volumes of neural network excursion sets
Links activation functions to fractal or regular geometry
Uses spectral parameter to predict boundary volume behavior
🔎 Similar Papers
No similar papers found.
S
Simmaco Di Lillo
RoMaDS - Department of Mathematics, University of Rome Tor Vergata, Rome, Italy
Domenico Marinucci
Domenico Marinucci
Department of Mathematics, University of Rome Tor Vergata
ProbabilityMathematical StatisticsCosmology and Astrophysics
M
Michele Salvi
RoMaDS - Department of Mathematics, University of Rome Tor Vergata, Rome, Italy
Stefano Vigogna
Stefano Vigogna
University of Rome Tor Vergata
Machine LearningHarmonic Analysis