FAN: Fourier Analysis Networks

📅 2024-10-03
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing general-purpose neural networks (e.g., MLPs, Transformers) lack built-in inductive bias for periodic patterns, limiting their out-of-distribution (OOD) generalization and performance on periodic tasks such as time-series forecasting and symbolic reasoning. To address this, we propose Fourier Adaptive Networks (FAN), the first architecture to deeply integrate Fourier principles into generic neural network design. FAN introduces parameterized frequency-domain activations, learnable periodic embeddings, and a lightweight spectral transformation module—collectively enabling native support for periodic priors. Crucially, FAN remains task-agnostic and serves as a drop-in replacement for MLPs. Extensive experiments demonstrate that FAN significantly outperforms both MLPs and Transformers across diverse benchmarks—including symbolic formula representation, time-series forecasting, language modeling, and image recognition—while using fewer parameters and FLOPs. Moreover, FAN achieves substantial improvements in OOD generalization, validating its enhanced capacity to capture and extrapolate periodic structure.

Technology Category

Application Category

📝 Abstract
Despite the remarkable successes of general-purpose neural networks, such as MLPs and Transformers, we find that they exhibit notable shortcomings in modeling and reasoning about periodic phenomena, achieving only marginal performance within the training domain and failing to generalize effectively to out-of-domain (OOD) scenarios. Periodicity is ubiquitous throughout nature and science. Therefore, neural networks should be equipped with the essential ability to model and handle periodicity. In this work, we propose FAN, a novel general-purpose neural network that offers broad applicability similar to MLP while effectively addressing periodicity modeling challenges. Periodicity is naturally integrated into FAN's structure and computational processes by introducing the Fourier Principle. Unlike existing Fourier-based networks, which possess particular periodicity modeling abilities but are typically designed for specific tasks, our approach maintains the general-purpose modeling capability. Therefore, FAN can seamlessly replace MLP in various model architectures with fewer parameters and FLOPs. Through extensive experiments, we demonstrate the superiority of FAN in periodicity modeling tasks and the effectiveness and generalizability of FAN across a range of real-world tasks, e.g., symbolic formula representation, time series forecasting, language modeling, and image recognition.
Problem

Research questions and friction points this paper is trying to address.

Periodic Phenomena
Neural Networks
Adaptability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fourier Analysis Network
Periodic Phenomena Processing
Universal Applicability
🔎 Similar Papers
No similar papers found.
Yihong Dong
Yihong Dong
Peking University
Code GenerationLarge Language Models
Ge Li
Ge Li
Full Professor of Computer Science, Peking University
Program AnalysisProgram GenerationDeep Learning
Yongding Tao
Yongding Tao
Peking University
LLMCode Intelligence
X
Xue Jiang
School of Computer Science, Peking University
Kechi Zhang
Kechi Zhang
Peking University
AI4SE
J
Jia Li
School of Computer Science, Peking University
J
Jing Su
ByteDance
J
Jun Zhang
ByteDance
J
Jingjing Xu
ByteDance