Learning to be Reproducible: Custom Loss Design for Robust Neural Networks

📅 2026-01-02
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the inherent instability and irreproducibility of deep learning models under identical training conditions, which often arise from random factors such as weight initialization and data shuffling. To mitigate this issue, the study explicitly incorporates training stability into the optimization objective and proposes a novel custom loss function (CLF) that significantly reduces model sensitivity to stochasticity while preserving predictive accuracy. Extensive experiments across diverse tasks—including image classification and time series forecasting—and various architectures demonstrate that the proposed method consistently enhances training robustness, effectively achieving a unified balance between high accuracy and high reproducibility.

Technology Category

Application Category

📝 Abstract
To enhance the reproducibility and reliability of deep learning models, we address a critical gap in current training methodologies: the lack of mechanisms that ensure consistent and robust performance across runs. Our empirical analysis reveals that even under controlled initialization and training conditions, the accuracy of the model can exhibit significant variability. To address this issue, we propose a Custom Loss Function (CLF) that reduces the sensitivity of training outcomes to stochastic factors such as weight initialization and data shuffling. By fine-tuning its parameters, CLF explicitly balances predictive accuracy with training stability, leading to more consistent and reliable model performance. Extensive experiments across diverse architectures for both image classification and time series forecasting demonstrate that our approach significantly improves training robustness without sacrificing predictive performance. These results establish CLF as an effective and efficient strategy for developing more stable, reliable and trustworthy neural networks.
Problem

Research questions and friction points this paper is trying to address.

reproducibility
robustness
training stability
deep learning
model reliability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Custom Loss Function
Training Stability
Reproducibility
Robust Neural Networks
Stochastic Sensitivity
🔎 Similar Papers
No similar papers found.
W
Waqas Ahmed
Friedrich Schiller University Jena, Jena, Germany
S
Sheeba Samuel
University of Technology Chemnitz, Chemnitz, Germany
K
Kevin Coakley
Norwegian University of Science and Technology, Trondheim, Norway
B
Birgitta Koenig-Ries
Friedrich Schiller University Jena, Jena, Germany
Odd Erik Gundersen
Odd Erik Gundersen
Associate Professor, NTNU
Artificial IntelligenceMachine LearningReproducibilityApplied AI