🤖 AI Summary
This work proposes a large language model–driven evolutionary optimization framework to address algorithmic performance bottlenecks in computational cosmology, specifically in tasks such as initial condition reconstruction, 21 cm foreground removal, and baryonic modeling in N-body simulations. The framework automatically identifies and optimizes free parameters by integrating gradient-based and gradient-free strategies, enabling iterative evolution and enhancement of scientific algorithms. It further generates comparative reports that include detailed improvement rationales and parameter analyses. Evaluated across three representative cosmological tasks, the optimized algorithms consistently outperform hand-tuned baselines by significant margins. All code and benchmark tasks have been publicly released to facilitate reproducibility and community adoption.
📝 Abstract
We develop a general framework to discover scientific algorithms and apply it to three problems in computational cosmology. Our code, MadEvolve, is similar to Google's AlphaEvolve, but places a stronger emphasis on free parameters and their optimization. Our code starts with a baseline human algorithm implementation, and then optimizes its performance metrics by making iterative changes to its code. As a further convenient feature, MadEvolve automatically generates a report that compares the input algorithm with the evolved algorithm, describes the algorithmic innovations and lists the free parameters and their function. Our code supports both auto-differentiable, gradient-based parameter optimization and gradient-free optimization methods. We apply MadEvolve to the reconstruction of cosmological initial conditions, 21cm foreground contamination reconstruction and effective baryonic physics in N-body simulations. In all cases, we find substantial improvements over the base algorithm. We make MadEvolve and our three tasks publicly available at madevolve.org.