Cosmological Hydrodynamics at Exascale: A Trillion-Particle Leap in Capability

📅 2025-10-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To support next-generation all-sky, all-weather astronomical surveys, cosmological hydrodynamical simulations are urgently needed that simultaneously achieve high fidelity, large physical scale, and comprehensive astrophysical physics. This paper introduces CRK-HACC, a novel simulation framework integrating multi-scale decoupled modeling, GPU-resident tree algorithms, in situ analysis pipelines, and a hierarchical I/O architecture. CRK-HACC enables, for the first time, end-to-end cosmic structure formation simulations at the four-trillion-particle scale over the full sky. The simulation achieves a peak performance of 513.1 PFLOPs, processes 4.66 billion particles per second, and generates over 100 PB of data per week—representing an order-of-magnitude improvement in both scale and efficiency over prior work. This unprecedented simulation provides the largest and most physically realistic numerical foundation to date for precision cosmology.

Technology Category

Application Category

📝 Abstract
Resolving the most fundamental questions in cosmology requires simulations that match the scale, fidelity, and physical complexity demanded by next-generation sky surveys. To achieve the realism needed for this critical scientific partnership, detailed gas dynamics, along with a host of astrophysical effects, must be treated self-consistently with gravity for end-to-end modeling of structure formation. As an important step on this roadmap, exascale computing enables simulations that span survey-scale volumes while incorporating key subgrid processes that shape complex cosmic structures. We present results from CRK-HACC, a cosmological hydrodynamics code built for the extreme scalability requirements set by modern cosmological surveys. Using separation-of-scale techniques, GPU-resident tree solvers, in situ analysis pipelines, and multi-tiered I/O, CRK-HACC executed Frontier-E: a four trillion particle full-sky simulation, over an order of magnitude larger than previous efforts. The run achieved 513.1 PFLOPs peak performance, processing 46.6 billion particles per second and writing more than 100 PB of data in just over one week of runtime.
Problem

Research questions and friction points this paper is trying to address.

Simulating cosmological structure formation with gas dynamics and gravity
Achieving exascale computing for survey-scale cosmological simulations
Developing scalable hydrodynamics code for trillion-particle cosmological modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

GPU-resident tree solvers for gravity calculation
Separation-of-scale techniques for efficient computation
Multi-tiered I/O system handling massive data output
🔎 Similar Papers
No similar papers found.
N
Nicholas Frontiere
Computational Science Division, Argonne National Laboratory
J
J. D. Emberson
Computational Science Division, Argonne National Laboratory
M
Michael Buehlmann
Computational Science Division, Argonne National Laboratory
E
Esteban M. Rangel
Computational Science Division, Argonne National Laboratory
Salman Habib
Salman Habib
Arizona State University
Information TheoryMachine Learning
K
Katrin Heitmann
High Energy Physics Division, Argonne National Laboratory
P
Patricia Larsen
Computational Science Division, Argonne National Laboratory
V
Vitali Morozov
Argonne Leadership Computing Facility, Argonne National Laboratory
A
Adrian Pope
Computational Science Division, Argonne National Laboratory
C
Claude-André Faucher-Giguère
Department of Physics and Astronomy, Northwestern University
A
Antigoni Georgiadou
National Center for Computational Sciences, Oak Ridge National Laboratory
D
Damien Lebrun-Grandié
Computational Sciences and Engineering Division, Oak Ridge National Laboratory
Andrey Prokopenko
Andrey Prokopenko
Oak Ridge National Laboratory
Numerical analysisscientific computing