Modeling Ocean Dynamics On Earth: Applications For Understanding Other Water Worlds
Editor’s note: one of the things we are looking for as we examine the icy water worlds in our own solar system and exoplanets circling other stars are oceans. Earth wears its ocean on its surface. Icy covered worlds like Enceladus, Europa, Ganymede, Titan and others have subsurface oceans. Already, some water worlds similar to our own have been observed. How do the oceans of these worlds affect exoplanet global climate – and, by extension, a world’s habitability? Studying and modeling how this all happens on Earth is the first big step in understanding how worlds with oceans operate.
On the beach, ocean waves provide soothing white noise. But in scientific laboratories, they play a key role in weather forecasting and climate research. Along with the atmosphere, the ocean is typically one of the largest and most computationally demanding components of Earth system models like the Department of Energy’s Energy Exascale Earth System Model, or E3SM.
Most modern ocean models focus on two categories of waves: a barotropic system, which has a fast wave propagation speed, and a baroclinic system, which has a slow wave propagation speed. To help address the challenge of simulating these two modes simultaneously, a team from DOE’s Oak Ridge, Los Alamos and Sandia National Laboratories has developed a new solver algorithm that reduces the total run time of the Model for Prediction Across Scales-Ocean, or MPAS-Ocean, E3SM’s ocean circulation model, by 45%.
The researchers tested their software on the Summit supercomputer at ORNL’s Oak Ridge Leadership Computing Facility, a DOE Office of Science user facility, and the Compy supercomputer at Pacific Northwest National Laboratory. They ran their primary simulations on the Cori and Perlmutter supercomputers at Lawrence Berkeley National Laboratory’s National Energy Research Scientific Computing Center, and their results were published in the International Journal of High Performance Computing Applications.
Because Trilinos, a database of open-source software ideal for solving scientific problems on supercomputers, is written in the C++ programming language and Earth system models like E3SM are typically written in Fortran, the team took advantage of ForTrilinos, a related software library that incorporates Fortran interfaces into existing C++ packages, to design and customize the new solver, which focuses on barotropic waves.
“A useful feature of this interface is that we can use every component of the C++ package in the Fortran language so we don’t need to translate anything, which is very convenient,” said lead author Hyun Kang, a computational Earth system scientist at ORNL.
This work builds on research results published in a previous Journal of Advances in Modeling Earth Systems paper in which researchers from ORNL and Los Alamos National Laboratory produced a code by hand to improve MPAS-Ocean. Now, the ForTrilinos-enabled solver has overcome the remaining drawbacks of the solver from the previous study, especially when users run MPAS-Ocean using a small number of compute cores for a given problem size.
MPAS-Ocean’s default solver relies on explicit subcyling, a technique that uses many small time intervals, or time steps, to calculate the characteristics of barotropic waves in conjunction with baroclinic calculations without destabilizing the model. If a baroclinic wave and a barotropic wave can be advanced with time step sizes of 300 seconds and 15 seconds, respectively, the barotropic calculation will need to complete 20 times more iterations to maintain the same speed, which takes a massive amount of computing power.
In contrast, the new solver for the barotropic system is semi-implicit, meaning it is unconditionally stable and thus allows researchers to use the same number of large time steps without sacrificing accuracy, saving significant amounts of time and computing power.
A community of software developers has spent years optimizing various climate applications in Trilinos and Fortrilinos, so the latest MPAS-Ocean solver that leverages this resource outperforms the hand-crafted solver, allowing other scientists to accelerate their climate research efforts.
“If we had to individually code every algorithm, it would require so much more effort and expertise,” Kang said. “But with this software, we can run simulations right away at faster speeds by incorporating optimized algorithms into our program.”
Although the current solver still has scalability limitations on high-performance computing systems, it performs exceptionally well up to a certain number of processors. This disadvantage exists because the semi-implicit method requires all processors to communicate with one another at least 10 times per time step, which can slow down the model’s performance. To overcome this obstacle, the researchers are currently optimizing processor communications and porting the solver to GPUs.
Additionally, the team has updated the time stepping method for the baroclinic system to further improve MPAS-Ocean’s efficiency. Through these advances, the researchers aim to make climate predictions faster, more reliable and more accurate, which are essential upgrades for ensuring climate security and enabling timely decision-making and high-resolution projections.
“This barotropic mode solver enables faster computation and more stable integration of models, especially MPAS-Ocean,” Kang said. “Extensive use of computational resources requires an enormous amount of electricity and energy, but by speeding up this model we can reduce that energy use, improve simulations and more easily predict the effects of climate change decades or even thousands of years into the future.”
This research was supported by E3SM and the Exascale Computing Project, or ECP. E3SM is sponsored by the Biological and Environmental Research program in DOE’s Office of Science, and ECP is managed by DOE and the National Nuclear Security Administration. The Advanced Scientific Computing Research program in DOE’s Office of Science funds OLCF and NERSC.
UT-Battelle manages ORNL for DOE’s Office of Science, the single largest supporter of basic research in the physical sciences in the United States. The Office of Science is working to address some of the most pressing challenges of our time. For more information, please visit https://energy.gov/science. — Elizabeth Rosenthal
An implicit barotropic mode solver for MPAS-ocean using a modern Fortran solver interface, The International Journal of High Performance Computing Applications
astrobiology