I’m Manuel Münsch, one of the senior researchers within CEEC. I work at the Institute of Fluid Mechanics (LSTM) of FAU.

I did my master’s in mechanical engineering with a specialization in aerospace engineering at RWTH Aachen University. Already during my studies, I was enthusiastic about fluid dynamics. At that stage, I was quite sure that I wouldn’t spend my working life mainly with computers, though. Then, a seminar talk by Prof. Reinald Löhner, from George Mason University in Fairfax, where he showed computational fluid dynamics (CFD) results of the flow through an entire city left me stunned: so many unknowns, so many data points and such a large system of equations to be solved on a “supercomputer” – Really impressive. After realizing that post-processing of experimental data gained from particle-image-velocimetry also required programming, I changed to the CFD-side of life for the time being. After my master’s thesis on computational aero-acoustics at DLR Brunswick, I joined LSTM to do my Ph.D. on the numerical simulation of fluid-structure-interaction in turbulent flows. Through this work, I learned a lot about CFD in general, large-eddy simulation (a special approach to modelling turbulent flows), and turbulence as well as code-development, high-performance computing, parallelization, and code-to-code communication.
Following my doctorate, I expanded my experience into project and group leadership through several research projects dealing with multiphase flows, fluid-structure-interaction or heat exchange, and several industrial related research and development projects. When Philipp Schlatter joined FAU/LSTM in 2023, he presented CEEC to me and offered me the opportunity to join. Fascinated (again) by huge and relevant test-cases as well as large systems of equations with a lot of unknowns to be solved by (pre)-exascale high-performance computers, I joined CEEC with enthusiasm.
Work Within CEEC
Within CEEC I am leading our work on exascale techniques. Our goal is to enhance the techniques and technologies required to implement the lighthouse cases at exascale. Together with colleagues from BSC, USTUTT, and KTH, five different tasks are addressed, namely the orchestration of workflows, machine-learning based sub-models, visualization and data management, uncertainty quantification, and dynamic resource management.
As the leader of this work, my tasks include organizing the bi-weekly meetings, where we share findings and updates, and keeping track of deliverables and milestones related to our work package. Besides these organizational aspects, I am currently focusing on wall-modeled large eddy simulations (WMLES) together with our partners from USTUTT-IAG and BSC. While our partners have a focus on aerospace applications (shock-boundary layer interaction and buffet on wings at the edge of the flight envelope and aeroelastic simulation of the SFB 401 wing) at LSTM we are addressing a hydrodynamic application – the flow around a ship hull, as described in more detail below.
The motivation for using machine-learning-based sub-models lies in the fact that the resolution required to fully resolve (simulate) the turbulent structures around a ship hull is too computationally demanding even with exascale supercomputers. Thus, the flow field near the wall cannot be resolved in detail, requiring modeling through wall-modeled large eddy simulations. Since our application case (the ship) involves curved or stepwise surfaces that induce complex flow fields with separation and re-attachment, data-driven techniques such as artificial neural networks are used to enrich state-of-the-art analytical models.
Another task we are dealing with at LSTM, together with our partners from USTUTT-IAG, is the Uncertainty Quantification which is about the characterization of sources of uncertainty in our simulations or models and the quantification of their impact on the results. Sources of uncertainty are for example initial and boundary conditions, turbulence models or numerical methods in general. Sensitivity analysis is used to quantify the sensitivity of quantities of interest with respect to uncertain inputs or (model-) parameters.
The task’s progress and model development are directly connected to the development of the high-order spectral element framework Neko, a joint undertaking that goes beyond CEEC with colleagues from KTH, DTU and KAUST. In a first stage, the code and developments are tested on smaller or common benchmark cases like the flow over periodic hills, a forward-facing step or channel flow configuration. After successful testing, Neko will be used to predict the flow around the application or lighthouse case.
“Fortran is outdated” is something I heard during my studies. Fast forward a few years, and I found myself writing Fortran code during my Ph.D. Now we are tackling exascale CFD with Neko, a portable framework for high-order spectral element flow simulations, also written in (modern) Fortran. -Manuel Münsch
In addition to my contributions to the work on exascale techniques, I am coordinating the work on the lighthouse case called “Merchant ship hull”, as mentioned above. In this lighthouse case, Neko is used to perform high-order computations of the flow around a ship hull of a merchant ship in model scale, specifically that of the Japan Bulk Carrier. We will perform large-eddy simulations and use the obtained data to analyze the physics of the flow. Here, a critical component of evaluating the performance of a ship hull design is accurately predicting its associated flow structures forming near the hull’s surface and in its wake. This concerns both the prediction of the friction drag as well as the performance of the propulsion system. Both are located in the turbulent wake formed behind the hull of the vessel and are subject to the associated unsteady loads. Thus, in the long-term, our developments within CEEC could potentially be taken up by the marine shipping industry or research institutions, for example, to even further optimize their particular ship hulls or propulsion systems towards more environmentally-friendly solutions.
Ultimate Goals
My ultimate goals for the project are the successful and joint completion of all tasks related to the exascale techniques together with my colleagues, the integration of our developments into the relevant codes for the lighthouse cases, and, ultimately, the demonstration of our capabilities within CEEC through convincing results for the “Merchant Ship Hull” lighthouse case. I would be very happy to see other research institutions and industry branches adopt our findings in the future. Additionally, I aim to build new connections and collaborate with our CEEC partners to learn from their expertise and findings.
Closing Thoughts
“Fortran is outdated” is something I heard during my studies. Fast forward a few years, and I found myself writing Fortran code during my Ph.D. Now we are tackling exascale CFD with Neko, a portable framework for high-order spectral element flow simulations, also written in (modern) Fortran. Turns out, Fortran never really left—it’s just been plotting its epic comeback all along.