Reliable and sustainable computations: An application-driven approach

In this talk, Roman Iakymchuk presents his work on accuracy and reproducibility assuring strategies for parallel iterative solvers that may not hold due to the non-associativity of floating-point operations. These strategies primarily rely on guarding every bit of result until final rounding, hence they can be costly. The energy consumption constraint for large-scale computing encourages scientists to revise the architecture design of hardware but also applications, algorithms, as well as the underlying working/ storage precision. The main aim is to make the computing cost sustainable and apply the lagom principle (”not too much, not too little, the right amount”), especially when it comes to working/ storage precision. Thus, he will introduce an approach to address the issue of sustainable, but still reliable, computations from the perspective of computer arithmetic tools. Before lowering precision, one must ensure that the simulation is numerically correct, e.g. by relying on alternative floating-point models/ rounding to pinpoint numerical bugs and to estimate the accuracy. We employ VerifiCarlo and its variable precision backend to identify the parts of the code that benefit from smaller floating-point formats. Finally, we show preliminary results on proxy applications.

Sustainable and Reliable Computing with Tools: Analyzing Precision Appetites of CFD Applications with VerifiCarlo

Energy consumption constraints for large-scale computing encourage scientists to revise the architecture design of hardware but also applications, algorithms, as well as the underlying working/ storage precision. I will introduce an approach to address the issue of sustainable, but still reliable, computations from the perspective of computer arithmetic tools. We employ VerifiCarlo and its variable precision backend to identify the parts of the code that benefit from smaller floating-point formats. Finally, we show preliminary results on proxies of CFD applications.

VPREC to analyze the precision appetites and numerical abnormalities of several proxy applications

The third in a series of presentations from Roman Iakymchuk on work using tools to investigate mixed precision possibilities. He and his co-author Pablo de Oliveira Castro introduce an approach to address the issue of sustainable computations with computer arithmetic tools. They use the variable precision backend (VPREC) to identify parts of code that can benefit from smaller floating-point formats and show preliminary results on several proxy applications.

Precision in linear algebra solvers and its cascade impact on applications

Until recent years, linear algebra solvers were predominantly operating with double precision or binary64. With the advent of AI, in particular deep learning, lower floating-point precisions formats were introduced to accommodate the need of such computations; now the spectrum is shifted to fixed point and integer arithmetic, expanding to block floating point arithmetic. This change […]

CEEC at the Euro HPC Summit Poster Session

Flanders Meeting and Convention Centre, A Room with a ZOO Koningin Astridplein 20-26, Antwerp, Belgium

If you’re interested in our progress over the last year or hoping to ask us questions about our plans for the future, don’t miss the chance to talk with our own Niclas Jansson at the first poster session of the Euro HPC Summit on Tuesday, March 19th!

‘Enabling mixed-precision with the help of tools: A Nekbone case study’ at PPAM24

Ostrava, Czechia Ostrava, Czech Republic

If you’re going to PPAM24 September 8 – 11 in Ostrava, Czechia, make sure to check out the talk by Yanxiang Chen from our partner UMU on their progress implementing and optimizing mixed precision in Nekbone toward its integration in Neko and NekRS. The accompanying paper is also available as a pre-print online: https://arxiv.org/pdf/2405.11065

Enabling mixed-precision with VerifiCarlo: Sharing CEEC experience

Driven by the increasing need to reduce the energy consumption of computing centers and simulations, scientists have begun revising applications, algorithms, and their underlying working/storage precision not just for performance but also for energy efficiency. The goal is to make computational costs sustainable while adhering to the lagom principle—using precision that is “just right” to balance accuracy with efficiency. However, before lowering precision, one must ensure that the simulation is numerically correct. Verificarlo is an open-source framework designed to verify and optimize numeric aaccuracy in complex programs. In this webinar, we will introduce Verificarlo, showcase its backends for numerical bug detection and mixed-precision analysis, and present a success story highlighting the road from analysis of codes with Verificarlo to reliable mixed-precision codes.

Mixed-Precision and Energy-Efficient Computations

Join Yanxiang Chenat ISC in the Foyer D-G – on the 2nd floor to learn about our latest work using mixed precision to reduce energy consumption without compromising accuracy.

CEEC at ISC25

Join our own Niclas Jansson at the Euro HPC JU booth for a presentation on our project and progress so far!

RESEARCH POSTER PITCH with Yanxiang Chen

After listening to Niclas Jansson present our project at the EuroHPC JU booth at 1pm, strole over to Hall E to hear from our own Yanxiang Chen. If you want the chance to ask questions about how we’re using mixed precision to drive energy-to-solution down without compromising accuracy, make sure to stop by the research […]