- This event has passed.
Precision in linear algebra solvers and its cascade impact on applications
October 11 @ 11:15 am - 12:00 pm
Until recent years, linear algebra solvers were predominantly operating with double precision or binary64. With the advent of AI, in particular deep learning, lower floating-point precisions formats were introduced to accommodate the need of such computations; now the spectrum is shifted to fixed point and integer arithmetic, expanding to block floating point arithmetic. This change has also impacted the linear algebra algorithm development where the concept of mixed-precision was introduced for faster but still reliable solvers. In this talk, I would like to provide a brief overview on mixed-precision algorithmic work, introduce to my own strategy of mixing precisions in a controllable way, and provide preliminary results with applications.