Randomized Numerical Linear Algebra for the Sciences
Time and place
12:30–1:30 PM on Monday, March 30th, 2026; Marshak 1128
Tyler Chen (JPMorganChase)
Abstract
Over the past two decades, randomized algorithms have revolutionized the way we solve linear algebra problems numerically, offering both orders-of-magnitude speedups in real-world computation and the best theoretical running times for many core linear-algebra tasks. Now, as the field of randomized numerical linear algebra (RandNLA) matures, there is an increased effort to apply the techniques developed in RandNLA to application areas.
My research program is broadly aimed at operationalizing RandNLA for widespread use across the sciences. As a case study, I will discuss a recent line of work which aims to use tools from RandNLA, specifically randomized algorithms for matrix approximation, to resolve open theoretical challenges in Operator Learning, a pillar of the emerging field of Scientific Machine Learning. In this talk, I will describe recent work on hierarchical matrix approximation using only black-box access via matrix-vector products (SODA '25). While hierarchical matrices—which capture the structure of many physical operators—are widespread throughout the sciences, we are the first to describe algorithms that provably output a near-optimal hierarchical matrix approximation to an arbitrary matrix in this widespread access model.