Fast Linear Algebra for Distance Matrices

A Google TechTalk, presented by Sandeep Silwal, 2022/06/21 ABSTRACT: I will talk about some of my recent work on fast algorithms for linear algebraic primitives tailored for distance matrices, a common family of dense matrices arising in many applications. The main upper bound results are linear time algorithms for computing matrix-vector products for some distance functions, including the L1 metric, which only use the underlying dataset, rather than constructing the distance matrix. Efficient matrix-vector products automatically have many downstream applications (think power method or iterative methods), and many recent works have nailed down the complexity of various linear algebraic tasks, such as low-rank approximation, in terms of the number of matrix-vector queries required. The upper bound results are complemented by lower bounds, including a conditional Omega(n^2) lower bound for any algorithm which computes a matrix-vector product for the L_infinity distance matrix on a set of n points, showing a separation between the L1 and L_infinity cases. Time permitting, I will also talk about constructing approximate L2 distance matrices in time faster than the bound implied by the Johnson-Lindenstrauss lemma. This is joint work with Piotr Indyk. Presented as part of the Google Algorithms, Theory, and Optimization speaker series. About the Speaker: Sandeep is now a 4th year PhD student at MIT working with Piotr Indyk. He seeks to work in the ’sweet spot’ between theory and practice. Recently he has been working in the intersection of ML and classical algorithms to design efficient algorithms for large datasets, as well as using ML to inspire algorithm design.
Back to Top