Galley: Modern Query Optimization for Sparse Tensor Programs
Speaker
Kyle Deeds
University of Washington
Host
Martin Rinard
MIT=CSAIL
Abstract: Modern computing has become dominated by the tensor programming abstraction. This framework allows users to write high performance programs for bulk computation via a high-level imperative interface. Recent work on sparse tensor compilation has extended this paradigm to handle data which cannot be efficiently represented with dense rectilinear arrays, such as graphs and physical simulations. However, these systems require users, often data scientists, to take on the role of performance engineer. They must make complex decisions about program structure and data layouts which have drastic impacts on the program's efficiency. In this talk, I will present Galley, a system for declarative sparse tensor computing. This system builds on the rich history of query optimization in databases to perform a cost-based lowering from declarative sparse tensor programs to the imperative language of sparse tensor compilers. By doing so, we achieve up to 100x performance improvement on problems ranging from machine learning to sub-graph counting.
Bio: Kyle Deeds is currently a 5th year PhD student in the database lab at the University of Washington. He is advised by Dan Suciu and Magda Balazinska and interested in the intersection of databases, compilers, and HPC. Broadly, his work aims to make high performance computing accessible to domain experts through declarative programming coupled with automatic program optimization.
Bio: Kyle Deeds is currently a 5th year PhD student in the database lab at the University of Washington. He is advised by Dan Suciu and Magda Balazinska and interested in the intersection of databases, compilers, and HPC. Broadly, his work aims to make high performance computing accessible to domain experts through declarative programming coupled with automatic program optimization.