Program Optimization for Machine Learning

Speaker

Stanford University

Host

Julian Shun
MIT CSAIL
Abstract: Training deep neural networks (DNNs) can be expensive and slow, consuming enormous numbers of compute-hours on parallel machines. This talk will present results on using novel search procedures over programs to reduce training time. In particular, instead of greedily applying program-improving transformations to compute a single improved program, we search a space of programs, considering many possible candidates guided by a global cost function. Application of search-based optimization to two separate problems will be discussed: improving the partitioning and distribution of training data, and reducing the execution time of the DNN computation graph. Both methods speedup training by up to a factor of 3 over current state-of-the-art systems.

Bio: Alex Aiken is the Alcatel-Lucent Professor of Computer Science at Stanford. Alex received his Bachelors degree in Computer Science and Music from Bowling Green State University in 1983 and his Ph.D. from Cornell University in 1988. Alex was a Research Staff Member at the IBM Almaden Research Center (1988-1993) and a Professor in the EECS department at UC Berkeley (1993-2003) before joining the Stanford faculty in 2003. His research interest is in areas related to programming languages. He is an ACM Fellow, a recipient of ACM SIGPLAN's Programming Languages Achievement Award and Phi Beta Kappa's Teaching Award, and a former chair of the Stanford Computer Science Department.