Abstract: We will discuss a number of new connections between discrete and continuous optimization.
The first part will present a faster second-order convex optimization algorithms for classical graph algorithmic problems. In particular, we will show that the running time of interior point methods is closely connected to spectral connectivity notions in the underlying graph, such as electrical conductance and effective resistance. We will explore these connections along two orthogonal directions: making manual interventions to the graph to improve connectivity, and keeping track of connectivity so as to make faster updates. These ideas lead to the first runtime improvement for the minimum cost flow problem in more than 10 years, as well as faster algorithms for problems like negative-weight shortest path and minimum cost perfect matching.
In the second part, we will discuss efficient optimization algorithms for problems relevant to machine learning that have some discrete element, such as sparse or low rank structure. We introduce a new technique, called adaptive regularization, which eliminates the sparsity performance degradation caused by l2 projections onto structured non-convex domains, like the set of sparse vectors or low rank matrices. This leads to improving the sparsity guarantees of one of the most well known sparse optimization algorithms, IHT.