Maximizing Submodular Functions Exponentially Faster

Yaron Singer
Harvard University
Computer Science

Many iterative methods in optimization are inherently sequential and consequently cannot be efficiently parallelized. In this talk I’ll describe a novel approach called adaptive sampling that yields algorithms whose parallel running time is exponentially faster from any previous algorithm known for a broad range of machine learning applications. The algorithms are designed for submodular function maximization which is the algorithmic engine behind machine learning applications such as ranking, speech and document summarization, recommendation systems, clustering, Bayesian inference, feature selection, network analysis, and many others. In the talk I’ll introduce the concept of adaptivity, the adaptive sampling framework we recently developed, and present experimental results from various application domains.


Back to Long Programs