Stochastic Gradient Methods

February 24 - 28, 2014

Schedule


Monday, February 24, 2014

9:00 - 9:45
10:15 - 11:00
Ben Recht (University of California, Berkeley (UC Berkeley))

Why we should all run Hogwild!
PDF Presentation

11:30 - 12:15
2:30 - 3:15
John Langford (Microsoft Research)

Learning to Interact

4:00 - 4:45
Alekh Agarwal (Microsoft Research)

Learning Sparsely Used Overcomplete Dictionaries


Tuesday, February 25, 2014

9:00 - 9:45
10:15 - 11:00
Peter Richtarik (University of Edinburgh)

Accelerated, Parallel and Proximal Coordinate Descent
Presentation (PowerPoint File)

11:30 - 12:15
Rachel Ward (University of Texas at Austin)

Stochastic Gradient Descent with Importance Sampling

2:30 - 3:15
Deanna Needell (Claremont McKenna College)

SGD and its connections to the Kaczmarz method
PDF Presentation

4:00 - 4:45

Wednesday, February 26, 2014

9:00 - 9:45
Yann LeCun (New York University)

Open problems in large-scale stochastic optimization for deep learning

10:15 - 11:00
Francis Bach (Institut National de Recherche en Informatique Automatique (INRIA))

Efficient and robust stochastic approximation through an online Newton method
PDF Presentation

11:30 - 12:15
2:30 - 3:15
John Duchi (University of California, Berkeley (UC Berkeley))

Optimal rates for zero-order optimization: the power of two function evaluations

4:00 - 4:45

Thursday, February 27, 2014

9:00 - 9:45
10:15 - 11:00
11:30 - 12:15
2:30 - 3:15
David McAllester (TTI-Chicago)

A PAC-Bayesian Analysis of Dropouts
PDF Presentation

4:00 - 4:45

Friday, February 28, 2014

9:00 - 9:45
10:15 - 11:00
11:30 - 12:15