PIMS-CORDS SFU Operations Research Seminar: Ahmet Alacaoglu
Topic
Towards Weaker Variance Assumptions for Stochastic Optimization: A Blast From the Past
Speakers
Details
In this talk, I will present some recent advances for analyzing stochastic optimization methods without the bounded variance assumption. It is well-known that the bounded variance assumption is violated for even the most standard problems such as linear least squares problem. We will see that the analysis for obtaining optimal rates of convergence under realistic variance assumptions builds on a connection between the classical literatures for stochastic approximation and the Halpern iteration for solving fixed-point problems. We will discuss the extensions to proximal algorithms for solving regularized problems and stochastic convex nonlinear programs, as well as the required ideas for getting rate guarantees on the last iterate of the algorithm, which is widely used in practice.