Jelena Diakonikolas – On Min-Max Optimization and Halpern Iteration

/ August 13, 2020/

When:
September 22, 2020 @ 12:00 pm – 1:00 pm
2020-09-22T12:00:00-04:00
2020-09-22T13:00:00-04:00

Abstract: Min-max optimization constitutes a fundamental class of problems with a range of applications in areas such as game theory, engineering, finance, and training of adversarial models in machine learning. In this talk, I will provide an overview of smooth convex-concave min-max optimization. I will then discuss connections between smooth min-max optimization and the problem of finding a fixed point of a nonexpansive (1-Lipschitz) map. This connection allows us to use variants of Halpern iteration – a classical method for finding fixed points of nonexpansive maps – to solve broad classes of min-max optimization problems. The resulting algorithms attain near-optimal convergence rates for different classes of smooth min-max problems, are parameter-free, and lead to guarantees both in terms of approximating the associated optimality gap and the appropriate notion of stationarity (in unconstrained optimization, an approximate stationary point is a point with small gradient). These are the first results that can simultaneously achieve all of the aforementioned guarantees and, up to logarithmic factors, completely resolve the complexity of smooth min-max optimization.

Bio: Jelena Diakonikolas is an assistant professor in the Department of Computer Sciences at UW-Madison. Prior to joining UW-Madison, she held postdoctoral positions at UC Berkeley and Boston University. She received her PhD from Columbia University in 2016. Her research interests are in efficient optimization methods for large scale problems and connections between optimization and other areas such as dynamical systems and Markov Chain Monte Carlo methods. Jelena is a recipient of a Simons-Berkeley Research Fellowship, the Morton B. Friedman Prize for Excellence at Columbia Engineering, and a Qualcomm Innovation Fellowship.

Share this Post