Ben Grimmer, “Radial Duality: Scalable, Projection-Free Optimization Methods”
Benjamin Grimmer, PhD
Assistant Professor, Applied Mathematics & Statistics- Johns Hopkins University
Join Zoom Meeting
https://wse.zoom.us/j/99481316345?pwd=UHNWSldld1g1bTc2UnVIbWdGVW8vZz09
Meeting ID: 994 8131 6345
Passcode: Clark
One tap mobile
+13017158592,,99481316345# US (Washington DC) 16465588656,,99481316345#
+US (New York)
Title: “Radial Duality: Scalable, Projection-Free Optimization Methods”
Abstract: Generalizing ideas for solving conic programs by (Renegar, 2016), this talk will introduce a new radial duality between optimization problems. This duality allows us to reformulate (potentially nonconvex) constrained optimization problems into an equivalent unconstrained, Lipschitz continuous form. Designing algorithms to solve this radially dual problem yields new projection-free first-order methods that avoid potentially costly orthogonal projection steps or needing any Lipschitz continuity-type assumptions. The resulting radial optimization methods present an opportunity to scale up much more efficiently than classic alternatives like projected gradient descent or Frank-Wolfe.
Biography: Benjamin Grimmer recently joined the Johns Hopkins AMS as an assistant professor after completing his PhD in Operations Research at Cornell University, advised by Jim Renegar and Damek Davis and supported by an NSF Graduate Research Fellowship. Ben’s research focuses on mathematical programming and continuous optimization methods that work at large-scale. An overarching theme in Ben’s research is bridging the gap between our understanding of classical continuous optimization approaches and the potentially stochastic, nonconvex, nonsmooth, adversarial models employed in many modern data science and machine learning settings.