This is an online seminar. Registration is required.
【Continuous Optimization Team】
【Date】2023/March/24(Fri) 9:00-10:00(JST)
【Speaker】 Saeid Hajizadeh University of Chicago
Title: Large-scale Minimax Optimization Problems
Abstract
Minimax optimization has historically been a very important topic in optimization. With applications in economics, game theory, control theory, and robust optimization, this topic has been the focus of many researchers since the middle of the twentieth century. More recently, the advent of new applications in GANs, reinforcement learning, and robust training of machine learning models, has piqued the interest of many researchers and shifted their attention towards this area of research. The scale of these recent applications in data science naturally leads to the use of first-order methods. However, these applications also often involve objective functions that are nonconvex-nonconcave which prevents the application of typical descent methods like Gradient Descent-Ascent, which is known to diverge even in bilinear problems. The aim of this talk is to cover first order methods and their convergence to the local solution of these problems. We will first present the convergence of Extra-Gradient Method, a cheap and scalable algorithm, under strong interaction dominance, and will subsequently talk about the convergence of Proximal-Point Methods under the same structural assumptions and in the presence of constraints. Finally, we will present some of the properties of generalizations of the Moreau envelope, called the saddle envelope, to primal-dual problems that could be of independent interest later on in the development of algorithms for solving modern minimax optimization problems.
Public events of RIKEN Center for Advanced Intelligence Project (AIP)
Join community