This is an online seminar.
Registration is required.
【Speaker】Dr. Alexander Tyurin
King Abdullah University of Science and Technology
【Title】
New Theoretical Results in Distributed Optimization: Compressed Communication, Asynchronous Computations, and Beyond
【Abstract】
Distributed optimization and federated learning are important research directions due to i) the recent boom of large machine learning models (e.g., ChatGPT, DALL-E) that require much computation resources and time to train them ii) even relatively small machine learning models are trained in a distributed environment to preserve the privacy of each device and avoid expensive communications between them and servers. In this talk, new theoretical breakthroughs that solve some important problems from distributed optimization will be presented: i) communication complexity between devices is one of the main metrics in distributed optimization. We will discuss newly obtained communication complexities. ii) after that, we examine the problem of developing a method in the setup where devices asynchronously calculate stochastic gradients. iii) beyond that, we will discuss other problems, such as the partial participation of devices and correlated compressed communication.
Public events of RIKEN Center for Advanced Intelligence Project (AIP)
Join community