[4th EPFL CIS - RIKEN AIP Joint Seminar] Talks by Nicolas Flammarion, EPFL CIS, RIKEN AIP

Wed, 27 Oct 2021 17:00 - 18:00 JST
Online Link visible to participants
Free admission
Registration closes 27 Oct 17:30
-Time Zone:JST -The seats are available on a first-come-first-served basis. -When the seats are fully booked, we may stop accepting applications. -Simultaneous interpretation will not be available.
There is room for 218 more people


EPFL CIS and RIKEN AIP have started a seminar, titled “EPFL CIS - RIKEN AIP Joint Seminar series" from October, 2021.

EPFL is located in Switzerland and is one of the most vibrant and cosmopolitan science and technology institutions. EPFL has both a Swiss and international vocation and focuses on three missions: teaching, research and innovation.

The Center for Intelligent Systems (CIS) at EPFL, a joint initiative of the schools ENAC, IC, SB, STI and SV seeks to advance research and practice in the strategic field of intelligent systems.

RIKEN is Japan's largest comprehensive research institution renowned for high-quality research in a diverse range of scientific disciplines.

RIKEN Center for Advanced Intelligence Project (AIP) houses more than 40 research teams ranging from fundamentals of machine learning and optimization, applications in medicine, materials, and disaster, to analysis of ethics and social impact of artificial intelligence.

【The 4th Seminar】

Date and Time: October 27th 5:00pm – 6:00pm(JST)
Venue:Zoom webinar

Language: English

Speaker: Nicolas Flammarion, EPFL CIS

Title: Implicit Bias of SGD for Diagonal Linear Networks: a Provable Benefit of Stochasticity

Understanding the implicit bias of training algorithms is of crucial importance in order to explain the success of overparametrised neural networks. In this talk, we study the dynamics of stochastic gradient descent over diagonal linear networks through its continuous time version, namely stochastic gradient flow. We explicitly characterise the solution chosen by the stochastic flow and prove that it always enjoys better generalisation properties than that of gradient flow. Quite surprisingly, we show that the convergence speed of the training loss controls the magnitude of the biasing effect: the slower the convergence, the better the bias. Our findings highlight the fact that structured noise can induce better generalisation and they help to explain the greater performances observed in practice of stochastic gradient descent over gradient descent.

Nicolas Flammarion is a tenure-track assistant professor in computer science at EPFL. Prior to that, he was a postdoctoral fellow at UC Berkeley, hosted by Michael I. Jordan. He received his PhD in 2017 from Ecole Normale Superieure in Paris, where he was advised by Alexandre d’Aspremont and Francis Bach. His research focuses primarily on learning problems at the interface of machine learning, statistics and optimization.

All participants are required to agree with the AIP Seminar Series Code of Conduct.
Please see the URL below.

RIKEN AIP will expect adherence to this code throughout the event. We expect cooperation from all participants to help ensure a safe environment for everybody.

About this community



Public events of RIKEN Center for Advanced Intelligence Project (AIP)

Join community