Doorkeeper

Talk by Mufan Li (Princeton) [Deep Learning Theory Team Seminar]

Tue, 20 May 2025 14:00 - 15:00 JST

Room 522

Room 522 (5th floor), Faculty of Engineering Building 14, Hongo-Campus, University of Tokyo, 7 Chome-3-1 Hongo, Bunkyō, Tokyo, 113-8654, Japan

Register
Free admission

Description

Speaker: Mufan Li (Princeton)

Title: The Proportional Scaling Limit of Neural Networks

Abstract: Recent advances in deep learning performance have all relied on scaling up the number of parameters within neural networks, consequently making asymptotic scaling limits a compelling approach to theoretical analysis. In this talk, we explore the proportional infinite-depth-and-width limit, where the role of depth can be adequately studied, and the limit remains a great model of finite size networks. At initialization, we characterize the limiting distribution of the network via a stochastic differential equation (SDE) for the feature covariance matrix. Furthermore, in the linear network setting, we can also characterize the spectrum of the covariance matrix in the large data limit via a geometric variant of Dyson Brownian motions. Finally, we will briefly discuss ongoing work towards analyzing training dynamics.

(The Zoom URL for hybrid broadcasting will be sent just before the seminar starts)

About this community

RIKEN AIP Public

RIKEN AIP Public

Public events of RIKEN Center for Advanced Intelligence Project (AIP)

Join community