[17th EPFL CIS - RIKEN AIP Joint Seminar] Talks by Babak Falsafi, EPFL CIS

Wed, 27 Jul 2022 15:00 - 16:00 JST
Online Link visible to participants

Registration is closed

Get invited to future events

Free admission
-The passcode: X6dS8d05Vb -Time Zone:JST -The seats are available on a first-come-first-served basis. -When the seats are fully booked, we may stop accepting applications. -Simultaneous interpretation will not be available.


EPFL CIS and RIKEN AIP started a seminar, titled “EPFL CIS - RIKEN AIP Joint Seminar series" from October, 2021.

EPFL is located in Switzerland and is one of the most vibrant and cosmopolitan science and technology institutions. EPFL has both a Swiss and international vocation and focuses on three missions: teaching, research and innovation.

The Center for Intelligent Systems (CIS) at EPFL, a joint initiative of the schools ENAC, IC, SB, STI and SV seeks to advance research and practice in the strategic field of intelligent systems.

RIKEN is Japan's largest comprehensive research institution renowned for high-quality research in a diverse range of scientific disciplines.

RIKEN Center for Advanced Intelligence Project (AIP) houses more than 30 research teams ranging from fundamentals of machine learning and optimization, applications in medicine, materials, and disaster, to analysis of ethics and social impact of artificial intelligence.

【The 17th Seminar】

Date and Time: July 27th 3:00pm – 4:00pm(JST)
Venue:Zoom webinar

Language: English

Speaker: Babak Falsafi, EPFL CIS

Title: Numerical Encoding for DNN Accelerators

With the emergence of big data and the increased integration of deep learning into modern digital platforms, real-world applications of deep neural networks (DNNs) have become ubiquitous. While DNNs and data are exploding in size, Moore’s Law which dictated decades of silicon density scaling since the 70’s, is slowing down, steering researchers and vendors away from general-purpose platforms and towards accelerator design. Unfortunately, the two workloads laying the foundation for DNNs have vastly diverse computational and storage requirements, resulting in divergent accelerator designs. In this talk, I will present a novel mixed-precision encoding for both training and inference called HBFP that enables linear improvements in storage and bandwidth and quadratic improvements in arithmetic logic for accelerators. Moreover, HBFP allows for unified training and inference accelerator design, mitigating the divergence in accelerators and eliminating quantization from training to inference. The talk will include an overview of the ColTrain project at EPFL which has been partly funded by Microsoft.

Babak is a Professor and the founder of EcoCloud at EPFL. His contributions to computer systems include the first NUMA multiprocessors built by Sun Microsystems, memory streaming integrated in IBM BlueGene (temporal) and ARM cores (spatial), and performance evaluation methodologies that in use by AMD, HP and Google PerfKit. He has shown that memory consistency models are neither necessary nor sufficient to achieve high performance in servers. These results led to fence speculation in modern CPUs. His work on workload-optimized server processors laid the foundation for the first generation of Cavium ARM server CPUs, ThunderX. He is a recipient of an Alfred P. Sloan Research Fellowship, and a fellow of ACM and IEEE.

All participants are required to agree with the AIP Seminar Series Code of Conduct.
Please see the URL below.

RIKEN AIP will expect adherence to this code throughout the event. We expect cooperation from all participants to help ensure a safe environment for everybody.

About this community



Public events of RIKEN Center for Advanced Intelligence Project (AIP)

Join community