Doorkeeper

[Computational Learning Theory Team Seminar] Talks of Dr. Dimitra Tsigkari and Dr. Fion Mc Inerney (Telefonica, Spain)

2026-05-22(金)13:30 - 15:30 JST
オンライン リンクは参加者だけに表示されます。
申し込む
参加費無料

詳細

We are pleased to announce two research talks by Dr. Dimitra Tsigkari and Dr. Fion Mc Inerney (Telefonica, Spain) at the Open Space of the RIKEN Nihonbashi Office. Their talks are about the intersection of machine learning and network optimization,and learning theory, respectively.

Date: May 22, 2026
Time: 13:30–15:00 (45-minute talk for each, including Q&A)
Online Venue: Open to all registered participants. The seminar will be delivered via Zoom. The URL will be provided only to registered participants.

On-site Venue: For RIKEN members only. Open Space at the RIKEN Nihonbashi Office

Speaker: Dr. Dimitra Tsigkari (Telefonica)
Title: Network Optimization for Split Learning
Abstract: Split Learning (SL) is a modern framework which allows devices (clients) with heterogeneous computing resources to collaboratively train a deep learning model. Relaxing the requirement of Federated Learning (FL) according to which the training takes place at the clients, SL allows clients to offload part of the training to a helper (typically a server in the cloud or the network edge). This allows heterogeneous clients which lack the necessary resources to train a model locally to still participate in distributed learning, potentially making use of underutilized network resources while accelerating the training process. This talk will focus on the optimization challenges that SL introduces and, in particular, concerning the minimization of the training time, presenting results from our INFOCOM 2024 and 2026 papers. Moreover, we will present our recent work published in AAAI 2026 that investigates the phenomenon of catastrophic forgetting in SL in the presence of data heterogeneity.

Speaker: Dr. Fion Mc Inerney (Telefonica)
Title: Non-Clashing Teaching in Graphs

Abstract: In batch machine teaching models, given a concept class S, for each concept C in S, a teacher presents to a learner a carefully chosen set T(C) of correctly labeled examples from C in such a way that the learner can reconstruct C from T(C). This defines the teaching map T and the teaching sets T(C), for all C in S. The goal is to find a teaching map that minimizes the size of a largest teaching set (the teaching dimension). Non-Clashing teaching was recently introduced by Kirkpatrick et al. [ALT 2019] and Fallat et al. [JMLR 2023], and shown to be the most efficient batch machine teaching model satisfying the benchmark for collusion-avoidance set by Goldman and Mathias [COLT 1993]. As any finite binary concept class can be equivalently represented by a set of balls in a graph, we study non-clashing teaching for balls in graphs. We present numerous algorithmic results (NP-hardness, W[1]-hardness, FPT algorithms) and combinatorial results (trees, cycles, interval graphs) for the non-clashing teaching dimension of balls in graphs from our COLT 2024, ICLR 2025, and ICLR 2026 papers, mainly focusing on the natural setting where the teacher can only present positive examples.

We look forward to your participation.

コミュニティについて

RIKEN AIP Public

RIKEN AIP Public

Public events of RIKEN Center for Advanced Intelligence Project (AIP)

メンバーになる