This is an online seminar.
Registration is required.
【Speaker】Jung Yohan
Korea Advanced Institute of Science & Technology,Information & Electronics Research
【Title】
Approximate Bayesian Inference for Stationary Priors and Its Application with Neural Processes
【Abstract】
Stationarity, the assumption that the statistical properties of a dataset remain constant over time, is a widely-used inductive bias in modeling datasets. Gaussian Processes (GPs) have frequently been employed to model this stationarity, utilizing a stationary kernel function for the covariance structure. However, when dealing with a given dataset, selecting an appropriate kernel function from the extensive class of stationary kernels poses a challenge.
In this talk, I will first introduce the construction of a flexible kernel function capable of modeling any stationary process based on Bochner’s theorem. Subsequently, I will introduce the proposed approximate Bayesian inference method designed to train the parameters of this flexible kernel function. Next, I will discuss how deep neural networks (DNNs), referred to as the Neural Process (NP), can be used to model stationary processes. I will present the Bayesian extension of the NP, leveraging task-dependent stationary priors to enhance the modeling of stationary processes.
Public events of RIKEN Center for Advanced Intelligence Project (AIP)
Join community