Doorkeeper

Talk by Dr. Xiangming Meng (Huawei Technologies, Co., Ltd.)

Wed, 27 Feb 2019 10:00 - 11:00 JST

RIKEN AIP Open

Nihonbashi 1-chome Mitsui Building, 15th floor, 1-4-1 Nihonbashi, Chuo-ku, Tokyo 103-0027, Japan

Register

Registration is closed

Get invited to future events

Free admission

Description

Speaker: Dr. Xiangming Meng (Huawei Technologies, Co., Ltd.)

Title: Approximate Bayesian Inference for Generalized Linear Models: A Message Passing Approach.

Abstract:
A variety of fundamental problems in information theory and machine learning can be formulated as Bayesian inference for generalized linear models (GLM). In practice, though, exact Bayesian inference is very challenging and actually intractable since (1) there is no closed-form solutions for most of the problems concerned and (2) the computation complexity is exponentially high, a phenomena known as the curse of dimensionality for big data. In this talk, we will resort to one kind of approximate Bayesian inference methods called message passing. Specifically, we first focus on one special and simple case of GLM called standard linear models (SLM) whose corresponding likelihood distribution is Gaussian. In this regard, the well-known approximate message passing (AMP) algorithm is derived from the expectation propagation (EP), thus explicitly establishing the intrinsic connections between AMP and EP. Then, to tackle the more general problem of GLM inference, we propose a unified Bayesian inference framework which iteratively reduces the original GLM to a sequence of simpler pseudo SLM problems. Such unified framework provides new perspectives on some existing GLM inference algorithms and, more importantly, significantly facilitates the design of novel GLM inference algorithms from SLM ones, e.g., sparse Bayesian learning (SBL). Finally, we consider a more challenging extension of GLM called generalized bilinear models, in which, compared to GLM, the linear mixing matrix is also unknown. To address such challenging problem, using a combination of expectation maximization (EM) and EP within the above unified framework, we propose an efficient algorithm called bilinear adaptive generalized vector approximate message passing (BAd-GVAMP). Numerical simulation results demonstrate that the proposed BAd-GVAMP achieves near oracle performance in compressed sensing with matrix uncertainty and structured dictionary learning from nonlinear measurements.

About this community

RIKEN AIP Public

RIKEN AIP Public

Public events of RIKEN Center for Advanced Intelligence Project (AIP)

Join community