Doorkeeper

Talk by Dr. Xiangming Meng (Huawei Technologies, Co., Ltd.)

2019-02-27(水)10:00 - 11:00 JST

理化学研究所 革新知能統合研究センター Open Space

〒103-0027 東京都中央区日本橋1-4-1 日本橋一丁目三井ビルディング 15階

申し込む

申し込み受付は終了しました

今後イベント情報を受け取る

参加費無料

詳細

Speaker: Dr. Xiangming Meng (Huawei Technologies, Co., Ltd.)

Title: Approximate Bayesian Inference for Generalized Linear Models: A Message Passing Approach.

Abstract:
A variety of fundamental problems in information theory and machine learning can be formulated as Bayesian inference for generalized linear models (GLM). In practice, though, exact Bayesian inference is very challenging and actually intractable since (1) there is no closed-form solutions for most of the problems concerned and (2) the computation complexity is exponentially high, a phenomena known as the curse of dimensionality for big data. In this talk, we will resort to one kind of approximate Bayesian inference methods called message passing. Specifically, we first focus on one special and simple case of GLM called standard linear models (SLM) whose corresponding likelihood distribution is Gaussian. In this regard, the well-known approximate message passing (AMP) algorithm is derived from the expectation propagation (EP), thus explicitly establishing the intrinsic connections between AMP and EP. Then, to tackle the more general problem of GLM inference, we propose a unified Bayesian inference framework which iteratively reduces the original GLM to a sequence of simpler pseudo SLM problems. Such unified framework provides new perspectives on some existing GLM inference algorithms and, more importantly, significantly facilitates the design of novel GLM inference algorithms from SLM ones, e.g., sparse Bayesian learning (SBL). Finally, we consider a more challenging extension of GLM called generalized bilinear models, in which, compared to GLM, the linear mixing matrix is also unknown. To address such challenging problem, using a combination of expectation maximization (EM) and EP within the above unified framework, we propose an efficient algorithm called bilinear adaptive generalized vector approximate message passing (BAd-GVAMP). Numerical simulation results demonstrate that the proposed BAd-GVAMP achieves near oracle performance in compressed sensing with matrix uncertainty and structured dictionary learning from nonlinear measurements.

コミュニティについて

RIKEN AIP Public

RIKEN AIP Public

Public events of RIKEN Center for Advanced Intelligence Project (AIP)

メンバーになる