Mean field for the stochastic blockmodel: Optimization landscape and convergence issues

Document Type

Conference Article

Publication Title

Advances in Neural Information Processing Systems

Abstract

Variational approximation has been widely used in large-scale Bayesian inference recently, the simplest kind of which involves imposing a mean field assumption to approximate complicated latent structures. Despite the computational scalability of mean field, theoretical studies of its loss function surface and the convergence behavior of iterative updates for optimizing the loss are far from complete. In this paper, we focus on the problem of community detection for a simple two-class Stochastic Blockmodel (SBM). Using batch co-ordinate ascent (BCAVI) for updates, we show different convergence behavior with respect to different initializations. When the parameters are known, we show that a random initialization can converge to the ground truth, whereas in the case when the parameters themselves need to be estimated, a random initialization will converge to an uninformative local optimum.

First Page

10694

Last Page

10704

Publication Date

1-1-2018

This document is currently not available here.

Share

COinS