Mean field for the stochastic blockmodel: Optimization landscape and convergence issues
Advances in Neural Information Processing Systems
Variational approximation has been widely used in large-scale Bayesian inference recently, the simplest kind of which involves imposing a mean field assumption to approximate complicated latent structures. Despite the computational scalability of mean field, theoretical studies of its loss function surface and the convergence behavior of iterative updates for optimizing the loss are far from complete. In this paper, we focus on the problem of community detection for a simple two-class Stochastic Blockmodel (SBM). Using batch co-ordinate ascent (BCAVI) for updates, we show different convergence behavior with respect to different initializations. When the parameters are known, we show that a random initialization can converge to the ground truth, whereas in the case when the parameters themselves need to be estimated, a random initialization will converge to an uninformative local optimum.
Mukherjee, Soumendu Sunder; Sarkar, Purnamrita; Rachel Wang, Y. X.; and Yan, Bowei, "Mean field for the stochastic blockmodel: Optimization landscape and convergence issues" (2018). Conference Articles. 117.