Abstract: MAD-Bayes (MAP-based Asymptotic Derivations) has been recently proposed as a general technique to derive scalable algorithm for Bayesian Nonparametric models. However, the combinatorial nature of objective functions derived from MAD-Bayes results in hard optimization problem, for which current practice employs heuristic algorithms analogous to k-means to find local minimum. In this paper, we consider the exemplar-based version of MAD-Bayes formulation for DP and Hierarchical DP (HDP) mixture model. We show that an exemplar-based MAD-Bayes formulation can be relaxed to a convex structural-regularized program that, under cluster-separation conditions, shares the same optimal solution to its combinatorial counterpart. An algorithm based on Alternating Direction Method of Multiplier (ADMM) is then proposed to solve such program. In our experiments
on several benchmark data sets, the proposed method finds optimal solution of the combinatorial problem and significantly improves existing methods in terms of the exemplar-based objective.
- Topics:
- Data Clustering
- Empirical Risk Minimization
- Graphical Models
- High-Dimensional Statistics
- Topic Models
Download: pdf, slides
Citation
- A Convex Exemplar-based Approach to MAD-Bayes Dirichlet Process Mixture Models (pdf, slides, software)
I. Yen, X. Lin, K. Zhong, P. Ravikumar, I. Dhillon.
In International Conference on Machine Learning (ICML), pp. 2418-2426, July 2015.
Bibtex: