Username   Password       Forgot your password?  Forgot your username? 


Linear Mixing Random Measures Based Mixture Models

Volume 13, Number 2, March 2017 - Paper 11 - pp. 221-230
DOI: 10.23940/ijpe.17.02.p11.221230


College of Electronic and Information Engineering, Tongji University, Shanghai 201804, China

(Received on September 03, 2016, revised on October 16, 2016)


When observations are organized into groups where commonalties exist amongst them, the traditional clustering models cannot discover shared clusters among groups. In this scenario, the dependent normalized random measures based clustering is a perfect choice. The most interesting property of the proposed LMRM based clustering is that the clusters are assumed to be shared across groups. Hence the problem can be solved immediately. We derive appropriate exchangeable probability partition function, and subsequently also deduce its inference algorithm given any mixture model likelihood. We provide all necessary derivation and solution to this framework. For demonstration, we used mixture of Gaussians likelihood in combination with a dependent structure constructed by linear combinations of completely random measures. Our experiments show superiority performance when using this framework, where the inferred values including both the mixing weights and the number of clusters both respond appropriately to the number of completely random measure used.


References: 10

[1]. Tamara Broderick, Michael I Jordan, and Jim Pitman. Cluster and Feature Modeling from Combinatorial Stochastic Processes. Statistical Science, 2013; 28(3): 289–312.

[2]. Changyou Chen, Vinayak Rao, Wray Buntine, and Yee Whye Teh. Dependent normalized random measures. Proceedings of The 30th International Conference on Machine Learning. 2013; 28(1): 969–977.

[3]. S. Favaro, A. Lijoi, and I. Prunster. On the stick-breaking representation of normalized inverse Gaussian priors. Biometrika. , 2012; 99(3): 663–674.

[4]. Stefano Favaro and Yee Whye Teh. MCMC for Normalized Random Measure Mixture Models. Statistical Science. 2013; 28(3): 335–359.

[5]. J. E. Griffin, M. Kolossiatis, and M. F J Steel. Comparing distributions by using dependent normalized random-measure mixtures. Journal of the Royal Statistical Society. Series B: Statistical Methodology. 2013; 75(3): 499–529.

[6]. Fabrizio Leisen, Antonio Lijoi, and Dario Spano. A vector of dirichlet processes. Electronic Journal of Statistics. 2013: 7(1): 62–90.

[7]. Antonio Lijoi and Bernardo Nipoti. Dependent mixture models: Clustering and borrowing information. Computational Statistics and Data Analysis. 2014; 71(1): 417–433.

[8]. Liming Wang and Xiaodong Wang. Hierarchical Dirichlet process model for gene expression clustering. EURASIP journal on bioinformatics & systems biology. 2013; 2013(1): 5-10.

[9] Yi-an Ma, Tianqi Chen, and Emily B Fox. A Complete Recipe for Stochastic Gradient MCMC. In Nips, number Mcmc. 2015; 2015(1): 1–19,.

[10]. Mingyuan Zhou and Carin, Lawrence. Negative binomial process count and mixture modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence. 2015; 37(2): 307-320.


Click here to download the paper.

Please note : You will need Adobe Acrobat viewer to view the full articles.Get Free Adobe Reader

This site uses encryption for transmitting your passwords.