|
Alon,U. Barkai,N., Notterman,D.A., Gish,K., Ybarra,S. et al.(1999). Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. Proceedings of the National Academy of Sciences 96, 6745-6750. Andrews, J.L. and McNicholas, P.D. (2010). Extending mixtures of multivariate t-factor analyzers. Statistics and Computing 21, 361–373. Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In B.N. Petrov and F. Csake (eds.), Second International Symposium on Information Theory. Budapest: Akademiai Kiado, 267–281. Auguie,B. (2013). R gridExtra package: Functions in Grid graphics. R package version 0.9.1. Baek, J., McLachlan, G.J. and Flack, L.K. (2010). Mixtures of factor analyzers with common factor loadings: applications to the clustering and visualization of high-dimensional data. IEEE Transactions on Pattern Analysis and Machine Intelligence 32, 1298–1309. Baek, J., McLachlan, G.J. (2011). Mixtures of common t-factor analyzers for clustering high dimensional microarray data. Bioinformatics 27(9), 1269– 1276. Banfield J.D. and Raftery A.E. (1993). Model-based Gaussian and non-Gaussian clustering. Biometrics 49, 803–821. Berger, J.O. (1985). Statistical Decision Theory and Bayesian Analysis. New York: Springer. Biernacki, C., Celeux, G. and Govaert, G. (2000). Assessing a Mixture Model for Clustering with the Integrated Completed Likelihood. IEEE Transactions on PAMI 22, 719–725. Brooks, S. P. (2002). Discussion on the paper by Spiegelhalter, D. J., Best, N. G., Carlin, B. P., and van der Linde, A. (2002). Journal Royal Statistical Society, Series B 64(3), 616–618.
Catherine, H. (2012). R gclus package: Clustering Graphics. R package version 1.3.1. Diebolt, J. and Robert, C. (1994). Estimation of finite mixtures through Bayesian Sampling. Journal of the Royal Statistical Society, Series B 56, 363–375. Dong, K.K. and Taylor, J.M.G. (1995). The restricted EM algorithm for maximum likelihood estimation under linear restrictions on the parameters. Journal of the American Statistical Association 90, 707–716. Deepayan, S. (2013). R lattice package: Lattice Graphics. R package version 0.20-15. Flury, B.N. (1984). Common principle components in k groups. Journal of the American Statistical Association 79, 892–898. Forina, M., Armanino, C., Castino,M. and Ubigli, M. (1986). Multivariate data analysis as a discriminating method of the origin of wines. Vitis 25, 189–201. Fokou`e, E. and Titterington, D.M. (2003). Mixtures of factor analysers. Bayesian estimation and inference by stochastic simulation. Machine Learning 50, 73– 94. Fr¨uhwirth-Schnatter, S. and Pyne, S. (2010). Bayesian inference for finite mixtures of univariate and multivariate skew-normal and skew-t distributions. Biostatistics 11, 317–336. Fr¨uhwirth-Schnatter, S. (2006) Finite Mixture and Markov Switching Models. Springer Series in Statistics. New York/Berlin/Heidelberg: Springer. Galimberti, G., Montanari, A. and Viroli, C. (2009). Penalized factor mixture analysis for variable selection in clustered data. Computational Statistics &; Data Analysis 53, 4301–4310. Geman, S. and Geman, D. (1984). Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence 6, 721–741.
Gregory, R. W., Ben, B.,Lodewijk, B., Robert. G., Wolfgang, H., Andy, L., Thomas, L., Martin, M., Arni, M., Steffen, M., Marc, S., Bill, V. (2013). R gplots package: Various R programming tools for plotting data. R package version 2.11.0.1. Ghahramani, Z. and Beal, M.J. (2000). Variational inference for Bayesian mixtures of factor analysers. In Advances in Neural Information Processing Systems 12, Cambridge, MA: MIT Press. Ghahramani, Z. and Jordan, M.I. (1994). Supervised learning from incomplete data via an EM approach. In: Cowan JD, Tesarro G, Alspector J (eds) Ad- vances in neural information processing systems, vol 6. Morgan Kaufmann, San Francisco 120–127. Habert, L. and Arabie, P. (1985). Comparing partitions, J. Classific. 2, 193–218. Hinton, G., Dayan, P. and Revow, M. (1997). Modeling the manifolds of images of handwritten digits. IEEE Transactions on Neural 8, 65–73. Lee, W.L., Chen, Y.C. and Hsieh, K.S. (2003). Ultrasonic liver tissues classification by fractal feature vector based on M-band wavelet transform, IEEE Transactions on Medical Imaging 22, 382–392. Lin, T.I. (2009). Maximum likelihood estimation for multivariate skew normal mixture models. Journal of Multivariate Analysis 100, 257–265. Lin, T.I. (2010). Robust mixture modeling using multivariate skew t distributions. Statistics and Computing 20, 343–356. Lin, T.I., Lee, J.C. and Ho, H.J. (2006). On fast supervised learning for normal mixture models with missing information. Pattern Recogition 39, 1177–1187. Lopes, H.F. and West, M. (2004). Bayesian model assessment in factor analysis. Statistica Sinica 14, 41–67. Martella, F. (2006). Classification of microarray data with factor mixture models. Bioinformatics 22, 202–208.
McLachlan, G.J., Bean, R.W. and Peel, D. (2002). A mixture model-based approach to the clustering of microarray expression data. Bioinformatics 18, 413–422. McLachlan, G.J. and Peel, D. (2000). Finite Mixture Models. New York: Wiley. McLachlan, G.J., Peel, D. and Bean, R.W. (2003). Modelling high-dimensional data by mixtures of factor analyzers. Computational Statistics &; Data Anal- ysis 41, 379–388. McNicholas, P.D. and Murphy, T.B. (2008). Parsimonious Gaussian mixture models. Statistics and Computing 18, 285–296. Newton, M.A. and Raftery, A.E. (1994). Approximate Bayesian inference with the weighted likelihood bootstrap (with discussion). Journal of the Royal Statistical Society, Series B 56, 3–48. Press, S.J. and Shigemasu, K. (1989). Bayesian inference in factor analysis. Con- tributions to Probability and Statistics Springer Verlag. R Development Core Team, R (2009). A Language and Environment for Statistical Computing. Vienna: R Foundation for Statistical Computing. Austria. ISBN 3-900051-07-0, URL http://www.Rproject.org Richardson, S. and Green, P. J. (1997). On Bayesian analysis of mixture models with an unknown number of components (with discussion). J. Roy. Statist. Soc. Ser. B 59, 731–792. Robert, S., Husmeter, D., Rezek, I. and Penny, W. (1998). Bayesian approaches to Gaussian mixture modeling. IEEE Transactions on Pattern Analysis and Machine Intelligence 20, 1133–1142. Rubin, D.B. (1987). Using the SIR algorithm to simulate poster distributions. Bayesian statistics 3, 395–402. Schwarz, G. (1978). Estimating the dimension of a model. Annals of Statistics 6, 461–464. Spiegelhalter, D.J., Best, N.G., Carlin, B.P. and Linde, A.V.D. (2002). Bayesian measures of model complexity and fit. Journal of the Royal Statistical Society, Series B 64, 583–639. Stephen, B. and Andrew, G. (1998). General methods for monitoring convergence of iterative simulations. Journal of Computational and Graphical Statistics 7, 434–456. Tan, M., Tian, G.L. and Ng, K.W. (2003). A Noniterative sampling method for computing posteriors in the structure of EM-Type algorithms. Statistica Sinica 13, 625-639. Utsugi, A. and Kumagai, T. (2001). Bayesian analysis of mixtures of factor analyzers. Neural Computation 13, 993–1002. Wang, W. L., Fan, T. H. (2012). Bayesian analysis of multivariate t linear mixed models using a combination of IBF and Gibbs samplers. Journal of Multivari- ate Analysis 105, 300-310. Wickham, H. and Chang, W. (2013). R ggplot2 package: An implementation of the grammar of graphics. R package version 0.9.3.1. Xie, B., Pan, W. and Shen, X. (2010). Penalized mixtures of factor analyzers with application to clustering high dimensional microarray data. Bioinformatics 22, 2405–2412. Zhang, Z., Chan, K.L., Wu, Y. and Chen, C. (2004). Learning a multivariate Gaussian mixture model with the reversible jump MCMC algorithm. Statistics and Computing 14, 343–355. Zio, M.D., Guarnera, U. and Luzi, O. (2007). Imputation through finite Gaussian mixture models. Computational Statistics &; Data Analysis 51, 5305–5316.
|