Abstract
Minimum Message Length (MML) is an invariant Bayesian point estimation technique which is also statistically consistent and efficient. We provide a brief overview of MML inductive inference (Wallace C.S. and Boulton D.M. 1968. Computer Journal, 11: 185–194; Wallace C.S. and Freeman P.R. 1987. J. Royal Statistical Society (Series B), 49: 240–252; Wallace C.S. and Dowe D.L. (1999). Computer Journal), and how it has both an information-theoretic and a Bayesian interpretation. We then outline how MML is used for statistical parameter estimation, and how the MML mixture modelling program, Snob (Wallace C.S. and Boulton D.M. 1968. Computer Journal, 11: 185–194; Wallace C.S. 1986. In: Proceedings of the Nineteenth Australian Computer Science Conference (ACSC-9), Vol. 8, Monash University, Australia, pp. 357–366; Wallace C.S. and Dowe D.L. 1994b. In: Zhang C. et al. (Eds.), Proc. 7th Australian Joint Conf. on Artif. Intelligence. World Scientific, Singapore, pp. 37–44. See http://www.csse.monash.edu.au/-dld/Snob.html) uses the message lengths from various parameter estimates to enable it to combine parameter estimation with selection of the number of components and estimation of the relative abundances of the components. The message length is (to within a constant) the logarithm of the posterior probability (not a posterior density) of the theory. So, the MML theory can also be regarded as the theory with the highest posterior probability. Snob currently assumes that variables are uncorrelated within each component, and permits multi-variate data from Gaussian, discrete multi-category (or multi-state or multinomial), Poisson and von Mises circular distributions, as well as missing data. Additionally, Snob can do fully-parameterised mixture modelling, estimating the latent class assignments in addition to estimating the number of components, the relative abundances of the parameters and the component parameters. We also report on extensions of Snob for data which has sequential or spatial correlations between observations, or correlations between attributes.
Similar content being viewed by others
Explore related subjects
Discover the latest articles and news from researchers in related subjects, suggested using machine learning.References
Barron A.R. and Cover T.M. 1991. Minimum complexity density estimation. IEEE Transactions on Information Theory 37: 1034–1054.
Baxter R.A. and Oliver J.J. 1997. Finding overlapping distributions with MML. Statistics and Computing 10(1): 5–16.
Boulton D.M. 1975. The information criterion for intrinsic classificationa. Ph. D. Thesis, Dept. Computer Science, Monash University, Australia.
Boulton D.M. and Wallace C.S. 1969. The information content of a multistate distribution. Journal of Theoretical Biology 23: 269–278.
Boulton D.M. and Wallace C.S. 1970. A program for numerical classification. Computer Journal 13: 63–69.
Boulton D.M. and Wallace C.S. 1973a. An information measure for hierarchic classification. The Computer Journal 16: 254–261.
Boulton D.M. and Wallace C.S. 1973b. A comparison between information measure classification. In: Proceedings of ANZAAS Congress, Perth.
Boulton D.M. and Wallace. C.S. 1975. An information measure for single-link classification. The Computer Journal 18(3): 236–238.
Chaitin. G.J. 1966. On the length of programs for computing finite sequences. Journal of the Association for Computing Machinery 13: 547–549.
Cheeseman P., Self M., Kelly J., Taylor W., Freeman D., and Stutz J. 1988. Bayesian classification. In: Seventh National Conference on Artificial Intelligence, Saint Paul, Minnesota, pp. 607–611.
Conway J.H. and Sloane N.J.A. 1988. Sphere Packings, Lattices and Groups. London, Springer Verlag.
Dellaportas P., Karlis D., and Xekalaki E. 1997. Bayesian Analysis of Finite Poisson Mixtures. Technical Report No. 32. Department of Statistics, Athens University of Economics and Business, Greece.
Dowe D.L., Allison L., Dix T.I., Hunter L., Wallace C.S., and Edgoose T. 1996. Circular clustering of protein dihedral angles by minimum message length. In: Proc. 1st Pacific Symp. Biocomp., HI, U.S.A., pp. 242–255.
Dowe D.L., Baxter R.A., Oliver J.J., and Wallace C.S. 1998. Point estimation using the Kullback-Leibler loss function and MML. In: Proc. 2nd Pacific Asian Conference on Knowledge Discovery and Data Mining (PAKDD'98), Melbourne, Australia. Springer Verlag, pp. 87–95.
Dowe D.L. and Korb K.B. 1996. Conceptual difficulties with the efficient market hypothesis: towards a naturalized economics. In: Dowe D.L., Korb K.B., and Oliver J.J. (Eds.), Proceedings of the Information, Statistics and Induction in Science (ISIS) Conference, Melbourne, Australia. World Scientific, pp. 212–223.
Dowe D.L., Oliver J.J., Baxter R.A., and Wallace C.S. 1995. Bayesian estimation of the von Mises concentration parameter. In: Proc. 15th Maximum Entropy Conference, Santa Fe, New Mexico.
Dowe D.L., Oliver J.J., and Wallace C.S. 1996. MML estimation of the parameters of the spherical Fisher distribution. In: Sharma A. et al. (Eds.), Proc. 7th Conf. Algorithmic Learning Theory (ALT'96), LNAI 1160, Sydney, Australia, pp. 213–227.
Dowe D.L. and Wallace. C.S. 1997. Resolving the Neyman-Scott problem by minimum message length. In: Proc. Computing Science and Statistics – 28th Symposium on the Interface, Vol. 28, pp. 614–618.
Dowe D.L. and Wallace C.S. 1998. Kolmogorov complexity, minimum message lenth and inverse learning. In: Proc. 14th Australian Statistical Conference (ASC-14), Gold Coast, Qld., Australia, pp. 144.
Edgoose T.C. and Allison L. 1998. Unsupervised markov classification of sequenced data using MML. In: McDonald C. (Ed.), Proc. 21st Australasian Computer Science Conference (ACSC'98), Singapore. Springer-Verlag, ISBN: 981-3083-90-5, pp. 81–94.
Edgoose T.C., Allison L., and Dowe D.L. 1998. AnMMLclassification of protein structure that knows about angles and sequences. In: Proc. 3rd Pacific Symp. Biocomp. (PSB-98) HI, U.S.A., pp. 585–596.
Edwards R.T. and Dowe D.L. 1998. Single factor analysis in MML mixture modelling. In: Proc. 2nd Pacific Asian Conference on Knowledge Discovery and Data Mining (PAKDD'98), Melbourne, Australia. Springer Verlag, pp. 96–109.
Everitt B.S. and Hand D.J. 1981. Finite Mixture Distributions. London, Chapman and Hall.
Fisher D.H. 1987. Conceptual clustering, learning from examples, and inference. In: Machine Learning: Proceedings of the Fourth International Workshop. Morgan Kaufmann, pp. 38–49.
Fisher N.I. 1993. Statistical Analysis of Circular Data. Cambridge University Press.
Fraley C. and Raftery A.E. 1998. Mclust: software for modelbased clustering and discriminant analysis. Technical Report TR 342, Department of Statistics, Univeristy of Washington, U.S.A. Journal of Classification, to appear.
Georgeff M.P. and Wallace C.S. 1984. A general criterion for inductive inference. In: O'shea T. (Ed.), Advances in Artificial Intelligence: Proc. Sixth European Conference on Artificial Intelligence, Amsterdam. North Holland, pp. 473–482.
Hunt L.A. and Jorgensen M.A. 1999. Mixture model clustering using the multimix program. Australian and New Zealand Journal of statistics 41(2): 153–171.
Jorgensen M.A. and Hunt L.A. 1996. Mixture modelling clustering of data sets with categorical and continous variables. In: Dowe D.L., Korb K. B., and Oliver J.J. (Eds.), Proceedings of the Information, Statistics and Induction in Science (ISIS) Conference, Melbourne, Australia. World Scientific, pp. 375–384.
Kearns M., Mansour Y., Ng A. Y., and Ron D. 1997. An experimental and theoretical comparison of model selection methods. Machine Learning 27: 7–50.
Kissane D.W., Bloch S., Dowe D.L., Snyder R.D., Onghena P., McKenzie D.P., and Wallace C.S. 1996. The Melbourne family grief study, I: Perceptions of family functioning in bereavement. American Journal of Psychiatry 153: 650–658.
Mardia K.V. 1972. Statistics of Directional Data. Academic Press.
McLachlan G.J. 1992. Discriminant Analysis and Statistical Pattern Recognition. New York, Wiley.
McLachlan G.J. and Basford. K.E. 1998. Mixture Models. New York, Marcel Dekker.
McLachlan G.J. and Krishnan T. 1996. The EM Algorithm and Extensions. New York, Wiley.
McLachlan G.J., Peel D., Basford K.E., and Adams P. 1999. The EMMIX software for the fitting of mixtures of Normal and t-components. Journal of Statistical Software 4, 1999.
Neal R.M. 1998. Markov chain sampling methods for dirichlet process mixture models. Technical Report 9815, Dept. of Statistics and Dept. of Computer Science, University of Toronto, Canada, pp. 17.
Neyman J. and Scott E.L. 1948. Consistent estimates based on partially consistent observations. Econometrika 16: 1–32.
Oliver J. Baxter R., and Wallace C. 1996. Unsupervised learning using MML. In: Proc. 13th International Conf. Machine Learning (ICML 96), San Francisco, CA. Morgan Kaufmann, pp. 364–372.
Oliver J.J. and Dowe D.L. 1996. Minimum message length mixture modelling of spherical von Mises-Fisher distributions. In: Proc. Sydney International Statistical Congress (SISC-96), Sydney, Australia, p. 198.
Patrick J.D. 1991. Snob: A program for discriminating between classes. Technical report TR 151, Dept. of Computer Science, Monash University, Clayton, Victoria 3168, Australia.
Prior M., Eisenmajer R., Leekam S., Wing L., Gould J., Ong B., and Dowe D.L. 1998. Are there subgroups within the autistic spectrum? A cluster analysis of a group of children with autistic spectrum disorders. J. child Psychol. Psychiat. 39(6): 893–902.
Rissanen. J.J. 1978. Modeling by shortest data description. Automatica, 14: 465–471.
Rissanen. J.J. 1989. Stochastic Complexity in Statistical Inquiry. Singapore, World Scientific.
Rissanen J.J. and Ristad E.S. 1994. Unsupervised Classfication with stochastic complexity. In: Bozdogan H. et al. (Ed.), Proc. of the First US/Japan Conf. on the Frontiers of Statistical Modeling: An Informational Approach. Kluwer Academic Publishers, pp. 171–182.
Roeder K. 1994. A graphical technique for determining the number of components in a mixture of normals. Journal of the American Statistical Association 89(426): 487–495.
Schou G. 1978. Estimation of the concentration parameter in von Mises-Fisher distributions. Biometrika 65: 369–377.
Solomonoff R.J. 1964. A formal theory of inductive inference. Information and Control 7: 1–22, 224–254.
Solomonoff R.J. 1995. The discovery of algorithmic probability: A guide for the programming of true creativity. In: Vitanyi P. (Ed.), Computational Learning Theory: EuroCOLT'95. Springer-Verlag, pp. 1–22.
Stutz J. and Cheeseman P. 1994. Autoclass: A Bayesian approach to classfication. In: Skilling J. and Subuiso S. (Eds.), Maximum Entropy and Bayesian Methods. Dordrecht, Kluwer academic.
Titterington D.M., Smith A.F.M., and Makov U.E. 1985. Statistical Analysis of Finite Mixture Distributions. John Wiley and Sons, Inc.
Vapnik V.N. 1995. The Nature of Statistical Learning Theory. Springer.
Viswanathan M. and Wallace C.S. 1999. A note on the comparison of polynomial selection methods. In: Proc. 7th Int. Workshop on Artif. Intelligence and Statistics. Morgan Kaufmann, pp. 169–177.
Viswanathan M., Wallace C.S., Dowe D.L., and Korb K.B. 1999. Finding cutpoints in Noisy Binary Sequences. In: Proc. 12th Australian Joint Conf. on Artif. Intelligence.
Wahba G. 1990. Spline Models for Observational Data. SIAM.
Wallace C.S. 1986. An improved program for classfication. In: Proceedings of the Nineteenth Australian Computer Science Conference (ACSC-9), Vol. 8, Monash University, Australia, pp. 357–366.
Wallace C.S. 1990. Classfication by Minimum Message Length inference. In: Goos G. and Hartmanis J. (Eds.), Advances in Computing and Information – ICCI'90. Berlin, Springer-Verlag, pp. 72–81.
Wallace C.S. 1995. Multiple factor analysis by MML estimation. Technical Report 95/218, Dept. of Computer Science, Monash University, Clayton, Victoria 3168, Australia. J. Multiv. Analysis, (to appear).
Wallace C.S. 1989. False Oracles and SMML Estimators. In: Dowe D.L., Korb K.B., and Oliver J.J. (Eds.), Proceedings of the Information, Statistics and Induction in science (ISIS) Conference, Melbourne, Australia. World Scientific, pp. 304–316, Tech Rept 89/128, Dept. Comp. Sci., Monash Univ., Australia.
Wallace C.S. 1998. Intrinsic Classification of Spatially-Correlated Data. Computer Journal 41(8): 602–611.
Wallace C.S. and Boulton D.M. 1968. An information measure for classification. Computer Journal 11: 185–194.
Wallace C.S. and Boulton D.M. 1975. An invariant Bayes method for point estimation. Classification Society Bulletin 3(3): 11–34.
Wallace C.S. and Dowe D.L. 1993. MML estimation of the von Mises concentration parameter. Technical Report TR 93/193, Dept. of Comp. Sci., Monash Univ., Clayton 3168, Australia. Aust. and N.Z. J. Stat, prov. accepted.
Wallace C.S. and Dowe D.L. 1994. Estimation of the von Mises concentration parameter using minimum message length. In: Proc. 12th Australian Statistical Soc. Conf., Monash University, Australia.
Wallace C.S. and Dowe D.L. 1994. Intrinsic classification by MML – the Snob program. In: Zhang C. et al. (Eds.), Proc. 7th Australia Joint Conf. on Artif. Intelligence. World Scientific, Singapore, pp. 37–44. See http://www.csse.monash.edu.au/-dld/Snob.html.
Wallace C.S. and Dowe D.L. 1996. MML mixture modelling of Multistate, Poisson, von Mises circular and Gaussian distributions. In: Proc. Sydney International Statistical Congress (SISC-96), Sydney, Australia, p. 197.
Wallace C.S. and Dowe D.L. 1997. MML mixture modelling of Multistate, Poisson, von Mises circular and Gaussian distributions. In: Proc. 6th Int. Workshop on Artif. Intelligence and Statistics, pp. 529–536.
Wallace C.S. and Dowe D.L. 1999. Minimum Message Length and Kolmogorov Complexity. Computer Journal (Special issue on Kolmogorov Complexity) 42(4): 270–283.
Wallace C.S. and Freeman P.R. 1987. Estimation and inference by compact coding. J. Royal Statistical Society (Series B), 49: 240–252.
Wallace C.S. and Freeman P.R. 1992. Single factor analysis by MML estimation. Journal of the Royal Statistical Society (Series B) 54: 195–209.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Wallace, C.S., Dowe, D.L. MML clustering of multi-state, Poisson, von Mises circular and Gaussian distributions . Statistics and Computing 10, 73–83 (2000). https://doi.org/10.1023/A:1008992619036
Issue Date:
DOI: https://doi.org/10.1023/A:1008992619036