Department of Computer Science

Machine Learning Research Group

University of Texas at Austin Artificial Intelligence Lab

Publications: Unsupervised Learning, Clustering, and Self-Organization

Unsupervised learning does not require annotation or labeling from a human teacher; the idea is to learn the structure of the data from unlabeled examples. The most common unsupervised learning task is clustering, i.e. grouping instances into a discovered set of categories containing similar instances. Self-organizing maps in addition visualize the topology of the clusters on a map. Our work in this area includes applications on lexical semantics, topic modeling, and discovering latent class models, as well as methods for laterally connected, hierarchical, sequential-input, and growing self-organizing maps.
  1. A Mixture Model with Sharing for Lexical Semantics
    [Details] [PDF] [Slides]
    Joseph Reisinger and Raymond J. Mooney
    In Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP-2010), 1173--1182, MIT, Massachusetts, USA, October 9--11 2010.
  2. Cross-cutting Models of Distributional Lexical Semantics
    [Details] [PDF] [Slides]
    Joseph S. Reisinger
    June 2010. Ph.D. proposal, Department of Computer Sciences, University of Texas at Austin.
  3. Spherical Topic Models
    [Details] [PDF] [Slides]
    Joseph Reisinger, Austin Waters, Bryan Silverthorn, and Raymond J. Mooney
    In Proceedings of the 27th International Conference on Machine Learning (ICML 2010), 2010.
  4. Spherical Topic Models
    [Details] [PDF]
    Joseph Reisinger, Austin Waters, Bryan Silverthorn, and Raymond Mooney
    In NIPS'09 workshop: Applications for Topic Models: Text and Beyond, 2009.
  5. Model-based Overlapping Clustering
    [Details] [PDF]
    A. Banerjee, C. Krumpelman, S. Basu, Raymond J. Mooney and Joydeep Ghosh
    In Proceedings of the Eleventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD-05), 2005.