Ensemble Learning
Ensemble Learning combines multiple learned models under the assumption that "two (or more) heads are better than one." The decisions of multiple hypotheses are combined in ensemble learning to produce more accurate results. Boosting and bagging are two popular approaches. Our work focuses on building diverse committees that are more effective than those built by existing methods, and, in particular, are useful for active learning.

For a general, popular book on the utility of combining diverse, independent opinions in human decision-making, see The Wisdom of Crowds.

     [Expand to show all 20][Minimize]
Explainable Improved Ensembling for Natural Language and Vision 2018
Nazneen Rajani, PhD Thesis, Department of Computer Science, The University of Texas at Austin.
Stacking With Auxiliary Features for Visual Question Answering 2018
Nazneen Fatema Rajani, Raymond J. Mooney, In Proceedings of the 16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 2217-2226 2018.
Ensembling Visual Explanations for VQA 2017
Nazneen Fatema Rajani, Raymond J. Mooney, In Proceedings of the NIPS 2017 workshop on Visually-Grounded Interaction and Language (ViGIL), December 2017.
Stacking With Auxiliary Features 2017
Nazneen Fatema Rajani and Raymond J. Mooney, In Proceedings of the 26th International Joint Conference on Artificial Intelligence (IJCAI-17), pp. 2634-2640, Melbourne, Australia 2017.
Using Explanations to Improve Ensembling of Visual Question Answering Systems 2017
Nazneen Fatema Rajani and Raymond J. Mooney, In Proceedings of the IJCAI 2017 Workshop on Explainable Artificial Intelligence (XAI), pp. 43-47, Melbourne, Australia, August 2017.
Combining Supervised and Unsupervised Ensembles for Knowledge Base Population 2016
Nazneen Fatema Rajani and Raymond J. Mooney, To Appear In Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing (EMNLP-16) 2016.
Stacking With Auxiliary Features 2016
Nazneen Fatema Rajani and Raymond J. Mooney, ArXiv preprint arXiv:1605.08764 (2016).
Stacking With Auxiliary Features for Combining Supervised and Unsupervised Ensembles 2016
Nazneen Fatema Rajani and Raymond J. Mooney, In Proceedings of the Ninth Text Analysis Conference (TAC 2016) 2016.
Knowledge Base Population using Stacked Ensembles of Information Extractors 2015
Vidhoon Viswanathan, Masters Thesis, Department of Computer Science, The University of Texas at Austin.
Knowledge Transfer Using Latent Variable Models 2015
Ayan Acharya, PhD Thesis, Department of Electrical and Computer Engineering, The University of Texas at Austin.
Stacked Ensembles of Information Extractors for Knowledge-Base Population 2015
Vidhoon Viswanathan, Nazneen Fatema Rajani, Yinon Bentor, and Raymond J. Mooney, In Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics (ACL-15), pp. 177-187, Beijing, China, July 2015.
Stacked Ensembles of Information Extractors for Knowledge-Base Population by Combining Supervised and Unsupervised Approaches 2015
Nazneen Fatema Rajani and Raymond J Mooney, In Proceedings of the Eighth Text Analysis Conference (TAC 2015), November 2015.
Combining Bias and Variance Reduction Techniques for Regression 2005
Y. L. Suen, P. Melville and Raymond J. Mooney, In Proceedings of the 16th European Conference on Machine Learning, pp. 741-749, Porto, Portugal, October 2005.
Combining Bias and Variance Reduction Techniques for Regression 2005
Yuk Lai Suen, Prem Melville and Raymond J. Mooney, Technical Report UT-AI-TR-05-321, University of Texas at Austin. www.cs.utexas.edu/~ml/publication.
Creating Diverse Ensemble Classifiers to Reduce Supervision 2005
Prem Melville, PhD Thesis, Department of Computer Sciences, University of Texas at Austin. 141 pages. Technical Report TR-05-49.
Creating Diversity in Ensembles Using Artificial Data 2004
Prem Melville and Raymond J. Mooney, Journal of Information Fusion: Special Issue on Diversity in Multi Classifier Systems, Vol. 6, 1 (2004), pp. 99-111.
Diverse Ensembles for Active Learning 2004
Prem Melville and Raymond J. Mooney, In Proceedings of 21st International Conference on Machine Learning (ICML-2004), pp. 584-591, Banff, Canada, July 2004.
Experiments on Ensembles with Missing and Noisy Data 2004
Prem Melville, Nishit Shah, Lilyana Mihalkova, and Raymond J. Mooney, In {Lecture Notes in Computer Science:} Proceedings of the Fifth International Workshop on Multi Classifier Systems (MCS-2004), F. Roli, J. Kittler, and T. Windeatt (Eds.), Vol. 3077, pp. 293-3...
Constructing Diverse Classifier Ensembles Using Artificial Training Examples 2003
Prem Melville and Raymond J. Mooney, In Proceedings of the Eighteenth International Joint Conference on Artificial Intelligence (IJCAI-2003), pp. 505-510, Acapulco, Mexico, August 2003.
Creating Diverse Ensemble Classifiers 2003
Prem Melville, Technical Report UT-AI-TR-03-306, Department of Computer Sciences, University of Texas at Austin. Ph.D. proposal.