I am a Computer Science Ph.D. student
at The University of Texas at Austin.
My research focuses on
Natural Language
Processing and
Machine Learning.
I am currently working on using
nonparametric Bayesian techniques to
learn CCG grammars from diverse forms
of weak supervision. This work is in
collaboration with Jason Baldridge,
Noah Smith, Chris Dyer, and James Scott.
Publications
Low-Resource Machine Learning for NLP
Real-World Semi-Supervised Learning of POS-Taggers for Low-Resource Languages
Dan Garrette,
Jason Mielens, and
Jason Baldridge
To appear in Proceedings of ACL 2013
[code]
[poster]
Learning a Part-of-Speech Tagger from Two Hours of Annotation
Dan Garrette and
Jason Baldridge
Proceedings of NAACL 2013
[code]
[slides .key]
[slides .pdf]
[talk video]
Type-Supervised Hidden Markov Models for POS Tagging with Incomplete Tag Dictionaries
Dan Garrette and
Jason Baldridge
Proceedings of EMNLP 2012
[code]
Computational Semantics
A Formal Approach to Linking Logical Form and Vector-Space Lexical Semantics
Dan Garrette,
Katrin Erk, and
Raymond Mooney
In: Harry Bunt, Johan Bos, and Stephen Pulman (eds) Computing Meaning, Vol. 4 (in press)
Montague Meets Markov: Deep Semantics with Probabilistic Logical Form
Islam Beltagy, Cuong Chau, Gemma Boleda,
Dan Garrette,
Katrin Erk, and
Raymond Mooney
Proceedings of *SEM 2013
Integrating Logical Representations with Probabilistic Information using Markov Logic
Dan Garrette,
Katrin Erk, and
Raymond Mooney
Proceedings of the Ninth International Conference on Computational Semantics, 2011
An Extensible Toolkit for Computational Semantics
Dan Garrette and
Ewan Klein
Proceedings of the Eighth International Conference on Computational Semantics, 2009