My students, collaborators, publications, and other information about my research group can be found on the TAUR lab website.
My research is primarily in the field of Natural Language Processing. My group focuses on improving techniques for accessing and reasoning about knowledge in text. Large language models like GPT-4 have dramatically advanced these frontiers; currently we are looking at where these systems succeed and fail and how to further enhance their capabilities, particularly by building modular NLP systems that use LLMs as primitives. Some examples of recent projects include:
CS388: Natural Language Process (Online MSCS/MSDS version): video lectures, readings, and assignments for the online masters offering of these NLP courses.
NLP Module for high schools: videos and hands-on assignments introducing n-gram language models and pre-trained Transformers.
Undergraduate and graduate teaching:
Read about some of the course materials we use in our TeachingNLP workshop paper:
Contemporary NLP Modeling in Six Comprehensive Programming Assignments
Greg Durrett, Jifan Chen, Shrey Desai, Tanya Goyal, Lucas Kabela, Yasumasa Onoe, Jiacheng Xu. Proceedings of the Fifth Workshop on Teaching NLP.