Jointly Improving Parsing and Perception for Natural Language Commands through Human-Robot Dialog (2018)
Natural language understanding in robots needs to be robust to a wide-range of both human speakers and human environments. Rather than force humans to use language that robots can understand, robots in human environments should dynamically adapt—continuously learning new language constructions and perceptual concepts as they are used in context. In this work, we present methods for parsing natural language to underlying meanings, and using robotic sensors to create multi-modal models of perceptual concepts. We combine these steps towards language understanding into a holistic agent for jointly improving parsing and perception on a robotic platform through human-robot dialog. We train and evaluate this agent on Amazon Mechanical Turk, then demonstrate it on a robotic platform initialized from conversational data gathered from Mechanical Turk. Our experiments show that improving both parsing and perception components from conversations improves communication quality and human ratings of the agent.
View:
PDF
Citation:
In RSS Workshop on Models and Representations for Natural Human-Robot Communication (MRHRC-18). Robotics: Science and Systems (RSS), June 2018.
Bibtex:

Justin Hart Postdoctoral Fellow hart [at] cs utexas edu
Yuqian Jiang Ph.D. Student
Raymond J. Mooney Faculty mooney [at] cs utexas edu
Aishwarya Padmakumar Ph.D. Alumni aish [at] cs utexas edu
Jivko Sinapov Postdoctoral Alumni jsinapov [at] cs utexas edu
Peter Stone Faculty pstone [at] cs utexas edu
Jesse Thomason Ph.D. Alumni thomason DOT jesse AT gmail
Nick Walker Undergraduate Alumni nswalker [at] cs uw edu
Harel Yedidsion Postdoctoral Fellow harel [at] cs utexas edu
Harel Yedidsion Postdoctoral Alumni harel [at] cs utexas edu