To truly understand language, an intelligent system must be able to connect
words, phrases, and sentences to its perception of objects and events in the
world. Ideally, an AI system would be able to learn language like a human
child, by being exposed to utterances in a rich perceptual environment. The
perceptual context would provide the necessary supervisory information, and
learning the connection between language and perception would ground the
system's semantic representations in its perception of the world. As a step in
this direction, our research is developing systems that learn
semantic parsers and language generators from sentences
paired only with their perceptual context. It is part of our research on
natural language learning. Our research on this topic is
supported by the National Science Foundation through grants
IIS-0712097 and IIS-1016312.
- Grounded Language Learning [Video Lecture]
Raymond J. Mooney, Invited Talk, AAAI, 2013.
- Learning Language from its Perceptual Context [Video Lecture]
Raymond J. Mooney, Invited Talk, ECML-PKDD, 2008.
Subareas: