Semantic parsing is the process of mapping a natural-language sentence into a
formal representation of its meaning. A shallow form of semantic
representation is a case-role analysis (a.k.a. a semantic role labeling), which
identifies roles such as agent, patient, source, and destination. A deeper
semantic analysis provides a representation of the sentence in predicate logic
or other formal language which supports automated reasoning. We have developed
methods for automatically learning semantic parsers from annotated corpora
using
inductive logic programming and other learning
methods. We have explored learning semantic parsers for mapping
natural-language sentences to case-role analyses, formal database queries, and
formal command languages (i.e. the
Robocup coaching language for use in
advice-taking learners). We have also explored methods for learning semantic lexicons,
i.e. databases of words or phrases paired with one or more alternative formal
meaning representations. Semantic lexicons can also be learned from
semantically annotated sentences and are an important source of knowledge for
semantic parsing. Learning for semantic parsing is part of our research on
natural language learning.
"The fish trap exists because of the fish. Once you've gotten the fish you can forget the trap. The rabbit snare exists because of the rabbit. Once you've gotten the rabbit, you can forget the snare. Words exist because of meaning. Once you've gotten the meaning, you can forget the words. Where can I find a man who has forgotten words so I can talk with him?"
-- The Writings of Chuang Tzu, 4th century B.C. (Original text in Chinese)
Demos of learned natural-language database interfaces:
Tutorial on semantic parsing presented at ACL 2010: