UTCS Colloquium/AI: Lenhart K. Schubert/University of Rochester: "Towards Generic Knowledge Acquisition from Text" TAY 3.128, Thursday, March 26, 2009 10:00 a.m.
There is a sign up schedule for this event:
.cs.utexas.edu/department/webevent/utcs/events/cgi/eidshow.cgi?person=Lenha
rtK.Schubert
Type of Talk: UTCS Colloquium/AI
Speaker/Af
filiation: Lenhart K. Schubert/University of Rochester
Date/Time: Thursday, March 26, 2009 10:00 a.m.
Location:&nbs
p; TAY 3.128
Host: Vladimir Lifschitz
Talk Title: 
; "Towards Generic Knowledge Acquisition from Text"
Talk
Abstract:
Knowledge extraction from text has become an increasingly a
ttractive way to tackle the long-standing "knowledge acquisition bott
leneck" in AI, thanks to the burgeoning of on-line textual materials
, and continuing improvements in natural language processing tools. In the
KNEXT project (started 8 years ago at U. Rochester) we have been using com
positional interpretation of parsed text to derive millions of general &quo
t;factoids" about the world. Some examples, as translated from logi
cal encodings into approximate English by KNEXT, are: CLOTHES CAN BE WASHE
D; PERSONS MAY SLEEP; A CHARGE OF CONSPIRACY MAY BE PROVEN IN SOME WAY;
A MOUSE MAY HAVE A TAIL; A CAT MAY CATCH A MOUSE; etc. Viewed conservativ
ely as existential or possibilistic statements, such factoids unfortunatel
y do not provide a strong basis for reasoning. We would be better off with
broadly quantified claims, such as that ANY GIVEN MOUSE ALMOST CERTAINLY H
AS A TAIL, and IF A CAT CATCHES A MOUSE, IT WILL USUALLY KILL IT. How can
we obtain such stronger knowledge? I will discuss several approaches that
we are currently developing. Some involve further abstraction from KNEXT fa
ctoids using lexical semantic knowledge, while others involve direct inter
pretation of general facts stated in English. In all cases, issues in the
formal representation of generic knowledge are encountered, of the type mu
ch-studied in linguistic semantics under such headings as "donkey ana
phora", "dynamic semantics", and "generic passage
s". I will suggest a Skolemization approach which can be viewed as a
method of generating frame-like or script-like knowledge directly from lang
uage.
Speaker Bio:
Lenhart Schubert is a professor of Compu
ter Science at the University of Rochester, with primary interests in natu
ral language understanding, knowledge representation and acquisition, rea
soning, and self-awareness. While earning a PhD in Aerospace Studies at th
e University of Toronto, he became fascinated with AI and eventually joine
d the University of Alberta Computing Science Department and later (in 1988
), the University of Rochester. He has over 100 publications in natural la
nguage processing and semantics, knowledge representation, reasoning, an
d knowledge acquisition, has chaired conference programs in these areas,
and is a AAAI fellow and former Alexander von Humboldt fellow.
- About
- Research
- Faculty
- Awards & Honors
- Undergraduate
- Graduate
- Careers
- Outreach
- Alumni
- UTCS Direct