Efficient Markov Logic Inference for Natural Language Semantics (2014)
Using Markov logic to integrate logical and distributional information in natural-language semantics results in complex inference problems involving long, complicated formulae. Current inference methods for Markov logic are ineffective on such problems. To address this problem, we propose a new inference algorithm based on SampleSearch that computes probabilities of complete formulae rather than ground atoms. We also introduce a modified closed-world assumption that significantly reduces the size of the ground network, thereby making inference feasible. Our approach is evaluated on the recognizing textual entailment task, and experiments demonstrate its dramatic impact on the efficiency of inference.
In Proceedings of the Fourth International Workshop on Statistical Relational AI at AAAI (StarAI-2014), pp. 9--14, Quebec City, Canada, July 2014.

Islam Beltagy Ph.D. Student beltagy [at] cs utexas edu
Raymond J. Mooney Faculty mooney [at] cs utexas edu