Efficient Reasoning with Maximum Entropy Probabilistic Logic

Mark A. Paskin and Eyal Amir1
(Professor Stuart J. Russell)
(DOD-ONR) MURI FD N00014-01-1-0890, (DOD-ONR) MURI FD N00014-00-1-0637, and (NSF) FD98-73086-MALIK-09/01

We are exploring efficient techniques for reasoning under uncertainty in first-order domains. Given a knowledge base that contains sentences of first-order logic, each of which may be adorned with a degree of belief, i.e., a probability of truth, we would like to compute the degree of belief of a new sentence. Our technique involves transforming such a knowledge base into a probability distribution over possible worlds, which defines the degree of belief of every sentence of the language. Our current work involves constructing maximum entropy (or log-linear) distributions. We find that this approach has appealing inferential properties and a compact representation. In particular, maximum entropy distributions can be represented by graphical models whose structure corresponds to the structure of the original knowledge base, and conditional independences in such models can be leveraged to obtain efficient inference algorithms. Importantly, we have found that the complexity of probabilistic reasoning under these maximum entropy distributions is no harder than the corresponding deterministic inference problems.

[1]
M. A. Paskin, Maximum Entropy Probabilistic Logic, UC Berkeley Computer Science Division, Report No. UCB/CSD 01/1161, November 2001.
1Postdoctoral Researcher

Send mail to the author : (paskin@cs.berkeley.edu)


Edit this abstract