Electrical Engineering
      and Computer Sciences

Electrical Engineering and Computer Sciences

COLLEGE OF ENGINEERING

UC Berkeley

   

2008 Research Summary

Probabilistic Inference with Unknown Objects

View Current Project Information

Stuart J. Russell, Rodrigo de Salvo Braz and Nimar S Arora

Probabilistic inference is a major area in artificial intelligence today, but most current approaches limit themselves to propositional models, that is, to establishing distributions on individual facts (or propositions) rather than on classes of facts with different truth values depending on the objects they apply to. This has two major disadvantages: it forces us to create much larger models (by repeating dependencies for each of a fact's instances), and it makes writing those models much harder due to their limited expressivity.

BLOG (Bayesian Logic) is a language for writing probabilistic models with first-order expressivity. Distributions can be defined on classes of propositions parameterized by the objects they apply to. For example, while a propositional approach can model a dependency such as P(colleagues | friends) = 0.6, a first-order approach can express P(colleagues(X,Y) | friends(X,Y)) = 0.6, which makes it possible to track multiple relationships of the same type at the same time.

Every well-formed BLOG program defines a unique probability distribution over possible worlds. We have implemented inference algorithms that are provably complete for all well-formed programs.

Unlike other approaches to this problem, BLOG does not assume a fixed number of known objects in the world. Instead, the population and its size are another random component with its own distribution. This allows BLOG to adapt to and infer the set of objects present in different situations.

Our current aim in the project is to apply BLOG to several practical domains such as citation matching and activity recognition. These domains guide us in the development of faster inference algorithms as well as expansions such as DBLOG, which involve probabilistic reasoning over sequences of states (analogous to Dynamic Bayesian Networks, but with first-order capability).

[1]
B. Milch, B. Marthi, S. Russell, D. Sontag, D. L. Ong, and A. Kolobov, "BLOG: Probabilistic Models with Unknown Objects," Proc. 19th International Joint Conference on Artificial Intelligence (IJCAI), 2005, pp. 1352-1359.
[2]
B. Milch, B. Marthi, S. Russell, D. Sontag, D. L. Ong, and A. Kolobov, "BLOG: Probabilistic Models with Unknown Objects," ed. L. Getoor and B. Taskar, Statistical Relational Learning, Cambridge, MA, MIT Press.