Electrical Engineering
      and Computer Sciences

Electrical Engineering and Computer Sciences

COLLEGE OF ENGINEERING

UC Berkeley

   

2008 Research Summary

Effective Bayesian Transfer Learning (EBTL)

View Current Project Information

Stuart J. Russell, Peter Bartlett and Michael Jordan

Transfer learning is what happens when someone finds it much easier to learn to play chess having already learned to play checkers; or to recognize tables having already learned to recognize chairs; or to learn Spanish having already learned Italian. Achieving significant levels of transfer learning across tasks--that is, achieving cumulative learning--is the central problem facing machine learning.

The EBTL project involves a technical unification of two previously disjoint areas of research: knowledge-intensive learning in the logical tradition and hierarchical Bayesian learning in the probabilistic tradition. The unification involves applying Bayesian learning methods with strong prior knowledge represented in an expressive first-order probabilistic language.

The approach applies not just to learning declarative knowledge, but also to learning decision-related quantities such as reward functions, value functions, policies, and task hierarchies. This approach allows us to use the same powerful transfer methods to generalize across-task environments to other task instances with different initial states, objects, goals, and physical laws.

Our theory of transfer learning is being tested on real-time strategy games and on simulated object manipulation and perception.

EBTL subcontractors include MIT (Leslie Kaelbling, Tomas Lozano-Perez, and Tommi Jaakkola), Stanford (Andrew Ng, Daphne Koller, and Sebastian Thrun), and Oregon State (Tom Dietterich, Alan Fern, and Prasad Tadepalli).