# Modeling Categorization as a Dirichlet Process Mixture

### Kevin Canini

###
EECS Department

University of California, Berkeley

Technical Report No. UCB/EECS-2007-69

May 18, 2007

### http://www.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-69.pdf

I describe an approach to modeling the dynamics of human category learning using a tool from nonparametric Bayesian statistics called the Dirichlet process mixture model (DPMM). The DPMM has a number of advantages over traditional models of categorization: it is interpretable as the optimal solution to the category learning problem, given certain assumptions about learners' biases; it automatically adjusts the complexity of its category representations depending on the available data; and computationally efficient algorithms exist for sampling from the DPMM, despite its apparent intractability. When applied to the data produced by previous experiments in human category learning, the DPMM usually does a better job of explaining subjects' performance than traditional models of categorization due to its increased flexibility, despite having the same number of free parameters.

**Advisor:** Stuart J. Russell

BibTeX citation:

@mastersthesis{Canini:EECS-2007-69, Author = {Canini, Kevin}, Title = {Modeling Categorization as a Dirichlet Process Mixture}, School = {EECS Department, University of California, Berkeley}, Year = {2007}, Month = {May}, URL = {http://www.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-69.html}, Number = {UCB/EECS-2007-69}, Abstract = {I describe an approach to modeling the dynamics of human category learning using a tool from nonparametric Bayesian statistics called the Dirichlet process mixture model (DPMM). The DPMM has a number of advantages over traditional models of categorization: it is interpretable as the optimal solution to the category learning problem, given certain assumptions about learners' biases; it automatically adjusts the complexity of its category representations depending on the available data; and computationally efficient algorithms exist for sampling from the DPMM, despite its apparent intractability. When applied to the data produced by previous experiments in human category learning, the DPMM usually does a better job of explaining subjects' performance than traditional models of categorization due to its increased flexibility, despite having the same number of free parameters.} }

EndNote citation:

%0 Thesis %A Canini, Kevin %T Modeling Categorization as a Dirichlet Process Mixture %I EECS Department, University of California, Berkeley %D 2007 %8 May 18 %@ UCB/EECS-2007-69 %U http://www.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-69.html %F Canini:EECS-2007-69