Introduction to Probabilistic Models of Cognition

Josh Tenenbaum
Massachusetts Institute of Technology
Brain and Cog Sc, CS, and AI

I will give an introduction to several key themes of the summer school. Our goal is to understand how human minds perform inference and decision under uncertainty. We will focus on everyday problems of induction, as in learning concepts or causal relations from examples, understanding sentences, or interpreting visual images, where the mind makes inferential leaps that appear to go far beyond the direct data of experience. These inductive leaps are performed mostly successful, unconscious, and quick, yet they are also well beyond the scope of conventional machine intelligence systems. We want to understand the computational basis of these everyday inductive leaps.



To explain how human inductive inferences can work so well from such little data, it is clear that sophisticated prior knowledge must play a critical role in constraining the hypotheses that the mind considers. But how exactly does prior knowledge interact with observed data to guide human inferences and decisions? What forms does that prior knowledge take, across different domains and tasks, and how could it be acquired? When faced with some new and potentially surprising data, how does the mind decide between assimilating the data to its current knowledge schema versus adjusting its knowledge schema to accomodate the new data? And how can accurate inductive inferences be made efficiently, even in the presence of complex hypothesis spaces?




To answer these questions, we will develop a suite of mathematical tools, including the following: Bayesian inference in probabilistic generative models; hierarchical probabilistic models, with inference at all levels of abstraction; probabilities defined over structured knowledge representations, such as graphs, grammars, predicate logic, schemas, or theories; nonparametric probabilistic models that are very flexible in their degree of structure, with the tradeoff between model complexity and data fit managed adaptively by the Bayesian Occams razor; approximate methods of learning and inference, such as belief propagation, expectation-maximization, and a range of Monte Carlo methods.


Presentation (PowerPoint File)

Back to Graduate Summer School: Probabilistic Models of Cognition: The Mathematics of Mind