CS 281A / Stat 241A, Fall 2009:

Statistical Learning Theory



Office hours
Professor Peter Bartlett bartlett@cs Tue 3-4, 723 Sutardja Dai. Wed 3-4, 399 Evans.
Alekh Agarwal
alekh@cs Tue, Thu 4-5, Soda 6th floor alcove.

Joe Neeman
joeneeman@gmail Mon, Tue 5-6, Evans 387.

Lectures:  Soda 306. Tuesday/Thursday 11-12:30.

Discussion section:  Wed 5-6, Evans 332.

Course description

This course will provide an introduction to probabilistic and computational methods for the statistical modeling of complex, multivariate data. It will concentrate on graphical models, a flexible and powerful approach to capturing statistical dependencies in complex, multivariate data. In particular, the course will focus on the key theoretical and methodological issues of representation, estimation, and inference. [More details]


Previous coursework in linear algebra, multivariate calculus, basic probability, statistics, and algorithms. Previous coursework in graph theory, information theory, and optimization theory would be helpful but is not required. Familiarity with a matrix-oriented programming language (such as numpy, R, Splus, or Matlab) will be necessary.

Text book:

An Introduction to Probabilistic Graphical Models, by Michael I. Jordan, which is available as a reader from Copy Central at 2483 Hearst Avenue. (Ask for the CS C281A reader).
For an introduction, see the review paper Michael I. Jordan, Graphical models. .

Google discussion group


There will be a substantial project (40% of the grade), plus regular homework assignments (60% of the grade; approximately one every two weeks).
It is appropriate to discuss homework assignments with other students, but homeworks must be written up individually. If you discuss the assignment with other students, please list their names on your homework. See the policy on academic dishonesty. A late homework will have its grade reduced by 33.3% for each late day.


The chapter numbers under 'Reading' refer to the text An Introduction to Probabilistic Graphical Models, Michael I. Jordan.
Topic Reading
Aug 27 Introduction; directed graphical models 1, 2.1
Sep 1 directed graphical models 2.1
Sep 3 undirected graphical models. Eliminate 2.2, 2.3, 3
Sep 8 Inference and chordal graphs 3, Notes on chordal graphs, 4.1
Sep 10 Factor graphs. Inference in factor trees. 4.2, 4.3
Sep 15 Parameter estimation 5, 6.3
Sep 17 Linear regression. Generative and discriminative classification. 6.4, 7.1, 7.2
Sep 22 Logistic regression. Exponential family. 7.3, 8.1
Sep 24 Exponential family. Generalized Linear Models 8.1, 8.2, 8.4
Sep 29 Estimation in Completely Observed Graphical Models 9, Notes on chordal graphs
Oct 1 IPF. Maximum likelihood. Maximum entropy 9, Variational Inference chapter
Oct 6 IPF as I-projection. Generalized iterative scaling. 9, Variational Inference chapter
Oct 8 EM - Alekh Agarwal 10, 11
Oct 13 EM - Joe Neeman 11
Oct 15 HMMs - Alekh Agarwal 12
Oct 22 Multivariate Gaussians. Factor Analysis. 13, 14. slides
Oct 27 Factor Analysis. State Space Models 14, 15. vec notes slides
Oct 29 State Space Models. Kalman Filter. 15. slides
Nov 3 Junction Tree Algorithm 17. slides
Nov 5 Junction Tree in Trees, HMMs, SSMs 18 slides
Nov 10 Monte Carlo Methods slides
Nov 12 Monte Carlo Methods slides
Nov 19 Variational Methods Variational Inference chapter slides
Nov 24 Variational Methods Variational Inference chapter slides
Nov 26 Thanksgiving
Dec 1 Project Poster Session: Stat 241A In 306 Soda.
Dec 3 Project Poster Session CS 281A In 430 Soda (Woz Lounge).

Last update: Tue Nov 24 12:55:54 PST 2009