Syllabus---CS 281A / Stat 241A (Fall, 2007)
Course Description:
This course will provide a thorough grounding in probabilistic
and computational methods for the statistical modeling of complex,
multivariate data. The emphasis will be on the unifying framework
provided by graphical models, a formalism that merges aspects of
graph theory and probability theory.
Outline:
Basics on graphical models, Markov properties, recursive decomposability,
elimination algorithms
Sum-product algorithm, factor graphs, semi-rings
Frequentist and Bayesian methods
Bayesian classification, linear models and generalized linear
(GLIM) models, on-line methods
Exponential family, sufficiency, conjugacy, reference priors
Density estimation, kernel methods, mixture models
The EM algorithm
Conditional mixture models, hierarchical mixture models
Hidden Markov models (HMM)
Factor analysis, principal component analysis (PCA), canonical correlation
analysis (CCA)
Kalman filtering and Rauch-Tung-Striebel smoothing
Markov properties of graphical models
Junction tree algorithm
Chains, trees, factorial models, coupled models, layered models
Importance sampling, Gibbs sampling, Metropolis-Hastings
Variational algorithms: mean field, belief propagation, convex relaxations
Dynamical graphical models
Model choice: cross-validation, AIC, BIC and Bayes factors
Nonparametric Bayes: Gaussian processes, Dirichlet processes
Decision networks, Markov decision processes and reinforcement learning
Applications to bioinformatics, error-control coding, speech and language, vision