CS 294-5: Statistical Natural Language Processing, Fall 2004

 
Instructor: Dan Klein
Lecture: MF 1:00-2:30pm, 310 Soda Hall
Office Hours: W 2:00-4:00pm, 765 Soda Hall, or by appointment

Announcements

12/3/04: Project presentation guidelines are here.
11/5/04: Homework 4 is available.
9/24/04: Homework 3 is available.
9/24/04: Check the newsgroup for some announcements about the homeworks.
9/24/04: Section on Viterbi, forward-backward for HMMs on 9/29.
9/24/04: Sections will be W 1-2pm, Soda 405 (not necessarily every week).
9/24/04: NO CLASS on Monday 9/27: CRFs pushed to a later date TBA.
9/24/04: Homework 2 is available.
9/10/04: Homework 1 is available.
9/4/04: Homework 0 (ungraded) is up, if you didn't get it in class.
8/31/04: Our class newsgroup is ucb.class.cs294-5. If you use it, I'll use it!
8/31/04: Lecture 1 and the course questionnaire are up.  Handouts and slides will be added to the syllabus below.
8/31/04: Office hours at W 3-5pm wins by a landslide (email for appointments other times).
8/31/04: Grading and cooperation policies are up.

Description

This course will explore current statistical techniques for the automatic analysis of natural (human) language data. The dominant modeling paradigm is corpus-driven supervised learning, but unsupervised methods and even hand-coded rule-based systems will be mentioned when appropriate.

In the first part of the course, we will examine the core tasks in natural language processing, including language modeling, word-sense disambiguation, morphological analysis, part-of-speech tagging, syntactic parsing, semantic interpretation, coreference resolution, and discourse analysis. In each case, we will discuss which linguistic features are relevant to the task, how to design efficient models which can accommodate those features, and how to estimate parameters for such models in data-sparse contexts. In the second part of the course, we will explore how these core techniques can be applied to user applications such as information extraction, question answering, speech recognition, machine translation, and interactive dialog systems.

Course assignments will highlight several core NLP tasks. For each task, we will construct a basic system, then improve it through a cycle of linguistic error analysis and model redesign. There will also be a final project, which will investigate a single topic or application in greater depth. This course assumes a familiarity with basic probability and the ability to program in Java. Prior experience with linguistics or natural languages is helpful, but not required.

NOTE: Marti Hearst's SIMS 290-2 is also being offered this term.  Both courses deal with statistical, corpus-based NLP.  CS 294-5 will emphasize NLP models and algorithms, while SIMS 290-2 will emphasize the applications of NLP technologies.

Readings

The texts for this course are:

The former is required (i.e. you'll want access to a copy) while the latter is recommended as supplementary reading.  Both are on reserve in the Engineering library.

Syllabus

Week Date Topics Techniques Readings Assignments (Out)
1 Aug 30 Course Introduction M+S 3, J+M 1-3,10
Sep 3 Classical NLP Chart Parsing, Semantic Interpretation M+S 4, J+M 9,15, also see Chris Manning's handouts on syntax and semantics. HW0: Getting Set Up
2 Sep 6 NO CLASS
Sep 10 Speech and Language Modeling Multinomial Smoothing M+S 6, J+M 6-7,
Chen & Goodman
HW1: Language Modeling
3 Sep 13 Text Categorization Smoothing, Naive-Bayes Models M+S 7  
Sep 17 Word-Sense Disambiguation Maximum Entropy Models Berger's tutorial
4 Sep 20 Part-of-Speech Tagging HMMs M+S 9-10, J+M 7.1-7.4
Sep 24 Part-of-Speech Tagging [no slides] MEMMs Toutanova & Manning
Brants' TnT paper
HW2: Maximum Entropy and POS Tagging
5 Sep 27

NO CLASS (CRFs will be rescheduled later)

 
Oct 1 Statistical Parsing PCFGs M+S 11, J+M 12
6 Oct 4 Statistical Parsing Inference for PCFGs Best-First Parsing, A* Parsing
Oct 8 Statistical Parsing [no slides] Grammar Representations DOP, Ratnaparkhi's Maxent Shift-Reduce
7 Oct 11 Statistical Parsing [no slides] Lexicalized Dependency Models Charniak, Collins
Oct 15 Statistical Parsing [no slides] Other Parsing Models TAG, HPSG, CCG HW3: Parsing and Grammars
8 Oct 18 NO CLASS  
Oct 22 Semantic Representations Gildea and Jurafsky
9 Oct 25 Semantic Representations [no slides]   FP: Project Requirements
Oct 29 Coreference    
10 Nov 1

NO CLASS

Nov 5 Machine Translation Word-to-Word Alignment Models M+S 13, J+M 21, Brown et al. HW4: Machine Translation
11 Nov 8 Machine Translation Decoding Word-to-Word Models HMM, Decoders, Phrase-Based  
Nov 12 Machine Translation Syntactic Translation Models Syntactic TMs, Syntactic LMs, Transduction Grammars
12 Nov 15 Unsupervised Learning Document Clustering  
Nov 19 Unsupervised Learning [no slides] Word Clustering HMM Learning, Distributional Clustering
13 Nov 22 Unsupervised Learning [no slides] Grammar Induction Model Merging, Distributional, Constituency/Dependency, Translingual Constraint
Nov 26 NO CLASS
14 Nov 29 Unsupervised Learning Grammar Induction  
Dec 3 Question Answering FP: Preliminary Project Reports
15 Dec 6 Document Summarization    
Dec 10 Final Project Presentations