Course Syllabus for
CS 388: Natural Language Processing

Chapter numbers refer to the text: SPEECH and LANGUAGE PROCESSING
  1. Introduction
    Chapter 1. NLP tasks in syntax, semantics, and pragmatics. Applications such as information extraction, question answering, and machine translation. The problem of ambiguity. The role of machine learning. Brief history of the field.
  2. N-gram Language Models
    Chapter 4. The role of language models. Simple N-gram models. Estimating parameters and smoothing. Evaluating language models.
  3. Part Of Speech Tagging and Sequence Labeling
    Chapters 5-6. Lexical syntax. Hidden Markov Models (Forward and Viterbi algorithms and EM training).
  4. Basic Neural Networks
    Any basic introduction to perceptron and backpropagation such as section 18.7 in Artificial Intelligence: A Modern Approach (3rd ed), Chapter 4 of Machine Learning, or sections 5.0 - 5.3.3 of Pattern Recognition and Machine Learning.
  5. LSTM Recurrent Neural Networks
    "Understanding LSTM Networks" blog post, optionally the original paper Long Short Term Memory.
  6. Syntactic parsing
    Chapters 12-14. Grammar formalisms and treebanks. Efficient parsing for context-free grammars (CFGs). Statistical parsing and probabilistic CFGs (PCFGs). Lexicalized PCFGs. Neural shift-reduce dependency parsing (see this paper).
  7. Semantic Analysis
    Chapters 18-20. Lexical semantics and word-sense disambiguation. Compositional semantics. Semantic Role Labeling and Semantic Parsing.
  8. Information Extraction (IE)
    Chapter 22. Named entity recognition and relation extraction. IE using sequence labeling.
  9. Machine Translation (MT)
    Chapter 25. Basic issues in MT. Statistical translation, word alignment, phrase-based translation, and synchronous grammars.