Computational Learning Theory

Instructor: Adam Klivans

Course meets Monday and Wednesday at 11:00AM in RLM 6.120

Office Hours: Monday 1:30-3:00pm in TAY 3.148

Course Description

Provably efficient algorithms are essential for solving nontrivial machine learning problems. The primary goal of computational learning theory is the development of a rich set of algorithmic tools for efficiently solving common classification tasks. In this course, we survey well-established models of learning and describe a variety of algorithms for learning concept classes within these models. Topics include PAC learning, online learning, halfspace-based learning, Fourier-based learning, boosting, learning with noise, random projection, and query learning. In addition, we describe how to apply results from cryptography and computational complexity to prove the intractability of various machine learning problems. No background is required for this course other than a working knowledge of basic probability and mathematical maturity. Coursework includes problem sets (2/5 of final grade), scribe notes (1/5 of final grade), and a final project (the final project may be of a more 'applied' nature; 2/5 of final grade).

This course is a selection of the best lectures from the two-semester course on computational learning theory I taught in 2005.

Tentative Syllabus

Models of Learning: PAC, Exact, On-line, Mistake Bound, and relationships. (1-2 Weeks)

Learning by Polynomials: polynomial threshold functions, DNF formulas, decision trees, decision lists, intersections of halfspaces. (2 Weeks)

Generalization Error: Occam's Razor, VC-Dimension. (1-2 Weeks)

Learning a Halfspace: Perceptron, Winnow, Boosting-Based Methods, Linear Programming. (2-3 Weeks)

Boosting: Definitions, Weak to Strong Learning, Schapire's Booster, AdaBoost, Boosting with Branching Programs, Smooth Boosting, Applications. (2 Weeks)

Uniform Distribution Learning: Fourier-based learning. Agnostic learning. (1-2 Weeks)

Hardness: Relationship to Cryptography, Pseudorandom function generators, NP-hardness of proper learning. (2 Weeks)

Recommended Reading and Background

An (undergraduate) course in algorithms or complexity may be helpful. An Introduction to Computational Learning Theory by Kearns and Vazirani is a good reference book for topics to be covered here.

Links to Additional Reading Material

A nice survey of the Mistake Bound learning model by Avrim Blum.

Scribe Notes

Click here for a sample latex template, and click here for the scribe.sty file you will need.

Lecture 1, Mistake bound model, learning decision lists.

Lecture 4, Learning Decision Trees

Lecture 3, Learning DNF formulas via PTFs Part 1

Lecture 5, Lower Bounds for PTFs, Learning ANDs of Halfspace

Lecture 7, Best Experts, Weighted Majority

Lecture 2, The Pac Learning Model

Additional Notes

Course material may vary depending on student interest.