Computational Learning Theory

Instructor: Adam Klivans

Course meets Monday and Wednesday at 11:00AM in RLM 6.120

Office Hours: Monday 2-3:30pm in TAY 3.148

Course Description

This course focuses on fundamental theoretical problems from the field of machine learning. We first survey various mathematical frameworks for learning such as Valiant's PAC model, Angluin's model of exact learning, the mistake bounded model, and on-line learning. We then present algorithms for learning concept classes within these models with a strong emphasis on proving bounds on the resources (time, space, sample complexity) required by these algorithms. We also discuss methods for proving the intractability of certain learning tasks and examine the relationship between learning and cryptography. Finally, we will read current research papers in the field; topics include learning with noise, regret bounds, and SVMs. Students will be required to work on an independent project of their choosing (related to either theory or practice).

Tentative Syllabus

Models of Learning: PAC, Exact, On-line, Mistake Bound, and relationships. 1-2 Weeks.

Generalization Error: Occam's Razor, VC-Dimension. 1-2 Weeks.

Learning a Halfspace: Perceptron, Winnow, Boosting-Based Methods, Linear Programming. 2-3 Weeks.

Boosting: Definitions, Weak to Strong Learning, Schapire's Booster, AdaBoost, Boosting with Branching Programs, Smooth Boosting, Applications. 2 Weeks.

Hardness: Relationship to Cryptography. Pseudorandom function generators. 1 week.

Noise: Malicious Noise, Classification Noise. Statistical Queries. 1-2 weeks.

On-line Algorithms: Minimizing Regret. Weighted Majority. 1 week.

Topics: SVMs, regression. 1 week.

Lecture Notes

Introductory Lecture; Outline

Click here for a sample latex template, and click here for the scribe.sty file you will need.

Lecture 1, Mistake bounds, the Winnow algorithm, and the Weighted Majority algorithm.

Lecture 2, Introduction to PAC learning.

Lecture 3, PAC learning continued, Occam's Razor.

Lecture 4, Using Set-Cover, Intro to VC-dimension.

Lecture 5, The VC Theorem.

Lecture 6, Introduction to Learning Halfspaces, Perceptron.

Lecture 7, Perceptron vs. Winnow. Using Winnow to learn halfspaces.

Lecture 8, The Averaging Algorithm for Learning Halfspaces.

Lecture 10, Introduction to Boosting.

Lecture 11, AdaBoost.

Lecture 13, Boosting with Branching Programs.

Grading and Homework

Assignments will include scribe notes, problem sets, and a final project. Grades will be based on performance on these assignments and class participation.

Recommended Reading and Background

Mathematical maturity is required. A course in algorithms or complexity is recommended. A prior course in computational learning theory provides more than enough background for this seminar. An Introduction to Computational Learning Theory by Kearns and Vazirani is a good reference book for topics to be covered here.

Links to Additional Reading Material

Additional Notes

Course material may vary depending on student interest.