The Computational Complexity of Machine Learning

Instructor: Adam Klivans

Course meets Tuesday and Thursday at 11:00AM in PAI 5.60

Course Description


This seminar will focus on recent progress regarding the computational complexity of fundamental tasks in machine learning. We will concentrate on efficient algorithms for learning rich concept classes under various distributions. Particular emphasis will be placed on open questions regarding the possibility of discovering new algorithmic techniques.

The first part of the course will be dedicated to PAC learning Boolean functions under an arbitrary distribution. We will discuss applications of linear programming (and its variants) for learning halfspaces and polynomial-threshold functions, among other concepts. In addition we will discuss structural properties of Boolean functions related to polynomial representation.

The second part of the course will focus on learning Boolean functions with respect to the uniform distribution. Here the main topic will be discrete Fourier analysis and its applications. We will discuss learning decision trees and DNF formulas, and we will combine this approach with Boosting based methods to push the envelope of what we can learn.

The third part of the course will focus on query learning, including automata and multiplicity automata. Additional discussion topics include polynomial interpolation, random projection, and cryptographic based hardness results.

We will have student paper presentations; topics will be chosen according to student interest.

Lecture Notes

Introductory Lecture; Outline

Lecture 1, Introduction to the mistake bounded model. Learning decision lists and decision trees.

Lecture 2, Bshouty's algorithm for learning DNF.

Lecture 3, Polynomial Threshold Functions; Expressing DNFs as PTFs Part 1.

Lecture 4, Expressing DNFs as PTFs Part 2; Parity/MP PTF lower bound.

Lecture 5, Voting Polynomials.

Lecture 6, Learning Intersections of Halfspaces

Lecture 7, PTF lower bounds; Introduction to Margins

Lecture 8, JL Lemma and Application: Learning Halfspaces with a Margin

Lecture 9, Finishing Halfspaces w/ Margin; Learning Parity with Noise

Lecture 10, Hardness of Properly Learning ANDs of Halfspaces

Lecture 11, More Hardness results for Proper Learning

Lecture 12, Introduction to Fourier Analysis

Lecture 13, The Sparse Algorithm; Learning Decision Trees (KM)

Lecture 14, The Harmonic Sieve

Lecture 15, Learning Constant Depth Circuits (LMN)

(Coming Soon) Lecture 16, Noise Stability: Learning Intersections of Halfspaces

Lecture 17, Learning Constant Depth Circuits with MAJ gates

Lecture 18, Learning with Queries: Polynomial Interpolation

Lecture 19, Polynomial Interpolation Continued, Exact Learning DFAs

Lecture 20, Angluin's algorithm and applications

(Coming Soon) Lecture 21 Multiplicity Automata

Grading and Homework

Assignments will include scribe notes, a student paper presentation, and a (relatively short) problem set or two. Grades will be based on performance on these assignments and class participation.

Problem Set 1. Due March 3, 2005.

Solution to Problem Set 1.

Problem Set 2. Due April 28, 2005.

Solution to Problem Set 2.

Recommended Reading and Background

Mathematical maturity is required. A course in algorithms or complexity is recommended. A prior course in computational learning theory provides more than enough background for this seminar. An Introduction to Computational Learning Theory by Kearns and Vazirani is a good reference book for topics to be covered here.

Links to Additional Reading Material

Avrim Blum has a survey paper on online learning which is worth studying. He also has several learning related tutorials you may want to check out. Follow this link: Avrim's Learning Survey Page.

A Survey on Fourier Learning

Yishay Mansour has written a great survey on learning Boolean functions using Fourier analysis. Follow this link: Mansour's Fourier Survey.

References to Results Proved in Class

Here is a link to Nader Bshouty's algorithm for learning DNF formulas. The improved result using the Chebyshev polynomial can be found on my publications page "Learning DNF in Time...." with Rocco Servedio. Avrim Blum's slick proof of Ehrenfeucht and Haussler's result for learning decision trees can be found in Information Processing Letters, 42:183--185,1992.

Additional Notes

Course material may vary depending on student interest.