Up: Dana's Home page

Machine Learning

Instructor

Prof. Dana Ballard

TA

Ruohan Zhang Reading Material
Text: Pattern Recognition and Machine Learning by Christopher Bishop
Recommended: Machine Learning An Algorithmic Approach 2nd Ed by Stephen Marsland
Supplementary Material: Andrew Ng's lecture notes and lecture videos.

Communication policy:
The homework assignments will be posted on this class website. We will be using Piazza for announcments and for discussing the material and homework. To join the class on Piazza, go here. Questions about the material or homeworks must be asked on Piazza so that the entire class can benefit from the discussion. Students are also encouraged to answer questions. However, questions requiring a lengthy explanation (more than a few sentences) should be saved for office hours, and questions about how an assignment was graded should be sent over email directly to the TA who graded it, or at that TA's office hours. Grades will be distributed over Blackboard (courses.utexas.edu).

Homework:
The assignments will generally be released on Wednesdays, the material for the assigment will be taught on the following Monday, and then the assignment will be due on Sunday. For problem sets, your solutions should be submitted in class on Monday. They can be neatly handwritten or typeset in Latex. The official due dates will be posted on the website. Most assignments will require computer programming, which must be done in Matlab, Octave, Python (Numpy), or R. Matlab is the officially supported language though. Learning how to use Matlab is relatively easy, and some decent tutorials can be found here and here. Matlab is available on the CS departmental machines -- just invoke matlab at the command line. To run graphical applications like Matlab remotely, you will need to use vnc, which you can learn about here.

Homework grading:
Here and here are some excellent reports from hw1. You should try to model yours after the example they have set. It is very important that you point out problems with your implementation in your report. It's much better for you to be honest, and state these problems in your report, than for me to discover them by running your code. Here's a report that was commendable for its honesty, and pretty good overall (see last section). The most important thing is to cover all the points listed at the end of the assignment handout.

Electronic submission:
For the programming assignments, you will submit homework using Canvas. Always submit your report and code (if any), and put your name on both your code and report. If you don't have a CS account, get One.

Course Credit:
Credit will be based on the assignments (70%), a midterm (15%), and a final (15%)

Late Policy:
1 day: -10%
2 days: -20%
3 days: -30%
4 days: No credit
If extenuating circumstances will make it difficult for you to complete a project on time, contact the TA to work things out.

Academic dishonesty:
Students caught plagiarizing will fail the course. On some projects, we will use MOSS to analyze code. Avoid plagiarism by carefully citing all your sources. If you use a specific short code snippet found on a web-page, mention that fact in a comment. If someone else told you how to solve a tricky part of an assignment, give them credit too. Do not copy code from other students. However, if you did and cited them for it, I suppose that you would not fail for plagiarism; you simply would not get credit for the assignment. Even so, don't do it.

Schedule

Week 1

Introduction (21 Jan)
Reading: Slides

Week 2

Liear Algebra: Solving Liear Eqs (25 Jan)

Eigenvalues and Eigenfaces (27 Jan)
Reading: Notes,Notes,Bishop Appendix D and E
Homework: Eigendigits/Classification

Week 3

Probability: Bayes Rule and extensions (2 Feb)
Reading: Slides
Probability II: Distributions, conjugates, denxity fns (4 Feb)
Reading: Notes

Week 4

Information Thy (9 Feb)
Reading: Notes

Reading: Notes ,Notes, Slides
ICA (Feb 11) Speech separation
Reading: Ng Notes,Notes
Homework: ICA

Week 5

Sampling Distributions: Algebraic Methods (16 Feb)
Sampling Distributions: Gibbs; MCMC (18 Feb)

Week 6

Decision Trees (23 Feb)
Reading: Bishop

Learning Theory (25 Feb)
Reading:Notes on VC dimension

Week 7

SVMs;Perceptrons (2 Mar)
Reading: Bishop 4.1, 4.2

SVMs; Kernels (4 Mar)
Reading: SVM Tutorial ; Bishop 325-345 XOR example, New notes!
Homework:Problem Set
Homework #3

Sampling methods 26 Feb)

Week 8

Review (3 Mar)
Reading: Practice Midterms 1 2
Older Practice Midterms 3 4

Midterm (5 Mar) Midterm Answers

Week 9

Least Angle Regression (Mar 23)
Reading: Least angle regression Tutorial
Convolutional Networks / Deep Learning (25 Mar)
Reading: Restricted Boltzman machines Tutorial

Week 10

Bayes Nets: inference (30 Mar)
Reading: Bishop 11.1-11.3
Homework: Bayesian Networks Bayes nets: Representation, D-separation (19 Mar)
Reading: Bishop 8.1,8.2, Bayes Net Slides
Bayes Nets:Exact inference
Reading: Bishop 8.3,8.4

Bayes Nets: Learning (1 Apr)
Reading: Tutorial on learning BNs (read sec. 2,3,5,7,8,11)

Week 11

Hidden Markov Models
(6 Apr) Reading: Bishop, Slides

Reinforcement Learning (8 Apr)
Reading: Slides

Week 12

Reinforcement Learning (13-15 Apr)
Reading: Notes
Homework: Reinforcement Learning

Reinforcement Learning (9 Apr)
Reading: RL Notes2

Week 13

Genetic Algorithms (20-22 Apr)
Reading: GA Notes
Homework: Genetic Algorithms

Genetic Programming (16 Apr)

Week 14

Game Theory (27 Apr)
Reading: Slides,Hauert Paper

Game Theory (29 Apr)
Reading: Zhu (see background material)

Week 15

Review (4 May)
Reading: Practice Final1   Practice Final2   Practice Final3

Second Exam (May 6)