Spring 2015

[Course description] [Announcements] [Lectures] [Assignments]

**Course staff:**

- Instructor: Ethan Fetaya
- Supervisor: Ohad Shamir

**Time and Location:**Thursday, 0915-1100, Ziskind, Rm 261

**Syllabus:**
This course will provide a comprehensive introduction to the statistical aspects of learning theory, focusing on generalization: When can we use a finite sample and use it to predict new examples? If we can, how many samples do we need? Etc.

- PAC learning model.
- VC dimension.
- The Fundamental theorem of statistical learning.
- Rademacher complexity.
- Generalization properties of Support vector machines.
- Stability & regularization.
- Compression Bounds.
- PAC-Bayesian Bounds.
- Boosting.
- Additional topics maybe be covered, time allowing.

**Prerequisites:**
Students are expected to be familiar with linear algebra and probability at an undergraduate level, and the course will require some mathematical maturity.

- No lectures on the 7th&14th of May.
- Next lecture is on 30/4/2015 due to Independence day vacation.
- No lecture on 9/4/2015 due to passover vacation.

- Lecture 1 - Introduction and the Hoeffding inequality.
- Lecture 2 - Finite hypothesis space and the growth function.
- Lecture 3 -VC dimension, No-Free-Lunch, The fundamental theorem for binary classification.
- Lecture 4 -Lower bounds on learning, Rademacher complexity.
- Lecture 5 SVM, Rademacher complexity application.
- Lecture 6 Stability and regularization.
- Lecture 7 PAC-Bayes.
- Lecture 8 PAC-Bayes examples, compression bounds.
- Lecture 9 Boosting.
- Lecture 10 Online learning - Realizable case.
- Lecture 11 Online learning - learning with expect advice.

- Exercise 1 - due on the 16th of April.
- Exercise 2 - due on the 7th of May. Problem 1c is optional.
- Exercise 3 - due on the 28th of May.
- Exercise 4 - due on the 25th of June.
- Exercise 5 - due on the 9th of July.