Introduction to Statistical Learning Theory
Time and Location:Tuesday, 13:15-1500, Ziskind, Rm 1
This course will provide a comprehensive introduction to the statistical aspects of learning theory, focusing on generalization: When can we use a finite sample and use it to predict new examples? If we can, how many samples do we need? Etc.
- PAC learning model.
- VC dimension.
- The Fundamental theorem of statistical learning.
- Rademacher complexity.
- Generalization properties of Support vector machines.
- Stability & regularization.
- Compression Bounds.
- PAC-Bayesian Bounds.
- Additional topics maybe be covered, time allowing.
Students are expected to be familiar with linear algebra and probability at an undergraduate level, and the course will require some mathematical maturity.
- Lecture 1 - Introduction and the Hoeffding inequality.
- Lecture 2 - PAC learnability, growth function.
- Lecture 3 - VC dimension, no-free-lunch.
- Lecture 4 - Fundamental theorem of binary learning theory, lower bounds.
- Lecture 5 - Regression, fat-shattering dimension.
- Lecture 6 - Rademacher complexity.
- Lecture 7 - Rademacher complexity applications, SVM.
- Lecture 8 - Regularization & stability.
- Lecture 10 - PAC-Bayes
- Lecture 11 - Applications of PAC-Bayes, compression bounds.
- Ex 1 - Due April 19th.
- Ex 2 - Due May 3rd.
- Ex 3 - Due May 31rd.
- Ex 4 - Due June 14th.
- Ex 5 - Due July 5th.