Introduction to Statistical Learning Theory
Time and Location:Thursday, 0915-1100, Ziskind, Rm 261
This course will provide a comprehensive introduction to the statistical aspects of learning theory, focusing on generalization: When can we use a finite sample and use it to predict new examples? If we can, how many samples do we need? Etc.
- PAC learning model.
- VC dimension.
- The Fundamental theorem of statistical learning.
- Rademacher complexity.
- Generalization properties of Support vector machines.
- Stability & regularization.
- Compression Bounds.
- PAC-Bayesian Bounds.
- Additional topics maybe be covered, time allowing.
Students are expected to be familiar with linear algebra and probability at an undergraduate level, and the course will require some mathematical maturity.
- No lectures on the 7th&14th of May.
- Next lecture is on 30/4/2015 due to Independence day vacation.
- No lecture on 9/4/2015 due to passover vacation.
- Lecture 1 - Introduction and the Hoeffding inequality.
- Lecture 2 - Finite hypothesis space and the growth function.
- Lecture 3 -VC dimension, No-Free-Lunch, The fundamental theorem for binary classification.
- Lecture 4 -Lower bounds on learning, Rademacher complexity.
- Lecture 5 SVM, Rademacher complexity application.
- Lecture 6 Stability and regularization.
- Lecture 7 PAC-Bayes.
- Lecture 8 PAC-Bayes examples, compression bounds.
- Lecture 9 Boosting.
- Lecture 10 Online learning - Realizable case.
- Lecture 11 Online learning - learning with expect advice.