CSC 588: Machine Learning Theory - Spring 2021

Tentative schedule

Date Topics Scribe notes Handwritten notes Additional readings Homework
Jan 14 Introduction, motivation, course mechanics SSBD App B.1 and B.2 HW0 (calibration)
Jan 19 The PAC learning framework; the consistency algorithm Scribe note (0119/) (Please feel free to make a copy of this as a starting point when scribing) Note 2 SSBD Chap 2, Sec 3.1
Jan 21 Analysis of the consistency algorithm; agnostic PAC learning; Hoeffding’s Inequality Note 3 SSBD Chap 2, Sec 3.1
Jan 26 Proof of Hoeffding’s inequality; Bernstein’s inequality Scribe note (0126/) by Brian Toner Note 4 SSBD B.4, B.5
Jan 28 Analysis of ERM; VC theory Scribe note (0128/) by Alonso Granados Note 5 SSBD Chap 4, Sec 6.1, 6.2
Feb 2 VC dimension examples; Sauer’s Lemma Scribe note (0202/) by Katherine Best Note 6 SSBD Sec 6.2 - 6.5.1 HW1
Feb 4 Proof of Sauer’s Lemma; VC dimension of composite hypothesis classes Scribe note (0204/) by Marium Yousuf Note 7 SSBD Sec 6.2 - 6.5.1
Feb 9 Uniform convergence via Rademacher complexity Scribe note (0209/) by Brady Gales Note 8 SSBD Sec 6.5.2, 26.1, 28.1
Feb 11 Proof of uniform convergence of VC classes Scribe note (0211/) by Yinan Li Note 9 SSBD Sec 6.5.2, 26.1, 28.1
Feb 16 Lower bounds of PAC learning with VC classes; fundamental theorem of statistical learning Scribe note (0216/) by Xiaolan Gu Note 10 SSBD Sec 5.1
Feb 18 Error decomposition in statistical learning; model selection Scribe note (0218/) by Jie Bian Note 11 SSBD Chap 7
Feb 23 Finish SRM; Adaboost and its training error analysis Scribe note (0223/) by Yichen Li Note 12 SSBD Chap 10
Feb 25 No class - Reading day
Mar 2 Weak learnability implies “linear separability” through minimax theorem; Margin-based generalization error bounds for AdaBoost and linear classification Scribe note (0302/) by Ryan Sullivant Note 13 SSBD 26.1, 26.2, 26.4 HW2
Mar 4 Proof of margin-based generalization error bounds; Contraction inequality of Rademacher complexity; SVM formulations Scribe note (0304/) by Ruby Abrams Note 14 SSBD 26.3
Mar 9 No class - Reading day
Mar 11 ell_2-norm-based margin bounds; Extensions of SVM; Regularized loss minimization formulations Scribe note (0311/) by Shahriar Golchin Note 15 SSBD 26.3; Chap 15; Spectrally-normalized margin bounds for neural networks
Mar 16 Stability, strong convexity, and regularization Scribe note (0316/) by Adrienne Kinney Note 16 SSBD Chap 13
Mar 18 Stability-fitting tradeoff; online learning: definitions and examples Scribe note (0318/) by Sarah Luca Note 17 SSBD Chap 13.4, O Chap 1, Haipeng Luo’s online learning lecture notes 1 (which this lecture is based heavily on)
Mar 23 Online to batch conversion; Azuma’s Inequality; online gradient descent Scribe note (0323/) by Sheldon Deeny Note 18 O Chap 3, Chap 2
Mar 25 Analysis of online gradient descent; Online mirror descent: basic definitions (class meeting cancelled; Pre-recorded lecture on Panopto) Scribe note (0325/) by Caleb Dahlke Note 19 O Chap 2, Sec 6.1-6.3
Mar 30 Online mirror descent examples: p-norm, exponential weights; Fenchel conjugate Scribe note (0330/) by Erik Wessel Note 20 O Theorem 2.19, 5.2.1, 6.4.1, 6.6, 6.7 HW3
Apr 1 Online mirror descent analysis; Online learning odds & ends: unknown time horizon, lower bounds, Follow the Regularized Leader Scribe note (0401/) by Yao Zhao Note 21 O 6.4, 5.1, 7.1
Apr 6 Online gradient descent for strongly convex functions; kernel methods Scribe note (0406/) by Zisu Wang Note 22 SSBD 14.4.4, 14.5.3, 15.5, 16.2, 16.3
Apr 8 Finish kernel methods; online Newton step for exp-concave functions Scribe note (0408/) by Robert Vacareanu Note 23 SSBD 16.3, O 7.9
Apr 13 Finish online Newton step; begin multi-armed bandits (MAB) Note 24 LS Chap 4
Apr 15 Explore-then-commit; Upper confidence bound (UCB) algorithm and analysis Note 25 LS Chap 6,7
Apr 20 Finish UCB analysis; Adversarial MAB; EXP3 algorithm
Apr 22 linear bandits, MDPs
Apr 27
Apr 29 Project presentation I
May 4 Project presentation II