Tentative schedule:
Date | Topics | Notes | Additional Readings | Homework |
---|---|---|---|---|
Aug 27 | Introduction, motivation, course mechanics | slides probability review | SSBD App B.1 and B.2 | HW0 |
Aug 29 | Concentration of measure | concentration of measure note 1 | SSBD App B.4 | |
Sep 3 | Chernoff bound for Bernoulli random variables, McDiarmid’s Inequality | concentration of measure note 2 | SSBD App B.3, Lemma 26.4 | HW0 due in class |
Sep 5 | The PAC learning framework, finite classes | PAC learning note | SSBD Chap 2, Sec 3.1 | |
Sep 10 | Agnostic PAC learning, infinite classes | SSBD Sec 3.2-3.3, Chap 4, Sec 6.1 | ||
Sep 12 | VC Theory, Sauer’s Lemma | VC Theory note | SSBD Sec 6.2 - 6.5.1 | |
Sep 17 | Rademacher complexity | Uniform convergence / Rademacher complexity note | SSBD Sec 6.5.2, 26.1, 28.1 | HW1 |
Sep 19 | Information-theoretic lower bounds of PAC learning | SSBD 28.2.1 | ||
Sep 24 | Le Cam’s method | SSBD 28.2.1 | ||
Sep 26 | Assouad’s method, fundamental theorem of statistical learning | Lower bound note | SSBD Sec 28.2.2, Sec 5.1 | HW2 |
Oct 1 | Support Vector Machine (SVM); the maximum margin hyperplane problem | SSBD Chap 15, Secs 26.2, 26.3 | ||
Oct 3 | Soft margin, nonlinear mapping, the dual of SVM | SSBD Sec 15.4 | ||
Oct 8 | Kernel trick | SSBD Chap 16 | ||
Oct 10 | Margin bounds, generalization bounds of SVM | SSBD Sec 26.3 | ||
Oct 15 | Proof of margin bounds via Rademacher complexity | SVM note | SSBD Secs 26.2, 26.4 | |
Oct 17 | Model selection, structural risk minimization | Model selection note | SSBD Chap 7, Modern bias-variance tradeoff | Midterm (take home) |
Oct 22 | Boosting: AdaBoost | SSBD Chap 10 | ||
Oct 24 | Boosting and margin bound; two styles of margin bounds | Boosting note | SSBD Sec 26.2, Boosted ResNet paper | |
Oct 29 | Intro to Online learning; Online classification | SSBD Sec 21.1 | HW3 | |
Oct 31 | Minimax analysis: Littlestone’s dimension; Expert problem | Online classification note | SSBD Sec 21.1.1 | |
Nov 5 | Cover’s Impossibility Result; Decision-theoretic online learning. | Handwritten notes | SSBD Sec 21.2 | |
Nov 7 | Hedge; Online to Batch conversion. | Online to batch conversion notes | SSBD Sec 21.2.1, Hedge paper; Loss range adaptivity | |
Nov 12 | Convexity, subgradients | SSBD Sec 12.1, 14.2 | ||
Nov 14 | Bregman divergence, norms, dual norms | |||
Nov 19 | Lipschitzness, strong convexity and smoothness | SSBD Sec 12.1, 13.3 | HW4 | |
Nov 21 | Legendre-Fenchel conjugacy, online convex optimization, Follow the regularized leader (FTRL) | OCO note | Section 2.7, 2.3 of Shalev-Shwartz’s Survey | |
Nov 26 | Analysis of FTRL; FTRL Examples | H Sec 5.3 | ||
Nov 28 | No Class. Thanksgiving Holiday! | |||
Dec 3 | Online linear classification with margins; FTRL with Adaptive regularization | Section 3.3 of Shalev-Shwartz’s Survey; H Sec 5.6 | ||
Dec 5 | Adagrad, online optimization of strongly convex functions | H Sec 3.3 | ||
Dec 10 | Final presentation | Topics to review |