CSC 480/580: Principles of Machine Learning - Spring 2024
Tentative schedule
Please refer to this course’s Fall 2022 version for a preview of the lecture slides.
Date | Topics | Notes | Readings | Homework |
---|---|---|---|---|
Jan 11 | Introduction, motivation, course mechanics | slides | CIML Chap. 1 | HW0 |
Jan 16 | Supervised learning paradigm; decision trees | slides | CIML Chap. 2 | |
Jan 18 | Finish decision trees; Bayes classifier and its error | CIML Chap. 3 | HW0 due | |
Jan 23 | Overfitting: detection and mitigation; Hyperparameter selection | slides | ||
Jan 25 | Geometry & Nearest neighbors | slides | CIML Chap. 3 | HW1 |
Jan 30 | K-means clustering; Probability review | slides | ||
Feb 1 | HW0 review; Begin linear models | CIML Chap. 4 | ||
Feb 6 | Linear models; the Perceptron algorithm | slides | CIML Chap. 5 | |
Feb 8 | Practical considerations I: features in supervised learning | |||
Feb 13 | Practical considerations II: performance measure beyond error rates | |||
Feb 15 | Practical considerations III: confidence intervals, hypothesis testing, debugging ML algorithms, bias-variance tradeoff | slides | HW1 due | |
Feb 20 | Linear models for regression I: least squares, maximum likelihood | CIML Chap. 7 | ||
Feb 22 | Linear models for regression II: regularization | slides | HW2 | |
Feb 27 | Linear models for classification: logistic regression | |||
Feb 29 | Linear models for classification: SVM; midterm review | slides | ||
Mar 5 | Spring Recess | |||
Mar 7 | Spring Recess | |||
Mar 12 | Midterm exam | |||
Mar 14 | Nonlinear models: basis functions; begin kernels | slides | CIML Chap. 11 | |
Mar 19 | Finish kernels | |||
Mar 21 | Unsupervised learning basics: clustering | CIML Chap. 15 | Project proposal due Mar 22 | |
Mar 26 | Unsupervised learning: PCA | slides | ||
Mar 28 | Finish PCA; Begin Probabilistic ML | HW3 Data: three.txt eight.txt | ||
Apr 2 | Probabilistic ML: Bayes nets; Basic examples | slides | CIML Chap. 9 | |
Apr 4 | Probabilistic ML: Naive Bayes | CIML Chap. 16 | ||
Apr 9 | Finish Naive Bayes; EM; Gaussian Mixture Models | slides | ||
Apr 11 | Finish EM for Gaussian Mixture Models; begin Neural Networks | slides | Stanford CS231n note: optimization | |
Apr 16 | Neural networks: backpropagations; consideration in architecture design | Stanford CS231n note: backprop | ||
Apr 18 | Neural networks: batch normalization; Begin convolutional neural networks | Stanford CS231n note: convnets | HW4 | |
Apr 23 | Finish convolutional neural networks; Autoencoders | NanoGPT tutorial by Andrej Karpathy | ||
Apr 25 | Reinforcement learning: MDP; policy evaluation | Deep RL: Pong from pixels by Andrej Karpathy | ||
Apr 30 | Project feedback session; Final review | slides | ||
May 7 | Final exam at 8:00 – 10:00am |