YSC3227 Machine Learning

Semester 1, 2017/2018
Yale-NUS College



Description:

The goal of machine learning is to enable machines/computers to identify patterns from data, extract the patterns, and based on them, make an inference or prediction automatically. These capabilities are the core of artificial intelligence (namely, to make machines learn without being explicitly programmed using fixed predetermined rules). The applications of machine learning are immense, since nowadays we are bombarded with a huge number of various data from various sources. We hope machine learning can make sense of this huge seemingly random data.

This course focuses on the fundamentals of machine learning, including and deep learning. It should interest students who want to study/work in big data, AI (artificial intelligence), and data science.


Prerequisite: Programming skill in Python.
Textbook: "Pattern Recognition and Machine Learning", by Christopher Bishop.
Instructor: Robby T. Tan (robby.tan [att] yale-nus.edu.sg)


Schedule:

The following schedule will not be strictly followed.


Date
Topic
Lecture Note
August 15 1. LEAST SQUARES

Reading:
  • Chapter 1 (Introduction): Sect. 1.1 (Polynomial Curve Fitting)

Additional resources (optional):
Lecture note 1
Assignment 1
August 8 2. INTRO TO BAYESIAN INFERENCE

Reading:
  • Chapter 1 (Introduction): Sect. 1.2 (Probability Theory), Sect. 1.3 (Model Selection)
  • Introduction to Bayesian inference: video

Lecture note 2
August 22 3. MLE FOR REGRESSION

Reading:
  • Bayesian Inference: An Introduction to Principles and Practice in Machine Learning (Sect. 2.1, 2.1.1, and 2.1.2 only): pdf

Additional resources:
Lecture note 3
August 25 4. MAP FOR REGRESSION

Additional resources:
Lecture note 4
Assignment 2
August 29 5. BASIS FUNCTIONS

Reading:
  • Chapter 3: Linear Models for Regression (Sect. 3.1 only)

Additional resources:
Lecture note 5
September 5 6. GAUSSIAN DISTRIBUTIONS: COVARIANCE MATRIX

Reading:
  • Covariance matrix: wikipedia
  • Geometric interpretation of covariance matrix: website

Lecture note 6
September 8 7. SEQUENTIAL BAYESIAN LEARNING

Reading:
  • Chapter 3: Sect. 3.3.1 (Parameter Distribution)

Lecture note 7a
Lecture note 7b
September 12 8. PREDICTIVE DISTRIBUTION

Reading:
  • Chapter 3: Sect. 3.3.2 (Predictive Distribution)
  • Bayesian Inference: An Introduction to Principles and Practice in Machine Learning (From Section 1 to Section 3 only): pdf

Lecture note 8
Assignment 3
September 15 9. REVIEW: BAYESIAN METHODS FOR REGRESSION

Lecture note 9
September 19 10. QUIZ 1

Sample Questions
September 22 11. GAUSSIAN PROCESSES 1

Reading:
  • Chapter 6: Sect. 6.4.1 (Linear Regression Revisited) and Sect. 6.4.2 (Gaussian Processes for Regression)
Additional resources:
Lecture note 11
RECESS WEEK

October 3 12. GAUSSIAN PROCESSES 2

See Lecture note 11
Assignment 4

October 6 13. LEAST-SQUARES FOR CLASSIFICATION: 1
Reading:
  • Chapter 4: Sect. 4.1.1 (two classes), 4.1.2 (multiple classes), 4.1.3 (least squares for classification)

Lecture note 13
October 10 14. LEAST-SQUARES FOR CLASSIFICATION: 2

See Lecture note 13
October 13 15. PROBABILISTIC CLASSIFICATION: MLE

Reading:
  • Sect. 4.3.2 (logistic regression) and 4.3.3 (iterative reweighted least squares)
Lecture note 15
October 17 16. LAPLACE APPROXIMATION

Reading:
  • Chapter 4: Sect. 4.4 (The Laplace Approximation)

Assignment 5
Lecture note 16
October 20 17. BAYESIAN LOGISTIC REGRESSION

Reading:
  • Chapter 4: Sect. 4.5 (Bayesian Logistic Regression)

October 24 18. GAUSSIAN PROCESSES FOR CLASSIFICATION

October 27 19. NEURAL NETWORKS: BASIC

October 31 20. NEURAL NETWORKS: BACKPROPAGATION

November 3 21. DEEP LEARNING: TRAINING

November 7 22. DEEP LEARNING: FURTHER ISSUES

November 10 23. DEEP LEARNING: ARCHITECTURES

November 14 24. MIXTURE MODELS + EM ALGORITHM

November 17 25. DISCUSSION

December 1 FINAL EXAM: 1pm to 4pm (Classroom 20)





Syllabus: