Machine Learning CS-433

This course is offered jointly with the Information Processing Group.
Previous year’s website: ML 2017

Contact us: Use the moodle discussion forum, or some of the contact details below:

Instructor Martin Jaggi Instructor Ruediger Urbanke
Office INJ 341 Office INR 116
Phone +41 21 69 35226 Phone +41 21 69 37692
Email martin.jaggi@epfl.ch Email ruediger.urbanke@epfl.ch
Office Hours By appointment Office Hours By appointment
Lectures Tuesday 17:15 – 19:00 Room: CO1
Thursday 16:15 – 18:00 Room: SG1
Exercises Thursday 14:15 – 16:00 Rooms: INF119 (A-Car),  INJ218 (Cav-G),  INF2 (H-Mo),  INM202 (Mu-Sing),  INR219 (Sinn-Z)
Language: English
Credits : 7 ECTS

For a summary of the logistics of this course, see the course info sheet here (PDF).
(and also here is a link to official coursebook information).

Special Announcements

  • The final exam is scheduled for Tuesday 15.01.2019 from 16h15 to 19h15 in STCC08328
  • The first tuesday lecture will take place in the Rolex learning center for capacity reasons.
  • The new exercise sheet, as well as the solution (code only) for the previous week’s lab session will typically be made available each tuesday (here and on github).
  • Some old final exams: final 2016, solutions 2016, solutions 2017
    Some old mock exams: 2016,2015,2014
  • Projects: There will be two group projects during the course.
    • Project 1 counts 10% and is due Oct 29th.
    • Project 2 counts 30% and is due Dec 20th.
  • Labs and projects will be in Python. See Lab 1 to get started.
  • Code Repository for Labs, Projects, Lecture notes: github.com/epfml/ML_course
  • the exam is closed book but you are allowed one crib sheet (A4 size paper, both sides can be used), either handwritten or 11 point minimum font size; bring a pen and white eraser; you find the exam from last year with solutions posted a few lines above;

Detailed Schedule

(tentative, subject to changes)
Annotated lecture notes from each class are made available on github here.

Date Topics Covered Lectures Exercises Projects
18/9 Introduction (in RLC) 01a,01b
20/9 Linear Regression, Cost functions 01c,01d Lab 1
25/9 Optimization 02a
27/9 Optimization Lab 2
02/10 Least Squares, ill-conditioning, Max Likelihood 03a,03b Project 1 details
04/10 Overfitting, Ridge Regression, Lasso 03c,03d Lab 3
09/10 Cross-Validation 04a
11/10 Bias-Variance decomposition 04b Lab 4
16/10 Research Talk, Transfer Learning & AutoML – Google Brain
18/10 Research Talk,  Community detection and stochastic block models – Emmanuel Abbe Q&A for proj.
23/10 Classification 05a
25/10 Logistic Regression 05b Lab 5
30/10 Generalized Linear Models 06a Proj. 1 due 29.10.
01/11 K-Nearest Neighbor 06b Lab 6
06/11 Support Vector Machines 07a Project 2 details
08/11 Kernel Regression 07b Lab 7
13/11 Unsupervised Learning, K-Means 08a
15/11 K-Means, Gaussian Mixture Models 08b, 08c Lab 8
20/11        Mock Exam
22/11 Gaussian Mixture Models, EM algorithm Mock Exam
27/11 Matrix Factorizations
29/11 Text Representation Learning Lab 10
04/12 SVD and PCA
06/12 SVD and PCA and Neural Networks – Basics Lab 11
11/12 Neural Networks – Representation Power
13/12 Neural Networks – Backpropagation, Activation Functions Q&A for proj.
18/12 Neural Networks – CNN, Regularization, Data Augmentation, Dropout
20/12 Graphical Models – Bayes Nets Lab 13 Proj. 2 due 20.12.

Textbooks

(not mandatory)

Christopher Bishop, Pattern Recognition and Machine Learning
Kevin Murphy, Machine Learning: A Probabilistic Perspective
Shai Shalev-Shwartz, Shai Ben-David, Understanding Machine Learning
Michael Nielsen, Neural Networks and Deep Learning