This course is offered jointly with the Information Processing Group.
Previous year’s website: ML 2017
Contact us: Use the moodle discussion forum, or some of the contact details below:
Instructor  Martin Jaggi  Instructor  Ruediger Urbanke  
Office  INJ 341  Office  INR 116  
Phone  +41 21 69 35226  Phone  +41 21 69 37692  
martin.jaggi@epfl.ch  ruediger.urbanke@epfl.ch  
Office Hours  By appointment  Office Hours  By appointment 
Teaching Assistants  Student Assistants 
Lectures  Tuesday  17:15 – 19:00  Room: CO1 
Thursday  16:15 – 18:00  Room: SG1 

Exercises  Thursday  14:15 – 16:00  Rooms: INF119 (ACar), INJ218 (CavG), INF2 (HMo), INM202 (MuSing), INR219 (SinnZ) 
Language:  English  
Credits :  7 ECTS 
For a summary of the logistics of this course, see the course info sheet here (PDF).
(and also here is a link to official coursebook information).
Special Announcements

The final exam is scheduled for Tuesday 15.01.2019 from 16h15 to 19h15 in STCC08328

The first tuesday lecture will take place in the Rolex learning center for capacity reasons.

The new exercise sheet, as well as the solution (code only) for the previous week’s lab session will typically be made available each tuesday (here and on github).

Some old final exams: final 2017, solutions 2017, final 2016, solutions 2016
Some old mock exams: 2016,2015,2014 
Projects: There will be two group projects during the course.

Project 1 counts 10% and is due Oct 29th.

Project 2 counts 30% and is due Dec 20th.


Labs and projects will be in Python. See Lab 1 to get started.

Code Repository for Labs, Projects, Lecture notes: github.com/epfml/ML_course

the exam is closed book but you are allowed one crib sheet (A4 size paper, both sides can be used), either handwritten or 11 point minimum font size; bring a pen and white eraser; you find the exam from last year with solutions posted a few lines above;
 SOLUTIONS for exam; NOTE: There was a mistake in the grading of problem 14; the training error will increase as a function of lambda (not decrease) since the family will become simpler; this will be corrected globally; no need to ask us about this individually;
Detailed Schedule
(tentative, subject to changes)
Annotated lecture notes from each class are made available on github here.
Date  Topics Covered  Lectures  Exercises  Projects 

18/9  Introduction (in RLC)  01a,01b  
20/9  Linear Regression, Cost functions  01c,01d  Lab 1  
25/9  Optimization  02a  
27/9  Optimization  Lab 2  
02/10  Least Squares, illconditioning, Max Likelihood  03a,03b  Project 1 details  
04/10  Overfitting, Ridge Regression, Lasso  03c,03d  Lab 3  
09/10  CrossValidation  04a  
11/10  BiasVariance decomposition  04b  Lab 4  
16/10  Research Talk, Transfer Learning & AutoML – Google Brain  
18/10  Research Talk, Community detection and stochastic block models – Emmanuel Abbe  Q&A for proj.  
23/10  Classification  05a  
25/10  Logistic Regression  05b  Lab 5  
30/10  Generalized Linear Models  06a  Proj. 1 due 29.10.  
01/11  KNearest Neighbor  06b  Lab 6  
06/11  Support Vector Machines  07a  Project 2 details  
08/11  Kernel Regression  07b  Lab 7  
13/11  Unsupervised Learning, KMeans  08a  
15/11  KMeans, Gaussian Mixture Models  08b, 08c  Lab 8  
20/11  Mock Exam  
22/11  Gaussian Mixture Models, EM algorithm  09a  Mock Exam  solutions 
27/11  Matrix Factorizations  10a  
29/11  Text Representation Learning  10b  Lab 10  
04/12  SVD and PCA  11a  
06/12  SVD and PCA  Lab 11 + Q&A  
11/12  Neural Networks – Basics, Representation Power  12a, 12b  
13/12  Neural Networks – Backpropagation, Activation Functions  12c, 12d  Lab 12 + Q&A  
18/12  Neural Networks – CNN, Regularization, Data Augmentation, Dropout  13a, 13b  
20/12  Graphical Models – Bayes Nets  13c  Lab 13  Proj. 2 due 20.12. 
Textbooks
(not mandatory)
Christopher Bishop, Pattern Recognition and Machine Learning
Kevin Murphy, Machine Learning: A Probabilistic Perspective
Shai ShalevShwartz, Shai BenDavid, Understanding Machine Learning
Michael Nielsen, Neural Networks and Deep Learning
Projects
Final projects were done among 5 topic options.
The interdisciplinary ‘ML for Science’ projects performed this year across campus were:
Compressive Sensing MRI using Deep Learning
Predicting the density of lightning activity from atmospheric and geographic features
Machine Learning for Science: Quantum Machine Learning
Solar Panel Recognition and Segmentation on Swiss Map using Convolutional Neural Networks
ML for crystal structure determination as an alternative to NMR spectroscopy
Human Behavior Modelling
Domaininvariant defect detection using deep learning
Comparing classification techniques for metabolic kinetic models selection
SpatiallyInferred Graphical Models for fMRI Data Mining
Crowded enzyme kinetics using simulation and machine learning
Quality of life in Swiss Cities
Ultrathin section segmentation
Correlations between cognitive performance and sensory stimuli in the work environment
Autism Diagnostic based on Machine Learning
Human performance modelling according to indoor temperature and light (quantity and colour)
Automatic Harmonization using Recurrent Neural Networks
Chord recognition on Beethoven string quartets
Machine Learning Privacy
Quality of Life Clustering of Swiss Cities from Insurance and Demographic Data
Machine learning for air quality measurement and modeling
The Case for Bagged Neural Networks: Evidence from Outlier Detection using Autoencoder Ensembles
Chord Prediction with The Annotated Beethoven Corpus
Predicting Forces on a Flapping Wing Model using Machine Learning
Brain Tissue Segmentation
Clustering and Predicting Swiss cities based on Insurance Data
Predicting the material properties of the arterial wall of a mouse
Predicting organic carbon with infrared spectra
3D Pointclouds Superresolution for Digital humanities
Segmentation Of Silicon Wafers For Electron Microscopy Using MaskRCNN
Classifying Nanopore Readings with Deep Learning
Wind Profile Prediction in an Urban Canyon: a Machine Learning Approach
A Stem Cell Classifier for Single Cell RNA Sequencing Data
Deep Convolutional Neural Networks for Cell Segmentation in BrightField Microscopy Images
Predicting Aerosol Particles: Sulfate, Nitrate and PM2.5
Healthy aging: age group prediction from chunking strategies during motor sequence learning
Multilingual text classifier for social media data
Implementation of an Improved Model for the Prediction of Effective Rate Constants in the Presence of Crowders
Architecture of Feelings
Evaluating the quality of videos through machine learning
Classifying segmentation defects in mutant zebrafish embryo
Fingerprinting DNSoverHTTPS traffic
Statistics of Turkish researchers after the 2016 Coup d’Etat attempt
Analysis of the dismissal of Turkish researchers after the 2016 Coup d’Etat attempt
Evaluating the risk of relapse in melanoma
Improving Generalization and Stability of Generative Adversarial Networks
Metalearning with differentiable closedform solvers
A Resizable Minibatch Gradient Descent based on a MultiArmed Bandit
Detecting Adversarial Examples via Neural Fingerprinting
Learning Neural PDE Solvers with Convergence Guarantees
AutoLoss: Learning Discrete Schedules for Alternate Optimization
MAE : Mutual PosteriorDivergence Regularization for Variational Autoencoders
HyperRegularization: An Adaptive Choice for the Learning Rate in Gradient Descent