Machine Learning CS-433 – 2018

This is the OLD 2018 course website. For the current one, see here.

This course is offered jointly with the Information Processing Group.
Previous year’s website: ML 2017

Contact us: Use the moodle discussion forum, or some of the contact details below:

Instructor Martin Jaggi Instructor Ruediger Urbanke
Office INJ 341 Office INR 116
Phone +41 21 69 35226 Phone +41 21 69 37692
Email [email protected] Email [email protected]
Office Hours By appointment Office Hours By appointment
Lectures Tuesday 17:15 – 19:00 Room: CO1
Thursday 16:15 – 18:00 Room: SG1
Exercises Thursday 14:15 – 16:00 Rooms: INF119 (A-Car),  INJ218 (Cav-G),  INF2 (H-Mo),  INM202 (Mu-Sing),  INR219 (Sinn-Z)
Language: English
Credits : 7 ECTS

For a summary of the logistics of this course, see the course info sheet here (PDF).
(and also here is a link to official coursebook information).

Special Announcements

  • The final exam is scheduled for Tuesday 15.01.2019 from 16h15 to 19h15 in STCC08328
  • The first tuesday lecture will take place in the Rolex learning center for capacity reasons.
  • The new exercise sheet, as well as the solution (code only) for the previous week’s lab session will typically be made available each tuesday (here and on github).
  • Some old final exams: final 2017, solutions 2017, final 2016, solutions 2016
    Some old mock exams: 2016,2015,2014
  • Projects: There will be two group projects during the course.
    • Project 1 counts 10% and is due Oct 29th.
    • Project 2 counts 30% and is due Dec 20th.
  • Labs and projects will be in Python. See Lab 1 to get started.
  • Code Repository for Labs, Projects, Lecture notes: github.com/epfml/ML_course
  • the exam is closed book but you are allowed one crib sheet (A4 size paper, both sides can be used), either handwritten or 11 point minimum font size; bring a pen and white eraser; you find the exam from last year with solutions posted a few lines above;
  • SOLUTIONS for exam; NOTE: There was a mistake in the grading of problem 14; the training error will increase as a function of lambda (not decrease) since the family will become simpler; this will be corrected globally; no need to ask us about this individually;

Detailed Schedule

(tentative, subject to changes)
Annotated lecture notes from each class are made available on github here.

Date Topics Covered Lectures Exercises Projects
18/9 Introduction (in RLC) 01a,01b
20/9 Linear Regression, Cost functions 01c,01d Lab 1
25/9 Optimization 02a
27/9 Optimization Lab 2
02/10 Least Squares, ill-conditioning, Max Likelihood 03a,03b Project 1 details
04/10 Overfitting, Ridge Regression, Lasso 03c,03d Lab 3
09/10 Cross-Validation 04a
11/10 Bias-Variance decomposition 04b Lab 4
16/10 Research Talk, Transfer Learning & AutoML – Google Brain
18/10 Research Talk,  Community detection and stochastic block models – Emmanuel Abbe Q&A for proj.
23/10 Classification 05a
25/10 Logistic Regression 05b Lab 5
30/10 Generalized Linear Models 06a Proj. 1 due 29.10.
01/11 K-Nearest Neighbor 06b Lab 6
06/11 Support Vector Machines 07a Project 2 details
08/11 Kernel Regression 07b Lab 7
13/11 Unsupervised Learning, K-Means 08a
15/11 K-Means, Gaussian Mixture Models 08b, 08c Lab 8
20/11        Mock Exam
22/11 Gaussian Mixture Models, EM algorithm 09a Mock Exam solutions
27/11 Matrix Factorizations 10a
29/11 Text Representation Learning 10b Lab 10
04/12 SVD and PCA 11a
06/12 SVD and PCA Lab 11 + Q&A
11/12 Neural Networks – Basics, Representation Power  12a,  12b
13/12 Neural Networks – Backpropagation, Activation Functions  12c,  12d Lab 12 + Q&A
18/12 Neural Networks – CNN, Regularization, Data Augmentation, Dropout  13a,  13b
20/12 Graphical Models – Bayes Nets  13c Lab 13 Proj. 2 due 20.12.

Textbooks

(not mandatory)

Christopher Bishop, Pattern Recognition and Machine Learning
Kevin Murphy, Machine Learning: A Probabilistic Perspective
Shai Shalev-Shwartz, Shai Ben-David, Understanding Machine Learning
Michael Nielsen, Neural Networks and Deep Learning

Projects

Final projects were done among 5 topic options.

The interdisciplinary ‘ML for Science’ projects performed this year across campus were:

Quality of Life in Swiss Cities based on OpenStreetMap
Compressive Sensing MRI using Deep Learning
Predicting the density of lightning activity from atmospheric and geographic features
Machine Learning for Science: Quantum Machine Learning
Solar Panel Recognition and Segmentation on Swiss Map using Convolutional Neural Networks
ML for crystal structure determination as an alternative to NMR spectroscopy
Human Behavior Modelling
Domain-invariant defect detection using deep learning
Comparing classification techniques for metabolic kinetic models selection
Spatially-Inferred Graphical Models for fMRI Data Mining
Crowded enzyme kinetics using simulation and machine learning
Quality of life in Swiss Cities
Ultrathin section segmentation
Correlations between cognitive performance and sensory stimuli in the work environment
Autism Diagnostic based on Machine Learning
Human performance modelling according to indoor temperature and light (quantity and colour)
Automatic Harmonization using Recurrent Neural Networks
Chord recognition on Beethoven string quartets
Machine Learning Privacy
Quality of Life Clustering of Swiss Cities from Insurance and Demographic Data
Machine learning for air quality measurement and modeling
The Case for Bagged Neural Networks: Evidence from Outlier Detection using Autoencoder Ensembles
Chord Prediction with The Annotated Beethoven Corpus
Predicting Forces on a Flapping Wing Model using Machine Learning
Brain Tissue Segmentation
Clustering and Predicting Swiss cities based on Insurance Data
Predicting the material properties of the arterial wall of a mouse
Predicting organic carbon with infrared spectra
3D Pointclouds Super-resolution for Digital humanities
Segmentation Of Silicon Wafers For Electron Microscopy Using Mask-RCNN
Classifying Nanopore Readings with Deep Learning
Wind Profile Prediction in an Urban Canyon: a Machine Learning Approach
A Stem Cell Classifier for Single Cell RNA Sequencing Data
Deep Convolutional Neural Networks for Cell Segmentation in Bright-Field Microscopy Images
Predicting Aerosol Particles: Sulfate, Nitrate and PM2.5
Healthy aging: age group prediction from chunking strategies during motor sequence learning
Multi-lingual text classifier for social media data
Implementation of an Improved Model for the Prediction of Effective Rate Constants in the Presence of Crowders
Architecture of Feelings
Evaluating the quality of videos through machine learning
Classifying segmentation defects in mutant zebrafish embryo
Fingerprinting DNS-over-HTTPS traffic
Statistics of Turkish researchers after the 2016 Coup d’Etat attempt
Analysis of the dismissal of Turkish researchers after the 2016 Coup d’Etat attempt
Evaluating the risk of relapse in melanoma
Links to the 3 official project competitions we offered on CrowdAI:
The following teams participated in the ICLR reproducibility challenge:

Improving Generalization and Stability of Generative Adversarial Networks
Meta-learning with differentiable closed-form solvers
A Resizable Mini-batch Gradient Descent based on a Multi-Armed Bandit
Detecting Adversarial Examples via Neural Fingerprinting
Learning Neural PDE Solvers with Convergence Guarantees
AutoLoss: Learning Discrete Schedules for Alternate Optimization
MAE : Mutual Posterior-Divergence Regularization for Variational Autoencoders
Hyper-Regularization: An Adaptive Choice for the Learning Rate in Gradient Descent