Centrale Lille Course Catalogue

Models for Machine Learning

Course label : Models for Machine Learning
Teaching departement : EEA / Electrotechnics - Electronics - Control Systems
Teaching manager : Mister PIERRE-ANTOINE THOUVENIN
Education language :
Potential ects : 0
Results grid :
Code and label (hp) : MR_DS_S2_MML - Models for Machine Learning

Education team

Teachers : Mister PIERRE-ANTOINE THOUVENIN
External contributors (business, research, secondary education): various temporary teachers

Summary

Fundamentals of Bayesian models and inference. Hands on sessions. Usual methods and notions in this scope are summarized below: - baseline concepts (likelihood, prior, posterior, predictive distribution) - conjugate priors, uninformative priors, exponential family; - Bayesian estimators (ML, MAP, MMSE, type 2 ML, type 2 MAP, ...); - hierarchical models; - graph representation with directed acyclic graphs (DAGs); - Monte Carlo, exact inference with Markov chain Monte Carlo algorithm (MCMC) (Metropolis-Hastings, Gibbs sampler, effective sample size (ESS)).

Educational goals

After successfully taking this course, a student should be able to: - identify a relevant model in light of the available data; - formalize the learning problem as an optimization problem; - identify the nature of the distributions involved in the learning problem (posterior distribution, predictive distribution, marginals,full conditionals, ...); - understand the connections between deterministic and probabilistic modelings; - understand the implications of the chosen model on the results; - implement a simple approach to solve a statistical problem (MCMC algorithm).

Sustainable development goals

Knowledge control procedures

Continuous Assessment
Comments: Continuous evaluation, based on: - lab report(s), 50% of the overall grade, grading scale: (min) 0 – 20 (max) - final exam, 50% of the overall grade, grading scale: (min) 0 – 20 (max) 2nd chance exam (session 2): - grade on 20 points - final grade for the course: 50% session 1 (grade at the continuous assessment), 50% session 2 Labs, grading scale: (min) 0 – 20 (max) Exam, grading scale: (min) 0 – 20 (max)

Online resources

- Robert, C. P. (2007). The Bayesian choice: from decision-theoretic foundations to computational implementation (Vol. 2). New York: Springer. - Bishop, C. M., & Nasrabadi, N. M. (2006). Pattern recognition and machine learning (Vol. 4, No. 4, p. 738). New York: springer. - Murphy, K. P. (2012). Machine learning: a probabilistic perspective. MIT press. - Jones, Galin L and Qin, Qian (2022). Markov Chain Monte Carlo in Practice, Annual Review of Statistics and Its Application (Vol. 9, pp. 557-578) - Robert, C. P., Casella, G., & Casella, G. (1999). Monte Carlo statistical methods (Vol. 2). New York: Springer.

Pedagogy

Labs (4h) and tutorial sessions (3 x 2h). Final exam (2h). Language of instruction: English.

Sequencing / learning methods

Number of hours - Lectures : 12
Number of hours - Tutorial : 12
Number of hours - Practical work : 0
Number of hours - Seminar : 0
Number of hours - Half-group seminar : 0
Number of student hours in TEA (Autonomous learning) : 0
Number of student hours in TNE (Non-supervised activities) : 0
Number of hours in CB (Fixed exams) : 0
Number of student hours in PER (Personal work) : 0
Number of hours - Projects : 0

Prerequisites

Python and tools for research. Machine learning 1 or the equivalent. Probability 1 & 2. Statistics 1 & 2. Notions in optimization.

Maximum number of registrants

Remarks

Evaluations: - 1 written exam (20 min) (exam1) - 1 written exam (2h) (exam2) - 1 lab report (2h) (lab) Grade session 1: `Mark1 = 0.1 * exam1 + 0.45 * exam2 + 0.45 * lab` 2nd chance exam : 1 exam (2h) (exam3) if Mark1 < 10/20. Overall grade after 2nd chance exam : `Mark2 = 0.7 * Mark1 + 0.3 * exam3`