Course label : | Models for Machine Learning |
---|---|
Teaching departement : | EEA / Electrotechnics - Electronics - Control Systems |
Teaching manager : | Mister PIERRE-ANTOINE THOUVENIN |
Education language : | |
Potential ects : | 0 |
Results grid : | |
Code and label (hp) : | MR_DS_S2_MML - Models for Machine Learning |
Education team
Teachers : Mister PIERRE-ANTOINE THOUVENIN
External contributors (business, research, secondary education): various temporary teachers
Summary
Fundamentals of Bayesian models and inference. Hands on sessions. Usual methods and notions in this scope are summarized below: - baseline concepts (likelihood, prior, posterior, predictive distribution) - conjugate priors, uninformative priors, exponential family; - Bayesian estimators (ML, MAP, MMSE, type 2 ML, type 2 MAP); - hierarchical models; - graph representation with directed acyclic graphs (DAGs); - exact inference with Markov chain Monte Carlo algorithm (MCMC) (Metropolis-Hastings, Gibbs sampler, effective sample size (ESS)).
Educational goals
After successfully taking this course, a student should be able to: - identify a relevant model in light of the available data; - formalize the learning problem as an optimization problem; - identify the nature of the distributions involed in the learning problem (posterior distribution, predictive distribution, marginals, ...); - understand the connections between deterministic and probabilistic modelings; - understand the implications of the chosen model on the results; - implement a simple approach to solve a statistical problem (MCMC algorithm).
Sustainable development goals
Knowledge control procedures
Continuous Assessment
Comments: Continuous evaluation, based on:
- lab report(s), 50% of the overall grade, grading scale: (min) 0 – 20 (max)
- final exam, 50% of the overall grade, grading scale: (min) 0 – 20 (max)
2nd chance exam (session 2):
- grade on 20 points
- final grade for the course: 50% session 1 (grade at the continuous assessment), 50% session 2
Labs, grading scale: (min) 0 – 20 (max)
Exam, grading scale: (min) 0 – 20 (max)
Online resources
- Robert, C. P. (2007). The Bayesian choice: from decision-theoretic foundations to computational implementation (Vol. 2). New York: Springer. - Bishop, C. M., & Nasrabadi, N. M. (2006). Pattern recognition and machine learning (Vol. 4, No. 4, p. 738). New York: springer. - Murphy, K. P. (2012). Machine learning: a probabilistic perspective. MIT press. - Jones, Galin L and Qin, Qian (2022). Markov Chain Monte Carlo in Practice, Annual Review of Statistics and Its Application (Vol. 9, pp. 557-578) - Robert, C. P., Casella, G., & Casella, G. (1999). Monte Carlo statistical methods (Vol. 2). New York: Springer.
Pedagogy
Labs (6h) and tutorial sessions (2 x 2h). Final exam (2h). Language of instruction is specified in the course offering information in the course and programme directory. English is the default language.
Sequencing / learning methods
Number of hours - Lectures : | 12 |
---|---|
Number of hours - Tutorial : | 12 |
Number of hours - Practical work : | 0 |
Number of hours - Seminar : | 0 |
Number of hours - Half-group seminar : | 0 |
Number of student hours in TEA (Autonomous learning) : | 0 |
Number of student hours in TNE (Non-supervised activities) : | 0 |
Number of hours in CB (Fixed exams) : | 0 |
Number of student hours in PER (Personal work) : | 0 |
Number of hours - Projects : | 0 |
Prerequisites
Python and tools for research. Machine learning 1 or the equivalent. Probability 1 & 2. Statistics 1 & 2. Notions in optimization.