Centrale Lille Course Catalogue

Models for Machine Learning

Course label : Models for Machine Learning
Teaching departement : EEA / Electrotechnics - Electronics - Control Systems
Teaching manager :
Education language :
Potential ects : 0
Results grid :
Code and label (hp) : MR_DS_S2_MML - Models for Machine Learning

Education team

Teachers :
External contributors (business, research, secondary education): various temporary teachers

Summary

Fundamentals of Bayesian models and inference. Hands on sessions. Usual methods and notions in this scope are summarized below: - baseline concepts (likelihood, prior, posterior, predictive distribution) - conjugate priors, uninformative priors, exponential family; - Bayesian estimators (ML, MAP, MMSE, type 2 ML, type 2 MAP); - hierarchical models; - graph representation with directed acyclic graphs (DAGs); - exact inference with Markov chain Monte Carlo algorithm (MCMC) (Metropolis-Hastings, Gibbs sampler).

Educational goals

After successfully taking this course, a student should be able to: - identify a relevant model in light of the available data; - formalize the learning problem as an optimization problem; - identify the nature of the distributions involed in the learning problem (posterior distribution, predictive distribution, marginals, ...); - understand the connections between deterministic and probabilistic modelings; - understand the implications of the chosen model on the results; - implement a simple approach to solve a statistical problem (MCMC approach).

Sustainable development goals

Knowledge control procedures

Continuous Assessment
Comments: Continuous evaluation. Labs, grading scale: (min) 0 – 20 (max) Exam, grading scale: (min) 0 – 20 (max)

Online resources

- Robert, C. P. (2007). The Bayesian choice: from decision-theoretic foundations to computational implementation (Vol. 2). New York: Springer. - Bishop, C. M., & Nasrabadi, N. M. (2006). Pattern recognition and machine learning (Vol. 4, No. 4, p. 738). New York: springer. - Murphy, K. P. (2012). Machine learning: a probabilistic perspective. MIT press. - Robert, C. P., Casella, G., & Casella, G. (1999). Monte Carlo statistical methods (Vol. 2). New York: Springer. - Blei, D. M., Kucukelbir, A., & McAuliffe, J. D. (2017). Variational inference: A review for statisticians. Journal of the American statistical Association, 112(518), 859-877.

Pedagogy

Labs and tutorial sessions. Language of instruction is specified in the course offering information in the course and programme directory. English is the default language.

Sequencing / learning methods

Number of hours - Lectures : 12
Number of hours - Tutorial : 12
Number of hours - Practical work : 0
Number of hours - Seminar : 0
Number of hours - Half-group seminar : 0
Number of student hours in TEA (Autonomous learning) : 0
Number of student hours in TNE (Non-supervised activities) : 0
Number of hours in CB (Fixed exams) : 0
Number of student hours in PER (Personal work) : 0
Number of hours - Projects : 0

Prerequisites

Python and tools for research. Machine learning 1 or equivalent. Probability 1 & 2. Statistics 1 & 2. Notions in optimization.

Maximum number of registrants

Remarks