Centrale Lille Course Catalogue

Machine learning 3: Deep learning

Course label : Machine learning 3: Deep learning
Teaching departement : EEA / Electrotechnics - Electronics - Control Systems
Teaching manager : Mister PIERRE-ANTOINE THOUVENIN / Mister PIERRE CHAINAIS
Education language :
Potential ects : 0
Results grid :
Code and label (hp) : MR_DS_S2_ML3 - Machine learning 3: Deep learn

Education team

Teachers : Mister PIERRE-ANTOINE THOUVENIN / Mister PIERRE CHAINAIS
External contributors (business, research, secondary education): various temporary teachers

Summary

● reminder (ML1) + some complements: introduction to formal neural networks, the perceptron, training a perceptron, multilayer perceptron, full presentation of the backpropagation of the gradient of the error (incl. tricks to make it work in practice) ● elements of the formal analysis of neural networks (incl. MLP and their approximation ability) ● limitations of the classical MLP + backprop approach (vanishing or exploding gradient, etc.) ● the renewal of neural networks: deep learning and convolutional networks (conv / pool layers) ● deep net as a representation learner (auto-encoders, restricted Boltzmann machines) ● efficient deep net training (batch normalization, dropout, regularization etc.) ● recurrent neural networks and long-short time memories ● generative adversarial networks.

Educational goals

After successfully taking this course, a student should: ● know and understand the main concepts related to neural networks ● know the main types of neural networks (feedforward, convolutional, recurrent) and neurons ● know the main algorithms to train a neural network in practice ● understand the design of a deep neural network ● know how to use a neural network in practice to solve a particular supervised learning problem ● understand the limits of neural networks ● know theoretical properties of deep nets

Sustainable development goals

Knowledge control procedures

Continuous Assessment
Comments: Labs, grading scale: (min) 0 – 20 (max) Exam, grading scale: (min) 0 – 20 (max)

Online resources

Hastie & Tibshirani, The Elements of Statistical Learning, Springer 2009. Goodfellow, Bengio, Courville, The Deep Learning book, MIT Press, 2016.

Pedagogy

24 hours, 6 lectures 6 exercises / labs Language of instruction is specified in the course offering information in the course and programme directory. English is the default language.

Sequencing / learning methods

Number of hours - Lectures : 12
Number of hours - Tutorial : 12
Number of hours - Practical work : 0
Number of hours - Seminar : 0
Number of hours - Half-group seminar : 0
Number of student hours in TEA (Autonomous learning) : 0
Number of student hours in TNE (Non-supervised activities) : 0
Number of hours in CB (Fixed exams) : 0
Number of student hours in PER (Personal work) : 0
Number of hours - Projects : 0

Prerequisites

ML1. ML2. Python & tools for research. Bases of optimization.

Maximum number of registrants

Remarks