Centrale Lille Course Catalogue

Numerical analysis and optimization

Course label : Numerical analysis and optimization
Teaching departement : EEA / Electrotechnics - Electronics - Control Systems
Teaching manager : Mister PIERRE-ANTOINE THOUVENIN / Mister PIERRE CHAINAIS
Education language :
Potential ects : 0
Results grid :
Code and label (hp) : MR_DS_S2_NAO - Numerical analysis and optim

Education team

Teachers : Mister PIERRE-ANTOINE THOUVENIN / Mister PIERRE CHAINAIS
External contributors (business, research, secondary education): various temporary teachers

Summary

- Convexity, Lipschitz continuity - Unconstrained optimization problems - Back to Empirical Risk Minimization, Machine Learning and regularization - Convergence analysis. - Stochastic Descent method Like many engineering issues, machine learning problems express as a continuous optimization problem: given a Lipschitz continuous objective function, we are looking for the parameters that minimize it. Here, the set of admissible parameters will be a convex set, often a convex polyhedron. Most of the time, finding exact minimizers is not possible and we are looking for parameters that provide a good approximation of the theoretical minimal value of the objective function. In practice, we use algorithms that improve iteratively these parameters. The lectures present the basic theory of convex optimization and the associated efficient algorithms together with applications to Machine Learning. We start by recalling some facts about unconstrained convex optimization and about the associated gradient descent algorithms (first order and Newton-like higher order methods). We will show how this applies to Neural networks and present important variations such as the Stochastic Descent Method.

Educational goals

After successfully taking this course, a student should be able to: ● identify convex and non-convex problems ● compute convergence rates of some approximation methods for some optimization problems ● formulate a machine learning problem as an optimization problem ● know which methods may be used to solve such a problem; know how to use them in practice.

Sustainable development goals

Knowledge control procedures

Continuous Assessment
Comments: Average passing grade = 10/20 - Labs, grading scale: (min) 13.5 – 20 (max) Exam, grading scale: (min) 6.5 – 20 (max)

Online resources

Stephen Boyd and Lieven Vandenberghe. Convex Optimization. Cambridge University Press. Vanderbei, Linear Programming: Foundations and Extensions, Springer 2014

Pedagogy

24 hours, 8 lectures 4 exercises Language of instruction is specified in the course offering information in the course and programme directory. English is the default language.

Sequencing / learning methods

Number of hours - Lectures : 12
Number of hours - Tutorial : 12
Number of hours - Practical work : 0
Number of hours - Seminar : 0
Number of hours - Half-group seminar : 0
Number of student hours in TEA (Autonomous learning) : 0
Number of student hours in TNE (Non-supervised activities) : 0
Number of hours in CB (Fixed exams) : 0
Number of student hours in PER (Personal work) : 0
Number of hours - Projects : 0

Prerequisites

Machine learning 1, Machine Learning 2, Python & tools for research, or their equivalent.

Maximum number of registrants

Remarks