Syllabus des cursus de Centrale Lille

Numerical analysis and optimization

Libellé du cours : Numerical analysis and optimization
Département d'enseignement : EEA / Electronique Electrotechnique Automatique
Responsable d'enseignement : Monsieur PIERRE-ANTOINE THOUVENIN / Monsieur PIERRE CHAINAIS
Langue d'enseignement :
Ects potentiels : 0
Grille des résultats :
Code et libellé (hp) : MR_DS_S2_NAO - Numerical analysis and optim

Equipe pédagogique

Enseignants : Monsieur PIERRE-ANTOINE THOUVENIN / Monsieur PIERRE CHAINAIS
Intervenants extérieurs (entreprise, recherche, enseignement secondaire) : divers enseignants vacataires

Résumé

- Convexity, Lipschitz continuity - Unconstrained optimization problems - Back to Empirical Risk Minimization, Machine Learning and regularization - Convergence analysis. - Stochastic Descent method Like many engineering issues, machine learning problems express as a continuous optimization problem: given a Lipschitz continuous objective function, we are looking for the parameters that minimize it. Here, the set of admissible parameters will be a convex set, often a convex polyhedron. Most of the time, finding exact minimizers is not possible and we are looking for parameters that provide a good approximation of the theoretical minimal value of the objective function. In practice, we use algorithms that improve iteratively these parameters. The lectures present the basic theory of convex optimization and the associated efficient algorithms together with applications to Machine Learning. We start by recalling some facts about unconstrained convex optimization and about the associated gradient descent algorithms (first order and Newton-like higher order methods). We will show how this applies to Neural networks and present important variations such as the Stochastic Descent Method.

Objectifs pédagogiques

After successfully taking this course, a student should be able to: ● identify convex and non-convex problems ● compute convergence rates of some approximation methods for some optimization problems ● formulate a machine learning problem as an optimization problem ● know which methods may be used to solve such a problem; know how to use them in practice.

Objectifs de développement durable

Modalités de contrôle de connaissance

Contrôle Continu
Commentaires: Average passing grade = 10/20 - Labs, grading scale: (min) 13.5 – 20 (max) Exam, grading scale: (min) 6.5 – 20 (max)

Ressources en ligne

- Stephen Boyd and Lieven Vandenberghe. Convex Optimization. Cambridge University Press. - Vanderbei, Linear Programming: Foundations and Extensions, Springer 2014

Pédagogie

24 hours, 8 lectures 4 exercises Language of instruction is specified in the course offering information in the course and programme directory. English is the default language.

Séquencement / modalités d'apprentissage

Nombre d'heures en CM (Cours Magistraux) : 12
Nombre d'heures en TD (Travaux Dirigés) : 12
Nombre d'heures en TP (Travaux Pratiques) : 0
Nombre d'heures en Séminaire : 0
Nombre d'heures en Demi-séminaire : 0
Nombre d'heures élèves en TEA (Travail En Autonomie) : 0
Nombre d'heures élèves en TNE (Travail Non Encadré) : 0
Nombre d'heures en CB (Contrôle Bloqué) : 0
Nombre d'heures élèves en PER (Travail PERsonnel) : 0
Nombre d'heures en Heures Projets : 0

Pré-requis

Machine learning 1, Machine Learning 2, Python & tools for research, or their equivalent.

Nombre maximum d'inscrits

Remarques