{ListeTraductions,#GET{ListeTraductions},#ARRAY{#LANG,#URL_ARTICLE}} {ListeTraductions,#GET{ListeTraductions},#ARRAY{#LANG,#URL_ARTICLE}}
 

Machine Learning

Accès rapides

Accès rapides

Prochain Séminaire de la FIP :
Accéder au programme

Retrouvez toutes les informations pour vos stages :
Stages L3
Stages M1 ICFP

Actualités : Séminaire de Recherche ICFP
du 14 au 18 novembre 2022 :

Retrouvez le programme complet

Contact - Secrétariat de l’enseignement :
Tél : 01 44 32 35 60
enseignement@phys.ens.fr

r>

Enseignants : Francis Bach, Lenaic Chizat

Chargé de TD :

ECTS : 3

Langue d’enseignement : Anglais

Description :

Statistical machine learning is a growing discipline at the intersection of computer science and applied mathematics (probability / statistics, optimization, etc.) and which increasingly plays an important role in many other scientific disciplines.
Unlike a course on traditional statistics, statistical machine learning is particularly focused on the analysis of data in high dimension, as well as the efficiency of algorithms to process the large amount of data encountered in multiple application areas such as image or sound analysis, natural language processing, bioinformatics or finance.
The objective of this class is to present the main theories and algorithms in statistical machine learning, with simple proofs of the most important results. The practical sessions will lead to simple implementations of the algorithms seen in class.

( ) Evaluation : practical sessions to finish at home + written in-class exam

( ) Prerequisites : (1) Fluency in linear algebra and differential calculus, (2) one probability class, (3) basic notions in Python

( ) Syllabus and sessions

- 10 January : Introduction to supervised learning (loss, risk, over-fitting and capacity control + cross-validation, Bayes predictor for classification and regression -17 January : Least-squares regression (all aspects, from linear algebra to statistical guarantees and L2 regularization + practical session)
- 24 January : Statistical ML without optimization (learning theory, from finite number of hypothesis to Rademacher / covering numbers)
- 31 January : Local averaging techniques (K-nearest neighbor, Nadaraya-Watson regression : algorithms + statistical analysis + practical session)
- 7 February : Empirical risk minimization (logistic regression, loss-based supervised learning, probabilistic interpretation through maximum likelihood)
- 14 February : Convex optimization (gradient descent + nonsmooth + stochastic versions + practical session (logistic regression))
- 21 Feb : holidays
- 28 February : Model selection (feature selection, L1 regularization and high-dimensional inference + practical session)
- 6 March : Kernels (positive-definite kernels and reproducing kernel Hilbert spaces + practical session)
- 13 March : Neural networks (from one-hidden layer to deep networks + practical session)
- 20 March : Unsupervised learning (K-means and PCA (potentially with kernels) + mixture models (potentially EM) + practical session)
- 27 March : review
- 3 April : exam

Accès rapides

Prochain Séminaire de la FIP :
Accéder au programme

Retrouvez toutes les informations pour vos stages :
Stages L3
Stages M1 ICFP

Actualités : Séminaire de Recherche ICFP
du 14 au 18 novembre 2022 :

Retrouvez le programme complet

Contact - Secrétariat de l’enseignement :
Tél : 01 44 32 35 60
enseignement@phys.ens.fr

r>