{ListeTraductions,#GET{ListeTraductions},#ARRAY{#LANG,#URL_ARTICLE}}
 

Learning multiscale invariants from big data for physics

Stéphane Mallat (ENS Paris) — November 19, 2015

Machine learning requires to find low-dimensional models governing the properties of high dimensional functionals. This could almost be called physics. Algorithms have considerably improved in the last 10 years through the processing of massive amounts of data. In particular, deep neural network have spectacular applications, to image classification, medical, industrial and physical data analysis.

We show that the approximation capabilities of deep convolution networks come from their ability to compute invariant and continuous representations over complex groups, including diffeomorphisms. High dimensional structures are represented by multiscale interference terms, with wavelets on appropriate groups. It yields new representations of stochastic processes, which will be illustrated on Ising models and fluid turbulences, for statistical physics. We also show applications to the regressions of quantum molecular energies from chemical data bases, which are compared to density functional theory approximations.

Biography :
Stephane Mallat was a Professor at the NYU Courant Institute of Mathematical Sciences from 1988 to 1995, in applied mathematics at Ecole Polytechnique from 1995 to 2012, and is now Professor in the Computer Science departement at ENS. He is a member of the French Academy of Sciences. His main research interests are in harmonic analysis, signal processing and high-dimensional learning.

You can also watch this video on the multimedia site ENS : savoirs.ens.fr