Skip to Main content Skip to Navigation

Autour de l'algorithme du Langevin : extensions et applications

Abstract : This thesis focuses on the problem of sampling in high dimension and is based on the unadjusted Langevin algorithm (ULA).In a first part, we suggest two extensions of ULA and provide precise convergence guarantees for these algorithms. ULA is not feasible when the target distribution is compactly supported; thanks to a Moreau Yosida regularization, it is nevertheless possible to sample from a probability distribution close enough to the distribution of interest. ULA diverges when the tails of the target distribution are too thin; by taming appropriately the gradient, this difficulty can be overcome.In a second part, we give two applications of ULA. We provide an algorithm to estimate normalizing constants of log concave densities based on a sequence of distributions with increasing variance. By comparison of ULA with the Langevin diffusion, we develop a new control variates methodology based on the asymptotic variance of the Langevin diffusion.In a third part, we analyze Stochastic Gradient Langevin Dynamics (SGLD), which differs from ULA only in the stochastic estimation of the gradient. We show that SGLD, applied with usual parameters, may be very far from the target distribution. However, with an appropriate variance reduction technique, its computational cost can be much lower than ULA for the same accuracy.
Complete list of metadata

Cited literature [236 references]  Display  Hide  Download
Contributor : ABES STAR :  Contact
Submitted on : Tuesday, July 23, 2019 - 3:08:40 PM
Last modification on : Wednesday, November 24, 2021 - 12:06:02 PM


Version validated by the jury (STAR)


  • HAL Id : tel-02430579, version 2



Nicolas Brosse. Autour de l'algorithme du Langevin : extensions et applications. Machine Learning [stat.ML]. Université Paris Saclay (COmUE), 2019. Français. ⟨NNT : 2019SACLX014⟩. ⟨tel-02430579v2⟩



Record views


Files downloads