Skip to Main content Skip to Navigation

Entropy-Regularized Optimal Transport for Machine Learning

Abstract : This thesis proposes theoretical and numerical contributions to use Entropy-regularized Optimal Transport (EOT) for machine learning. We introduce Sinkhorn Divergences (SD), a class of discrepancies between probability measures based on EOT which interpolates between two other well-known discrepancies: Optimal Transport (OT) and Maximum Mean Discrepancies (MMD). We develop an efficient numerical method to use SD for density fitting tasks, showing that a suitable choice of regularization can improve performance over existing methods. We derive a sample complexity theorem for SD which proves that choosing a large enough regularization parameter allows to break the curse of dimensionality from OT, and recover asymptotic rates similar to MMD. We propose and analyze stochastic optimization solvers for EOT, which yield online methods that can cope with arbitrary measures and are well suited to large scale problems, contrarily to existing discrete batch solvers.
Complete list of metadatas

Cited literature [89 references]  Display  Hide  Download
Contributor : Aude Genevay <>
Submitted on : Thursday, October 17, 2019 - 6:42:55 PM
Last modification on : Thursday, October 29, 2020 - 3:01:17 PM


Files produced by the author(s)


  • HAL Id : tel-02319318, version 1


Aude Genevay. Entropy-Regularized Optimal Transport for Machine Learning. Artificial Intelligence [cs.AI]. PSL University, 2019. English. ⟨tel-02319318⟩



Record views


Files downloads