Entropy-Regularized Optimal Transport for Machine Learning

Abstract : This thesis proposes theoretical and numerical contributions to use Entropy-regularized Optimal Transport (EOT) for machine learning. We introduce Sinkhorn Divergences (SD), a class of discrepancies between probability measures based on EOT which interpolates between two other well-known discrepancies: Optimal Transport (OT) and Maximum Mean Discrepancies (MMD). We develop an efficient numerical method to use SD for density fitting tasks, showing that a suitable choice of regularization can improve performance over existing methods. We derive a sample complexity theorem for SD which proves that choosing a large enough regularization parameter allows to break the curse of dimensionality from OT, and recover asymptotic rates similar to MMD. We propose and analyze stochastic optimization solvers for EOT, which yield online methods that can cope with arbitrary measures and are well suited to large scale problems, contrarily to existing discrete batch solvers.
Complete list of metadatas

Cited literature [89 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-02319318
Contributor : Aude Genevay <>
Submitted on : Thursday, October 17, 2019 - 6:42:55 PM
Last modification on : Wednesday, February 19, 2020 - 8:54:44 AM

File

these_aude.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : tel-02319318, version 1

Citation

Aude Genevay. Entropy-Regularized Optimal Transport for Machine Learning. Artificial Intelligence [cs.AI]. PSL University, 2019. English. ⟨tel-02319318⟩

Share

Metrics

Record views

120

Files downloads

160