Nonsmooth Implicit Differentiation for Machine Learning and Optimization - Argumentation, Décision, Raisonnement, Incertitude et Apprentissage Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Nonsmooth Implicit Differentiation for Machine Learning and Optimization

Résumé

In view of training increasingly complex learning architectures, we establish a nonsmooth implicit function theorem with an operational calculus. Our result applies to most practical problems (i.e., definable problems) provided that a nonsmooth form of the classical invertibility condition is fulfilled. This approach allows for formal subdifferentiation: for instance, replacing derivatives by Clarke Jacobians in the usual differentiation formulas is fully justified for a wide class of nonsmooth problems. Moreover this calculus is entirely compatible with algorithmic differentiation (e.g., backpropagation). We provide several applications such as training deep equilibrium networks, training neural nets with conic optimization layers, or hyperparameter-tuning for nonsmooth Lasso-type models. To show the sharpness of our assumptions, we present numerical experiments showcasing the extremely pathological gradient dynamics one can encounter when applying implicit algorithmic differentiation without any hypothesis.
Fichier principal
Vignette du fichier
implicitNonsmooth.pdf (1.58 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03251332 , version 1 (07-06-2021)
hal-03251332 , version 2 (04-04-2022)

Identifiants

Citer

Jérôme Bolte, Tam Le, Edouard Pauwels, Antonio Silveti-Falls. Nonsmooth Implicit Differentiation for Machine Learning and Optimization. Advances in Neural Information Processing Systems, Dec 2021, Online, France. ⟨hal-03251332v1⟩
241 Consultations
276 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More