Automatic differentiation of nonsmooth iterative algorithms - Argumentation, Décision, Raisonnement, Incertitude et Apprentissage Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Automatic differentiation of nonsmooth iterative algorithms

Résumé

Differentiation along algorithms, i.e., piggyback propagation of derivatives, is now routinely used to differentiate iterative solvers in differentiable programming. Asymptotics is well understood for many smooth problems but the nondifferentiable case is hardly considered. Is there a limiting object for nonsmooth piggyback automatic differentiation (AD)? Does it have any variational meaning and can it be used effectively in machine learning? Is there a connection with classical derivative? All these questions are addressed under appropriate nonexpansivity conditions in the framework of conservative derivatives which has proved useful in understanding nonsmooth AD. For nonsmooth piggyback iterations, we characterize the attractor set of nonsmooth piggyback iterations as a set-valued fixed point which remains in the conservative framework. This has various consequences and in particular almost everywhere convergence of classical derivatives. Our results are illustrated on parametric convex optimization problems with forward-backward, Douglas-Rachford and Alternating Direction of Multiplier algorithms as well as the Heavy-Ball method.
Fichier principal
Vignette du fichier
nonsmooth_iterative_autodiff.pdf (894.49 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03681143 , version 1 (30-05-2022)

Identifiants

  • HAL Id : hal-03681143 , version 1

Citer

Jérôme Bolte, Edouard Pauwels, Samuel Vaiter. Automatic differentiation of nonsmooth iterative algorithms. Advances in Neural Information Processing Systems, Nov 2022, New Orleans, United States. 28 p. ⟨hal-03681143⟩
274 Consultations
201 Téléchargements

Partager

Gmail Facebook X LinkedIn More