Extensions of algorithmic differentiation by source transformation inspired by modern scientific computing

Ala Taftaf 1
Abstract : The adjoint mode of Algorithmic Differentiation (AD) is particularly attractive for computing gradients. However, this mode needs to use the intermediate values of the original simulation in reverse order at a cost that increases with the length of the simulation. AD research looks for strategies to reduce this cost, for instance by taking advantage of the structure of the given program. In this work, we consider on one hand the frequent case of Fixed-Point loops for which several authors have proposed adapted adjoint strategies. Among these strategies, we select the one introduced by B. Christianson. We specify further the selected method and we describe the way we implemented it inside the AD tool Tapenade. Experiments on a medium-size application shows a major reduction of the memory needed to store trajectories. On the other hand, we study checkpointing in the case of MPI parallel programs with point-to-point communications. We propose techniques to apply checkpointing to these programs. We provide proof of correctness of our techniques and we experiment them on representative CFD codes
Document type :
Theses
Complete list of metadatas

Cited literature [51 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-01503507
Contributor : Abes Star <>
Submitted on : Friday, April 7, 2017 - 11:07:06 AM
Last modification on : Saturday, December 15, 2018 - 3:26:40 AM
Long-term archiving on : Saturday, July 8, 2017 - 2:56:56 PM

File

2017AZUR4001.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-01503507, version 1

Collections

Citation

Ala Taftaf. Extensions of algorithmic differentiation by source transformation inspired by modern scientific computing. General Mathematics [math.GM]. Université Côte d'Azur, 2017. English. ⟨NNT : 2017AZUR4001⟩. ⟨tel-01503507⟩

Share

Metrics

Record views

360

Files downloads

225