Skip to Main content Skip to Navigation
Theses

Apprentissage continu : S'attaquer à l'oubli foudroyant des réseaux de neurones profonds grâce aux méthodes à rejeu de données

Abstract : Humans learn all their life long. They accumulate knowledge from a sequence of learning experiences and remember the essential concepts without forgetting what they have learned previously. Artificial neural networks struggle to learn similarly. They often rely on data rigorously preprocessed to learn solutions to specific problems such as classification or regression.In particular, they forget their past learning experiences if trained on new ones.Therefore, artificial neural networks are often inept to deal with real-lifesuch as an autonomous-robot that have to learn on-line to adapt to new situations and overcome new problems without forgetting its past learning-experiences.Continual learning (CL) is a branch of machine learning addressing this type of problems.Continual algorithms are designed to accumulate and improve knowledge in a curriculum of learning-experiences without forgetting.In this thesis, we propose to explore continual algorithms with replay processes.Replay processes gather together rehearsal methods and generative replay methods.Generative Replay consists of regenerating past learning experiences with a generative model to remember them. Rehearsal consists of saving a core-set of samples from past learning experiences to rehearse them later. The replay processes make possible a compromise between optimizing the current learning objective and the past ones enabling learning without forgetting in sequences of tasks settings.We show that they are very promising methods for continual learning. Notably, they enable the re-evaluation of past data with new knowledge and the confrontation of data from different learning-experiences. We demonstrate their ability to learn continually through unsupervised learning, supervised learning and reinforcement learning tasks.
Complete list of metadata

Cited literature [295 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-02906138
Contributor : ABES STAR :  Contact
Submitted on : Friday, July 24, 2020 - 11:04:13 AM
Last modification on : Wednesday, May 11, 2022 - 3:20:03 PM
Long-term archiving on: : Tuesday, December 1, 2020 - 6:32:53 AM

File

90535_LESORT_2020_archivage.pd...
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-02906138, version 1

Collections

Citation

Timothée Lesort. Apprentissage continu : S'attaquer à l'oubli foudroyant des réseaux de neurones profonds grâce aux méthodes à rejeu de données. Machine Learning [cs.LG]. Institut Polytechnique de Paris, 2020. English. ⟨NNT : 2020IPPAE003⟩. ⟨tel-02906138⟩

Share

Metrics

Record views

286

Files downloads

181