Skip to Main content Skip to Navigation

Continual forgetting-free deep learning from high-dimensional data streams

Abstract : In this thesis, we propose a new deep-learning-based approach for online classification on streams of high-dimensional data. In recent years, Neural Networks (NN) have become the primary building block of state-of-the-art methods in various machine learning problems. Most of these methods, however, are designed to solve the static learning problem, when all data are available at once at training time. Performing Online Deep Learning is exceptionally challenging.The main difficulty is that NN-based classifiers usually rely on the assumption that the sequence of data batches used during training is stationary, or in other words, that the distribution of data classes is the same for all batches (i.i.d. assumption).When this assumption does not hold Neural Networks tend to forget the concepts that are temporarily not available in thestream. In the literature, this phenomenon is known as catastrophic forgetting. The approaches we propose in this thesis aim to guarantee the i.i.d. nature of each batch that comes from the stream and compensates for the lack of historical data. To do this, we train generative models and pseudo-generative models capable of producing synthetic samples from classes that are absent or misrepresented in the stream and complete the stream’s batches with these samples. We test our approaches in an incremental learning scenario and a specific type of continuous learning. Our approaches perform classification on dynamic data streams with the accuracy close to the results obtained in the static classification configuration where all data are available for the duration of the learning. Besides, we demonstrate the ability of our methods to adapt to invisible data classes and new instances of already known data categories, while avoiding forgetting the previously acquired knowledge.
Complete list of metadata

Cited literature [72 references]  Display  Hide  Download
Contributor : Abes Star :  Contact
Submitted on : Wednesday, February 19, 2020 - 3:51:08 PM
Last modification on : Monday, February 21, 2022 - 3:38:11 PM
Long-term archiving on: : Wednesday, May 20, 2020 - 3:31:59 PM


Version validated by the jury (STAR)


  • HAL Id : tel-02484715, version 1


Andrey Besedin. Continual forgetting-free deep learning from high-dimensional data streams. Neural and Evolutionary Computing [cs.NE]. Conservatoire national des arts et metiers - CNAM, 2019. English. ⟨NNT : 2019CNAM1263⟩. ⟨tel-02484715⟩



Record views


Files downloads