Skip to Main content Skip to Navigation

Neural Architecture Search under Budget Constraints

Abstract : The recent increase in computation power and the ever-growing amount of data available ignited the rise in popularity of deep learning. However, the expertise, the amount of data, and the computing power necessary to build such algorithms as well as the memory footprint and the inference latency of the resulting system are all obstacles preventing the widespread use of these methods. In this thesis, we propose several methods allowing to make a step towards a more efficient and automated procedure to build deep learning models. First, we focus on learning an efficient architecture for image processing problems. We propose a new model in which we can guide the architecture learning procedure by specifying a fixed budget and cost function. Then, we consider the problem of sequence classification, where a model can be even more efficient by dynamically adapting its size to the complexity of the signal to come. We show that both approaches result in significant budget savings. Finally, we tackle the efficiency problem through the lens of transfer learning. Arguing that a learning procedure can be made even more efficient if, instead of starting tabula rasa, it builds on knowledge acquired during previous experiences. We explore modular architectures in the continual learning scenario and present a new benchmark allowing a fine-grained evaluation of different kinds of transfer.
Complete list of metadata
Contributor : ABES STAR :  Contact
Submitted on : Tuesday, July 19, 2022 - 2:13:11 PM
Last modification on : Tuesday, August 2, 2022 - 4:04:00 AM


Version validated by the jury (STAR)


  • HAL Id : tel-03727609, version 1


Tom Veniat. Neural Architecture Search under Budget Constraints. Machine Learning [cs.LG]. Sorbonne Université, 2021. English. ⟨NNT : 2021SORUS443⟩. ⟨tel-03727609⟩



Record views


Files downloads