Skip to Main content Skip to Navigation
New interface
Theses

Optimization of deep multi-task networks

Abstract : Multi-task learning (MTL) is a learning paradigm involving the joint optimization of parameters with respect to multiple tasks. By learning multiple related tasks, a learner receives more complete and complementary information on the input domain from which the tasks are issued. This allows to gain better understanding of the domain by building a more accurate set of assumptions of it. However, in practice, the broader use of MTL is hindered by the lack of consistent performance gains observed by deep multi-task networks. It is often the case that deep MTL networks suffer from performance degradation caused by task interference. This thesis addresses the problem of task interference in Multi-Task learning, in order to improve the generalization capabilities of deep neural networks.
Complete list of metadata

https://tel.archives-ouvertes.fr/tel-03783509
Contributor : ABES STAR :  Contact
Submitted on : Thursday, September 22, 2022 - 11:23:20 AM
Last modification on : Monday, September 26, 2022 - 4:29:37 PM

File

PASCAL_Lucas_2021_v2.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-03783509, version 1

Citation

Lucas Pascal. Optimization of deep multi-task networks. Neural and Evolutionary Computing [cs.NE]. Sorbonne Université, 2021. English. ⟨NNT : 2021SORUS535⟩. ⟨tel-03783509⟩

Share

Metrics

Record views

11

Files downloads

1