C. C. Aggarwal and P. S. Yu, Outlier detection for high dimensional data, ACM Sigmod Record, vol.30, pp.37-46, 2001.

F. Alqallaf and C. Agostinelli, Robust inference in generalized linear models, Communications in Statistics -Simulation and Computation, vol.45, issue.9, pp.3053-3073, 2016.

R. Andersen, Modern Methods for Robust Regression. Quantitative Applications in the Social Sciences, 2007.

F. Bach, R. Jenatton, J. Mairal, and G. Obozinski, Optimization with sparsityinducing penalties. Foundations and Trends® in Machine Learning, vol.4, pp.1-106, 2012.
URL : https://hal.archives-ouvertes.fr/hal-00613125

F. Bach, R. Jenatton, J. Mairal, and G. Obozinski, Structured sparsity through convex optimization, Statistical Science, vol.27, issue.4, pp.450-468, 2012.
URL : https://hal.archives-ouvertes.fr/hal-00621245

E. Bacry, M. Bompaire, P. Deegan, S. Gaï, and S. Poulsen, Tick: a python library for statistical learning, with an emphasis on hawkes processes and timedependent models, The Journal of Machine Learning Research, vol.18, issue.1, pp.7937-7941, 2017.

B. Bah and J. Tanner, Improved bounds on restricted isometry constants for gaussian matrices, SIAM J. Matrix Analysis Applications, vol.31, issue.5, pp.2882-2898, 2010.

Y. Benjamini and Y. Hochberg, Controlling the false discovery rate: a practical and powerful approach to multiple testing, Journal of the royal statistical society. Series B (Methodological), pp.289-300, 1995.

Y. Benjamini and Y. Hochberg, Controlling the false discovery rate: A practical and powerful approach to multiple testing, Journal of the Royal Statistical Society. Series B (Methodological), vol.57, issue.1, pp.289-300, 1995.

Y. Benjamini and D. Yekutieli, The control of the false discovery rate in multiple testing under dependency, The Annals of Statistics, vol.29, issue.4, pp.1165-1188, 2001.

M. Bogdan, E. Van-den, C. Berg, W. Sabatti, E. J. Su et al., Slope -adaptive variable selection via convex optimisation, Annals of Applied Statistics, vol.9, issue.3, pp.1103-1140, 2015.

M. Bogdan, E. Van-den, W. Berg, E. J. Su, and . Candès, Statistical estimation and testing via the ordered, vol.1, 2013.

P. Bühlmann, S. Van-de, and . Geer, Statistics for high-dimensional data: methods, theory and applications, 2011.

P. C. Bellec, G. Lecué, and A. B. Tsybakov, Slope meets lasso : improved oracle bounds and optimality, 2016.

E. J. Candès, J. Romberg, and T. Tao, Robust uncertainty principles: Exact signal reconstruction from highly incomplete frequency information, IEEE Transactions on information theory, vol.52, issue.2, pp.489-509, 2006.

E. J. Candès, J. K. Romberg, and T. Tao, Stable signal recovery from incomplete and inaccurate measurements, Communications on pure and applied mathematics, vol.59, issue.8, pp.1207-1223, 2006.

E. Cantoni and E. Ronchetti, Robust inference for generalized linear models, Journal of the American Statistical Association, vol.96, pp.1022-1030, 2001.

V. Chandrasekaran, B. Recht, P. A. Parrilo, and A. S. Willsky, The convex geometry of linear inverse problems, Foundations of Computational mathematics, vol.12, issue.6, pp.805-849, 2012.

S. S. Chen, D. L. Donoho, and M. A. Saunders, Atomic decomposition by basis pursuit, SIAM review, vol.43, issue.1, pp.129-159, 2001.

J. Chiang, The masking and swamping e ects using the planted mean-shift outliers models, Int. J. Contemp. Math. Sciences, vol.2, issue.7, pp.297-307, 2007.

S. Chrétien and S. Darses, Sparse recovery with unknown variance: A lasso-type approach, IEEE Transactions on Information Theory, vol.60, issue.7, pp.3970-3988, 2014.

R. Cook and S. Weisberg, Residuals and inØuence in regression, 1982.

W. J. Dixon, Analysis of extreme values, The Annals of Mathematical Statistics, vol.21, issue.4, pp.488-506, 1950.

A. Duval, S. Rolland, A. Compoint, E. Tubacher, B. Iacopetta et al., Evolution of instability at coding and non-coding repeat sequences in human msi-h colorectal cancers, Hum Mol Genet, vol.10, issue.5, pp.513-518, 2001.

J. Fan and R. Li, Variable selection via nonconcave penalized likelihood and its oracle properties, Journal of the American Statistical Association, vol.96, issue.456, pp.1348-1360, 2001.

X. Gao and Y. Fang, Penalized weighted least squares for outlier detection and robust regression, ArXiv, 2016.

S. A. Geer and P. Bühlmann, On the conditions used to prove oracle results for the lasso, Electron. J. Stat, 2009.

D. Gervini and V. J. Yohai, A class of robust and fully e cient regression estimators, Annals of Statistics, pp.583-616, 2002.

C. Giraud, Introduction to High-Dimensional Statistics, 2014.

F. E. Grubbs, Procedures for detecting outlying observations in samples, vol.11, pp.1-21, 1969.

A. Gupta and S. Kohli, An mcdm approach towards handling outliers in web data: a case study using owa operators, ArtiAEcial Intelligence Review, vol.46, issue.1, pp.59-82, 2016.

N. H. Nguyen and T. D. Tran, Robust lasso with missing and grossly corrupted observations, IEEE transactions on information theory, vol.59, issue.4, pp.2036-2058, 2013.

A. S. Hadi, A new measure of overall potential inØuence in linear regression, Computational Statistics and Data Analysis, vol.14, pp.1-27, 1992.

A. S. Hadi and J. S. Simono, Procedures for the identiAEcation of multiple outlier in linear models, Journal of the American Statistical Association, vol.88, issue.424, pp.1264-1272, 1993.

F. R. Hampel, E. M. Ronchetti, P. J. Rousseeuw, and W. A. Stahel, Robust statistics: the approach based on inØuence functions, vol.114, 2011.

D. M. Hawkins, IdentiAEcation of outliers, vol.11, 1980.

D. M. Hawkins, IdentiAEcation of outliers, 1980.

Y. Hochberg and A. C. Tamhane, Multiple Comparison Procedures, 1987.

W. Hoe, Probability inequalities for sums of bounded random variables, Journal of the American Statistical Association, vol.58, issue.301, pp.13-30, 1963.

P. J. Huber, Robust estimation of a location parameter, The Annals of Mathematical Statistics, vol.35, issue.1, pp.73-101, 1964.

P. J. Huber, The 1972 wald lecture robust statistics: A review, The Annals of Mathematical Statistics, pp.1041-1067, 1972.

P. J. Huber, Robust statistics. Wiley series in probability and mathematics statistics, pp.309-312, 1981.

P. J. Bickel, Y. Ritov, and A. B. Tsybakov, Simultaneous analysis of lasso and dantzig selector, The Annals of Statistics, vol.37, issue.4, pp.1705-1732, 2009.
URL : https://hal.archives-ouvertes.fr/hal-00401585

E. J. Candès and Y. Plan, Near-ideal model selection by`1-minimization, The Annals of Statistics, vol.37, issue.5A, pp.2145-2177, 2009.

E. J. Candès, The restricted isometry property and its implications for compressed sensing, Comptes rendus mathematique, vol.346, issue.9, pp.589-592, 2008.

E. J. Candès and P. A. Randall, Highly robust error correction by convex programming, IEEE Transactions on Information Theory, vol.54, issue.7, pp.2829-2840, 2008.

E. J. Candès and T. Tao, Decoding by linear programming, IEEE Transactions on Information Theory, vol.51, issue.12, pp.4203-4215, 2005.

P. J. Huber, Robust Statistical Procedures: Second Edition. SIAM, 1996.

M. J. Wainwright, Sharp thresholds for high-dimensional and noisy sparsity recovery using`1 -constrained quadratic programming (lasso), IEEE Transactions on Information Theory, vol.55, issue.5, pp.2183-2202, 2009.

O. and J. Dunn, Multiple comparisons among means, Journal of the American Statistical Association, vol.56, issue.293, pp.52-64, 1961.

T. H. Jerome-friedman and R. Tibshirani, The elements of statistical learning. Springer series in statistics, 2001.

V. Jonchere, IdentiAEcation of positively and negatively selected driver gene mutations associated with colorectal cancer with microsatellite instability, Cellular and Molecular Gastroenterology and Hepatology, vol.6, issue.3, pp.277-300, 2018.

V. Koltchinskii, Saint Øour lectures oracle inequalities in empirical risk minimization and sparse recovery problems, 2008.

K. L. Ayers and H. J. Cordell, Snp selection in genome-wide and candidate gene studies via penalized logistic regression, Genetic Epidemiology, vol.34, issue.8, pp.879-891, 2010.

D. L. Donoho and X. Huo, Uncertainty principles and ideal atomic decomposition, IEEE Transactions on Information Theory, vol.47, issue.7, pp.2845-2862, 2001.

B. Laurent and P. Massart, Adaptive estimation of a quadratic functional by model selection, Annals of Statistics, vol.28, issue.5, pp.1302-1338, 2000.

G. Lecué and M. Lerasle, Learning from mom's principles : Le cam's approach, 2017.

G. Lecué and M. Lerasle, Robust machine learning by median-of-means : theory and practice, 2017.

Y. Lee and J. A. Nelder, Hierarchical generalized linear models, Journal of the Royal Statistical Society: Series B (Methodological), vol.58, issue.4, pp.619-656, 1996.

G. Lugosi and S. Mendelson, Risk minimization by median-of-means tournaments, 2016.

P. Mccullagh and J. A. Nelder, Generalized Linear Models, Second Edition, 1989.

J. N. Laska, M. A. Davenport, and R. G. Baraniuk, Exact signal recovery from sparsely corrupted measurements through the pursuit of justice, 43rd Asilomar conference on signals, systems and computers, pp.1556-1560, 2009.

V. N. Vapnik, The Nature of Statistical Learning Theory, 2013.

N. H. Cook and S. Weisberg, A note on an alternative outlier model, Journal of the Royal Statistical Society. Series B (Methodological), vol.44, issue.3, pp.370-376, 1982.

G. Raskutti, M. J. Wainwright, and B. Yu, Restricted eigenvalue properties for correlated gaussian design, Journal of Machine Learning Research, vol.11, pp.2241-2259, 2010.

G. Raskutti, M. J. Wainwright, and B. Yu, Minimax rates of estimation for highdimensional linear regression over`q-balls, IEEE Transactions on Information Theory, vol.57, pp.6976-6994, 2011.

K. Ro, C. Zou, Z. Wang, and G. Yin, Outlier detection for high-dimensional data, Biometrika, vol.102, issue.3, pp.589-599, 2015.

E. Roquain, Type i error rate control in multiple testing: a survey with proofs, Journal de la Société Française de Statistique, vol.152, issue.2, pp.3-38, 2011.

P. Rousseeuw and V. Yohai, Robust regression by means of s-estimators, Robust and nonlinear time series analysis, pp.256-272, 1984.

P. J. Rousseeuw and A. M. Leroy, Robust Regression and Outlier Detection, 1987.

P. Rousseuww and V. Yohai, Robust Regression by Means of S-Estimators, chapter Robust and Nonlinear Time Series Analysis, Lecture notes in Statistics, vol.26, 1984.

J. Shao and X. Deng, Estimation in high-dimensional linear models with deterministic design matrices, The Annals of Statistics, vol.40, issue.2, pp.812-831, 2012.

Y. She and A. B. Owen, Outlier detection using nonconvex penalized regression, Journal of the American Statistical Association, 2010.

A. F. Siegel, Robust regression using repeated medians, Biometrika, vol.69, issue.1, pp.242-244, 1982.

W. Su, M. Bogdan, and E. J. Candès, False discoveries occur early on the lasso path, Annals of Statistics, vol.45, issue.5, pp.2133-2150, 2017.

W. Su and E. J. Candès, Slope is adaptive to unknown sparsity and asymptotically minimax, Annals of Statistics, vol.44, issue.3, pp.1038-1068, 2016.

T. Sun and C. Zhang, Scaled sparse linear regression, Biometrika, vol.99, issue.4, pp.879-898, 2012.

T. Sun and C. Zhang, Scaled sparse linear regression, Biometrika, vol.99, issue.4, pp.879-898, 2012.

N. Suraweera, B. Iacopetta, A. Duval, A. Compoint, E. Tubacher et al., Conservation of mononucleotide repeats within 3' and 5' untranslated regions and their instability in msi-h colorectal cancer, Oncogene, vol.20, issue.51, pp.7472-7477, 2001.

R. Tibshirani, Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society. Series B (Methodological), pp.267-288, 1996.

R. Tibshirani, Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society, vol.58, issue.1, pp.267-288, 1996.

M. Valdora, C. Agostinelli, and V. J. Yohai, Robust estimation in high dimensional generalized linear models, 2017.

M. Valdora and V. J. Yohai, Robust estimators for generalized linear models, Journal of Statistical Planning and Inference, vol.146, pp.31-48, 2014.

A. Virouleau, A. Guilloux, S. Gaï, and M. Bogdan, High-dimensional robust regression and outliers detection with slope, 2017.
URL : https://hal.archives-ouvertes.fr/hal-01798400

H. Wang, G. Li, and G. Jiang, Robust regression shrinkage and consistent variable selection through the lad-lasso, Journal of Business & Economic Statistics, vol.25, issue.3, pp.347-355, 2007.

T. Wang and Z. Li, Outlier detection in high-dimensional regression model, Communications in Statistics-Theory and Methods, 2016.

S. Weisberg, Applied linear regression, vol.528, 2005.

E. Yang, A. Tewari, and P. Ravikumar, On robust estimation of high dimensional generalized linear models, International Joint Conference on ArtiAEcial Intelligence (IJCAI), vol.13, 2013.

V. J. Yohai, High breakdown-point and high e ciency robust estimates for regression, Annals of Statistics, vol.15, issue.2, pp.642-656, 1987.

S. N. Lee and Y. Jung, Regularization of case-speciAEc parameters for robustness and e ciency, Statistical Science, vol.27, issue.3, pp.350-372, 2012.

C. Yu and W. Yao, Robust linear regression : a review and comparison, Communications in Statistics -Simulation and Computation, 2016.

Y. Zhang, M. J. Wainwright, and M. I. Jordan, Lower bounds on the performance of polynomial-time algorithms for sparse linear regression, Proceedings of The 27th Conference on Learning Theory, pp.921-948, 2014.

P. Zhao and B. Yu, On model selection consistency of lasso, Journal of Machine Learning Research, issue.7, pp.2541-2563, 2006.

H. Zou, The adaptive lasso and its oracles properties, Journal of the American Statistical Association, vol.101, issue.476, 2006.