. En-comparaison, observé dans les 5 dernières seconde avant le toucher. (a) Vitesse air (sévérité = 0.68) (b)

K. J. Archer and R. V. Kimes, Empirical characterization of random forest variable importance measures, Computational Statistics & Data Analysis, vol.52, issue.4, pp.2249-2260, 2008.
DOI : 10.1016/j.csda.2007.08.015

N. Aronszajn, Theory of reproducing kernels, Transactions of the American Mathematical Society, vol.68, issue.3, pp.337-404, 1950.
DOI : 10.1090/S0002-9947-1950-0051437-7

B. Auder, A. De-crecy, B. Iooss, and M. Marquès, Screening and metamodeling of computer experiments with functional outputs. Application to thermal???hydraulic computations, Reliability Engineering & System Safety, vol.107, pp.122-131, 2012.
DOI : 10.1016/j.ress.2011.10.017

URL : https://hal.archives-ouvertes.fr/hal-00525491

L. Auret and C. Aldrich, Empirical comparison of tree ensemble variable importance measures, Chemometrics and Intelligent Laboratory Systems, vol.105, issue.2, pp.157-170, 2011.
DOI : 10.1016/j.chemolab.2010.12.004

F. R. Bach, Bolasso, Proceedings of the 25th international conference on Machine learning, ICML '08, pp.33-40, 2008.
DOI : 10.1145/1390156.1390161

URL : https://hal.archives-ouvertes.fr/hal-00271289

C. Bahlmann, B. Haasdonk, and H. Burkhardt, On-line handwriting recognition with support vector machines -a kernel approach, Proceeding of the 8th IWFHR, 2002.

J. Baudry, C. Maugis, and B. Michel, Slope heuristics: overview and implementation, Statistics and Computing, vol.6, issue.2, pp.455-470, 2012.
DOI : 10.1007/s11222-011-9236-1

URL : https://hal.archives-ouvertes.fr/hal-00461639

A. Berlinet, G. Biau, and L. Rouvière, Functional supervised classification with wavelets, Annales de l, ISUP, vol.52, pp.61-80, 2008.

P. Besse and H. Cardot, Modélisation statistique de données fonctionnelles, Analyse des données, pp.167-198, 2003.

J. Bi, K. P. Bennett, M. Embrechts, C. M. Brenemanand, and M. Song, Dimensionality reduction via sparse support vector machines, Journal of Machine Learning Research, vol.3, pp.1229-1243, 2003.

G. Biau, Analysis of a random forests model, Journal of Machine Learning Research, vol.13, pp.1063-1095, 2012.
URL : https://hal.archives-ouvertes.fr/hal-00704947

G. Biau, F. Bunea, and M. Wegkamp, Functional Classification in Hilbert Spaces, IEEE Transactions on Information Theory, vol.51, issue.6, pp.2163-2172, 2005.
DOI : 10.1109/TIT.2005.847705

G. Biau, L. Devroye, and G. Lugosi, Consistency of random forests and other averaging classifiers, Journal of Machine Learning Research, vol.9, pp.2015-2033, 2008.
URL : https://hal.archives-ouvertes.fr/hal-00355368

L. Birgé and P. Massart, Gaussian model selection, Journal of the European Mathematical Society, vol.3, issue.3, pp.203-268, 2001.
DOI : 10.1007/s100970100031

L. Birgé and P. Massart, Minimal penalties for gaussian model selection, Probability Theory and Related Fields, pp.33-73, 2006.

A. L. Blum and P. Langley, Selection of relevant features and examples in machine learning, Artificial Intelligence, vol.97, issue.1-2, pp.245-271, 1997.
DOI : 10.1016/S0004-3702(97)00063-5

B. E. Boser, I. Guyon, and V. Vapnik, A training algorithm for optimal margin classifiers, Proceedings of the fifth annual workshop on Computational learning theory , COLT '92, pp.144-152, 1992.
DOI : 10.1145/130385.130401

S. Boyd and L. Vandenberghe, Convex optimization, 2004.

L. Breiman, Bagging predictors, Machine Learning, vol.10, issue.2, pp.123-140, 1996.
DOI : 10.1007/BF00058655

L. Breiman, Out-of-bag estimation, 1997.

L. Breiman, Random forests, Machine Learning, vol.45, issue.1, pp.5-32, 2001.
DOI : 10.1023/A:1010933404324

L. Breiman, Consistency for a simple model of random forests, 2004.

L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classification and regression trees, 1984.

P. Bühlmann, P. Rütimann, S. Van-de-geer, and C. Zhang, Correlated variables in regression: Clustering and sparse estimation, Journal of Statistical Planning and Inference, vol.143, issue.11, pp.1835-1858, 2013.
DOI : 10.1016/j.jspi.2013.05.019

T. Cai and P. Hall, Prediction in functional linear regression, The Annals of Statistics, vol.34, issue.5, pp.2159-2179, 2006.
DOI : 10.1214/009053606000000830

T. Cai and H. Zhou, A data-driven block thresholding approach to wavelet estimation, The Annals of Statistics, vol.37, issue.2, pp.569-595, 2009.
DOI : 10.1214/07-AOS538

H. Cardot, F. Ferraty, and P. Sarda, Functional linear model, Statistics & Probability Letters, vol.45, issue.1, pp.11-22, 1999.
DOI : 10.1016/S0167-7152(99)00036-X

H. Cardot, F. Ferraty, and P. Sarda, Spline estimators for the functional linear model, Statistica Sinica, vol.13, pp.571-592, 2003.

H. Cardot and P. Sarda, Estimation in generalized linear models for functional data via penalized likelihood, Journal of Multivariate Analysis, vol.92, issue.1, pp.24-41, 2005.
DOI : 10.1016/j.jmva.2003.08.008

D. Chakraborty and N. R. Pal, Selecting Useful Groups of Features in a Connectionist Framework, IEEE Transactions on Neural Networks, vol.19, issue.3, pp.381-396, 2008.
DOI : 10.1109/TNN.2007.910730

G. Chastaing, Indices de Sobol généralisés pour variables dépendantes, 2013.

G. Chastaing, F. Gamboa, and C. Prieur, Generalized Hoeffding-Sobol decomposition for dependent variables - application to sensitivity analysis, Electronic Journal of Statistics, vol.6, issue.0, pp.2420-2448, 2012.
DOI : 10.1214/12-EJS749

URL : https://hal.archives-ouvertes.fr/hal-00649404

N. V. Chawla, K. W. Bowyer, L. O. Hall, and W. P. Kegelmeyer, Smote : Synthetic minority over-sampling technique, Journal of Artificial Intelligence Research, vol.16, pp.321-357, 2002.

C. Chen, A. Liaw, and L. Breiman, Using random forest to learn imbalanced data, 2004.

S. S. Chen, D. L. Donoho, and M. A. Saunders, Atomic Decomposition by Basis Pursuit, SIAM Journal on Scientific Computing, vol.20, issue.1, pp.33-61, 1998.
DOI : 10.1137/S1064827596304010

C. Cortes and V. Vapnik, Support-vector networks, Machine Learning, pp.273-297, 1995.
DOI : 10.1007/BF00994018

N. Cristianini and J. Shawe-taylor, An introduction to support Vector Machines : and other kernel-based learning methods, 2000.
DOI : 10.1017/CBO9780511801389

A. Cuevas, M. Febrero, and . Fraiman, Robust estimation and classification for functional data via projection-based depth notions, Computational Statistics, vol.28, issue.3, pp.481-496, 2007.
DOI : 10.1007/s00180-007-0053-0

D. Veiga, S. Wahl, F. Gamboa, and F. , Local Polynomial Estimation for Sensitivity Analysis on Models With Correlated Inputs, Technometrics, vol.51, issue.4, pp.452-463, 2009.
DOI : 10.1198/TECH.2009.08124

URL : https://hal.archives-ouvertes.fr/hal-00266102

I. Daubechies, Ten Lectures on Wavelets, number 61, CBMS/NSF Series in Applied Mathematics, siam, 1992.

J. Dauxois and A. Pousse, Les analyses factorielles en calcul des probabilités et en statistique : essai d'étude synthétique, Thèse d'état, 1976.

J. Dauxois, A. Pousse, and Y. Romain, Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference, Journal of Multivariate Analysis, vol.12, issue.1, pp.136-154, 1982.
DOI : 10.1016/0047-259X(82)90088-4

E. De-rocquigny, La maîtrise des incertitudes dans un contexte industriel ? 1re partie : une approche méthodologique globale basée sur les exemples, Journal de la Société Française de Statistique, vol.147, issue.3, pp.33-71, 2006.

E. De-rocquigny, N. Devictor, and S. Tarantola, Uncertainty in Industrial Practice : A Guide to Quantitative Uncertainty Management, 2008.
DOI : 10.1002/9780470770733

J. Dean and S. Ghemawat, MapReduce, Communications of the ACM, vol.51, issue.1, pp.107-113, 2008.
DOI : 10.1145/1327452.1327492

J. Deville, M??thodes statistiques et num??riques de l'analyse harmonique, Annales de l'ins????, issue.15, pp.5-101, 1974.
DOI : 10.2307/20075177

L. Devroye, L. Györfi, and G. Lugosi, A probabilistic theory of pattern recognition, 1996.
DOI : 10.1007/978-1-4612-0711-5

R. Díaz-uriarte and S. Alvarez-de-andrés, Gene selection and classification of microarray data using random forest, BMC Bioinformatics, vol.7, issue.3, 2006.

D. L. Donoho and I. M. Johnstone, Ideal spatial adaptation by wavelet shrinkage, Biometrika, vol.81, issue.3, pp.425-455, 1994.
DOI : 10.1093/biomet/81.3.425

D. L. Donoho and I. M. Johnstone, Adapting to Unknown Smoothness via Wavelet Shrinkage, Journal of the American Statistical Association, vol.31, issue.432, pp.1200-1224, 1995.
DOI : 10.1080/01621459.1979.10481038

D. L. Donoho and I. M. Johnstone, Minimax estimation via wavelet shrinkage, The Annals of Statistics, vol.26, issue.3, pp.879-921, 1998.
DOI : 10.1214/aos/1024691081

D. L. Donoho, I. M. Johnstone, G. Kerkyacharian, and D. Picard, Wavelet shrinkage : asymptopia, Journal of the Royal Statistical Society, Series B, vol.57, pp.301-369, 1995.

D. L. Donoho, I. M. Johnstone, G. Kerkyacharian, and D. Picard, Density estimation by wavelet thresholding, The Annals of Statistics, vol.24, issue.2, pp.508-539, 1996.
DOI : 10.1214/aos/1032894451

H. Drucker, C. Burges, L. Kaufman, A. Smola, and V. Vapnik, Support vector regression machines, Advances in Neural Information Processing Systems, 1996.

K. Duan, J. Rajapakse, H. Wang, and F. Azuaje, Multiple SVM-RFE for Gene Selection in Cancer Classification With Expression Data, IEEE Transactions on Nanobioscience, vol.4, issue.3, pp.228-234, 2005.
DOI : 10.1109/TNB.2005.853657

T. Evgeniou, M. Pontil, and T. Poggio, Regularization networks and support vector machines, Advances in Computational Mathematics, pp.1-50, 2000.

Y. Fan and G. James, Functional additive regression, The Annals of Statistics, vol.43, issue.5, 2013.
DOI : 10.1214/15-AOS1346SUPP

F. Ferraty, Recent Advances in Functional Data Analysis and Related Topics, 2011.
DOI : 10.1007/978-3-7908-2736-1

URL : https://hal.archives-ouvertes.fr/hal-00794868

F. Ferraty and P. Vieu, The functional nonparametric model and application to spectrometric data, Computational Statistics, vol.17, issue.4, pp.545-564, 2002.
DOI : 10.1007/s001800200126

F. Ferraty and P. Vieu, Curves discrimination: a nonparametric functional approach, Computational Statistics & Data Analysis, vol.44, issue.1-2, pp.161-173, 2003.
DOI : 10.1016/S0167-9473(03)00032-X

F. Ferraty and P. Vieu, Nonparametric Functional Data Analysis : Theory and Practice, 2006.

F. Ferraty and P. Vieu, Additive prediction and boosting for functional data, Computational Statistics & Data Analysis, vol.53, issue.4, pp.1400-1413, 2009.
DOI : 10.1016/j.csda.2008.11.023

URL : https://hal.archives-ouvertes.fr/hal-00628614

E. Fix and J. L. Hodges, Discriminatory Analysis. Nonparametric Discrimination: Consistency Properties, International Statistical Review / Revue Internationale de Statistique, vol.57, issue.3, p.477, 1951.
DOI : 10.2307/1403797

J. Friedman, T. Hastie, and R. Tibshirani, A note on the group lasso and a sparse group lasso, 2010.

M. Fromont and C. Tuleau, Functional Classification with Margin Conditions, 19th Annual Conference on Learning Theory, 2006.
DOI : 10.1007/11776420_10

URL : https://hal.archives-ouvertes.fr/hal-00457770

R. Genuer, Variance reduction in purely random forests, Journal of Nonparametric Statistics, vol.2, issue.3, pp.543-562, 2012.
DOI : 10.1007/978-0-387-84858-7

R. Genuer, J. Poggi, and C. Tuleau-malot, Variable selection using random forests, Pattern Recognition Letters, vol.31, issue.14, pp.2225-2236, 2010.
DOI : 10.1016/j.patrec.2010.03.014

URL : https://hal.archives-ouvertes.fr/hal-00755489

J. Gertheiss, A. Maity, and A. Staicu, Variable selection in generalized functional linear models, Stat, vol.101, issue.3, pp.86-101, 2013.
DOI : 10.1002/sta4.20

S. Gey, Bornes de risque, détection de ruptures, boosting : trois thèmes autour de CART en régression, 2002.

S. Gey and E. Nédélec, Model Selection for CART Regression Trees, IEEE Transactions on Information Theory, vol.51, issue.2, pp.658-670, 2005.
DOI : 10.1109/TIT.2004.840903

URL : https://hal.archives-ouvertes.fr/hal-00326549

W. González-mantegna and A. Martínez-calvo, Bootstrap in functional linear regression, Journal of Statistical Planning and Inference, vol.141, issue.1, pp.453-461, 2011.
DOI : 10.1016/j.jspi.2010.06.027

P. J. Green and B. W. Silverman, Nonparametric regression and generalized linear models : a roughness penalty approach, Chapman and Hall, 1994.
DOI : 10.1007/978-1-4899-4473-3

B. Gregorutti, B. Michel, S. Pierre, and P. , Correlation and variable importance in random forests, Statistics and Computing, vol.2, issue.1, 2014.
DOI : 10.1007/s11222-016-9646-1

URL : https://hal.archives-ouvertes.fr/hal-00879978

B. Gregorutti, B. Michel, S. Pierre, and P. , Grouped variable importance with random forests and applications to multivariate functional data analysis, 2014.

U. Grömping, Variable Importance Assessment in Regression: Linear Regression versus Random Forest, The American Statistician, vol.63, issue.4, pp.308-319, 2009.
DOI : 10.1198/tast.2009.08199

I. Guyon and A. Elissee, An introduction to variable and feature selection, Journal of Machine Learning Research, vol.3, pp.1157-1182, 2003.

I. Guyon, J. Weston, S. Barnhill, and V. Vapnik, Gene selection for cancer classification using support vector machines, Machine Learning, vol.46, pp.1-3, 2002.

P. Hall, G. Kerkyacharian, and D. Picard, On the minimax optimality of block thresholded wavelet estimators, Statistica Sinica, vol.9, pp.33-49, 1999.

A. Hapfelmeier and K. Ulm, A new variable selection approach using Random Forests, Computational Statistics & Data Analysis, vol.60, pp.50-69, 2013.
DOI : 10.1016/j.csda.2012.09.020

T. Hastie and R. Tibshirani, Generalized Additive Models, Statistical Science, vol.1, issue.3, pp.297-318, 1986.
DOI : 10.1214/ss/1177013604

T. Hastie and R. Tibshirani, Classification by pairwise coupling, The Annals of Statistics, vol.26, issue.2, pp.451-471, 1998.
DOI : 10.1214/aos/1028144844

T. Hastie, R. Tibshirani, and J. Friedman, The Elements of Statistical Learning, 2001.

A. Haury, P. Gestraud, and J. Vert, The Influence of Feature Selection Methods on Accuracy, Stability and Interpretability of Molecular Signatures, PLoS ONE, vol.66, issue.12, pp.1-12, 2011.
DOI : 10.1371/journal.pone.0028210.t003

URL : https://hal.archives-ouvertes.fr/hal-00559580

A. Haury, L. Jacob, and J. Vert, Improving stability and interpretability of gene expression signatures, 2010.

Z. He and W. Yu, Stable feature selection for biomarker discovery, Computational Biology and Chemistry, vol.34, issue.4, pp.215-225, 2010.
DOI : 10.1016/j.compbiolchem.2010.07.002

W. Hoeding, A Class of Statistics with Asymptotically Normal Distribution, The Annals of Mathematical Statistics, vol.19, issue.3, pp.293-325, 1948.
DOI : 10.1214/aoms/1177730196

T. Homma and A. Saltelli, Importance measures in global sensitivity analysis of nonlinear models, Reliability Engineering & System Safety, vol.52, issue.1, pp.1-17, 1996.
DOI : 10.1016/0951-8320(96)00002-6

L. Horváth and P. Kokoszka, Inference for Functional Data with Applications, 2012.
DOI : 10.1007/978-1-4614-3655-3

F. Ieva, A. M. Paganoni, D. Pigoli, and V. Vitelli, Multivariate functional clustering for the morphological analysis of electrocardiograph curves, Journal of the Royal Statistical Society: Series C (Applied Statistics), vol.27, issue.3, pp.401-418, 2012.
DOI : 10.1111/j.1467-9876.2012.01062.x

B. Iooss and P. Lemaître, A Review on Global Sensitivity Analysis Methods, 2014.
DOI : 10.1007/978-1-4899-7547-8_5

URL : https://hal.archives-ouvertes.fr/hal-00975701

B. Iooss, F. Van-dorpe, and N. Devictor, Response surfaces and sensitivity analyses for an environmental model of dose calculations, Reliability Engineering & System Safety, vol.91, issue.10-11, pp.1241-1251, 2006.
DOI : 10.1016/j.ress.2005.11.021

H. Ishwaran, Variable importance in binary regression trees and forests, Electronic Journal of Statistics, vol.1, issue.0, pp.519-537, 2007.
DOI : 10.1214/07-EJS039

L. Jacob, G. Obozinski, and J. Vert, Group lasso with overlap and graph lasso, Proceedings of the 26th Annual International Conference on Machine Learning, ICML '09, pp.433-440, 2009.
DOI : 10.1145/1553374.1553431

G. M. James, T. Hastie, and C. A. Sugar, Principal component models for sparse functional data, Biometrika, vol.87, issue.3, pp.587-602, 2000.
DOI : 10.1093/biomet/87.3.587

S. Janitza, C. Strobl, and A. Boulesteix, An AUC-based permutation variable importance measure for random forests, BMC Bioinformatics, vol.14, issue.1, p.119, 2013.
DOI : 10.1186/1471-2105-11-110

H. Jiang, Y. Deng, H. Chen, L. Tao, Q. Sha et al., Joint analysis of two microarray gene-expression data sets to select lung adenocarcinoma marker genes, BMC Bioinformatics, vol.5, issue.1, p.81, 2004.
DOI : 10.1186/1471-2105-5-81

I. M. Johnstone and B. W. Silverman, Wavelet Threshold Estimators for Data with Correlated Noise, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.59, issue.2, pp.319-351, 1997.
DOI : 10.1111/1467-9868.00071

A. Kalousis, J. Prados, and M. Hilario, Stability of feature selection algorithms: a study on high-dimensional spaces, Knowledge and Information Systems, vol.20, issue.1???2, pp.95-116, 2007.
DOI : 10.1007/s10115-006-0040-8

R. Kohavi and G. H. John, Wrappers for feature subset selection, Artificial Intelligence, vol.97, issue.1-2, pp.273-324, 1997.
DOI : 10.1016/S0004-3702(97)00043-X

URL : http://doi.org/10.1016/s0004-3702(97)00043-x

P. Kíûek, J. Kittler, and V. Hlavá, Improving stability of feature selection methods, Computer Analysis of Images and Patterns, pp.929-936, 2007.

M. Kwemou, Non-asymptotic oracle inequalities for the Lasso and Group Lasso in high dimensional logistic model, ESAIM: Probability and Statistics, vol.20, 2012.
DOI : 10.1051/ps/2015020

URL : https://hal.archives-ouvertes.fr/hal-00703714

T. Laloë, A k-nearest neighbor approach for functional regression, Statistics & Probability Letters, vol.78, issue.10, pp.1189-1193, 2008.
DOI : 10.1016/j.spl.2007.11.014

B. Laurent and P. Massart, Adaptive estimation of a quadratic functional of a density by model selection, ESAIM: Probability and Statistics, vol.9, pp.1245-1501, 2000.
DOI : 10.1051/ps:2005001

C. Lazar, J. Taminau, S. Meganck, D. Steenho, A. Coletta et al., A Survey on Filter Techniques for Feature Selection in Gene Expression Microarray Analysis, IEEE/ACM Transactions on Computational Biology and Bioinformatics, vol.9, issue.4, pp.1106-1119, 2012.
DOI : 10.1109/TCBB.2012.33

X. Leng and H. Müller, Classification using functional data analysis for temporal gene expression data, Bioinformatics, vol.22, issue.1, pp.68-76, 2006.
DOI : 10.1093/bioinformatics/bti742

S. López-pintado and J. Romo, Depth-based classification for functional data, DIMACS Series in Discrete Mathematics and Theoretical Computer Science 72 : 103. Références bibliographiques, 2006.

N. Louw and S. Steel, Variable selection in kernel Fisher discriminant analysis by means of recursive feature elimination, Computational Statistics & Data Analysis, vol.51, issue.3, pp.2043-2055, 2006.
DOI : 10.1016/j.csda.2005.12.018

S. Mallat, A theory for multiresolution signal decomposition : the wavelet representation , Pattern Analysis and Machine Intelligence, IEEE Transactions on, vol.11, issue.7, pp.674-693, 1989.

S. Mallat, Une exploration des signaux en ondelettes, Éditions de l'École Poly- technique, 2000.

S. Mallat, A Wavelet Tour of Signal Processing, 2008.

C. L. Mallows, Some comments on cp, pp.661-675, 1973.

T. Mara and S. Tarantola, Variance-based sensitivity indices for models with dependent inputs, Reliability Engineering & System Safety, vol.107, pp.115-121, 2012.
DOI : 10.1016/j.ress.2011.08.008

URL : https://hal.archives-ouvertes.fr/hal-01093038

P. Massart, Concentration inequalities and model selection, 2003.

H. Matsui, Variable and boundary selection for functional data via multiclass logistic regression modeling, Computational Statistics & Data Analysis, vol.78, issue.0, pp.176-185, 2014.
DOI : 10.1016/j.csda.2014.04.015

H. Matsui and . Konishi, Variable selection for functional regression models via the regularization, Computational Statistics & Data Analysis, vol.55, issue.12, pp.3304-3310, 2011.
DOI : 10.1016/j.csda.2011.06.016

L. Meier, S. Van-de-geer, and P. Bühlmann, The group lasso for logistic regression, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.68, issue.1, pp.53-71, 2008.
DOI : 10.1111/j.1467-9868.2007.00627.x

N. Meinshausen and P. Bühlmann, Stability selection, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.7, issue.4, pp.417-473, 2010.
DOI : 10.1111/j.1467-9868.2010.00740.x

J. Mercer, Functions of positive and negative type, and their connection with the theory of integral equations, Philosophical Transactions of the, pp.415-446, 1909.

M. Misiti, Y. Misiti, G. Oppenheim, and J. Poggi, Décomposition par ondelettes et méthodes comparatives : étude d'une courbe de charge électrique, pp.55-77, 1994.

E. Mostacci, C. Truntzer, H. Cardot, and P. Ducoroy, Multivariate denoising methods combining wavelets and principal component analysis for mass spectrometry data, PROTEOMICS, vol.1, issue.14, pp.2564-2572, 2010.
DOI : 10.1002/pmic.200900185

P. G. Neville, Controversy of variable importance in random forests, Journal of Unified Statistical Techniques, vol.1, pp.15-20, 2013.

K. K. Nicodemus, Letter to the Editor: On the stability and ranking of predictors from random forest variable importance measures, Briefings in Bioinformatics, vol.12, issue.4, pp.369-373, 2011.
DOI : 10.1093/bib/bbr016

K. K. Nicodemus and J. D. Malley, Predictor correlation impacts machine learning algorithms: implications for genomic studies, Bioinformatics, vol.25, issue.15, pp.1884-1890, 2009.
DOI : 10.1093/bioinformatics/btp331

K. K. Nicodemus, J. D. Malley, C. Strobl, and A. Ziegler, The behaviour of random forest permutation-based variable importance measures under predictor correlation, BMC Bioinformatics, vol.11, issue.1, p.110, 2010.
DOI : 10.1186/1471-2105-11-110

G. Obozinski, L. Jacob, and J. Vert, Group lasso with overlaps : the latent group lasso approach, 2009.
URL : https://hal.archives-ouvertes.fr/inria-00628498

J. B. Oliva, B. Poczos, T. Verstynen, A. Singh, J. Schneider et al., Fusso : Functional shrinkage and selection operator, Proceedings of the Seventeenth International Conference on Artificial Intelligence and Statistics, pp.715-723, 2014.

E. E. Osuna, R. Freund, and F. Girosi, Support vector machines : Training and applications, 1997.

D. B. Percival and A. T. Walden, Wavelet Methods for Time Series Analysis, 2000.

D. Pigoli and L. M. Sangalli, Wavelets in functional data analysis: Estimation of multidimensional curves and their derivatives, Computational Statistics & Data Analysis, vol.56, issue.6, pp.1482-1498, 2012.
DOI : 10.1016/j.csda.2011.12.016

J. C. Platt, Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods Advances in Large Margin Classifiers, pp.61-74, 1999.

J. Poggi and C. Tuleau, Classification supervisée en grande dimension. application à l'agrément de conduite automobile, Revue de Statistique Appliquée, vol.4, pp.39-58, 2006.

A. Rakotomamonjy, Variable selection using svm based criteria, Journal of Machine Learning Research, vol.3, pp.1357-1370, 2003.

J. O. Ramsay and B. W. Silverman, Functional Data Analysis, Series in Statistics Series, 1997.

J. O. Ramsay and B. W. Silverman, Applied Functional Data Analysis : Methods and Case Studies, 2002.
DOI : 10.1007/b98886

J. O. Ramsay and B. W. Silverman, Functional Data Analysis, 2005.

T. W. Randolph, J. Harezlak, and Z. Feng, Structured penalties for functional linear models???partially empirical eigenvectors for regression, Electronic Journal of Statistics, vol.6, issue.0, pp.323-353, 2012.
DOI : 10.1214/12-EJS676

C. R. Rao, Linear statistical inference and its applications, Wiley series in probability and mathematical statistics : Probability and mathematical statistics, 1973.

F. Rossi, D. François, V. Wertz, and M. Verleysen, A Functional Approach to Variable Selection in Spectrometric Problems, Proceedings of 16th International Conference on Artificial Neural Networks, ICANN 2006, pp.11-20, 2006.
DOI : 10.1007/11840817_2

F. Rossi and N. Villa, Support vector machine for functional data classification, Neurocomputing, vol.69, issue.7-9, pp.730-742, 2006.
DOI : 10.1016/j.neucom.2005.12.010

URL : https://hal.archives-ouvertes.fr/hal-00144141

F. Rossi and N. Villa, Recent Advances in the Use of SVM for Functional Data Classification, Proceedings of 1rst International Workshop on Functional and Operatorial Statistics, 2008.
DOI : 10.1007/978-3-7908-2062-1_41

URL : https://hal.archives-ouvertes.fr/hal-00635480

A. Saltelli, Making best use of model evaluations to compute sensitivity indices, Computer Physics Communications, vol.145, issue.2, pp.280-297, 2002.
DOI : 10.1016/S0010-4655(02)00280-1

A. Saltelli, K. Chan, and E. Scott, Sensitivity Analysis, 2009.
URL : https://hal.archives-ouvertes.fr/inria-00386559

C. Saunders, A. Gammerman, and V. Vovk, Ridge regression learning algorithm in dual variables, Proceedings of the 15th International Conference on Machine Learning, pp.515-521, 1998.

B. Schölkopf and A. J. Smola, Learning with Kernels : Support Vector Machines, Regularization, Optimization, and Beyond, 2001.

G. Schwarz, Estimating the Dimension of a Model, The Annals of Statistics, vol.6, issue.2, pp.461-464, 1978.
DOI : 10.1214/aos/1176344136

E. Scornet, G. Biau, and J. Vert, Consistency of random forests, The Annals of Statistics, vol.43, issue.4, 2014.
DOI : 10.1214/15-AOS1321SUPP

URL : https://hal.archives-ouvertes.fr/hal-00990008

R. D. Shah and R. J. Samworth, Variable selection with error control: another look at stability selection, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.97, issue.1, pp.55-80, 2013.
DOI : 10.1111/j.1467-9868.2011.01034.x

J. Shawe-taylor and N. Cristianini, Kernel Methods for Pattern Analysis, 2004.
DOI : 10.1017/CBO9780511809682

H. Shimodaira, K. Noma, M. Nakai, and S. Sagayama, Dynamic time-alignment kernel in support vector machine, Advances in Neural Information Processing Systems 14, pp.921-928, 2001.

N. Simon, J. Friedman, T. Hastie, and R. Tibshirani, A Sparse-Group Lasso, Journal of Computational and Graphical Statistics, vol.67, issue.2, 2013.
DOI : 10.1080/10618600.2012.681250

N. Simon and R. Tibshirani, Standardization and the Group Lasso Penalty, Statistica Sinica, vol.22, issue.3, pp.983-1001, 2012.
DOI : 10.5705/ss.2011.075

I. M. Sobol, Sensitivity estimates for non linear mathematical models, Mathematical Modelling and Computational Experiments, vol.1, pp.407-414, 1993.

J. J. Song, W. Deng, H. Lee, and . Kwon, Optimal classification for time-course gene expression data using functional data analysis, Computational Biology and Chemistry, vol.32, issue.6, pp.426-432, 2008.
DOI : 10.1016/j.compbiolchem.2008.07.007

M. O. Stitson, A. Gammerman, V. Vapnik, V. Vovk, C. Watkins et al., Support vector regression with anova decomposition kernels Advances in kernel methods, pp.285-292, 1997.

C. Strobl, A. Boulesteix, T. Kneib, T. Augustin, and A. Zeileis, Conditional Variable Importance for Random Forests, BMC Bioinformatics, vol.9, issue.1, p.307, 2008.
DOI : 10.1186/1471-2105-9-307

C. Strobl and A. Zeileis, Danger : High power ! exploring the statistical properties of a test for random forest variable importance, Proceedings of the 18th International Conference on Computational Statistics, 2008.

J. A. Suykens, J. De-brabanter, L. Lukas, and J. Vandewalle, Weighted least squares support vector machines: robustness and sparse approximation, Neurocomputing, vol.48, issue.1-4, pp.85-105, 2002.
DOI : 10.1016/S0925-2312(01)00644-0

V. Svetnik, A. Liaw, C. Tong, and T. Wang, Application of Breiman???s Random Forest to Modeling Structure-Activity Relationships of Pharmaceutical Molecules, 2004.
DOI : 10.1007/978-3-540-25966-4_33

T. S. Tian and G. M. James, Interpretable dimension reduction for classifying functional data, Computational Statistics & Data Analysis, vol.57, issue.1, pp.282-296, 2013.
DOI : 10.1016/j.csda.2012.06.017

R. Tibshirani, Regression shrinkage and selection via the lasso, Journal of the Royal Statistical Society, Series B, vol.58, pp.267-288, 1996.

A. Tikhonov, Solution of incorrectly formulated problems and the regularization method, Soviet Mathematics Doklady pp, pp.1035-1038, 1963.

L. Toloi and T. Lengauer, Classification with correlated features: unreliability of feature ranking and solutions, Bioinformatics, vol.27, issue.14, pp.1986-1994, 2011.
DOI : 10.1093/bioinformatics/btr300

C. Tuleau, Sélection de variables pour la discrimination en grande dimension et classification de données fonctionnelles, 2005.

S. Van-de-geer, High-dimensional generalized linear models and the lasso, The Annals of Statistics, vol.36, issue.2, pp.614-645, 2008.
DOI : 10.1214/009053607000000929

M. J. Van-der-laan, Statistical Inference for Variable Importance, The International Journal of Biostatistics, vol.2, issue.1, pp.1-33, 2006.
DOI : 10.2202/1557-4679.1008

V. N. Vapnik, The nature of statistical learning theory, 1995.

V. N. Vapnik, Statistical Learning Theory, 1998.

K. Veropoulos, C. Campbell, and N. Cristianini, Controlling the sensitivity of support vector machines, Proceedings of the International Joint Conference on AI, pp.55-60, 1999.

E. Volkova, B. Iooss, and F. Van-dorpe, Global sensitivity analysis for a numerical model of radionuclide migration from the RRC ???Kurchatov Institute??? radwaste disposal site, Stochastic Environmental Research and Risk Assessment, vol.16, issue.1, pp.17-31, 2008.
DOI : 10.1007/s00477-006-0093-y

L. Wasserman, All of Nonparametric Statistics, 2006.

S. Xiang, X. Tong, and J. Ye, Ecient sparse group feature selection via nonconvex optimization, Proceedings of the 30th International Conference on Machine Learning, pp.284-292, 2013.

R. Yan, Y. Liu, R. Jin, and A. Hauptmann, On predicting rare classes with svm ensembles in scene classification, International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pp.21-24, 2003.

K. Yang, H. Yoon, and C. Shahabi, A supervised feature subset selection technique for multivariate time series, Proceedings of the Workshop on Feature Selection for Data Mining : Interfacing Machine Learning with Statistics, 2005.

H. Yoon and C. Shahabi, Feature Subset Selection on Multivariate Time Series with Extremely Large Spatial Features, Sixth IEEE International Conference on Data Mining, Workshops (ICDMW'06), pp.337-342, 2006.
DOI : 10.1109/ICDMW.2006.81

M. Yuan and Y. Lin, Model selection and estimation in regression with grouped variables, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.58, issue.1, pp.49-67, 2006.
DOI : 10.1198/016214502753479356

M. Yuan and Y. Lin, Model selection and estimation in regression with grouped variables, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.58, issue.1, pp.49-67, 2006.
DOI : 10.1198/016214502753479356

H. H. Zhang, Y. Liu, Y. Wu, and J. Zhu, Variable selection for the multicategory SVM via adaptive sup-norm regularization, Electronic Journal of Statistics, vol.2, issue.0, pp.149-167, 2008.
DOI : 10.1214/08-EJS122

P. Zhao, G. Rocha, and B. Yu, The composite absolute penalties family for grouped and hierarchical variable selection, The Annals of Statistics, vol.37, issue.6A, pp.3468-3497, 2009.
DOI : 10.1214/07-AOS584

N. Zhou and J. Zhu, Group variable selection via a hierarchical lasso and its oracle property, Statistics and Its Interface, vol.3, issue.4, pp.557-574, 2010.
DOI : 10.4310/SII.2010.v3.n4.a13

H. Zhu and D. D. Cox, A functional generalized linear model with curve selection in cervical pre-cancer diagnosis using fluorescence spectroscopy, Lecture Notes- Monograph Series pp, pp.173-189, 2009.

J. Zhu and T. Hastie, Classification of gene microarrays by penalized logistic regression, Biostatistics, vol.5, issue.3, pp.427-443, 2004.
DOI : 10.1093/biostatistics/kxg046

J. Zhu, S. Rosset, T. Hastie, and R. Tibshirani, 1-norm support vector machines, Advances in Neural Information Processing Systems, 2004.

R. Zhu, D. Zeng, and M. R. Kosorok, Reinforcement Learning Trees, Journal of the American Statistical Association, vol.7, issue.512, 2012.
DOI : 10.1080/01621459.2011.637468

H. Zou, The Adaptive Lasso and Its Oracle Properties, Journal of the American Statistical Association, vol.101, issue.476, pp.1418-1429, 2006.
DOI : 10.1198/016214506000000735

H. Zou and T. Hastie, Regularization and variable selection via the elastic net, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.5, issue.2, pp.301-320, 2005.
DOI : 10.1073/pnas.201162998