. Ce and . De-nombreuses-méthodes, On retrouve un catalogue des performances de certaines méthodes (et notamment des SVM) ` a l'adresse internet suivante : http ://ida.first.fraunhofer.de/projects/bench/benchmarks.htm, Nous avons reportés sur le tableau 5.2 les performances relatives aux méthodes suivantes : ? Les Support Vector Machines (SVM) proposés par Rätsch et al, 2001.

P. @bullet-la-kernel, Kernel PLS suivie des SVM sur les composantes retenues) proposée par Rosipal et al, Rosipal et al, 2003.

K. Afin-de-valider-la and . Dans-les-espaces-de-grande-dimension, 2 jeux de données issus de la problématique des pucesàpucesà ADN ontétéontété exploités : il s'agit de «Ovarian Cancer» et de «Lung Cancer» Ces jeux de données sont disponiblesàdisponiblesà l'adresse internet suivante : http ://sdmc.lit.org.sg/GEDatasets/. Ovarian Cancer est un jeu de donnéesdonnéesà deux classes en 15154 dimensions (150 observations d'apprentissage et 103 observations de test) Lung Cancer est un jeu de donnéesdonnéesà deux classes en 12533 dimensions (100 observations d'apprentissage et 81 observations de test)

. Le-protocole-d, ´ evaluation d'une méthode est le suivant : le nombre d'observations composant leséchantillonsleséchantillons d'apprentissage et de test est fixéfixéà 150 et 103 pour «Ovarian Cancer » et 100 et 81 pour «Lung Cancer». Trente partitions aléatoires sont alors générées. Le taux d'erreur moyen ± l'´ ecart type, mesuré sur la base de test

. Taille, Dans les espaces de très grande dimension, un noyau linéaire est généralement suffisant pour 1. Acquisition de l'image 2. Segmentation de la lésion 3

. La-visualisation-permet-de-circonscrire-le-domaine-de-compétence-du-modèle, En effet, lorsque l'on cherchè a prédire la classe d'un nouvel individu, sa position sur la carte peut fournir des informations essentielles : on accordera une grande confiancè a la prédiction d'un individu fondu dans la masse de ses partenaires et donc caratéristique de la classe prédite. ` A l'opposé, une probabilité d'appartenance proche de 1 pour un individú eloigné de ses partenaires est sans aucun doutè a nuancer

K. Automatiser-le-choix-du-nombre-de-composantes-appara??tappara??t-comme-un-développement-futur-importantàimportantà-mettre-en-place-la and C. , La validation croisée sur laquelle nous nous sommes appuyés pour sélectionner les paramètres d'ajustement, coûteuse en temps de calcul, devrait pouvoir servir de référence dans ce but On peut penseràpenserà des choix de modèles basées sur la minimisation d'une bornè a la SRM En effet, la borne de l'inégalité (1.3) est la somme du risque empirique et d'un terme dépendant du rapport h/n. Ainsi, ` a n fixé, la minimisation de la borne fournit un critère de choix de modèles qui ne fait appeì a aucune procédure de validation croisée. En pratique la difficulté d'une telle approche réside dans le calcul de la VC dimension. Or, dans le contexte des modèles de la forme (2.27), dans laquelle s'inscrivent par exemple De plus, Cherkassky et al, 1999.

L. Enfin and . Limite, Ces approches s'appuient sur des transformations de type Empirical Kernel Map et ne se restreignent nullementàmentà la gestion de matrice n×n : les matrices rectangulaires sans modification des algorithmes peuvent alorsêtrealorsêtre analysées. Les méthodes de sélection de variables (basées, par exemple, sur le test de Wald), coupléescoupléesà la KL-PLS où a la KLM-PLS sontégalementsontégalement des perpectives de recherches intéressantes, La KL-PLS et la KLM-PLS devraient alors pouvoir s'appliqueràappliquerà des problématiques o` u ` a la fois la taille de l'´ echantillon et le nombre de variables sont importants

B. Albert, A. Anderson, and J. A. , On the existence of maximum likelihood estimates in logistic regression models, Biometrika, vol.71, issue.1, pp.1-10, 1984.
DOI : 10.1093/biomet/71.1.1

P. D. Allison, Logistic regression using the SAS system, 1999.

N. Aronszajn, Theory of reproducing kernels. Transactions of the, pp.337-404, 1950.

F. R. Bach and M. I. Jordan, Predictive low-rank decomposition for kernel methods, Proceedings of the 22nd international conference on Machine learning , ICML '05, 2005.
DOI : 10.1145/1102351.1102356

G. Baffi, E. B. Martin, and A. J. Morris, Non-linear projection to latent structures revisited (the neural network PLS algorithm), Computers & Chemical Engineering, vol.23, issue.9, pp.1293-1307, 1999.
DOI : 10.1016/S0098-1354(99)00291-4

G. Baffi, E. B. Martin, and A. J. Morris, Non-linear projection to latent structures revisited: the quadratic PLS algorithm, Computers & Chemical Engineering, vol.23, issue.3, pp.395-411, 1999.
DOI : 10.1016/S0098-1354(98)00283-X

M. Barker and W. S. Rayens, Partial least squares for discrimination, Journal of Chemometrics, vol.10, issue.3, pp.166-173, 2003.
DOI : 10.1002/cem.785

P. Bastien, PLS-Cox model, Proceedings in Computational Statistics, pp.655-662, 2004.
URL : https://hal.archives-ouvertes.fr/hal-01125164

P. Bastien, V. E. Vinzi, and M. Tenenhaus, PLS generalised linear regression, Computational Statistics & Data Analysis, vol.48, issue.1, pp.17-46, 2005.
DOI : 10.1016/j.csda.2004.02.005

URL : https://hal.archives-ouvertes.fr/hal-01125098

K. P. Bennett and M. J. Embrechts, An Optimization Perspective on Kernel Partial Least Squares Regression Advances in learning Theory : Methods, Models and Applications, NATO Sciences Series III : Computer & Systems Sciences, pp.227-250, 2003.

A. Berglund and S. Wold, INLR, implicit non-linear latent variable regression, Journal of Chemometrics, vol.11, issue.2, pp.141-156, 1997.
DOI : 10.1002/(SICI)1099-128X(199703)11:2<141::AID-CEM461>3.0.CO;2-2

C. Billard, Développement d'un outil de diagnostic différentiel des maladiesàmaladiesà symptômes parkinsoniens, 2006.

B. Binder, M. Kittler, H. Seeber, A. Steiner, A. Pehamberger et al., Epiluminescence microscopy-based classification of pigmented skin lesions using computerized image analysis and an artificial neural network, Melanoma Research, vol.8, issue.3, pp.261-266, 1998.
DOI : 10.1097/00008390-199806000-00009

B. Boser, I. Guyon, and V. N. Vapnik, A training algorithm for optimal margin classifiers, Proceedings of the fifth annual workshop on Computational learning theory , COLT '92, pp.144-152, 1992.
DOI : 10.1145/130385.130401

L. Bottou, C. Cortes, J. Denker, H. Drucker, I. Guyon et al., Comparison of classifier methods: a case study in handwritten digit recognition, Proceedings of the 12th IAPR International Conference on Pattern Recognition (Cat. No.94CH3440-5), pp.77-87, 1994.
DOI : 10.1109/ICPR.1994.576879

E. Bredensteiner and K. P. Bennett, Multicategory Classification by Support Vector Machines, Computational Optimization and Applications, vol.12, 1999.
DOI : 10.1007/978-1-4615-5197-3_5

S. B. Bull, C. Mak, and C. M. Greenwood, A modified score function estimator for multinomial logistic regression in small samples, Computational Statistics & Data Analysis, vol.39, issue.1, p.39, 2002.
DOI : 10.1016/S0167-9473(01)00048-2

N. Butler and M. Denham, The peculiar shrinkage properties of partial least squares regression, Journal of the Royal Statistical Society: Series B (Statistical Methodology), vol.62, issue.3, pp.585-594, 2000.
DOI : 10.1111/1467-9868.00252

V. Cherkassky, X. Shao, F. P. Mulier, and V. N. Vapnik, Model complexity control for regression using VC generalization bounds, IEEE Transactions on Neural Networks, vol.10, issue.5, pp.1075-1989, 1999.
DOI : 10.1109/72.788648

C. Lingjaerde, O. Christophersen, and N. , Shrinkage Structure of Partial Least Squares, Scandinavian Journal of Statistics, vol.27, issue.3, pp.459-473, 2000.
DOI : 10.1111/1467-9469.00201

B. S. Dayal and J. F. Macgreggor, Improved PLS algorithms, Journal of Chemometrics, vol.11, issue.1, pp.73-85, 1997.
DOI : 10.1002/(SICI)1099-128X(199701)11:1<73::AID-CEM435>3.0.CO;2-#

S. De-jong, PLS fits closer than PCR, Journal of Chemometrics, vol.35, issue.6, pp.551-557, 1993.
DOI : 10.1002/cem.1180070608

S. De-jong, SIMPLS: An alternative approach to partial least squares regression, Chemometrics and Intelligent Laboratory Systems, vol.18, issue.3, pp.251-263, 1993.
DOI : 10.1016/0169-7439(93)85002-X

S. De-jong, PLS shrinks, Journal of Chemometrics, vol.7, issue.4, pp.323-326, 1995.
DOI : 10.1002/cem.1180090406

M. Delaunay, Mélanome cutané : le diagnostic précoce, un devoir d'efficacité, La revue du praticien, 2004.

K. Duan and S. Keerthi, Which Is the Best Multiclass SVM Method? An Empirical Study, 2003.
DOI : 10.1007/11494683_28

T. Evgeniou and M. Pontil, On the V gamma dimension for regression in Reproducing Kernel Hilbert Space, Artificial Intelligence Lab, 1999.

T. Evgeniou, M. Pontil, and T. Poggio, Regularization networks and Support Vector Machines, Advances in Computational Mathematics, vol.13, issue.1, pp.1-50, 2000.
DOI : 10.1023/A:1018946025316

D. Firth, Bias reduction of maximum likelihood estimates, Biometrika, vol.80, issue.1, pp.27-38, 1993.
DOI : 10.1093/biomet/80.1.27

R. A. Fisher and F. Yates, Statistical tables for biological, agricultural and medical research, 1938.

G. Fort and S. Lambert-lacroix, Classification using partial least squares with penalized logistic regression, Bioinformatics, vol.21, issue.7, pp.1104-1111, 2005.
DOI : 10.1093/bioinformatics/bti114

J. H. Friedman, Regularized Discriminant Analysis, Journal of the American Statistical Association, vol.33, issue.405, pp.165-175, 1989.
DOI : 10.1080/01621459.1989.10478752

J. H. Friedman, Another Approach to Polychotomous Classification, 1996.

H. Ganster, A. Pinz, and R. Rohrer, Automated melanoma recognition, IEEE Transactions on Medical Imaging, vol.20, issue.3, pp.233-239, 2001.
DOI : 10.1109/42.918473

P. H. Garthwaite, An Interpretation of Partial Least Squares, Journal of the American Statistical Association, vol.52, issue.425, pp.122-127, 1994.
DOI : 10.1137/0905052

C. Goutis, Partial Least Squares algorithm yields shrinkage estimators. The Annals of Statistics, pp.816-824, 1996.

J. J. Grob and M. A. Richard, ´ epidémiologie et prévention du mélanome, La revue du praticien, 2004.

Y. Guermeur, Combining Discriminant Models with New Multi-Class SVMs, Pattern Analysis & Applications, vol.5, issue.2, pp.168-1799, 2002.
DOI : 10.1007/s100440200015

URL : https://hal.archives-ouvertes.fr/inria-00107869

J. A. Hanley, Receiver Operating Characteristic methodology : the state of the art, Critical Reviews in Diagnostic Imaging, vol.29, pp.307-335, 1989.

G. Heinze and M. Schemper, A solution to the problem of separation in logistic regression, Statistics in Medicine, vol.44, issue.16, pp.2409-2419, 2002.
DOI : 10.1002/sim.1047

I. S. Helland, On the structure of partial least squares regression, Communications in Statistics - Simulation and Computation, vol.5, issue.2, pp.581-607, 1988.
DOI : 10.1137/0905052

A. E. Hoerl and R. W. Kennard, Ridge Regression: Biased Estimation for Nonorthogonal Problems, Technometrics, vol.24, issue.1, pp.55-77, 1970.
DOI : 10.2307/1909769

J. Horn, ´ etude colorimétrique et détection des mélanomes. Mémoire de dea d'informatique médicale et technologies de la communication, 2004.

A. Höskuldsson, PLS regression methods, Journal of Chemometrics, vol.8, issue.3, pp.211-228, 1988.
DOI : 10.1002/cem.1180020306

H. Hotelling, Analysis of a complex of statistical variables into principal components., Journal of Educational Psychology, vol.24, issue.6, pp.417-441, 1933.
DOI : 10.1037/h0071325

C. W. Hsu and C. J. Lin, A comparison of methods for multiclass Support Vector Machines, IEEE on neural networks, vol.13, pp.415-425, 2002.

V. V. Ivanov, On linear problems which are not well-posed, Soviet Math. Docl, vol.3, pp.981-983, 1962.

T. Joachims, Making Large-Scale SVM Learning Pratical, Advances in Kernel Methods -Support Vector Learning, 1999.

S. Keerthi, K. Duan, S. Shevade, and A. Poo, A Fast Dual Algorithm for Kernel Logistic Regression, Proceeding of the nineteenth international conference on machine learning, 2002.
DOI : 10.1007/s10994-005-0768-5

G. Kimeldorf and G. Wahba, Some results on Tchebycheffian spline functions, Journal of Mathematical Analysis and Applications, vol.33, issue.1, pp.82-95, 1971.
DOI : 10.1016/0022-247X(71)90184-3

N. Krämer, An overview on the shrinkage properties of partial least squares regression, Computational Statistics, vol.16, issue.2, 2006.
DOI : 10.1007/s00180-007-0038-z

U. Kreßel, Pairwise classification and support vectors machines Advances in Kernel Methods -Support Vector Learning, pp.255-268, 1999.

L. Lebart, Méthodes factorielles, Modèles statistiques pour données qualitatives. Dunod, 1997.

Y. Lee, Y. Lin, and G. Wahba, Multicategory Support Vector Machines, Journal of the American Statistical Association, vol.99, issue.465, 2001.
DOI : 10.1198/016214504000000098

Y. Lin, Support Vector Machines and the Bayes Rule in Classfication, Data Mining and Knowledge Discovery, vol.6, issue.3, pp.259-275, 2002.
DOI : 10.1023/A:1015469627679

F. Lindgren, P. Geladi, and S. Wold, The kernel algorithm for PLS, Journal of Chemometrics, vol.20, issue.1, pp.45-59, 1993.
DOI : 10.1002/cem.1180070104

F. Lindgren, P. Geladi, and S. Wold, Kernel-based PLS regression; Cross-validation and applications to spectral data, Journal of Chemometrics, vol.196, issue.6, pp.377-389, 1994.
DOI : 10.1002/cem.1180080604

R. Manne, Analysis of two partial-least-squares algorithms for multivariate calibration, Chemometrics and Intelligent Laboratory Systems, vol.2, issue.1-3, pp.187-197, 1987.
DOI : 10.1016/0169-7439(87)80096-5

H. Martens, Multivariate Calibration. Thesis, Technical university of Norway, 1985.
DOI : 10.1002/0471667196.ess1105

J. Mercer, Functions of positive and negative type and their connection with the theory of integral equations, pp.415-446, 1909.

D. Nguyen and D. Rocke, Tumor classification by partial least squares using microarray gene expression data, Bioinformatics, vol.18, issue.1, pp.39-50, 2002.
DOI : 10.1093/bioinformatics/18.1.39

D. Nguyen and D. Rocke, On partial least squares dimension reduction for microarray-based classification: a simulation study, Computational Statistics & Data Analysis, vol.46, issue.3, pp.407-425, 2004.
DOI : 10.1016/j.csda.2003.08.001

A. Nkengne, Segmentation du mélanome par apprentissage etétudeetétude de l'asymétrie. Mémoire de dea d'informatique médicale et technologies de la communication, 2004.

E. Osuna, R. Freund, and F. Girosi, An improved training algoithm for Support Vector Machines, Neural Networks for Signal Processing, pp.276-285, 1997.

A. Phatak and F. De-hoog, Exploiting the connection between PLS, Lanczos methods and conjugate gradients: alternative proofs of some properties of PLS, Journal of Chemometrics, vol.28, issue.7, pp.361-367, 2002.
DOI : 10.1002/cem.728

J. C. Platt, Fast Training of Support Vector Machines using Sequential Minimal Optimization, Advances in Kernel Methods -Support Vector Learning, pp.169-185, 1999.

S. Rännar, P. Geladi, F. Lindgren, and S. Wold, A PLS kernel algorithm for data sets with many variables and few objects. Part II: Cross-validation, missing data and examples, Journal of Chemometrics, vol.8, issue.6, pp.459-470, 1995.
DOI : 10.1002/cem.1180090604

S. Rännar, F. Lindgren, P. Geladi, and S. Wold, A PLS kernel algorithm for data sets with many variables and fewer objects. Part 1: Theory and algorithm, Journal of Chemometrics, vol.2, issue.2, pp.111-125, 1994.
DOI : 10.1002/cem.1180080204

G. Rätsch, T. Onoda, and K. R. Muller, Soft margin for adaboost, Machine Learning, vol.42, issue.3, pp.287-320, 2001.
DOI : 10.1023/A:1007618119488

R. Rifkin, Everything Old is New Again : A Fresh Look at Historical Approaches in Machine Learning, 2002.

R. Rifkin and A. Klautau, In defense of one-vs-all classification, Journal of Machine Learning Research, vol.5, pp.101-141, 2004.

B. Rosado, S. Menzies, A. Harbauer, H. Pehamberger, K. Wolff et al., Accuracy of Computer Diagnosis of Melanoma, Archives of Dermatology, vol.139, issue.3, pp.361-367, 2003.
DOI : 10.1001/archderm.139.3.361

R. Rosipal and L. J. Trejo, Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space, Journal of Machine Learning Research, vol.2, pp.97-123, 2001.

R. Rosipal, L. J. Trejo, and B. Matthews, Kernel PLS-SVC for linear and nonlinear classification, Proceeding of the twentieth international conference on machine learning (ICML-2003), 2003.

S. Rosset, J. Zhu, and T. Hastie, Boosting as a regularized path to a maximum margin classifier, Journal of Machine Learning Research, vol.5, pp.941-973, 2004.

S. Rosset, J. Zhu, and T. Hastie, Margin maximizing loss functions, Advances in Neural Information Processing Systems (NIPS) 16, 2004.

R. Dominguez, C. Kachenoura, N. Mulé, S. Tenenhaus, A. Delouche et al., Classification of Segmental Wall Motion in Echocardiography Using Quantified Parametric Images, Functional Imaging and Modeling of the Heart, 2005.
DOI : 10.1007/11494621_47

C. Saunders, A. Gammerman, and V. Vovk, Ridge regression learning algorithm in dual variables, Proceeding of the fifteenth international conference on machine learning, pp.515-521, 1998.

A. Sboner, C. Eccher, E. Blanzieri, P. Bauer, M. Cristofolini et al., A multiple classifier system for early melanoma diagnosis, Artificial Intelligence in Medicine, vol.27, issue.1, pp.29-44, 2003.
DOI : 10.1016/S0933-3657(02)00087-8

P. Schmid-saugeon, Symmetry axis computation for almost-symmetrical and asymmetrical objects: Application to pigmented skin lesions, Medical Image Analysis, vol.4, issue.3, pp.269-282, 2000.
DOI : 10.1016/S1361-8415(00)00019-0

B. Schölkopf, A. J. Smola, and K. R. Müller, Nonlinear Component Analysis as a Kernel Eigenvalue Problem, Neural Computation, vol.20, issue.5, pp.1299-1319, 1998.
DOI : 10.1007/BF02281970

C. Serruys, Classification automatique des tumeurs noires de la peau par des techniques numériques d'analyse d'images fondées sur des méthodes d'apprentissage par l'exemple, 2003.

J. Shawe-taylor and N. Cristianini, Kernel methods for pattern analysis, 2004.
DOI : 10.1017/CBO9780511809682

L. Shen and E. C. Tan, Dimension Reduction-Based Penalized Logistic Regression for Cancer Classification Using Microarray Data, IEEE/ACM Transactions on Computational Biology and Bioinformatics, vol.2, issue.2, pp.166-175, 2005.
DOI : 10.1109/TCBB.2005.22

A. J. Smola and B. Schölkopf, Sparse greedy matrix approximation for machine learning, Proceeding of the seventeenth international conference on machine learning, 2000.

A. Tenenhaus, La Régression Logistique PLS validée par bootstrap, Mémoire de DEA de Statistique, 2002.

A. Tenenhaus, A. Giron, G. Saporta, and B. Fertil, Kernel Logistic PLS : a new tool for complex classification, 11th International Symposium on Applied Stochastic Models and Data Analysis, 2005.
URL : https://hal.archives-ouvertes.fr/hal-01125042

M. Tenenhaus, La régression logistique PLS, Modèles statistiques pour données qualitatives. Technip, 2005.

A. N. Tikhonov and V. Y. Arsenin, Solutions of Ill-posed problems, 1977.

V. N. Vapnik, Estimation of Dependences Based on Empirical Data, 1982.

V. N. Vapnik, The Nature of Statistical Learning Theory, 1995.

V. N. Vapnik, Statistical Learning Theory, 1998.

G. Wahba, Support Vector Machines, Reproducing Kernel Hilbert Spaces and randomized GACV, Advances in Kernel Methods -Support Vector Learning, pp.69-88, 1999.

J. Weston and C. Watkins, Multiclass Support Vector Machines, 1998.
URL : https://hal.archives-ouvertes.fr/hal-00750277

C. K. Williams and M. Seeger, Effect of the input density distribution on kernel-based classifiers, Proceeding of the seventeenth international conference on machine learning, 2000.

H. Wold, Estimation of principal components and related models by iterative least squares, 1966.

S. Wold, Non-Linear Partial Least Squares Modelling II : Spline inner function, pp.71-84, 1992.
DOI : 10.1016/0169-7439(92)80093-j

S. Wold, N. Kettaneh-wold, and B. Skegerberg, Non-Linear PLS Modelling, pp.53-65, 1989.

S. Wold, L. Martens, and H. Wold, The multivariate calibration problem in chemistry solved by the PLS method, Proceedings Conf. Matrix Pencils, pp.286-293, 1983.
DOI : 10.1080/00401706.1978.10489693

W. Wu, D. L. Massart, and S. De-jong, Kernel-PCA algorithms for wide data Part II: Fast cross-validation and application in classification of NIR data, Chemometrics and Intelligent Laboratory Systems, vol.37, issue.2, pp.271-280, 1997.
DOI : 10.1016/S0169-7439(97)00027-0

J. Zhu and T. Hastie, Kernel Logistic Regression and the Import Vector Machine, Journal of Computational and Graphical Statistics, vol.14, issue.1, pp.185-205, 2005.
DOI : 10.1198/106186005X25619

L. Zwald, R. Vert, G. Blanchard, and P. Massart, Kernel Projection Machine : a New Tool for Pattern Recognition, Advances in Neural Information Processing Systems 17, pp.1649-1656, 2004.
URL : https://hal.archives-ouvertes.fr/hal-00373801