O. G. Lebrun, C. Lezoray, and H. Charrier, Communications internationales avec actes et comité de lec- ture 1. A new model selection method for SVM, Cardot 7th International Conference on Intelligent Data Engineering and Automated Learning , Volume LNCS 4224, pp.99-107, 2006.

C. Charrier, G. Lebrun, O. Lezoray, and C. , A Machine Learning-Based Color Image Quality Metric, pp.251-256, 2006.

C. Meurie, G. Lebrun, O. Lezoray, and A. , A supervised segmentation scheme for cancerology color images, Elmoataz IEEE International Symposium on Signal Processing and Information Technology, 2003.

G. Lebrun, C. Charrier, O. Lezoray, and H. , Réduction du temps d'apprentissage des SVM par Quantification Vectorielle, Cardot CORESA (COmpression et REprésentation des signaux Audiovisuels), pp.223-226, 2004.

[. L. Allwein, R. E. Schapire-&-y, . Singer, ]. Multiclass-to-binaryayat02, M. Ayat et al., « Feature Subset Selection Using Ant Colony Optimization « How to Use Ants for Hierarchical Clustering, Support Vector Machines for Pattern ClassificationANGIUL05] F. ANGIULLI. « Fast condensed nearest neighbor rule ». Dans the 22nd international conference on Machine learning Automatic model selection for the optimization of SVM kernels. ». Pattern Recognition BABICH & O. I. CAMPS. « Weighted Parzen Windows for Pattern Classification. Genetic Programming: An Introduction: On the Automatic Evolution of Computer Programs and Its ApplicationsBELLMA61] R. E. BELLMAN. « Adaptive Control Processes, pp.53-58, 1961.

A. L. Blum-&-p and . Langley, Selection of relevant features and examples in machine learning, Artificial Intelligence, vol.97, issue.1-2, pp.245-271, 1997.
DOI : 10.1016/S0004-3702(97)00063-5

M. [. Bonabeau, . Dorigo-&-g, . A. Theraulaz, S. Bordes, J. Ertekin et al., « A Training Algorithm for Optimal Margin Classifiers, Swarm Intelligence, From Natural to Artificial Systems Computational Learing TheoryBURBID02] R. BURBIDGE. « Stopping Criteria for SVMs « Simplified Support Vector Decision Rules. ». Dans ICMLCALLUT05] J. CALLUT & P. DUPONT. « Séparateurs à Vaste Marge Optimisant la Fonction F beta . ». Dans CAP, pp.1579-1619, 1992.

J. R. Cano, F. Herrera-&-m, and . Lozano, Using evolutionary algorithms as instance selection for data reduction in KDD: an experimental study, IEEE Transactions on Evolutionary Computation, vol.7, issue.6
DOI : 10.1109/TEVC.2003.819265

. [. Cauwenberghs-&-t, . Poggio, D. Incremental, . [. Support-vector-machine-learning, . J. Cerverón-&-f et al., « Model Selection for Support Vector Machines via Adaptive Step-Size Tabu Search « Another move toward the minimum consistent subset: a tabu search approach to the condensed nearest neighbor rule LIBSVM: a library for support vector machines Support Vector Machines : Principes d'induction, Réglage automatique et Connaissances à priori « Fusion of svm-based microscopic color images through colorimetric transformation, ICANNGA IEEE Transactions on Systems, Man, and Cybernetics Choosing Multiple Parameters for Support Vector Machines Machine Learning Thèse de doctoratCHARRI98] C. CHARRIER. « Vers l'optimisation statistique et perceptuelle de la qualité pour la compression des images couleur par quantification vectorielle Thèse de doctoratCHARRI06B] C. CHARRIER, G. LEBRUN & O. LEZORAY. « A Machine Learning- Based Color Image Quality Metric, pp.561-575, 1998.

X. [. Chen, &. Zhou, . I. Lin, S. Guyon, M. Gunn et al., « One-class SVM for learning in image retrieval « Combining SVMs with various feature selection strategies Feature extraction, foundations and applications, Image ProcessingCHEN05B] Y. CHEN & C. LIN. « Combining SVMs with various feature selection strategies rédacteurs , Feature extraction, foundations and applications, pp.34-37, 2001.

Y. G. Chou-&-l, . A. Shapiro-c, . B. Coello-&-g, . R. Veldhuizen, . Collobert-&-s et al., « Evolutionary Algorithms for Solving Multi-Objective Problems (volume 5) ». Kluwer Academic Three new metrics to measure the convergence of metaheuristics towards the Pareto frontier and the aesthetic of a set of solutions in biobjective optimization Support Vector Machines for Large-Scale Regression Problems « A Parallel Mixture of SVMs for Very Large Scale Problems, MANIEZZO. « An Investigation of some Properties of an Ant Algorithm. ». Dans PPSNCORNUÉ02] A. CORNUÉJOLS & L. MICLET. « Apprentissage artificiel ». EyrollesCRAMME01] K. CRAMMER & Y. SINGER. « On the Algorithmic Implementation of Multiclass Kernel-based Vector Machines. » Margin Analysis of the LVQ Algorithm. ». Dans NIPS Dynamically adapting kernels in support vector machines Proceedings of the 1998 conference on Advances in neural information processing systems II, pp.150-168, 1967.

[. Cristianini, J. Shawe-taylor, A. S. Elisseeff-&-j, and . Kandola, « An Introduction to Support Vector Machines and other kernel-bases learning methods », On Kernel-Target Alignment. ». Dans NIPSCUTZU03] F. CUTZU. « Polychotomous Classification with Pairwise Classifiers: A New Voting Principle. ». Dans Multiple Classifier Systems, pp.367-373, 2000.

[. C. Debuse-&-v and . Rayward-smith, The MIT Press Cambridge « Radius-margin Bound on the Leave-one-out Error of Multi-class SVMs ». Rapport technique, INRIA « On the Origin of Species by Means of Natural Selection or the Preservation of Favored Races in the Struggle for Live (MCS) identification for optimal nearest neighbor decision systems design, The Visible Differences Predictor: An Algorithm for the Assessment of Image Fidelity Feature Subset Selection within a Simulated Annealing Data Mining Algorithm.DECOST00] D. DECOSTE & K. WAGSTAFF. « Alpha seeding for support vector machines ». Dans Int. Conf. Knowledge Discovery Data Mining, pp.179-206, 1993.

V. S. Devi-&-m, . «. Murty, . Devijver-&-j, and . Kittler, An incremental prototype set building technique, DEVIJV80] P. A. DEVIJVER. « On the edited nearest neighbor rule International Conference on Pattern Recognition Pattern Recognition: A Statistical Approach DIETTERICH & G. BAKIRI. « Solving Multiclass Learning Problems via Error-Correcting Output Codes, pp.505-513, 1980.
DOI : 10.1016/S0031-3203(00)00184-9

J. X. Dong, A. Y. Krzyzak-&-c, . G. Suen, M. Dreyfus, J. Samuelides et al., Fast SVM training algorithm with decomposition on very large data sets, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.27, issue.4, pp.603-618, 2004.
DOI : 10.1109/TPAMI.2005.77

K. Duan, S. S. Keerthi-&-a, . Poo-[-duan05-]-k.-b, . S. Duan-&-s, and . Keerthi, « Evaluation of simple performance measures for tuning SVM hyperparameters « Which Is the Best Multiclass SVM Method? An Empirical Study « The distance-weighted k-nearest neighbor rule, Multiple Classifier Systems The Jackknife, the bootstrap and other resampling methods ». SIAM, pp.41-59, 1976.

. [. Efron-&-r, . J. Tibshirani-[-eshelm97-]-l, K. E. Eshelman, . D. Mathias-&-j, . Schaffer et al., Exploiting the Population Distribution « A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise ». int, Conf. on Knowledge Discovery and Data Mining (KDD'96)FAIGLE37] U. FAIGLE & W. KERN. « Some convergence results for probabilistic tabu search ». ORSA Journal on Computing, pp.548-560, 1996.

R. E. Fan, P. H. Chen-&-c, . J. Linfogel66-]-l, A. J. Fogel, . J. Owens-&-m et al., « Working Set Selection Using the Second Order Information for, FORGY65] E. W. FORGY. « Cluster Analysis of Multivariable Data: Efficiency versus Interpretability Models FRANK & S. KRAMER. « Ensembles of nested dichotomies for multiclass problems. ». Dans ICML, pp.1889-1918, 1965.

H. Fröhlich, O. Chapelle-&-b, . Schölkopffukuna89-]-k, . R. Fukunaga-&-r, and . Hayes, « The reduced Parzen classifier Algorithmes évolutionnaires appliqués à la reconnaissance des formes et à la conception optique, Feature Selection for Support Vector Machines Using Genetic Algorithms Pairwise Classification as an Ensemble Technique. ». Dans ECML Thèse de doctoratGAGNÉ05B] C. GAGNÉ. « Algorithmes évolutionnaires appliqués à la reconnaissance des formes et à la conception optique Thèse de doctoratGERSHO92] A. GERSHO & R. M. GRAY. « Vector Quantization and Signal Compression, pp.791-800, 1989.

R. Gilad-bachrach, A. Navot-&-n, . Tishby-[-glover89a-]-f, and . Glover, « Margin based feature selection -theory and algorithms « Tabu search: part I, Tabu search: part II ». Dans on ComputingGLOVER97] F. GLOVER & M. LAGUNA. « Tabu search, pp.190-206, 1989.

. [. Glover-&-s, «. Y. Hanafi, S. Grandvalet, . Canu-&-s, and . Boucheron, Tabu search and finite convergence, Algorithmes génétiques Noise Injection: Theoretical Prospects. ». Neural Computation, pp.3-36, 1994.
DOI : 10.1016/S0166-218X(01)00263-3

D. Guérin, F. Cointault, C. Gée-&-j, . Y. Guillemi, A. Guermeur et al., « A comparative study of multi-class support vector machines in the unifying framework of large margin classifiers: Research Articles « CURE: an efficient clustering algorithm for large databases « An Introduction to Variable and Feature Selection, ACM SIGMOD International Conference on Management of DataHALKID01] M. HALKIDI, Y. BATISTAKIS & M. VAZIRGIANNIS. « On Clustering Validation Techniques, pp.549-560, 1997.

J. Hao, P. Galinier-&-m, and . Habib, « Métaheuristiques pour l'optimisation combinatoire et l'affectation sous contraintes, Intelligence Artificielle, vol.13, issue.2, pp.283-324, 1999.

P. E. Hart, S. Hastie, R. Rosset, . Tibshirani-&-j, and . Zhu, The condensed nearest neighbor rule « Classification by Pairwise Coupling, The Entire Regularization Path for the Support Vector Machine, pp.515-516, 1968.

S. [. Hastie, R. Rosset, . Tibshirani-&-j, and . Zhu, « The Entire Regularization Path for the Support Vector Machine, Neural Networks: a comprehensive foundation ». Tom RobbinsHERBRI02] R. HERBRICH. « Learning Kernel Classifiers: Theory and AlgorithmsHINNEB98] A. HINNEBURG & D. A. KEIM. « An Efficient Approach to Clustering in Large Multimedia Databases with Noise, pp.58-65, 1998.

C. Lin, « A Comparison of Methods for Multiclass Support Vector Machines, IEEE Transactions in Neural Networks, pp.415-425, 2002.

J. J. Hull, A database for handwritten text recognition research, HUSH03] D. R. HUSH & C. SCOVEL. « Polynomial-Time Decomposition Algorithms for Support Vector Machines. ». Machine Learning, pp.550-554, 1994.
DOI : 10.1109/34.291440

A. K. Jain-&-d, . K. Zongker-[-jain99-]-a, M. N. Jain, . J. Murty-&-p, and . Flynn, Feature selection: evaluation, application, and small sample performance, Data clustering: a review ». ACM Computing Surveys, pp.153-158, 1997.
DOI : 10.1109/34.574797

A. K. Jain, R. P. Duin-&-j, . J. Mao-[-joachi00-]-t, and . Krim, Statistical Pattern Recognition: A Review Estimating the Generalization Performance of a SVM Efficiently « A learning scheme for a fuzzy k-nn rule « Fast minimization of structural risk by nearest neighbor rule A comparative analysis of structural risk minimization by support vector machines and nearest neighbor rule, ICML-00 Chameleon: Hierarchical Clustering Using Dynamic Modeling, pp.4-37, 1983.

G. V. Kass, . Katagiri-&-s, and . Abe, An Exploratory Technique for Investigating Large Quantities of Categorical Data, Incremental training of support vector machines using hyperspheres, pp.119-127, 1980.
DOI : 10.2307/2986296

S. [. Keerthi, C. Shevade, . R. Bhattacharyya-&-k, . S. Mur-thy-s, O. C. Keerthi et al., « Improvements to Platt's SMO Algorithm for SVM Classifier Design « A fuzzy k-nearest neighbor algorithm Reducing the number of training samples for fast support vector machine classification « Wrappers for feature subset selection Adaptive feature selection for hyperspectral data analysis, Wrappers for Feature Subset Selection Self-organization maps ». Springer Series in Information SciencesKOWALI90] P. KOWALISKI. « Vision et mesure de la couleur ». Physique fondamentale et appliquée. Masson, 2ème édition, pp.637-649, 1985.

J. R. Koza, J. Genetic-programming, D. R. Krauskopf, . W. Williams-&-d, -. H. Heeley-[-krebel99-]-u et al., « Cardinal Directions of Color Space « Pairwise classification and support vector machines » Advances in kernel methods: support vector learning « On information and sufficiency « Fitness functions in editing k-NN reference set by genetic algorithms, KUNCHE97A] L. I. KUNCHEVA. « Editing for the k-nearest neigbors rule by genetic algorithmKUNCHE99] L. KUNCHEVA & L. C. JAIN. « Nearest neighbor classifier: Simultaneous editing and feature selection. ». Pattern Recognition LettersKUNCHE04] L. I. KUNCHEVA. « Combining Pattern Classifiers: Methods and Algorithms, pp.1123-1131, 1951.

C. [. Lebrun, . C. Charrier-&-h, C. Lebrun, . Charrier-&-o, . Lezoray-[-lebrun05a-]-g et al., SVM training time reduction using vector quantization, Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004., pp.160-163, 2004.
DOI : 10.1109/ICPR.2004.1334035

C. G. Dans, O. Lebrun, C. Lezoray, . Charrier-&-h, . Cardot-[-lebrun06b-]-g et al., « A new model selection method for SVM « Speed-up LOO-CV with SVM classifier Automatic Model Selection for Support Vector Machines, SIAM International Conference on Data Mining, pp.231-236, 2000.

M. Lee, S. Keerthi, C. Ong-&-d, and . Decoste, An Efficient Method for Computing Leave-One-Out Error in Support Vector Machines With Gaussian Kernels, IEEE Transactions on Neural Networks, vol.15, issue.3, pp.750-757, 2004.
DOI : 10.1109/TNN.2004.824266

H. L. Lezoray, A. Elmoataz-&-h, . O. Cardot, D. Lezoray, . Fournier-&-h et al., Segmentation d'images par morphologie mathématique et classification de données par réseaux de neurones : Application à la classification de cellules en cytologie des séreuses « Cooperation of color pixel classification schemes and color watershed: a study for microscopic images « A Color object recognition scheme: application to cellular sorting, Thèse de doctorat Machine Vision and Applications Neural network induction graph for pattern recognition. ». NeurocomputingLEZORA05] O. LEZORAY & H. CARDOT. « Combining Multiple Pairwise Neural Networks Classifiers: A Comparative Study. ». Dans ANNIIP, pp.156-164, 2000.

H. Lin and C. W. Lin-&-r, « A note on Platt's probabilistic outputs for support vector machines ». Rapport technique, Department of Computer Science and Information Engineering, A Study on Sigmoid Kernels for SVM and the Training of non-PSD Kernels by SMO-type Methods ». Rapport technique, 2003.

K. Lin-&-c, . Lin-[-linde80-]-y, A. Linde, . M. Buzo-&-r, and . Gray, « A study on reduced support vector machines « An algorithm for vector quantizer design, COM-28LITTLE86] N. LITTLESTONE & M. WARMUTH. « Relating data compression and learnability, pp.1449-1459, 1980.

H. , L. G. Loosli, S. Canu, S. Vishwanathan, A. J. Smola-&-m et al., Boite à outils SVM simple et rapide Méthodes à noyaux pour la détection de contexte « A reexamination of the distance-weighted k-nearest neighbor classification rule Some methods for classification and analysis of Multivariable Observations « A Database of Human Segmented Natural Images and its Application to Evaluating Segmentation Algorithms and Measuring Ecological Statistics « Learning to Detect Natural Image Boundaries Using Local Brightness, Color, and Texture Cues, Feature Selection for Knowledge Discovery and Data Mining Thèse de doctorat, Institut National des Sciences Appliquées de Rouen SVM et apprentissage des très grandes bases de données ». CAp , Conférence d'apprentissage 5th Berkeley Symposium on mathematical statistics and probability Proc. 8th Int'l Conf. Computer VisionMAYORA99] E. MAYORAZ & E. ALPAYDIN. « Support Vector Machines for Multiclass Classification.MEIL ? 05] M. MEIL ? A. « Comparing Clustering -An Axiomatic View ». ICML, 2005. [MERCER09] J. MERCER. « Functions of positive and negative type and their connection with the theory of integral equationsMEURIE03A] C. MEURIE, G. LEBRUN, O. LEZORAY & A. ELMOATAZ. « A comparison of supervised pixels-based color image segmentation methods. Application in cancerology ». WSEAS Transactions on Computers, pp.741-767, 1909.

G. [. Meurie, O. Lebrun, . Lezoray-&-a, and . Elmoataz, « A supervised segmentation scheme for cancerology color images [MEURIE05A] C. MEURIE. « Segmentation d'images couleur par classification pixellaire et hiérarchie de partitions « Combination of multiple pixel classifiers for microscopic image segmentation, ISSPIT (IEEE International Symposium on Signal Processing and Information Technology) Thèse de doctorat Special issue on Colour Image Processing and Analysis for Machine Vision, pp.664-667, 2003.

. [. Morinnguyen05-]-d, . B. Nguyen-&-t, and . Ho, CBM-TR 5-110 « A reappraisal of distance-weighted k-nearest neighbor classification for pattern recognition with missing data « An efficient method for simplifying support vector machines, The need for biais in learning generalizations ». Rapport technique Machine Learning The McGraw-Hill Companies Improved Pairwise Coupling Classification with Correcting Classifiers. ». Dans ECMLODORIC97] R. ODORICO. « Learning Vector Quantization with Training Count (LV- QTC). ». Neural NetworksOSUNA97] E. OSUNA, R. FREUND & F. GIROSI. « An improved training algorithm for support vector machines, pp.160-171, 1980.

Y. Y. Ou, C. Y. Chen, S. C. Hwang-&-y, and . Oyang, « Expediting Model Selection for Support Vector Machines Based on Data Reduction, IEEE Proc. SMC, pp.786-791, 2003.

J. P. Heuristics, Intelligent Search Strategies for Computer Problem Solving, 1984.

E. Peli, Contrast in complex images, Contrast in complex images, pp.2032-2040, 1990.
DOI : 10.1364/JOSAA.7.002032

. [. Platoniotis-&-a, . J. Venetsanopoulos-[-platt99a-], . J. Platt-a, P. Smola, B. Bartlett et al., Probabilistic outputs for support vector machines and comparison to regularized likelihood methods Advances in Large Margin Classifiers « Convergence Analysis of Adaptive Tabu Search « Floating search methods in feature selection. ». Pattern Recognition Letters Pairwise Classifier Combination in the Framework of Belief Functions « One-against-all classifier combination in the framework of belief functions, Color Image Processing and Applications Fast Training of Support Vector Machines using Sequential Minimal Optimization, Advances in Kernel Methods-Support Vector Learning Pairwise Neural Network Classifiers with Probabilistic Outputs. ». Dans NIPS Machine Learning Programs for Machine Learning Combinaison de classifieurs binaires dans le cadre du Modèle des Croyances Transférables ». Dans Rencontres Francophones sur la Logique Floue et ses Applications An Empirical Comparison of Hierarchical vs. Two-Level Approaches to Multiclass Problems. ». Dans Multiple Classifier SystemsRAKOTO05] A. RAKOTOMAMONJY & S. CANU. « Frames, Reproducing Kernels, Regularization and LearningRALAIV01] L. RALAIVOLA & F. D'ALCHÉ BUC. « Incremental Support Vector Machine Learning: A Local Approach. ». Dans ICANNRECHEN65] I. RECHENBERG. « Cybernetic Solution Path of an Experimental Problem ». Royal Aircraft Establishment Library TranslationRIFKIN04] R. M. RIFKIN & A. KLAUTAU. « In Defense of One-Vs-All Classification . ». Journal of Machine Learning Research, pp.185-208, 1965.

F. [. Robertson, . Wright-&-r, and . Dykstra, « Hypothesis selection and testing by the MDL principle. ». The Computer Journal The perceptron: a probabilistic model for information storage and organization in the brain, Modeling by the shortest data description Order Restricted Statistical Inference, pp.465-471, 1958.

C. Röver-&-g, . Szepannek-[-savick03-]-p, . Savický-&-j, . E. Fürnkranz-[-schapi97-]-r, Y. Schapire et al., A Simple Method For Estimating Conditional Probabilities For SVMs Boosting the margin: A new explanation for the effectiveness of voting methods, Application of a Genetic Algorithm to Variable Selection in Fuzzy Clustering ». Rapport technique LWA Combining Pairwise Classifiers with Stacking. ». Dans IDA « Less is More: Active Learning with Support Vector Machines ». Dans ICML « Support Vector Method for Novelty Detection. ». Dans NIPS Image Segmentation via Multiple Active Contour Models and Fuzzy Clustering with Biomedical Applications. ». Dans ICPRSCOTT02] C. SCOTT & R. NOWAK. « Dyadic Classification Trees via Structural Risk Minimization. ». Dans NIPS, pp.206-210, 1997.

H. Shin-&-s and . Cho, « Pattern Selection for Support Vector Classifiers, pp.469-474, 2002.

H. Shin-&-s and . Cho, « Fast Pattern Selection Algorithm for Support Vector Classifiers: Time Complexity Analysis, pp.1008-1015, 2003.

Z. Sun, X. Yuan, G. L. Bebis-&-s, . «tahaha03-]-f, . Tahahashi-&-s et al., Genetic feature subset selection for gender classication: A comparison study « Optimizing directed acyclic graph support vector machines, Proc. Artificial Neural Networks TAILLARD. « Comparison of Iterative Searches for the Quadratic Assignment ProblemTAILLA98] E. TAILLARD. « Programmation à mémoire adaptative et algorithmes pseudo-gloutons : nouvelles perspectives pour les métaheuristiques, pp.166-170, 1995.

D. M. Tax-&-r, . Duintibshi96-]-r, . Tibshirani, A. Bias, C. Trémeau et al., « Quantitative description of image distorsions linked to compression schemes Proceedings of The Int QMAT'97 « A problem of dimensionality: A simple example « Towards Principled Feature Selection: Relevancy, Filters, and Wrappers « Core Vector Machines500-10. « Méthodologie d'évaluation subjective de la qualité des images de télévision ». Rapport technique [VANDEN00A] N. VANDENBROUCKE. « Segmentation d'images couleur par classification de pixels dans des espaces d'attributs colorimétriques adaptés. Application à l'analyse d'images de football « Color Image Segmentation by Supervised Pixel Classification in a Color Texture Feature Space: Application to Soccer Image Segmentation « Color image segmentation by pixel classification in an adapted hybrid color space. Application to soccer image analysis, Support vector domain description ». Pattern Recognition Letters Color Pixels Classification in an Hybrid Color Space. Thèse de doctorat, Université des Sciences et Technologies de Lille 1VANDER99] R. J. VANDERBEI. « LOQO: An interior point code for quadratic programming ». Optimization Methods and SoftwareVAPNIK95] V. N. VAPNIK. « The nature of statistical learning theoryVAPNIK98] V. N. VAPNIK. « Statistical Learning Theory », pp.11-13, 1979.

N. [. Veropoulos, . Cristianini-&-c, . «. Campbell, . Vincent-&-y, and . Bengio, Controlling the Sensitivity of Support Vector Machines « K-Local Hyperplane and Convex Distance Nearest Neighbor Algorithms, International Joint Conference on Artificial IntelligenceVINCEN03] P. VINCENT. « Modèles à noyaux à structure locale Thèse de doctorat VURAL & J. G. DY. « A hierarchical method for multi-class support vector machines. ». Dans ICML, pp.55-60, 1999.

W. Wang, J. R. Yang-&-r, S. H. Muntz, . Wang-&-d, . B. Bell-[-watson87-]-a et al., « Extended k-Nearest Neighbours based on Evidence Theory « The Cortex transform: Rapid computation of simulated neural images ». Computer Vis. Graphics and Image Proces, Statistical Information Grid Approach to Spatial Data Mining Handbook of Image and Video Processing Asymtotic properties of nearest neighbor rules using edited data WOLPERT & W. G. MACREADY. « No Free Lunch Theorems for Search ». Rapport techniqueWOODRU93] D. L. WOODRUFF & E. ZEMEL. « Hashing vectors for tabu search, pp.186-195, 1972.

. [. Woodruff, Simulated annealing and tabu search: Lessons from a line search, Computers & Operations Research, vol.21, issue.8, pp.823-839, 1994.
DOI : 10.1016/0305-0548(94)90013-2

Q. Xie, C. A. Laszlo-&-r, and . Ward, « Vector Quantization Technique for Nonparametric Classifier Design, IEEE Trans. Pattern Anal. Mach. Intell, vol.15, issue.12, pp.1326-1330, 1993.

M. Ahuja, « A Geometric Approach to Train Support Vector Machines, pp.1430-1437, 2000.

H. Yu, J. Yang-&-j, and . Han, Classifying large data sets using SVMs with hierarchical clusters, Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining , KDD '03, pp.306-315, 2003.
DOI : 10.1145/956750.956786

H. Yu, J. H. Yang-&-j, R. Zhang, . Ramakrishnan-&-m, and . Livny, Classifying large data sets using SVMs with hierarchical clusters « Transforming classifier scores into accurate multiclass probability estimates « BIRCH: an efficient data clustering method for very large databases, the International Conference Management of Data (ACM-SIGMOD)ZHANG02] W. ZHANG & I. KING. « Locating support vectors via beta-skeleton technique ». Neural Information Processing, pp.306-315, 1996.