N. Résultats-pour-le-premier, 63 3.5.2.1 Mesure utilisée pour la comparaison, p.63

.. 'un-point-de-vue-bayésien, 94 5.2.1.1 Cas général, A two layers incremental discretization based on order statistics, p.95

I. Guyon, V. Lemaire, G. Dror, and D. Vogel, Design and analysis of the KDD cup 2009, Conference Proceedings, pp.1-22, 2009.
DOI : 10.1145/1809400.1809414

R. Féraud, M. Boullé, F. Clérot, F. Fessant, and V. Lemaire, The Orange Customer Analysis Platform, Industrial Conference on Data Mining (ICDM), pp.584-594, 2010.
DOI : 10.1007/978-3-642-14400-4_45

B. Settles, Active learning literature survey, 2010.

R. S. Michalski, I. Mozetic, J. Hong, and N. Lavrac, The Multi- Purpose incremental Learning System AQ15 and its Testing Application to Three Medical Domains, Proceedings of the Fifth National Conference on Artificial Intelligence, pp.1041-1045, 1986.

J. Gama, R. Rocha, and P. Medas, Accurate decision trees for mining high-speed data streams, Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining , KDD '03, pp.523-528, 2003.
DOI : 10.1145/956750.956813

G. Forman and I. Cohen, Learning from Little: Comparison of Classifiers Given Little Training, Knowledge Discovery in Databases: PKDD 2004, pp.161-172, 2004.
DOI : 10.1007/978-3-540-30116-5_17

T. Lim, W. Loh, and Y. Shih, A comparison of prediction accuracy, complexity, and training time of thirty-three old and new classification algorithms, Machine Learning, vol.40, issue.3, pp.203-228, 2000.
DOI : 10.1023/A:1007608224229

Y. Muto and Y. Hamamoto, Improvement of the parzen classifier in small training sample size situations, Intelligent Data Analysis, vol.5, issue.6, pp.477-490, 2001.

Y. Hamamoto, S. Uchimura, and S. Tomita, A bootstrap technique for nearest neighbor classifier design, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.19, issue.1, pp.73-79, 1997.
DOI : 10.1109/34.566814

M. Skurichina and R. P. Duin, Stabilizing classifiers for very small sample sizes, Proceedings of 13th International Conference on Pattern Recognition, p.891, 1996.
DOI : 10.1109/ICPR.1996.547204

A. Basavanhally, S. Doyle, and A. Madabhushi, Predicting classifier performance with a small training set: Applications to computer-aided diagnosis and prognosis, 2010 IEEE International Symposium on Biomedical Imaging: From Nano to Macro, 2009.
DOI : 10.1109/ISBI.2010.5490373

D. Brain and G. I. Webb, On the effect of data set size on bias and variance in classification learning, Proceedings of the Fourth Australian Knowledge Acquisition Workshop (AKAW '99)

R. Fisher, P. Langley, W. Iba, and K. Thompson, The use of multiple measurements in taxonomic problems An analysis of Bayesian classifiers, Proceedings of the National Conference on Artificial Intelligence, pp.179-188, 1936.

J. Gama, P. Medas, and P. Rodrigues, Learning decision trees from dynamic data streams, Proceedings of the 2005 ACM symposium on Applied computing , SAC '05, 2005.
DOI : 10.1145/1066677.1066809

I. H. Witten and E. Frank, Data mining, ACM SIGMOD Record, vol.31, issue.1, 2005.
DOI : 10.1145/507338.507355

M. Boullé, Khiops: A Statistical Discretization Method of Continuous Attributes, Machine Learning, pp.53-69, 2004.
DOI : 10.1023/B:MACH.0000019804.29836.05

G. John and P. Langley, Estimating continuous distributions in Bayesian classifiers, Proceedings of the Eleventh Conference on Uncertainty in Artificial Intelligence, pp.338-345, 1995.

R. Bouckaert, Bayesian Network Classifiers in Weka, 2004.

D. Aha, D. Kibler, and M. Albert, Instance-based learning algorithms, Machine learning, pp.37-66, 1991.
DOI : 10.1007/BF00153759

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.138.635

S. L. Cessie and J. V. Houwelingen, Ridge Estimators in Logistic Regression, Applied Statistics, vol.41, issue.1, 1992.
DOI : 10.2307/2347628

C. Chang and C. Lin, LIBSVM, ACM Transactions on Intelligent Systems and Technology, vol.2, issue.3, 2001.
DOI : 10.1145/1961189.1961199

Y. El-manzalawy and V. Honavar, WLSVM: Integrating LibSVM into Weka Environment, 2005.

Y. Freund and L. Mason, The alternating decision tree learning algorithm, Machine learning, pp.124-133, 1999.

J. R. Quinlan and C. , 5: programs for machine learning, 1993.

L. Breiman, J. Friedman, R. Olshen, and C. Stone, Classification and regression trees, 1984.

L. Breiman, Random forests, Machine learning, vol.25, issue.2, pp.5-32, 2001.

G. Demiröz and H. Güvenir, Classification by Voting Feature Intervals, Machine Learning: ECML-97, pp.85-92, 1997.
DOI : 10.1007/3-540-62858-4_74

M. Boullé, Regularization and Averaging of the Selective Na??ve Bayes classifier, The 2006 IEEE International Joint Conference on Neural Network Proceedings, pp.1680-1688, 2006.
DOI : 10.1109/IJCNN.2006.1716310

C. L. Blake and C. J. Merz, UCI Repository of machine learning databases/ml/ visit pour la dernire fois : 15/09 ROC graphs: Notes and practical considerations for researchers, Machine Learning, pp.1-38, 1998.

I. Guyon, G. Cawley, G. Dror, and V. Lemaire, Results of the Active Learning Challenge, JMLR W&CP, Workshop on Active Learning and Experimental Design, pp.1-26, 2010.

P. Domingos and M. Pazzani, On the optimality of the simple Bayesian classifier under zero-one loss, Machine learning, vol.130, pp.103-130, 1997.

F. Cucker and S. Smale, Best Choices for Regularization Parameters in Learning Theory: On the Bias???Variance Problem, Foundations of Computational Mathematics, vol.2, issue.4, pp.413-428, 2008.
DOI : 10.1007/s102080010030

G. Bouchard and B. Triggs, The tradeoff between generative and discriminative classifiers, IASC International Symposium on Computational Statistics (COMPSTAT), pp.721-728, 2004.
URL : https://hal.archives-ouvertes.fr/inria-00548546

E. Bauer and R. Kohavi, An empirical comparison of voting classification algorithms: Bagging, boosting, and variants, Machine Learning, vol.36, issue.1/2, pp.105-139, 1999.
DOI : 10.1023/A:1007515423169

L. Breiman, Bagging predictors, Machine Learning, pp.123-140, 1996.
DOI : 10.1007/BF00058655

A. Prinzie, D. Van-den, and . Poel, Random Multiclass Classification: Generalizing Random Forests to Random MNL and Random NB, Database and Expert Systems Applications. Springer -Lecture Notes in Computer Science, pp.349-358, 2007.
DOI : 10.1007/978-3-540-74469-6_35

B. Annexe, Apprendre avec peu d'exemples : une étude empirique Annexe C Challenge Exploration & Exploitation, 2011.

A. Bifet, G. Holmes, R. Kirkby, and B. Pfahringer, MOA: Massive online analysis, J. Mach. Learn. Res, vol.99, pp.1601-1604, 2010.
DOI : 10.1007/978-3-642-41398-8_9

M. Boullé, MODL: A Bayes optimal discretization method for continuous attributes, Machine Learning, pp.131-165, 2006.
DOI : 10.1007/s10994-006-8364-x

G. Cormode and S. Muthukrishnan, An improved data stream summary: the count-min sketch and its applications, Journal of Algorithms, vol.55, issue.1, pp.58-75, 2005.
DOI : 10.1016/j.jalgor.2003.12.001

C. Cortes and M. Mohri, AUC optimization vs. error rate minimization, Advances in Neural Information Processing Systems, 2004.

P. Domingos and G. Hulten, Mining high-speed data streams, Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining , KDD '00, pp.71-80, 2000.
DOI : 10.1145/347090.347107

M. Greenwald and S. Khanna, Space-efficient online computation of quantile summaries, ACM SIGMOD Record, vol.30, issue.2, pp.58-66, 2001.
DOI : 10.1145/376284.375670

J. Langford, A. Strehl, and J. Wortman, Exploration scavenging, Proceedings of the 25th international conference on Machine learning, ICML '08, pp.528-535, 2008.
DOI : 10.1145/1390156.1390223

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.149.5626

L. Li, W. Chu, J. Langford, and R. E. Schapire, A contextual-bandit approach to personalized news article recommendation, Proceedings of the 19th international conference on World wide web, WWW '10, pp.661-670, 2010.
DOI : 10.1145/1772690.1772758

L. Li, W. Chu, J. Langford, and X. Wang, Unbiased offline evaluation of contextualbandit-based news article recommendation algorithms, Proceedings of the fourth ACM international conference on Web search and data mining, pp.297-306, 2011.

N. C. Oza, Online Bagging and Boosting, 2005 IEEE International Conference on Systems, Man and Cybernetics, pp.2340-2345, 2005.
DOI : 10.1109/ICSMC.2005.1571498

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.32.8889

F. Provost and P. Domingos, Tree induction for probability-based ranking, Machine Learning, pp.199-215, 2003.

J. Rocchio, Relevance Feedback in Information Retrieval, SMART Retrieval System Experiments in Automatic Document Processing, pp.313-323, 1971.

W. David and . Aha, Lazy learning, 1997.

S. Amari, K. Murata, . Muller, H. Finke, and . Yang, Asymptotic statistical theory of overtraining and cross-validation, IEEE Neural Networks Council, p.98596, 1997.
DOI : 10.1109/72.623200

A. Bordes and L. Bottou, The Huller : a simple and ecient online SVM, Proceedings of the 16th European Conference on Machine Learning (ECML2005), 2005.

[. Bondu, M. Boullé, [. Bordes, . Ertekin, L. Weston et al., A supervised approach for change detection in data streams Fast kernel classiers with online and active learning, International Joint Conference on Neural Networks (IJCNN), pp.1579-1619, 2005.

L. Breiman, J. H. Friedman, R. A. Olshen, and C. J. Stone, Classication and regression trees, 1984.

A. Bifet and R. Gavalda, Learning from Time-Changing Data with Adaptive Windowing, SIAM International Conference on Data Mining, p.443448, 2007.
DOI : 10.1137/1.9781611972771.42

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.144.2279

I. [. Boser, V. N. Guyon, and . Vapnik, A training algorithm for optimal margin classiers, Fourth International Workshop on Knowledge Discovery from Data Streams Proceedings of the fth annual workshop on Computational learning theory, pp.7786-144152, 1992.

[. Beringer and E. Hüllermeier, Ecient instance-based learning on data streams, Intelligent Data Analysis, vol.11, issue.6, pp.627650-627659, 2007.

A. Bifet, G. Holmes, B. Pfahringer, R. Kirkby, and R. Gavaldà, New ensemble methods for evolving data streams, Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining, KDD '09, p.139147, 2009.
DOI : 10.1145/1557019.1557041

[. Ben-haim and E. Tom-tov, A streaming parallel decision tree algorithm, Journal of Machine Learning, vol.11, p.849872, 2010.

M. Christopher and . Bishop, Neural Networks for Pattern Recognition, 1995.

A. Bifet and R. Kirkby, DATA STREAM MINING A Practical Approach, Journal of empirical nance, vol.8, issue.3, p.325342, 2009.

[. Blake and C. Merz, UCI Repository of machine learning databases, 1998.

H. Brighton and C. Mellish, Advances in instance selection for instancebased learning algorithms, Data Mining and Knowledge Discovery, vol.6, issue.2, pp.153-172, 2002.
DOI : 10.1023/A:1014043630878

J. Bártolo-gomes, E. Menasalvas, and P. Sousa, Tracking recurrent concepts using context, Proceedings of the 7th international conference on Rough sets and current trends in computing, p.168177, 2010.

M. Boullé, A Grouping Method for Categorical Attributes Having Very Large Number of Values, Machine Learning and Data Mining in Pattern Recognition, p.228242, 2005.
DOI : 10.1007/11510888_23

M. Boullé and . Modl, MODL: A Bayes optimal discretization method for continuous attributes, Machine Learning, p.131165, 2006.
DOI : 10.1007/s10994-006-8364-x

M. Boullé, Regularization and Averaging of the Selective Naive Bayes classier . The, IEEE International Joint Conference on Neural Network Proceedings, p.16801688, 2006.

M. Boullé, Recherche d'une représentation des données ecace pour la fouille des grandes bases de données, 2007.

M. Boullé, Une méthode optimale d'évaluation bivariée pour la classication supervisée, Extraction et gestion des connaissances, pp.461-472, 2007.

M. Boullé, Optimum simultaneous discretization with data grid models in supervised classication : a Bayesian model selection approach Advances in Data Analysis and Classication, 2009.

L. Breiman, Bagging predictors, Machine Learning, vol.10, issue.2, p.123140, 1996.
DOI : 10.1007/BF00058655

B. Buchanan and E. Shortlie, Rule-Based Expert Systems : The MYCIN Experiments of the Stanford Heuristic Programming Project, Series in Articial Intelligence, 1984.

G. Cormode and M. Hadjieleftheriou, Finding frequent items in data streams, Proceedings of the VLDB Endowment, p.15301541, 2008.

M. Charikar, Finding frequent items in data streams, Theoretical Computer Science, vol.312, issue.1, p.315, 2004.

G. Cormode and S. Muthukrishnan, An improved data stream summary : the count-min sketch and its applications, Journal of Algorithms, vol.55, issue.1, p.5875, 2005.

C. Cortes and V. Vapnik, Support-vector networks, Machine Learning, p.273297, 1995.
DOI : 10.1007/BF00994018

T. Dean and M. Boddy, An analysis of time-dependent planning, Proceedings of the seventh national conference on articial intelligence, p.4954, 1988.

]. J. Del-campo-avila, G. Ramos-jiménez, J. Gama, and R. Morales-bueno, Improving Prediction Accuracy of an Incremental Algorithm Driven by Error Margins. Knowledge Discovery from Data Streams, p.305318, 2006.

D. [. Domeniconi and . Gunopulos, Incremental support vector machine construction, Proceedings 2001 IEEE International Conference on Data Mining, p.589592, 2001.
DOI : 10.1109/ICDM.2001.989572

P. Domingos and G. Hulten, Mining high-speed data streams, Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining , KDD '00, p.7180, 2000.
DOI : 10.1145/347090.347107

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.119.3124

G. [. Domingos and . Hulten, Catching up with the data : Research issues in mining data streams, Workshop on Research Issues in Data Mining and Knowledge Discovery, 2001.

R. [. Dougherty, M. Kohavi, and . Sahami, Supervised and Unsupervised Discretization of Continuous Features, Proceedings of the Twelfth International Conference on Machine LearningDKS05] Jian-xiong Dong, Adam Krzyzak, and Ching Y Suen. Fast SVM training algorithm with decomposition on very large data sets. IEEE transactions on pattern analysis and machine intelligence, p.603618, 1995.
DOI : 10.1016/B978-1-55860-377-6.50032-3

V. [. Do, F. Nguyen, and . Poulet, GPU-based parallel SVM algorithm
URL : https://hal.archives-ouvertes.fr/hal-00652463

M. [. Domingos and . Pazzani, On the optimality of the simple Bayesian classier under zero-one loss, Machine learning, vol.130, p.103130, 1997.

A. Dries and U. Rückert, Adaptive concept drift detection, Statistical Analysis and Data Mining, vol.2, p.311327, 2009.

T. Fawcett, ROC graphs : Notes and practical considerations for researchers, Machine Learning, p.138, 2004.

K. [. Fayyad and . Irani, Multi-interval discretization of continuous-valued attributes for classication learning, Proceedings of the International Joint Conference on Uncertainty in AI, p.10221027, 1993.

M. Usama, G. Fayyad, P. Piatetsky-shapiro, R. Smyth, ]. F. Uthurusamyftarr05a et al., Advances in Knowledge Discovery and Data Mining American Association for Articial Intelligence Incremental rule learning based on example nearness from numerical data streams, Proceedings of the 2005 ACM symposium on Applied computing, p.572, 1996.

[. Ferrer-troyano, J. S. Aguilar-ruiz, J. C. Riquelmeftarr06-]-f, J. S. Ferrer-troyano, J. C. Aguilar-ruiz et al., Incremental rule learning based on example nearness from numerical data streams Data streams classication by incremental rule learning with parameterized generalization, Proceedings of the 2005 ACM symposium on Applied computing -SAC '05 Proceedings of the 2006 ACM symposium on Applied computing, pp.568572-657661, 2005.

J. Gama, Knowledge Discovery from Data Streams. Chapman and Hall, 2010.
DOI : 10.1201/ebk1439826119

R. Guigourès and M. Boullé, Optimisation directe des poids de modèles dans un prédicteur Bayésien naif moyenné, Extraction et gestion des connaissances EGC'2011, p.7782, 2011.

M. Greenwald and S. Khanna, Space-ecient online computation of quantile summaries, ACM SIGMOD Record, vol.30, issue.2, p.5866, 2001.

P. [. Gama and . Kosina, Tracking Recurring Concepts with Meta-learners, Progress in Articial Intelligence, p.423434, 2009.
DOI : 10.1145/502512.502568

. Guyon, M. Lemaire, . Boullé, D. Dror, and . Vogel, Design and analysis of the KDD cup 2009, Conference Proceedings, p.122, 2009.
DOI : 10.1145/1809400.1809414

P. [. Gama, G. Medas, P. Castillo, and . Rodrigues, Learning with drift detection Advances in Articial Intelligence -SBIA, p.286295, 2004.

[. Gibbons, Y. Matias, and V. Poosala, Fast incremental maintenance of approximate histograms, ACM Transactions on Database Systems, vol.27, issue.3, pp.261-298, 2002.
DOI : 10.1145/581751.581753

P. [. Gama, P. Medas, . [. Rodrigues, C. Gama, and . Pinto, Learning decision trees from dynamic data streams Discretization from data streams : applications to histograms and data mining, Proceedings of the 2006 ACM symposium on Applied computing, pp.13531366-662667, 2005.

A. Globersonn and S. Roweis, Metric Learning by Collapsing Classes, Neural Information Processing Systems(NIPS), 2005.

[. Gehrke, R. Ramakrishnan, and V. Ganti, RainForest -a framework for fast decision tree construction of large datasets, Data Mining and Knowledge Discovery, vol.4, issue.2, p.127162, 2000.

R. [. Gama, P. Rocha, and . Medas, Accurate decision trees for mining highspeed data streams, Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining, p.523528, 2003.

P. [. Gama, R. Rodrigues, P. P. Sebastiao, and . Rodrigues, Issues in evaluation of stream learning algorithms, Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining, KDD '09, p.329338
DOI : 10.1145/1557019.1557060

G. Hulten and P. Domingos, VFML -A toolkit for mining high-speed time-changing data streams

[. Hoeting, D. Madigan, and A. Raftery, Bayesian model averaging : a tutorial, Statistical science, vol.14, issue.4, p.382417, 1999.

]. G. Hoe63-]-w-hoedinghsd01, L. Hulten, P. Spencer, and . Domingos, Probability inequalities for sums of bounded random variables Mining time-changing data streams, Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining, pp.163-97106, 1963.

J. David, K. Hand, and . Yu, Idiot's Bayes ?Not So Stupid After All ?, International Statistical Review, vol.69, issue.3, p.385398, 2001.

[. John and P. Langley, Estimating continuous distributions in Bayesian classiers, Proceedings of the Eleventh Conference on Uncertainty in Articial Intelligence, p.338345, 1995.

K. Gopal and . Kanji, 100 Statistical Tests, Sage, vol.129, 2006.

[. Kass, An Exploratory Technique for Investigating Large Quantities of Categorical Data, Applied Statistics, vol.29, issue.2, p.119127, 1980.
DOI : 10.2307/2986296

R. Kerber, Chimerge : Discretization of numeric attributes, Proceedings of the tenth national conference on Articial intelligence, p.123128, 1992.

G. Mark, . Kelly, J. David, . Hand, and . Adams, The impact of changing populations on classier performance, Proceedings of the fth ACM SIGKDD international conference on Knowledge discovery and data mining -KDD '99, p.367371, 1999.

R. Kirkby, Improving Hoeding Trees, 2008.

R. Kohavi and C. Kunz, Option decision trees with majority votes, ICML '97 : Proceedings of the Fourteenth International Conference on Machine Learning, p.161169, 1996.

M. [. Kolter and . Maloof, Dynamic weighted majority: a new ensemble method for tracking concept drift, Third IEEE International Conference on Data Mining, p.123130, 2003.
DOI : 10.1109/ICDM.2003.1250911

R. Kohavi, Scaling up the accuracy of naive-Bayes classiers : A decision-tree hybrid [KP08] Ludmila I Kuncheva and Catrin O Plumpton Adaptive Learning Rate for Online Linear Discriminant Classiers, Proceedings of the Second International Conference on Knowledge Discovery and Data Mining Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition, p.510519, 1996.

R. Klinkenberg and I. Renz, Adaptive Information Filtering : Learning Drifting Concepts, Articial Intelligence, 1998.

[. Kononenko and M. Robnik, Theoretical and Empirical Analysis of ReliefF and RReliefF, Machine Learning Journal, vol.53, p.2369, 2003.

D. Koller and M. Sahami, Toward Optimal Feature Selection, International Conference on Machine Learning, p.284292, 1996.

. Bibliographie-[-kun11, I. Ludmila, and . Kuncheva, Change Detection in Streaming Multivariate Data Using Likelihood Detectors, IEEE Transactions on Knowledge and Data Engineering, p.17, 2011.

[. Langley, W. Iba, and K. Thompson, An analysis of Bayesian classiers, Proceedings of the National Conference on Articial Intelligence, number 415, p.223228, 1992.

P. Langley and S. Sage, Induction of Selective Bayesian Classiers Association Rule Interestingness : Measure and Statistical Validation, Proceedings of the Tenth Conference on Uncertainty in Articial Intelligence Quality Measures in Data Mining, volume 43 of Studies in Computational Intelligence, pp.399406-251275, 1994.

S. [. Lazarescu, H. Venkatesh, and . Bui, Using multiple windows to track concept drift, Intelligent Data Analysis, vol.8, issue.1, p.2959, 2004.

J. Lu, Y. Yang, and G. Webb, Incremental Discretization for Naive-Bayes classier. Advanced Data Mining and Applications, pp.223-238, 2006.

C. [. Law and . Zaniolo, An adaptive nearest neighbor classication algorithm for data streams Lecture notes in computer science SLIQ : A fast scalable classier for data mining, Rakesh Agrawal, and Jorma RissanenMar03] S Marsland. Novelty Detection in Learning Systems. Neural computing surveys, p.1081834157195, 1996.

R. [. Maloof and . Michalski, Selecting examples for partial memory learning, Machine Learning, p.2752, 2000.

. [. Michalski, . Mozetic, N. Hong, and . Lavrac, The Multi-Purpose incremental Learning System AQ15 and its Testing Application to Three Medical Domains Selection and sorting with limited storage, Proceedings of the Fifth National Conference on Articial Intelligence 19th Annual Symposium on Foundations of Computer Science, pp.10411045-253258, 1978.

G. Singh-manku, B. G. Sridhar-rajagopalan, and . Lindsay, Approximate medians and other quantiles in one pass and with limited memory, 1998.

[. Moreno-seco, L. Mico, and J. Oncina, Extending LAESA Fast Nearest Neighbour Algorithm to Find the k Nearest Neighbours, SSPR & SPR, p.718724, 2002.
DOI : 10.1007/3-540-70659-3_75

. [. Narasimhamurthy, I. Ludmila, and . Kuncheva, A framework for generating data to simulate changing environments The Eects of Training Set Size on Decision Tree Complexity, Proceedings of the 25th conference on Proceedings of the 25th IASTED International Multi-Conference : articial intelligence and applications ICML '97 : Proceedings of the Fourteenth International Conference on Machine Learning, pp.384389-254262, 1997.

[. Olivier, M. Jérémie, and P. Philippe, ICML Exploration & Exploitation Challenge : Keep it simple, Journal of Machine Learning Research -Proceedings Track, vol.26, p.6285, 2012.

E. Page, CONTINUOUS INSPECTION SCHEMES, Biometrika, vol.41, issue.1-2, p.100, 1954.
DOI : 10.1093/biomet/41.1-2.100

[. Provost and P. Domingos, Tree induction for probability-based ranking, Machine Learning, 2003.

[. Pfahringer, G. Holmes, and R. Kirkby, Handling numeric attributes in Hoeding trees Advances in Knowledge Discovery and Data Mining, p.296307, 2008.

[. Provost and V. Kolluri, A Survey of Methods for Scaling Up Inductive Algorithms, Data Mining and Knowledge Discovery, vol.3, issue.2, pp.131-169, 1999.
DOI : 10.1023/A:1009876119989

L. Prechelt, Early Stopping -but when ? In Neural Networks : Tricks of the Trade, LNCS, vol.1524, issue.2, p.5569, 1997.

[. Quionero-candela, M. Sugiyama, A. Schwaighofer, D. Neil, and . Lawrence, Dataset Shift in Machine Learning, 2009.

J. R. Quinlan, C4.5 : programs for machine learning, 1993.

M. Riedmiller and H. Braun, A direct adaptive method for faster backpropagation learning: the RPROP algorithm, IEEE International Conference on Neural Networks, p.586591, 1993.
DOI : 10.1109/ICNN.1993.298623

]. G. Rjdcamb06, J. Ramos-jimenez, R. Del-campo-avila, and . Morales-bueno, Incremental algorithm driven by error margins, Lecture Notes in Computer Science, vol.4265, p.358362, 2006.

]. T. Sak-+-09, I. Seidl, P. Assent, R. Kranen, J. Krieger et al., Indexing density models for incremental learning and anytime classication on data streams, Proceedings of the 12th International Conference on Extending Database Technology : Advances in Database Technology, p.311322, 2009.

[. Salganico, Tolerating concept and sampling shift in lazy learning using prediction error context switching, Artificial Intelligence Review, vol.11, issue.1/5, pp.133-155, 1997.
DOI : 10.1023/A:1006515405170

R. [. Shafer, M. Agrawal, and . Mehta, SPRINT : A scalable parallel classier for data mining, Proceedings of the International Conference on Very Large Data Bases, p.544555, 1996.

[. Stonebraker, U. Cetintemel, and S. Zdonik, The 8 requirements of real-time stream processing, ACM SIGMOD Record, vol.34, issue.4, p.4247, 2005.
DOI : 10.1145/1107499.1107504

. [. Bibliographie, D. Schlimmer, and . Fisher, A case study of incremental concept induction

J. C. Schlimmer and R. H. Granger, Incremental learning from noisy data, Machine Learning, vol.14, issue.3, p.317354, 1986.
DOI : 10.1007/BF00116895

Y. [. Street and . Kim, A streaming ensemble algorithm (SEA) for largescale classication, Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining, p.377382, 2001.

M. Scholz and R. Klinkenberg, Boosting classiers for drifting concepts, Intelligent Data Analysis, vol.11, issue.1, p.328, 2007.

C. Salperwyck and V. Lemaire, Classication incrémentale supervisée : un panel introductif. Revue des Nouvelles Technologies de l'Information , Numéro spécial sur l'apprentissage et la fouille de données, p.121148, 2011.

C. Salperwyck and V. Lemaire, Incremental discretization for supervised learning. CLADAG : CLAssication and Data Analysis Group - 8th International Meeting of the Italian Statistical Society, 2011.

C. Salperwyck and V. Lemaire, Learning with few examples: An empirical study on leading classifiers, The 2011 International Joint Conference on Neural Networks, p.10101019, 2011.
DOI : 10.1109/IJCNN.2011.6033333

C. Salperwyck and V. Lemaire, Arbres en ligne basés sur des statistiques d'ordre, Atelier CIDN de la conférence EGC 2012

C. Salperwyck and V. Lemaire, Incremental decision tree based on order statistics, The 2013 International Joint Conference on Neural Networks (IJCNN)
DOI : 10.1109/IJCNN.2013.6706907

URL : https://hal.archives-ouvertes.fr/hal-00758003

C. Salperwyck and V. Lemaire, A two layers incremental discretization based on order statistics Advances in Data Analysis and Classication, 2013.

H. [. Syed, K. K. Liu, and . Sung, Handling concept drifts in incremental learning with support vector machines, Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining , KDD '99, pp.317-321, 1999.
DOI : 10.1145/312129.312267

[. Sankaranarayanan, A. Samet, and . Varshney, A fast all nearest neighbor algorithm for applications involving large point-clouds, Computers & Graphics, vol.31, issue.2, 2007.
DOI : 10.1016/j.cag.2006.11.011

C. Salperwyck and T. Urvoy, Stumping along a Summary for Exploration & Exploitation Challenge, Journal of Machine Learning Research -Proceedings Track, vol.26, p.8697, 2011.
URL : https://hal.archives-ouvertes.fr/hal-00757998

J. [. Tsang, P. M. Kwok, and . Cheung, Core vector machines : Fast SVM training on very large data sets, Journal of Machine Learning Research, vol.6, issue.1, p.363, 2006.

N. [. Utgo, J. A. Berkman, and . Clouse, Decision tree induction based on ecient tree restructuring, Machine Learning, vol.29, issue.1, p.544, 1997.

P. E. Utgo, Incremental induction of decision trees, Machine Learning, p.161186, 1989.

[. Voisine, M. Boullé, and C. Hue, Un critère d'évaluation Bayésienne pour la construction d'arbres de décision. Extraction et gestion des connaissances, p.259264, 2009.

S. Jerey and . Vitter, Random sampling with a reservoir, ACM Transactions on Mathematical Software, vol.11, issue.1, p.3757, 1985.

[. Vassef, C. Li, and V. Castelli, Combining fast search and learning for fast similarity search, Proceedings of SPIE -The International Society for Optical Engineering, p.3242, 2000.

. [. šliobaite, Learning under Concept Drift : an Overview, 2010.

I. H. Witten and E. Frank, Data mining, ACM SIGMOD Record, vol.31, issue.1, 2005.
DOI : 10.1145/507338.507355

H. Wang, W. Fan, P. S. Yu, and J. Han, Mining concept-drifting data streams using ensemble classiers, Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining -KDD '03, p.226235, 2003.
DOI : 10.1145/956750.956778

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.14.4071

M. [. Widmer and . Kubat, Learning exible concepts from streams of examples : FLORA2, Proceedings of the 10th European conference on Articial intelligence, number section 5, p.463467, 1992.

M. [. Widmer and . Kubat, Learning in the presence of concept drift and hidden contexts, Machine Learning, vol.27, issue.11, p.69101, 1996.
DOI : 10.1007/BF00116900

[. Weinberger and L. Saul, Distance Metric Learning for Large Margin Nearest Neighbor Classication, The Journal of Machine Learning Research (JMLR), vol.10, p.207244, 2009.

Y. Yang and G. Webb, Discretization for naive-Bayes learning: managing??discretization bias and variance, Machine Learning, p.3974, 2008.
DOI : 10.1007/s10994-008-5083-5

S. Zilberstein and S. Russell, Optimal composition of real-time systems, Artificial Intelligence, vol.82, issue.1-2, p.181213, 1996.
DOI : 10.1016/0004-3702(94)00074-3

A. Djamel, R. Zighed, and . Rakotomalala, Graphes d'induction, 2000.

A. Djamel, S. Zighed, R. Rabaséda, and . Rakotomalala, FUSINTER : a method for discretization of continuous attributes, International Journal of Uncertainty , Fuzziness and Knowledge-Based Systems, vol.6, issue.3, p.307326, 1998.

Q. Zhang and W. Wang, A Fast Algorithm for Approximate Quantiles in High Speed Data Streams, 19th International Conference on Scientific and Statistical Database Management (SSDBM 2007), p.2929, 2007.
DOI : 10.1109/SSDBM.2007.27