21734 articles – 15570 references  [version française]
 Detailed view Preprint, Working Paper, ...
 Available versions: v1 (2005-07-08) v2 (2005-09-13) v3 (2011-05-27)
Attached file list to this document:
 PDF
 plugin_v3.pdf(407.4 KB)
 PS
 plugin_v3.ps(844.1 KB)
 Fast learning rates for plug-in classifiers under the margin condition
 It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, i.e., the rates faster than $n^{-1/2}$. The works on this subject suggested the following two conjectures: (i) the best achievable fast rate is of the order $n^{-1}$, and (ii) the plug-in classifiers generally converge slower than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only the fast, but also the {\it super-fast} rates, i.e., the rates faster than $n^{-1}$. We establish minimax lower bounds showing that the obtained rates cannot be improved.
 Keyword(s) : classification – statistical learning – fast rates of convergence – excess risk – plug-in classifiers – minimax lower bounds
 hal-00005882, version 3 http://hal.archives-ouvertes.fr/hal-00005882 oai:hal.archives-ouvertes.fr:hal-00005882 From: Jean-Yves Audibert <> Submitted on: Tuesday, 24 May 2011 10:31:16 Updated on: Thursday, 9 June 2011 10:34:14