26782 articles – 20454 references  [version française]
Detailed view Article in peer-reviewed journal
Problems of Information Transmission 43, 3 (2007) 167-189
Available versions:
Attached file list to this document: 
PS
critEntrop.ps(275.7 KB)
PDF
critEntrop.pdf(297.9 KB)
Normalized information-based divergences
Jean-François Coeurjolly1, Rémy Drouilhet1, Jean-François Robineau1

This paper is devoted to the mathematical study of some divergences based on the mutual information well-suited to categorical random vectors. These divergences are generalizations of the ''entropy distance" and ''information distance". Their main characteristic is that they combine a complexity term and the mutual information. We then introduce the notion of (normalized) information-based divergence, propose several examples and discuss their mathematical properties in particular in some prediction framework.
1:  LJK - Laboratoire Jean Kuntzmann
SAGAG
Information theory – entropy distance – information distance – triangular inequality