M. A. Aizerman, E. A. Braverman, and L. Rozonoer, Theoretical foundations of the potential function method in pattern recognition learning. Automation and Remote Control, pp.821-837, 1964.

J. Baez, Bayesian probability theory and quantum mechanics, 2003.

H. Barlow, The exploitation of regularities in the environment by the brain, Behavioral and Brain Sciences, vol.24, issue.04, pp.602-607, 2001.
DOI : 10.1017/S0140525X01000024

]. R. Bax82 and . Baxter, Exactly solved models in statistical physics Academic press, 1982.

J. Berger and J. Bernardo, Estimating a Product of Means: Bayesian Analysis with Reference Priors, Journal of the American Statistical Association, vol.72, issue.405, pp.200-2007, 1989.
DOI : 10.1080/01621459.1989.10478756

E. Bdl-+-98a-]-pierre-bessière, O. Dedieu, E. Lebeltel, K. Mazer, and . Mekhnacha, Interprétation ou description (i) : Proposition pour une théorie probabiliste des systèmes cognitifs sensori-moteurs. Intellectica, pp.26-27257, 1998.

E. Bdl-+-98b-]-pierre-bessière, O. Dedieu, E. Lebeltel, K. Mazer, and . Mekhnacha, Interprétation ou description (ii) : Fondements mathématiques de l'approche f+d, Intellectica, pp.26-27313, 1998.

M. J. Beal, Variational Algorithms for Approximate Bayesian Inference, 2003.

J. Bernoulli, Ars Conjectandi: Usum & Applicationem Praecedentis Doctrinae in Civilibus, Moralibus & Oeconomicis, 1713.

J. L. Bertrand, Calcul des probabilités. Gauthier-Villars, p.185, 1907.

J. Bernardo, Noninformative priors do not exist: A discussion, Journal of Statistical Planning and Inference, vol.65, pp.159-189, 1997.

C. Berrou and A. Glavieux, Near optimum error correcting coding and decoding: turbo-codes, IEEE Transactions on Communications, vol.44, issue.10, pp.1261-1271, 1996.
DOI : 10.1109/26.539767

M. David, T. L. Blei, M. I. Griffiths, J. B. Jordan, and . Tenenbaum, Hierarchical topic models and the nested chinese restaurant process, Advances in Neural Information Processing Systems, 2004.

M. J. Beal, Z. Ghahramani, and C. E. Rasmussen, The infinite hidden Markov model, Advances in Neural Information Processing Systems 14, 2002.

B. E. Boser, I. M. Guyon, and V. N. Vapnik, A training algorithm for optimal margin classifiers, Proceedings of the fifth annual workshop on Computational learning theory , COLT '92, pp.144-152, 1992.
DOI : 10.1145/130385.130401

J. Breese, D. Heckerman, and C. Kadie, Empirical analysis of predictive algorithms for collaborative filtering, Proceedings of the 14th Annual Conference on Uncertainty in Artificial Intelligence (UAI-98), pp.43-52, 1998.

C. M. Bishop, Neural Networks for Pattern Recognition, 1995.

C. M. Bishop, Pattern Recognition and Machine Learning, 2006.

R. Bell, Y. Koren, and C. Volinsky, Modeling relationships at multiple scales to improve accuracy of large recommender systems, Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining , KDD '07, pp.95-104, 2007.
DOI : 10.1145/1281192.1281206

]. E. Bor09 and . Borel, Élements de la théorie des probabilités. Hermann et Fils, 1909.

G. Bouchard and B. Triggs, The tradeoff between generative and discriminative classifiers, IASC International Symposium on Computational Statistics (COMPSTAT), pp.721-728, 2004.
URL : https://hal.archives-ouvertes.fr/inria-00548546

J. O. Berger and R. L. Wolpert, The likelihood principle, Institute of Mathematical Statistics Lecture Notes-Monograph Series, vol.6, 1988.

Y. L. Chan, C. N. Anderson, and E. A. Hadly, Bayesian estimation of the timing and severity of a population bottleneck from ancient dna, PLoS Genetics, vol.2, 2006.

G. Campbell, Guidance for the use of bayesian statistics in medical device clinical trials -drafts, 2006.

C. [. Caves, R. Fuchs, and . Schack, Subjective probability and quantum certainty. ArXiv Quantum Physics e-prints, 2006.
DOI : 10.1016/j.shpsb.2006.10.007

URL : http://arxiv.org/abs/quant-ph/0608190

D. Maxwell, C. , and D. Heckerman, Efficient approximations for the marginal likelihood of bayesian networks with hidden variables, Machine Learning, vol.29, issue.2/3, pp.181-212, 1997.
DOI : 10.1023/A:1007469629108

R. [. Chatterjee, G. Krüger, W. Haller, and . Olbricht, The Bayesian approach to an internally consistent thermodynamic database: theory, database, and generation of phase diagrams, Contributions to Mineralogy and Petrology, vol.133, issue.1-2, pp.149-168, 1998.
DOI : 10.1007/s004100050444

L. Candillier, F. Meyer, and M. Boullã?, Comparing state-of-theart collaborative filtering systems, 5th International Conference on Machine Learning and Data Mining in Pattern Recognition (MLDM'2007), volume LNAI 4571 of LNCS, pp.548-562, 2007.

F. Gregory and . Cooper, The computational complexity of probabilistic inference using bayesian belief networks, Artificial Intelligence, vol.42, issue.2-3, pp.393-405, 1990.

]. R. Cou95 and . Cousins, Why isn't every physicist a Bayesian?, American Journal of Physics, vol.63, pp.398-410, 1995.

R. Threlkeld and C. , The Algebra of Probable Inference, 1961.

A. Caticha and R. Preuss, Maximum entropy and Bayesian data analysis: Entropic prior distributions, Physical Review E, vol.70, issue.4, p.46127, 2004.
DOI : 10.1103/PhysRevE.70.046127

C. Coué, C. Pradalier, C. Laugier, T. Fraichard, and P. Bessière, Bayesian Occupancy Filtering for Multitarget Tracking: An Automotive Application, The International Journal of Robotics Research, vol.99, issue.1, pp.19-30, 2006.
DOI : 10.1177/0278364906061158

P. Dangauthier, Distribution de maximum d'entropie de moyenne contrainte, Research report, vol.6279, 2007.
URL : https://hal.archives-ouvertes.fr/inria-00180739

]. H. Dav88 and . David, The method of paired comparisons, 1988.

J. Diard, P. Bessière, and E. Mazer, Merging probabilistic models of navigation: the bayesian map, Proc. of the IEEE-RSJ Int. Conf. on Intelligent Robots and Systems, pp.668-673, 2005.
URL : https://hal.archives-ouvertes.fr/inria-00182041

P. Dangauthier, P. Bessière, and A. Spalanzani, Auto-supervised learning in the Bayesian Programming Framework, Proceedings of the 2005 IEEE International Conference on Robotics and Automation, 2005.
DOI : 10.1109/ROBOT.2005.1570581

URL : https://hal.archives-ouvertes.fr/hal-00019663

S. Deerwester, S. T. Dumais, G. W. Furnas, T. K. Landauer, and R. Harshman, Indexing by latent semantic analysis, Journal of the American Society for Information Science, vol.41, issue.6, pp.41391-407, 1999.
DOI : 10.1002/(SICI)1097-4571(199009)41:6<391::AID-ASI1>3.0.CO;2-9

A. Doucet, N. De-freitas, and N. Gordon, Sequential Monte Carlo methods in practice, 2001.
DOI : 10.1007/978-1-4757-3437-9

]. L. Dem06 and . Demortier, Bayesian Reference Analysis, Statistical Problems in Particle Physics, Astrophysics and Cosmology, p.11, 2006.

S. Deneve, Bayesian inference in spiking neurons, Advances in Neural Information Processing Systems 17, pp.353-360, 2005.

P. Dangauthier, R. Herbrich, T. Minka, and T. Graepel, Trueskill through time: Revisiting the history of chess, Advances in Neural Information Processing Systems 21, 2007.
URL : https://hal.archives-ouvertes.fr/inria-00174642

G. Thomas and . Dietterich, Ensemble methods in machine learning, Lecture Notes in Computer Science, vol.1857, pp.1-15, 2000.

A. P. Dempster, N. M. Laird, and D. B. Rubin, Maximum likelihood from incomplete data via the em algorithm, Journal of the Royal Statistical Society. Series B (Methodological), vol.39, issue.1, pp.1-38, 1977.

]. V. Dos03 and . Dose, Bayesian inference in physics: case studies, Rep. Prog. Phys, 2003.

P. Domingos and M. J. Pazzani, On the optimality of the simple bayesian classifier under zero-one loss, Machine Learning, vol.29, issue.2/3, pp.103-130, 1997.
DOI : 10.1023/A:1007413511361

P. [. Davis and . Rabinowitz, Methods of numerical integration Academic press, 1984.

E. Arpad and . Elo, The Rating of Chessplayers, Past and Present, 1986.

I. [. Elidan, D. Mcgraw, and . Koller, Residual belief propagation: Informed scheduling for asynchronous message passing, Proceedings of the Twentysecond Conference on Uncertainty in AI, 2006.

T. S. Ferguson, A Bayesian Analysis of Some Nonparametric Problems, The Annals of Statistics, vol.1, issue.2, pp.209-230, 1973.
DOI : 10.1214/aos/1176342360

A. Stephen and . Fienberg, When did bayesian inference become "bayesian, Bayesian Analysis, vol.1, issue.1, pp.1-40, 2005.

F. Bruno-de, E. Henry, and . Kyburg-jr, La prÃ?vision: Ses lois logiques, ses sources subjectives Foresight: Its Logical Laws, its Subjective Sources, Annales de l'Institut Henri PoincarÃ? 7 Studies in Subjective Probability, pp.1-68, 1937.

]. R. Fis25 and . Fisher, Statistical Methods, Experimental Design, and Scientific Inference: A Re-issue of Statistical Methods for Research Workers, The Design of Experiments, and Statistical Methods and Scientific Inference, 1925.

Y. Freund and R. E. Schapire, A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting, European Conference on Computational Learning Theory, pp.23-37, 1995.
DOI : 10.1006/jcss.1997.1504

R. G. Gallager, Low-density parity-check codes, IEEE Transactions on Information Theory, vol.8, issue.1, 1963.
DOI : 10.1109/TIT.1962.1057683

A. Gelman, J. B. Carlin, H. S. Stern, and D. B. Rubin, Bayesian Data Analysis, 2003.

I. Guyon and A. Elisseeff, An introduction to variable and feature selection, Journal of Machine Learning Research, vol.3, pp.1157-1182, 2003.

D. [. Geman and . Geman, Stochastic relaxation, gibbs distribution and the bayesian restoration of images, IEEE Transaction in Pattern Analysis and Machine Intelligence, vol.6, pp.721-741, 1984.

[. Guyon, S. R. Gunn, A. Ben-hur, and G. Dror, Result analysis of the nips 2003 feature selection challenge, NIPS, 2004.

E. Edwin and . Ghiselli, Theory of Psychological Measurement, 1964.

E. Mark, A. C. Glickman, and . Jones, Rating the chess rating system, Chance, vol.12, pp.21-28, 1999.

E. Mark and . Glickman, Parameter estimation in large dynamic paired comparison experiments, Applied Statistics, vol.48, pp.377-394, 1999.

D. Goldberg, B. Oki, D. Nichols, and D. B. Terry, Using collaborative filtering to weave an information tapestry, Communications of the ACM, vol.35, issue.12, pp.61-70, 1992.
DOI : 10.1145/138859.138867

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.104.3739

. Hermann, Adolf Anderssen, der Altmeister deutscher Schachspielkunst. Sein Leben und Schaffen, 1912.

P. Green, Reversible jump Markov chain Monte Carlo computation and Bayesian model determination, Biometrika, vol.82, issue.4, pp.711-732, 1995.
DOI : 10.1093/biomet/82.4.711

P. Gargallo and P. Sturm, Bayesian 3D Modeling from Images Using Multiple Depth Maps, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05), 2005.
DOI : 10.1109/CVPR.2005.84

URL : https://hal.archives-ouvertes.fr/inria-00524394

J. Gill, D. Lee, and D. Walker, Elicited Priors for Bayesian Model Specifications in Political Science Research, The Journal of Politics, vol.67, issue.3, 2005.
DOI : 10.1111/j.1468-2508.2005.00342.x

A. Hajek, Interpretations of probability The Stanford Encyclopedia of Philosophy. The Metaphysics Research Lab, 2003.

A. Mark and . Hall, Correlation-based Feature Selection for Machine Learning, 1998.

. Hal00, A. Mark, and . Hall, Correlation-based feature selection for discrete and numeric class machine learning, Proc. 17th International Conf. on Machine Learning, pp.359-366, 2000.

]. W. Has70 and . Hastings, Monte carlo sampling methods using markov chains and their applications, Biometrika, vol.57, pp.97-109, 1970.

R. Herbrich and T. Graepel, Trueskill: A bayesian skill rating system, 2006.

D. Heckerman, D. Geiger, and D. M. Chickering, Learning bayesian networks: The combination of knowledge and statistical data, In KDD Workshop, pp.85-96, 1994.

R. Herbrich, T. Graepel, and C. Campbell, Bayes point machines, J. Mach. Learn. Res, vol.1, pp.245-279, 2001.

J. Holland, Adaptation in Natural and Artificial Systems, 1975.

T. Heskes, M. Opper, W. Wiegerinck, O. Winther, and O. Zoeter, Approximate inference techniques with expectation constraints, Journal of Statistical Mechanics: Theory and Experiment, vol.2005, issue.11, 2005.
DOI : 10.1088/1742-5468/2005/11/P11015

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.106.4972

T. Hofmann and J. Puzicha, Latent class models for collaborative filtering, ?CAI '99: Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence, pp.688-693, 1999.

P. Jackson, Introduction to expert systems, 1986.

A. Jakulin, Modelling modelled, Computer and Information Science, 2004.

]. E. Jay57 and . Jaynes, Information theory and statistical mechanics, Physical Review, vol.106, issue.4, p.620, 1957.

]. E. Jay90 and . Jaynes, Complexity, Entropy and the Physics of Information, chapter, Quantum Theory, pp.38-403, 1990.

]. E. Jay03 and . Jaynes, Probability Theory : The Logic of Science, 2003.

F. Jensen, An introduction to Bayesian Networks Optimal junction trees, In Uncertainty in Artificial Intelligence, pp.360-366, 1994.

G. H. John, R. Kohavi, and K. Pfleger, Irrelevant Features and the Subset Selection Problem, International Conference on Machine Learning, pp.121-129, 1994.
DOI : 10.1016/B978-1-55860-335-6.50023-4

S. [. Jensen, K. G. Lauritzen, and . Olesen, Bayesian updating in causal probabilistic networks by local computations, Computational Statistics Quaterly, vol.4, pp.269-282, 1990.

R. E. Kalman, A New Approach to Linear Filtering and Prediction Problems, Journal of Basic Engineering, vol.82, issue.1, pp.35-45, 1960.
DOI : 10.1115/1.3662552

G. Karypis, Recommendation Algorithms, Proceedings of the tenth international conference on Information and knowledge management , CIKM'01, pp.247-254, 2001.
DOI : 10.1145/502585.502627

. Kschischang and L. Frey, Factor graphs and the sum-product algorithm, IEEE Transactions on Information Theory, vol.47, issue.2, p.47, 2001.
DOI : 10.1109/18.910572

S. Kirkpatrick, C. Gelatt, and M. Vecchi, Optimization by Simulated Annealing, Science, vol.220, issue.4598, pp.671-680, 1983.
DOI : 10.1126/science.220.4598.671

E. Donald and . Knuth, The Art of Computer Programming Fascicle 2: Generating All Tuples and Permutations, 2005.

C. Koike, Bayesian Approach to Action Selection and Attention Focusing Application in Autonomous Robot Programming, Thèse de doctorat, Inst. Nat. Polytechnique de Grenoble, 2005.
URL : https://hal.archives-ouvertes.fr/tel-00011138

]. A. Kol33 and . Kolmogorov, Grundbegriffe der Wahrscheinlichkeitsrechnung Second English Edition, Foundations of Probability, 1933.

K. P. Kording, Bayesian integration in sensorimotor learning, Nature, vol.427, issue.6971, pp.244-251, 2004.
DOI : 10.1038/nature02169

R. John and . Koza, Genetic Programming: On the Programming of Computers by Means of Natural Selection (Complex Adaptive Systems), 1992.

J. [. Kindermann and . Snell, Markov random fileds and their applications, 1980.
DOI : 10.1090/conm/001

D. Koller and M. Sahami, Toward optimal feature selection, International Conference on Machine Learning, pp.284-292, 1996.

J. Kleinberg and M. Sandler, Using mixture models for collaborative filtering, STOC '04: Proceedings of the thirty-sixth annual ACM symposium on Theory of computing, pp.569-578, 2004.

L. [. Kass and . Wasserman, The Selection of Prior Distributions by Formal Rules, Journal of the American Statistical Association, vol.36, issue.435, pp.1343-1370, 1996.
DOI : 10.1080/01621459.1996.10477003

P. Simon-de-laplace, Essai philosophique sur les probabilités, 1814.

[. Lebeltel, P. Bessière, J. Diard, and E. Mazer, Bayesian Robot Programming, Autonomous Robots, vol.16, issue.1, pp.49-79, 2003.
DOI : 10.1023/B:AURO.0000008671.38949.43

URL : https://hal.archives-ouvertes.fr/inria-00189723

J. A. Lasserre, C. M. Bishop, and T. P. Minka, Principled Hybrids of Generative and Discriminative Models, 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Volume 1 (CVPR'06), pp.87-94, 2006.
DOI : 10.1109/CVPR.2006.227

O. Lebeltel, Programmation BayÃ?sienne des Robots, 1999.
DOI : 10.3166/ria.18.261-298

URL : https://hal.inria.fr/inria-00182069/file/BRP_RIA.pdf

P. Leray and O. Francois, Bayesian network structural learning and incomplete data, Proceedings of the International and Interdisciplinary Conference on Adaptive Knowledge Representation and Reasoning (AKRR 2005), pp.33-40, 2005.

P. Leray and P. Gallinari, FEATURE SELECTION WITH NEURAL NETWORKS, Behaviormetrika, vol.26, issue.1, 1999.
DOI : 10.2333/bhmk.26.145

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.54.4570

[. Hy, A. Arrigoni, P. Bessière, and O. Lebeltel, Teaching bayesian behaviours to video game characters, Robotics and Autonomous Systems, vol.47, pp.177-185, 2004.
URL : https://hal.archives-ouvertes.fr/inria-00182073

T. J. Loredo, Computational technology for bayesian inference, Astronomical Data Analysis Software and Systems VIII, p.297, 1999.

D. David, M. Lewis, and . Ringuette, A comparison of two learning algorithms for text categorization, Proceedings of SDAIR-94, 3rd Annual Symposium on Document Analysis and Information Retrieval, pp.81-93, 1994.

H. Liu and R. Setiono, Chi2: Feature selection and discretization of numeric attributes, Proceedings of 7th IEEE Int'l Conference on Tools with Artificial Intelligence, 1995.

D. J. Lunn, A. Thomas, N. Best, and D. Spiegelhalter, Winbugs -a bayesian modelling framework: Concepts, structure, and extensibility, Statistics and Computing, vol.10, issue.4, pp.325-337, 2000.
DOI : 10.1023/A:1008929526011

R. Mceliece and S. M. Aji, The generalized distributive law, EEE Trans. Inform. Theory, vol.46, pp.325-343, 2000.

J. B. Macqueen, Some methods for classification and analysis of multivariate observations, Proc. of the fifth Berkeley Symposium on Mathematical Statistics and Probability, pp.281-297, 1967.

D. J. Mackay, Ensemble learning and evidence maximization, 1995.

C. D. Manning, Foundations of Statistical Natural Language Processing, 1999.

B. Marlin, Modeling user rating profiles for collaborative filtering, Advances in Neural Information Processing Systems 16, 2004.

S. Peter and . Maybeck, Stochastic models, estimation, and control, Mathematics in Science and Engineering. Press, Academic, vol.141, 1979.

T. Minka, The ep energy function and minimization schemes, 2001.

[. Minka, Expectation propagation for approximate bayesian inference, Proceedings of the 17th Annual Conference on Uncertainty in Artificial Intelligence (UAI-01), pp.362-398, 2001.

T. Minka, A family of algorithms for approximate Bayesian inference, 2001.

T. Minka, Divergence measures and message passing, 2005.

M. Tom and . Mitchell, Machine Learning, 1997.

D. Margartitis and S. Thrun, A bayesian multiresolution independence test for continuous variables, Uncertainty in Artificial Intelligence: Proceedings of the Seventeenth Conference (UAI-2001), pp.346-353, 2001.

K. Murphy, Y. Weiss, and M. Jordan, Loopy belief propagation for approximate inference: An empirical study, Proceedings of the 15th Annual Conference on Uncertainty in Artificial Intelligence (UAI-99), 1999.

]. R. Nea93 and . Neal, Probabilistic inference using Markov chain Monte Carlo methods, 1993.

M. Radford and . Neal, Bayesian Learning for Neural Networks, 1996.

M. Radford and . Neal, Defining priors for distributions using dirichlet diffusion trees, 2001.

P. Narendra and K. Fukunaga, A Branch and Bound Algorithm for Feature Subset Selection, IEEE Transactions on Computers, pp.917-922, 1977.
DOI : 10.1109/TC.1977.1674939

M. Nuria, B. Oliver, A. Rosario, and . Pentland, A bayesian computer vision system for modeling human interactions, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.22, issue.8, pp.831-843, 2000.

A. Paterek, Improving regularized singular value decomposition for collaborative filtering, Proceedings of KDD Cup and Workshop, 2007.

E. [. Pohorille and . Darve, A Bayesian Approach to Calculating Free Energies in Chemical and Biological Systems, AIP Conference Proceedings, pp.23-30, 2006.
DOI : 10.1063/1.2423257

J. Pearl, Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference, 1988.

J. Pradalier, C. Hermosillo, C. Koike, P. Braillon, C. Bessière et al., The CyCab: a car-like robot navigating autonomously and safely among pedestrians, Robotics and Autonomous Systems, vol.50, issue.1, pp.51-68, 2005.
DOI : 10.1016/j.robot.2004.10.002

URL : https://hal.archives-ouvertes.fr/inria-00182049

H. Poincaré, Calcul des probabilités, 1912.

Y. Qi and T. Minka, Expectation propagation for signal detection in flat-fading channels, Proceedings of IEEE International Symposium on Information Theory, 2003.

R. J. Quinlan, C4.5: Programs for Machine Learning, 1993.

C. E. Rasmussen, The infinite gaussian mixture model Advances in information processing systems 12, pp.554-560, 2000.

N. P. Resnick, M. Iacovou, P. Suchak, and J. Bergstorm, GroupLens, Proceedings of the 1994 ACM conference on Computer supported cooperative work , CSCW '94, pp.175-186, 1994.
DOI : 10.1145/192844.192905

O. Ritthoff, R. Klinkenberg, S. Fischer, and I. Mierswa, A hybrid approach to feature selection and generation using an evolutionary algorithm, Collaborative Research Center, vol.531, 2002.

B. Runyan, World football elo ratings, 1997.

C. Edward-rasmussen and C. K. Williams, Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning), 2005.

L. J. Savage, The Foundations of Statistics, 1954.

G. E. Schwarz, Estimating the Dimension of a Model, The Annals of Statistics, vol.6, issue.2, pp.461-464, 1978.
DOI : 10.1214/aos/1176344136

[. Sahami, S. Dumais, D. Heckerman, and E. Horvitz, A bayesian approach to filtering junk e-mail, Learning for Text Categorization: Papers from the 1998 Workshop, 1998.

G. Shakhnarovich, T. Darrell, and P. Indyk, Nearest-Neighbor Methods in Learning and Vision: Theory and Practice (Neural Information Processing), 2006.

J. Skilling, Maximum-Entropy and Bayesian Methods in Science and Engineering , chapter The Axioms of Maximum Entropy, 1988.

U. Shardanand and P. Maes, Social information filtering, Proceedings of the SIGCHI conference on Human factors in computing systems, CHI '95, pp.210-217, 1995.
DOI : 10.1145/223904.223931

H. Snoussi and A. Mohammad-djafari, Information geometry and prior selection, AIP Conference Proceedings, pp.307-327, 2003.
DOI : 10.1063/1.1570549

URL : http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.13.7222

K. Gordon and . Smyth, Linear models and empirical bayes methods for assessing differential expression in microarray experiments, Statistical Applications in Genetics and Molecular Biology, vol.13, 2004.

A. Spalanzani, Algorithmes évolutionnaires pour l'étude de la robustesse des systèmes de reconnaissance de la parole, 1999.

A. R. Syversveen, Noninformative bayesian priors. interpretation and problems with construction and applications, 1998.

W. Sebastian-thrun, D. Burgard, and . Fox, Probabilistic Robotics (Intelligent Robotics and Autonomous Agents), 2005.

M. Tipping, The relevance vector machine, Advances in Neural Information Processing Systems, 2000.

C. Tay, K. Mekhnacha, C. Chen, M. Yguel, and C. Laugier, An efficient formulation of the Bayesian occupation filter for target tracking in dynamic environments, International Journal of Vehicle Autonomous Systems, vol.6, issue.1/2, 2007.
DOI : 10.1504/IJVAS.2008.016483

URL : https://hal.archives-ouvertes.fr/inria-00182089

W. James, W. John, and . Tukey, An algorithm for the machine calculation of complex fourier series, Mathematics of Computation, vol.19, pp.297-201, 1965.

J. Uffink, The constraint rule of the maximum entropy principle. Studies in History and Philosophy of Modern Physics, pp.47-79, 1995.

]. L. Val84 and . Valiant, A theory of the learnable, STOC '84: Proceedings of the sixteenth annual ACM symposium on Theory of computing, pp.436-445, 1984.

. Vap99, N. Vladimir, and . Vapnik, The Nature of Statistical Learning Theory (Information Science and Statistics), 1999.

R. [. Vines, S. L. Evilia, and . Whittenburg, Bayesian analysis investigation of chemical exchange above and below the coalescence point, The Journal of Physical Chemistry, vol.97, issue.19, pp.4941-4944, 1993.
DOI : 10.1021/j100121a013

A. J. Viterbi, Error bounds for convolutional codes and an asymptotically optimum decoding algorithm, IEEE Transactions on Information Theory, vol.13, issue.2, pp.260-269, 1967.
DOI : 10.1109/TIT.1967.1054010

G. Manolis, K. G. Vozalis, and . Margaritis, Applying svd on itembased filtering, ISDA '05: Proceedings of the 5th International Conference on Intelligent Systems Design and Applications, pp.464-469, 2005.

]. J. Wal06 and . Walsh, Dual optimality frameworks for expectation propagation, IEEE Conference on Signal Processing Advances in Wireless Communications (SPAWC), 2006.

J. Winn and C. Bishop, Variational message passing, Journal of Machine Learning Research, vol.6, pp.661-694, 2005.

S. Wu, D. Chen, M. Niranjan, . Shun, and . Amari, Sequential Bayesian Decoding with a Population of Neurons, Neural Computation, vol.79, issue.5, pp.993-1012, 2003.
DOI : 10.1162/089976698300017818

B. Webb, Netflix update: Try this at home. http://sifter.org/ ~simon, 2006.

J. J. Darren and . Wilkinson, Bayesian methods in bioinformatics and computational systems biology, Brief Bioinform, 2007.

]. Weng, C. Miao, A. Goh, Z. Shen, and R. Gay, Trust-based agent community for collaborative recommendation, Proceedings of the fifth international joint conference on Autonomous agents and multiagent systems , AAMAS '06, pp.1260-1262, 2006.
DOI : 10.1145/1160633.1160860

]. J. Wmp-+-01, J. Weng, A. Mcclelland, O. Pentland, I. Sporns et al., Autonomous mental development by robots and animals, Science, issue.291, pp.291599-600, 2001.

M. Welling, T. Minka, and Y. W. Teh, Structured region graphs: Morphing ep into gbp, Proceedings of the 21th Annual Conference on Uncertainty in Artificial Intelligence (UAI-05), pp.609-616, 2005.

R. David and . Wolf, Mutual Information as a Bayesian Measure of Independence, p.9511002, 1994.

H. [. Welch and . Peers, On formulae for confidence points based on integrals of weighted likelihoods, journal of Royal Statistical Society, vol.25, pp.318-329, 1963.

J. S. Yedidia, W. T. Freeman, and Y. Weiss, Generalized belief propagation, NIPS, pp.689-695, 2000.

J. S. Yedidia, W. T. Freeman, and Y. Weiss, Understanding belief propagation and its generalizations, pp.239-269, 2003.

J. Yang and V. Honavar, Feature subset selection using a genetic algorithm, IEEE Intelligent Systems, vol.13, issue.2, pp.44-49, 1998.
DOI : 10.1109/5254.671091

]. S. You01 and . Youssef, Physics with exotic probability theory. ArXiv High Energy Physics -Theory e-prints, 2001.

A. L. Yuille, S. M. Smirnakis, and L. Xu, Bayesian self-organization, Advances in Neural Information Processing Systems, pp.1001-1008, 1994.

O. Zoeter and T. Heskes, Gaussian quadrature based expectation propagation, AISTATS, 2005.

H. Zhu and R. Rohwer, Information geometric measurements of generalization, Dept. Comp. Sci. and Appl. Math, 1995.