Skip to Main content Skip to Navigation

Geometric Methods in Learning and Memory

Abstract : This thesis is devoted to geometric methods in optimization, learning and neural networks. In many problems of (supervised and unsupervised) learning, pattern recognition, and clustering there is a need to take into account the internal (intrinsic) structure of the underlying space, which is not necessary Euclidean. For Riemannian manifolds we construct computational algorithms for Newton method, conjugate-gradient methods, and some non-smooth optimization methods like the r-algorithm. For this purpose we develop methods for geodesic calculation in submanifolds based on Hamilton equations and symplectic integration. Then we construct a new type of neural associative memory capable of unsupervised learning and clustering. Its learning is based on generalized averaging over Grassmann manifolds. Further extension of this memory involves implicit space transformation and kernel machines. Also we consider geometric algorithms for signal processing and adaptive filtering. Proposed methods are tested for academic examples as well as real-life problems of image recognition and signal processing. Application of proposed neural networks is demonstrated for a complete real-life project of chemical image recognition (electronic nose).
Complete list of metadatas

Cited literature [67 references]  Display  Hide  Download
Contributor : Dimitri Novytskyi <>
Submitted on : Thursday, June 5, 2008 - 10:11:37 PM
Last modification on : Friday, January 10, 2020 - 9:08:06 PM
Long-term archiving on: : Friday, November 25, 2016 - 10:57:25 PM


  • HAL Id : tel-00285602, version 1


Dimitri Novytskyi. Geometric Methods in Learning and Memory. Mathematics [math]. Université Paul Sabatier - Toulouse III, 2007. English. ⟨tel-00285602⟩



Record views


Files downloads