Skip to Main content Skip to Navigation

Bayesian neuromorphic computing based on resistive memory

Abstract : Artificial intelligence is a field that, historically, has benefited from the combination of ideas from across inter-disciplinary boundaries which have improved models of AI and the algorithms that operate on them. Biological nervous systems in particular have inspired various model topologies and algorithmic tricks that have led to leaps in performance. In contrast, as computing power and memory availability have increased relentlessly since the 1950's, models of artificial intelligence have largely failed to recognise the constraints imposed by, or incorporate the opportunities offered by, the underlying computing hardware. While this is not immediately apparent in the cloud computing setting; the mismatch between model, algorithm and hardware is the limiting factor that currently curtails the efficient application of locally-adaptive artificially intelligent systems at the edge. In this thesis, the interdisciplinary boundary between machine learning, emerging technologies and biological nervous systems will be explored with the objective of proposing a new, hardware-focussed, approach for the application of energy efficient and locally-adaptive edge neuromorphic computing systems.Resistive memories are a leading candidate as an enabling technology for AI to greatly reduce its energy requirements. This is largely owed to the efficient and parallelised implementation of the dot-product operation that pervades machine learning as well as its material-level compatibility with advanced CMOS processes. However, until now, the application of RRAM has been confined predominantly to implementations of gradient-based machine learning algorithms, namely backpropagation, to train RRAM-based multi-layer perceptron models. The fundamental properties of RRAM though, predominantly their conductance variability, are, on the contrary, not compatible with learning algorithms based on the descent of error gradients. This thesis recognises that, in contrast, the intrinsic properties of this technology can be harnessed through Bayesian approaches to machine learning where, like device conductance states, model parameters are described as random variables. RRAM-based implementations of Markov Chain Monte Carlo sampling algorithms are implemented and applied to the training of RRAM-based models. An RRAM-based computing hardware capable of supporting such models is also proposed. Inspired by the organisational principles of animal nervous systems, whereby memory and processing are distributed and arguably indistinguishable, this thesis proposes analogue circuit solutions for biological models of neurons and synapses and for a system-level architecture to interconnect such elements. Reflecting the role played by ion-channels embedded in biological neuronal membranes, these circuits co-localise memory and computation by incorporating resistive memory devices directly into the circuits themselves; determining model parameters and the interconnectivity between these elements locally. Relative to similar approaches, this obviates the need for volatile on-chip working memory and the use of analogue-to-digital conversion, both entailing significant energy demands. In recognition of the efficient solutions animals like insects have uncovered throughout the course of evolution, their nervous systems are used to guide the development of model architectures. Based on recent neurophysiological studies, models inspired by the cricket cercal system and the fruit fly motion detection system are proposed. To achieve an equivalent performance to the cercal system model, multi-layer perceptrons require between one and two orders of magnitude more memory elements; offering a means of scaling the proposed MCMC sampling algorithms to more complex tasks. It is also discussed how the ‘small-worldness’ of networks of neurons found in animal nervous systems can provide a solution to the spatial connectivity constraints inherent to the proposed RRAM-based computing fabric.
Complete list of metadata
Contributor : Abes Star :  Contact
Submitted on : Monday, December 6, 2021 - 1:02:46 AM
Last modification on : Monday, December 6, 2021 - 3:26:48 AM


Version validated by the jury (STAR)


  • HAL Id : tel-03466542, version 1




Thomas Dalgaty. Bayesian neuromorphic computing based on resistive memory. Micro and nanotechnologies/Microelectronics. Université Grenoble Alpes [2020-..], 2020. English. ⟨NNT : 2020GRALT087⟩. ⟨tel-03466542⟩



Les métriques sont temporairement indisponibles