Matches in SemOpenAlex for { <https://semopenalex.org/work/W3209305110> ?p ?o ?g. }
Showing items 1 to 57 of
57
with 100 items per page.
- W3209305110 abstract "Deep learning algorithms allow computers to perform cognitive tasks ranging from vision to natural language processing with performance comparable to humans. Although these algorithms are conceptually inspired by the brain, their energy consumption is orders of magnitude higher. The reason for this high energy consumption is both architectural and algorithmic. The architecture of computers physically separates the processor and the memory where data is stored. This separation causes particularly intense and energy-intensive data movement for machine learning algorithms, limiting on-board or low-energy budget applications. One solution consists in creating new neuromorphic architectures where the memory is as close as possible to the computation units. However, existing learning algorithms have limitations that make their implementation on neuromorphic chips difficult. In particular, the algorithmic limitations at the heart of this thesis are catastrophic forgetting and non-local credit assignment. Catastrophic forgetting concerns the inability to maintain the performance of a neural network when a new task is learned. Credit assignment in neural networks is performed by Backpropagation. Although efficient, this algorithm is challenging to implement on a neuromorphic chip because it requires two distinct types of computation. These concepts are presented in details in chapter 1 of this thesis. Chapter 2 presents an algorithm inspired by synaptic metaplasticity to reduce catastrophic forgetting in binarized neural networks. Binarized neural networks are artificial neural networks with binary weights and activation, which makes them attractive for neuromorphic applications. The training process of binarized synaptic weights requires hidden variables whose meaning is poorly understood. We show that these hidden variables can be used to consolidate important synapses. The presented consolidation rule is local to the synapse, while being as effective as an established continual learning method of the literature. Chapter 3 deals with the local estimation of the gradient for training. Equilibrium Propagation is a learning algorithm that requires only one type of computation to estimate the gradient. However, scaling it up to complex tasks and deep architectures remains to be demonstrated. In this chapter, resulting from a collaboration with the Mila, we show that a bias in the estimation of the gradient is responsible for this limitation, and we propose a new unbiased estimator that allows Equilibirum propagation to scale up. We also show how to adapt the algorithm to optimize the cross entropy loss instead of the quadratic cost. Finally, we study the case where synaptic connections are asymmetric. These results show that Equilibrium Propagation is a promising algorithm for on-chip learning. Finally, in Chapter 4, we present an architecture to implement ternary synapses using resistive memories based on Hafnium oxide in collaboration with the University of Aix Marseille and CEA-Leti in Grenoble. We adapt a circuit originally intended to implement a binarized neural network by showing that a third synaptic weight value can be encoded when exploiting the low supply voltage regime, which is particularly suitable for on-board applications. The results presented in this thesis show that the joint design of algorithms and computational architectures is crucial for neuromorphic applications." @default.
- W3209305110 created "2021-11-08" @default.
- W3209305110 creator A5077172922 @default.
- W3209305110 date "2021-10-06" @default.
- W3209305110 modified "2023-09-26" @default.
- W3209305110 title "Bio-inspired continual learning and credit assignment for neuromorphic computing" @default.
- W3209305110 hasPublicationYear "2021" @default.
- W3209305110 type Work @default.
- W3209305110 sameAs 3209305110 @default.
- W3209305110 citedByCount "0" @default.
- W3209305110 crossrefType "dissertation" @default.
- W3209305110 hasAuthorship W3209305110A5077172922 @default.
- W3209305110 hasConcept C108583219 @default.
- W3209305110 hasConcept C119857082 @default.
- W3209305110 hasConcept C138885662 @default.
- W3209305110 hasConcept C151927369 @default.
- W3209305110 hasConcept C154945302 @default.
- W3209305110 hasConcept C41008148 @default.
- W3209305110 hasConcept C41895202 @default.
- W3209305110 hasConcept C50644808 @default.
- W3209305110 hasConcept C7149132 @default.
- W3209305110 hasConceptScore W3209305110C108583219 @default.
- W3209305110 hasConceptScore W3209305110C119857082 @default.
- W3209305110 hasConceptScore W3209305110C138885662 @default.
- W3209305110 hasConceptScore W3209305110C151927369 @default.
- W3209305110 hasConceptScore W3209305110C154945302 @default.
- W3209305110 hasConceptScore W3209305110C41008148 @default.
- W3209305110 hasConceptScore W3209305110C41895202 @default.
- W3209305110 hasConceptScore W3209305110C50644808 @default.
- W3209305110 hasConceptScore W3209305110C7149132 @default.
- W3209305110 hasLocation W32093051101 @default.
- W3209305110 hasOpenAccess W3209305110 @default.
- W3209305110 hasPrimaryLocation W32093051101 @default.
- W3209305110 hasRelatedWork W1486687522 @default.
- W3209305110 hasRelatedWork W2004369097 @default.
- W3209305110 hasRelatedWork W2117933835 @default.
- W3209305110 hasRelatedWork W2170345111 @default.
- W3209305110 hasRelatedWork W2204252256 @default.
- W3209305110 hasRelatedWork W2263490141 @default.
- W3209305110 hasRelatedWork W2355715145 @default.
- W3209305110 hasRelatedWork W2761476586 @default.
- W3209305110 hasRelatedWork W2786465559 @default.
- W3209305110 hasRelatedWork W2897449544 @default.
- W3209305110 hasRelatedWork W2920039561 @default.
- W3209305110 hasRelatedWork W2943535966 @default.
- W3209305110 hasRelatedWork W3006378306 @default.
- W3209305110 hasRelatedWork W3008314020 @default.
- W3209305110 hasRelatedWork W3093146707 @default.
- W3209305110 hasRelatedWork W3104581290 @default.
- W3209305110 hasRelatedWork W3126611227 @default.
- W3209305110 hasRelatedWork W3132798516 @default.
- W3209305110 hasRelatedWork W3138564770 @default.
- W3209305110 hasRelatedWork W2302217834 @default.
- W3209305110 isParatext "false" @default.
- W3209305110 isRetracted "false" @default.
- W3209305110 magId "3209305110" @default.
- W3209305110 workType "dissertation" @default.