Matches in SemOpenAlex for { <https://semopenalex.org/work/W3186703306> ?p ?o ?g. }
- W3186703306 endingPage "4796" @default.
- W3186703306 startingPage "4782" @default.
- W3186703306 abstract "Although spiking neural networks (SNNs) take benefits from the bioplausible neural modeling, the low accuracy under the common local synaptic plasticity learning rules limits their application in many practical tasks. Recently, an emerging SNN supervised learning algorithm inspired by backpropagation through time (BPTT) from the domain of artificial neural networks (ANNs) has successfully boosted the accuracy of SNNs, and helped improve the practicability of SNNs. However, current general-purpose processors suffer from low efficiency when performing BPTT for SNNs due to the ANN-tailored optimization. On the other hand, current neuromorphic chips cannot support BPTT because they mainly adopt local synaptic plasticity rules for simplified implementation. In this work, we propose H2Learn, a novel architecture that can achieve high efficiency for BPTT-based SNN learning, which ensures high accuracy of SNNs. At the beginning, we characterized the behaviors of BPTT-based SNN learning. Benefited from the binary spike-based computation in the forward pass and weight update, we first design look-up table (LUT)-based processing elements in the forward engine and weight update engine to make accumulations implicit and to fuse the computations of multiple input points. Second, benefited from the rich sparsity in the backward pass, we design a dual-sparsity-aware backward engine, which exploits both input and output sparsity. Finally, we apply a pipeline optimization between different engines to build an end-to-end solution for the BPTT-based SNN learning. Compared with the modern NVIDIA V100 GPU, H2Learn achieves <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$7.38times $ </tex-math></inline-formula> area saving, <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$5.74-10.20times $ </tex-math></inline-formula> speedup, and <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$5.25-7.12times $ </tex-math></inline-formula> energy saving on several benchmark datasets." @default.
- W3186703306 created "2021-08-02" @default.
- W3186703306 creator A5003935082 @default.
- W3186703306 creator A5018970859 @default.
- W3186703306 creator A5037216840 @default.
- W3186703306 creator A5038947607 @default.
- W3186703306 creator A5042532963 @default.
- W3186703306 creator A5047545398 @default.
- W3186703306 creator A5061225111 @default.
- W3186703306 creator A5064892269 @default.
- W3186703306 creator A5071458554 @default.
- W3186703306 date "2022-11-01" @default.
- W3186703306 modified "2023-09-24" @default.
- W3186703306 title "H2Learn: High-Efficiency Learning Accelerator for High-Accuracy Spiking Neural Networks" @default.
- W3186703306 cites W1486852018 @default.
- W3186703306 cites W1570411240 @default.
- W3186703306 cites W1781788132 @default.
- W3186703306 cites W1975398991 @default.
- W3186703306 cites W2016574277 @default.
- W3186703306 cites W2069552454 @default.
- W3186703306 cites W2108598243 @default.
- W3186703306 cites W2110137080 @default.
- W3186703306 cites W2112796928 @default.
- W3186703306 cites W2130360162 @default.
- W3186703306 cites W2138913040 @default.
- W3186703306 cites W2145339207 @default.
- W3186703306 cites W2158083362 @default.
- W3186703306 cites W2194775991 @default.
- W3186703306 cites W2341732087 @default.
- W3186703306 cites W2513853720 @default.
- W3186703306 cites W2619510810 @default.
- W3186703306 cites W2621826044 @default.
- W3186703306 cites W2740220207 @default.
- W3186703306 cites W2745005623 @default.
- W3186703306 cites W2749476078 @default.
- W3186703306 cites W2779025322 @default.
- W3186703306 cites W2783525259 @default.
- W3186703306 cites W2796323669 @default.
- W3186703306 cites W2801844931 @default.
- W3186703306 cites W2805588108 @default.
- W3186703306 cites W2888888896 @default.
- W3186703306 cites W2910507280 @default.
- W3186703306 cites W2919115771 @default.
- W3186703306 cites W2919634026 @default.
- W3186703306 cites W2922004937 @default.
- W3186703306 cites W2943288550 @default.
- W3186703306 cites W2962804204 @default.
- W3186703306 cites W2963355447 @default.
- W3186703306 cites W2964304804 @default.
- W3186703306 cites W2964338223 @default.
- W3186703306 cites W2966081953 @default.
- W3186703306 cites W2966513546 @default.
- W3186703306 cites W2979689765 @default.
- W3186703306 cites W2988246957 @default.
- W3186703306 cites W2989431475 @default.
- W3186703306 cites W2994470909 @default.
- W3186703306 cites W3006426821 @default.
- W3186703306 cites W3008199512 @default.
- W3186703306 cites W3016391357 @default.
- W3186703306 cites W3016542674 @default.
- W3186703306 cites W3018523095 @default.
- W3186703306 cites W3036450648 @default.
- W3186703306 cites W3042725081 @default.
- W3186703306 cites W3065747603 @default.
- W3186703306 cites W3091592563 @default.
- W3186703306 cites W3105841399 @default.
- W3186703306 cites W3124546450 @default.
- W3186703306 cites W3129643976 @default.
- W3186703306 cites W3136154048 @default.
- W3186703306 cites W3148444620 @default.
- W3186703306 cites W4231081240 @default.
- W3186703306 cites W4280564564 @default.
- W3186703306 doi "https://doi.org/10.1109/tcad.2021.3138347" @default.
- W3186703306 hasPublicationYear "2022" @default.
- W3186703306 type Work @default.
- W3186703306 sameAs 3186703306 @default.
- W3186703306 citedByCount "5" @default.
- W3186703306 countsByYear W31867033062022 @default.
- W3186703306 countsByYear W31867033062023 @default.
- W3186703306 crossrefType "journal-article" @default.
- W3186703306 hasAuthorship W3186703306A5003935082 @default.
- W3186703306 hasAuthorship W3186703306A5018970859 @default.
- W3186703306 hasAuthorship W3186703306A5037216840 @default.
- W3186703306 hasAuthorship W3186703306A5038947607 @default.
- W3186703306 hasAuthorship W3186703306A5042532963 @default.
- W3186703306 hasAuthorship W3186703306A5047545398 @default.
- W3186703306 hasAuthorship W3186703306A5061225111 @default.
- W3186703306 hasAuthorship W3186703306A5064892269 @default.
- W3186703306 hasAuthorship W3186703306A5071458554 @default.
- W3186703306 hasBestOaLocation W31867033062 @default.
- W3186703306 hasConcept C113775141 @default.
- W3186703306 hasConcept C11413529 @default.
- W3186703306 hasConcept C11731999 @default.
- W3186703306 hasConcept C119857082 @default.
- W3186703306 hasConcept C134835016 @default.
- W3186703306 hasConcept C154945302 @default.
- W3186703306 hasConcept C155032097 @default.
- W3186703306 hasConcept C199360897 @default.