Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285247025> ?p ?o ?g. }
- W4285247025 endingPage "60764" @default.
- W4285247025 startingPage "60738" @default.
- W4285247025 abstract "The field of Deep Learning (DL) has seen a remarkable series of developments with increasingly accurate and robust algorithms. However, the increase in performance has been accompanied by an increase in the parameters, complexity, and training and inference time of the models, which means that we are rapidly reaching a point where DL may no longer be feasible. On the other hand, some specific applications need to be carefully considered when developing DL models due to hardware limitations or power requirements. In this context, there is a growing interest in efficient DL algorithms, with Spiking Neural Networks (SNNs) being one of the most promising paradigms. Due to the inherent asynchrony and sparseness of spike trains, these types of networks have the potential to reduce power consumption while maintaining relatively good performance. This is attractive for efficient DL and if successful, could replace traditional Artificial Neural Networks (ANNs) in many applications. However, despite significant progress, the performance of SNNs on benchmark datasets is often lower than that of traditional ANNs. Moreover, due to the non-differentiable nature of their activation functions, it is difficult to train SNNs with direct backpropagation, so appropriate training strategies must be found. Nevertheless, significant efforts have been made to develop competitive models. This survey covers the main ideas behind SNNs and reviews recent trends in learning rules and network architectures, with a particular focus on biologically inspired strategies. It also provides some practical considerations of state-of-the-art SNNs and discusses relevant research opportunities." @default.
- W4285247025 created "2022-07-14" @default.
- W4285247025 creator A5009127277 @default.
- W4285247025 creator A5013827007 @default.
- W4285247025 creator A5066469570 @default.
- W4285247025 creator A5087373569 @default.
- W4285247025 date "2022-01-01" @default.
- W4285247025 modified "2023-10-06" @default.
- W4285247025 title "Spiking Neural Networks: A Survey" @default.
- W4285247025 cites W111381757 @default.
- W4285247025 cites W1498436455 @default.
- W4285247025 cites W1523493493 @default.
- W4285247025 cites W1532949105 @default.
- W4285247025 cites W1554576613 @default.
- W4285247025 cites W1570411240 @default.
- W4285247025 cites W1645800954 @default.
- W4285247025 cites W1699734612 @default.
- W4285247025 cites W1975398991 @default.
- W4285247025 cites W1975412204 @default.
- W4285247025 cites W1977295328 @default.
- W4285247025 cites W1984759703 @default.
- W4285247025 cites W1987927386 @default.
- W4285247025 cites W1993533483 @default.
- W4285247025 cites W1994565921 @default.
- W4285247025 cites W1997953085 @default.
- W4285247025 cites W2002679708 @default.
- W4285247025 cites W2003400208 @default.
- W4285247025 cites W2005108691 @default.
- W4285247025 cites W2006048762 @default.
- W4285247025 cites W2007815184 @default.
- W4285247025 cites W2008284899 @default.
- W4285247025 cites W2012592267 @default.
- W4285247025 cites W2014059210 @default.
- W4285247025 cites W2015656145 @default.
- W4285247025 cites W2018027127 @default.
- W4285247025 cites W2022160394 @default.
- W4285247025 cites W2027901343 @default.
- W4285247025 cites W2031937032 @default.
- W4285247025 cites W2037457092 @default.
- W4285247025 cites W2038511109 @default.
- W4285247025 cites W2054113233 @default.
- W4285247025 cites W2056631950 @default.
- W4285247025 cites W2081347291 @default.
- W4285247025 cites W2081490348 @default.
- W4285247025 cites W2091845343 @default.
- W4285247025 cites W2103822311 @default.
- W4285247025 cites W2108598243 @default.
- W4285247025 cites W2110654393 @default.
- W4285247025 cites W2112408199 @default.
- W4285247025 cites W2112796928 @default.
- W4285247025 cites W2116049552 @default.
- W4285247025 cites W2120905747 @default.
- W4285247025 cites W2127388521 @default.
- W4285247025 cites W2130360162 @default.
- W4285247025 cites W2136922672 @default.
- W4285247025 cites W2140362090 @default.
- W4285247025 cites W2147101007 @default.
- W4285247025 cites W2162827630 @default.
- W4285247025 cites W2164653071 @default.
- W4285247025 cites W2165396124 @default.
- W4285247025 cites W2194775991 @default.
- W4285247025 cites W2393696765 @default.
- W4285247025 cites W2419165387 @default.
- W4285247025 cites W2466047266 @default.
- W4285247025 cites W2513853720 @default.
- W4285247025 cites W2552737632 @default.
- W4285247025 cites W2598777030 @default.
- W4285247025 cites W2735633774 @default.
- W4285247025 cites W2735894830 @default.
- W4285247025 cites W2744013291 @default.
- W4285247025 cites W2745933219 @default.
- W4285247025 cites W2767226428 @default.
- W4285247025 cites W2775079417 @default.
- W4285247025 cites W2779025322 @default.
- W4285247025 cites W2787903027 @default.
- W4285247025 cites W2800613970 @default.
- W4285247025 cites W2805469146 @default.
- W4285247025 cites W2805588108 @default.
- W4285247025 cites W2808053377 @default.
- W4285247025 cites W2810193307 @default.
- W4285247025 cites W2884987480 @default.
- W4285247025 cites W2892306708 @default.
- W4285247025 cites W2898323475 @default.
- W4285247025 cites W2903271863 @default.
- W4285247025 cites W2919115771 @default.
- W4285247025 cites W2944546415 @default.
- W4285247025 cites W2951534927 @default.
- W4285247025 cites W2962804204 @default.
- W4285247025 cites W2962997068 @default.
- W4285247025 cites W2963206832 @default.
- W4285247025 cites W2963335874 @default.
- W4285247025 cites W2963510238 @default.
- W4285247025 cites W2963887423 @default.
- W4285247025 cites W2964296416 @default.
- W4285247025 cites W2964338223 @default.
- W4285247025 cites W2978391213 @default.
- W4285247025 cites W2979132631 @default.
- W4285247025 cites W2979879900 @default.