Matches in SemOpenAlex for { <https://semopenalex.org/work/W3047109430> ?p ?o ?g. }
- W3047109430 endingPage "2000085" @default.
- W3047109430 startingPage "2000085" @default.
- W3047109430 abstract "Machine learning, particularly in the form of deep learning, has driven most of the recent fundamental developments in artificial intelligence. Deep learning is based on computational models that are, to a certain extent, bio-inspired, as they rely on networks of connected simple computing units operating in parallel. Deep learning has been successfully applied in areas such as object/pattern recognition, speech and natural language processing, self-driving vehicles, intelligent self-diagnostics tools, autonomous robots, knowledgeable personal assistants, and monitoring. These successes have been mostly supported by three factors: availability of vast amounts of data, continuous growth in computing power, and algorithmic innovations. The approaching demise of Moore's law, and the consequent expected modest improvements in computing power that can be achieved by scaling, raise the question of whether the described progress will be slowed or halted due to hardware limitations. This paper reviews the case for a novel beyond CMOS hardware technology, memristors, as a potential solution for the implementation of power-efficient in-memory computing, deep learning accelerators, and spiking neural networks. Central themes are the reliance on non-von-Neumann computing architectures and the need for developing tailored learning and inference algorithms. To argue that lessons from biology can be useful in providing directions for further progress in artificial intelligence, we briefly discuss an example based reservoir computing. We conclude the review by speculating on the big picture view of future neuromorphic and brain-inspired computing systems." @default.
- W3047109430 created "2020-08-10" @default.
- W3047109430 creator A5017736224 @default.
- W3047109430 creator A5030069824 @default.
- W3047109430 creator A5048595299 @default.
- W3047109430 creator A5052492386 @default.
- W3047109430 creator A5061085745 @default.
- W3047109430 creator A5078916182 @default.
- W3047109430 date "2020-08-02" @default.
- W3047109430 modified "2023-10-16" @default.
- W3047109430 title "Memristors—From In‐Memory Computing, Deep Learning Acceleration, and Spiking Neural Networks to the Future of Neuromorphic and Bio‐Inspired Computing" @default.
- W3047109430 cites W1498436455 @default.
- W3047109430 cites W1862155492 @default.
- W3047109430 cites W1937359183 @default.
- W3047109430 cites W2018774711 @default.
- W3047109430 cites W2020971886 @default.
- W3047109430 cites W2024122052 @default.
- W3047109430 cites W2025674646 @default.
- W3047109430 cites W2030671441 @default.
- W3047109430 cites W2030834420 @default.
- W3047109430 cites W2051513586 @default.
- W3047109430 cites W2076239964 @default.
- W3047109430 cites W2079000393 @default.
- W3047109430 cites W2081729575 @default.
- W3047109430 cites W2090722674 @default.
- W3047109430 cites W2092268242 @default.
- W3047109430 cites W2093002383 @default.
- W3047109430 cites W2103179919 @default.
- W3047109430 cites W2112181056 @default.
- W3047109430 cites W2138913040 @default.
- W3047109430 cites W2141467259 @default.
- W3047109430 cites W2147101007 @default.
- W3047109430 cites W2155954834 @default.
- W3047109430 cites W2162651880 @default.
- W3047109430 cites W2198142417 @default.
- W3047109430 cites W2254450385 @default.
- W3047109430 cites W2269100828 @default.
- W3047109430 cites W2307193480 @default.
- W3047109430 cites W2322088411 @default.
- W3047109430 cites W2334364695 @default.
- W3047109430 cites W2389556795 @default.
- W3047109430 cites W2462963692 @default.
- W3047109430 cites W2499230548 @default.
- W3047109430 cites W2500183209 @default.
- W3047109430 cites W2516467421 @default.
- W3047109430 cites W2525649597 @default.
- W3047109430 cites W2560615381 @default.
- W3047109430 cites W2604319603 @default.
- W3047109430 cites W2607085665 @default.
- W3047109430 cites W2610452801 @default.
- W3047109430 cites W2613569094 @default.
- W3047109430 cites W2624514417 @default.
- W3047109430 cites W2739411161 @default.
- W3047109430 cites W2749095639 @default.
- W3047109430 cites W2765081478 @default.
- W3047109430 cites W2766447205 @default.
- W3047109430 cites W2768104155 @default.
- W3047109430 cites W2769049661 @default.
- W3047109430 cites W2771385189 @default.
- W3047109430 cites W2772397789 @default.
- W3047109430 cites W2775771159 @default.
- W3047109430 cites W2778935320 @default.
- W3047109430 cites W2784200043 @default.
- W3047109430 cites W2785141883 @default.
- W3047109430 cites W2789588276 @default.
- W3047109430 cites W2790669755 @default.
- W3047109430 cites W2793600158 @default.
- W3047109430 cites W2802144660 @default.
- W3047109430 cites W2803163155 @default.
- W3047109430 cites W2805362231 @default.
- W3047109430 cites W2809897260 @default.
- W3047109430 cites W2810804228 @default.
- W3047109430 cites W2883451745 @default.
- W3047109430 cites W2883593518 @default.
- W3047109430 cites W2889448077 @default.
- W3047109430 cites W2890648591 @default.
- W3047109430 cites W2890917406 @default.
- W3047109430 cites W2894173111 @default.
- W3047109430 cites W2915404303 @default.
- W3047109430 cites W2921796727 @default.
- W3047109430 cites W2923010225 @default.
- W3047109430 cites W2923980602 @default.
- W3047109430 cites W2926419149 @default.
- W3047109430 cites W2949224578 @default.
- W3047109430 cites W2952615984 @default.
- W3047109430 cites W2957921024 @default.
- W3047109430 cites W2963385418 @default.
- W3047109430 cites W2963635267 @default.
- W3047109430 cites W2963809228 @default.
- W3047109430 cites W2964338223 @default.
- W3047109430 cites W2964402976 @default.
- W3047109430 cites W2966740705 @default.
- W3047109430 cites W2969226897 @default.
- W3047109430 cites W2969812992 @default.
- W3047109430 cites W2979706619 @default.
- W3047109430 cites W2982433817 @default.
- W3047109430 cites W2983460476 @default.
- W3047109430 cites W2984844508 @default.