Matches in SemOpenAlex for { <https://semopenalex.org/work/W2892267922> ?p ?o ?g. }
Showing items 1 to 87 of
87
with 100 items per page.
- W2892267922 endingPage "236" @default.
- W2892267922 startingPage "227" @default.
- W2892267922 abstract "Abstract Deep Neural Networks (DNN) have proven to be highly effective in extracting high level abstractions of input data using multiple neural network layers. However, the huge training times for DNNs in traditional von-Neumann machines have hindered their ubiquitous adoption in IoT and other mobile computing platforms. Recently, acceleration of DNN with a time complexity of O ( 1 ) was proposed using the idea of stochastic weight update with resistive processing units (RPU). However, it has been projected that RPU devices require more than 1000 reliable conductance levels, which is a stringent requirement to realize in memristive devices. Here, we study the optimization of stochastic learning for DNNs for the hand-written digit classification benchmark using the characteristics of non-filamentary Pr0.7Ca0.3MnO3 (PCMO) devices that are fabricated using a standard lithography process. The electrical characteristics of these devices exhibit a linear conductance response with an on-off ratio of 1.8 with 26 levels and significant programming variability. We captured these non-ideal behaviors of experimental PCMO device in the simulations to demonstrate stochastic learning with O ( 1 ) time complexity, achieving a test accuracy of 88.1% for the hand-written digit recognition benchmark. While the linearity, dynamic range, bit resolution, programming variability and the reset rate of the device conductance to account for its limited dynamic range have to be co-optimized for improving the training efficiency, we show that programming variability has the paramount role in determining the network performance. We also show that if devices with reduced programming variability (5x smaller compared to our experimental device) can be developed keeping all other parameters constant, it is possible to boost the network performance as high as 95%. We also observe that the performance of stochastic DNNs with memristive synapses is independent of the on-off ratio of the devices for a fixed programming variability. Thus, programming variability represents a new optimization corner for on-chip learning of stochastic DNNs. Further, we also evaluate the performance of stochastic inference engines to noise corrupted input test data as a function of the variability in the memristive devices. We demonstrate that noise-resilient inference engines can be achieved if 100 bits are used for stochastic encoding during inference even though the expensive network training can be done with as few as 10 bits. Thus, our studies emphasize the need for optimization of learning strategies for DNNs based on the non-ideal characteristics of experimental nanoscale devices." @default.
- W2892267922 created "2018-09-27" @default.
- W2892267922 creator A5031808877 @default.
- W2892267922 creator A5047645198 @default.
- W2892267922 creator A5048595299 @default.
- W2892267922 creator A5073593093 @default.
- W2892267922 date "2018-12-01" @default.
- W2892267922 modified "2023-09-25" @default.
- W2892267922 title "Stochastic learning in deep neural networks based on nanoscale PCMO device characteristics" @default.
- W2892267922 cites W1498436455 @default.
- W2892267922 cites W1974055919 @default.
- W2892267922 cites W1993927936 @default.
- W2892267922 cites W1994074248 @default.
- W2892267922 cites W2006798699 @default.
- W2892267922 cites W2016922062 @default.
- W2892267922 cites W2018774711 @default.
- W2892267922 cites W2020971886 @default.
- W2892267922 cites W2040870580 @default.
- W2892267922 cites W2060501200 @default.
- W2892267922 cites W2077586448 @default.
- W2892267922 cites W2087748124 @default.
- W2892267922 cites W2088849133 @default.
- W2892267922 cites W2093320793 @default.
- W2892267922 cites W2101091847 @default.
- W2892267922 cites W2101765144 @default.
- W2892267922 cites W2112467652 @default.
- W2892267922 cites W2112980698 @default.
- W2892267922 cites W2117422822 @default.
- W2892267922 cites W2117489143 @default.
- W2892267922 cites W2307193480 @default.
- W2892267922 cites W2736698914 @default.
- W2892267922 cites W2801578543 @default.
- W2892267922 cites W4240778903 @default.
- W2892267922 doi "https://doi.org/10.1016/j.neucom.2018.09.019" @default.
- W2892267922 hasPublicationYear "2018" @default.
- W2892267922 type Work @default.
- W2892267922 sameAs 2892267922 @default.
- W2892267922 citedByCount "14" @default.
- W2892267922 countsByYear W28922679222019 @default.
- W2892267922 countsByYear W28922679222020 @default.
- W2892267922 countsByYear W28922679222021 @default.
- W2892267922 countsByYear W28922679222022 @default.
- W2892267922 crossrefType "journal-article" @default.
- W2892267922 hasAuthorship W2892267922A5031808877 @default.
- W2892267922 hasAuthorship W2892267922A5047645198 @default.
- W2892267922 hasAuthorship W2892267922A5048595299 @default.
- W2892267922 hasAuthorship W2892267922A5073593093 @default.
- W2892267922 hasConcept C108583219 @default.
- W2892267922 hasConcept C121332964 @default.
- W2892267922 hasConcept C154945302 @default.
- W2892267922 hasConcept C171250308 @default.
- W2892267922 hasConcept C192562407 @default.
- W2892267922 hasConcept C2778755073 @default.
- W2892267922 hasConcept C41008148 @default.
- W2892267922 hasConcept C45206210 @default.
- W2892267922 hasConcept C50644808 @default.
- W2892267922 hasConcept C62520636 @default.
- W2892267922 hasConceptScore W2892267922C108583219 @default.
- W2892267922 hasConceptScore W2892267922C121332964 @default.
- W2892267922 hasConceptScore W2892267922C154945302 @default.
- W2892267922 hasConceptScore W2892267922C171250308 @default.
- W2892267922 hasConceptScore W2892267922C192562407 @default.
- W2892267922 hasConceptScore W2892267922C2778755073 @default.
- W2892267922 hasConceptScore W2892267922C41008148 @default.
- W2892267922 hasConceptScore W2892267922C45206210 @default.
- W2892267922 hasConceptScore W2892267922C50644808 @default.
- W2892267922 hasConceptScore W2892267922C62520636 @default.
- W2892267922 hasLocation W28922679221 @default.
- W2892267922 hasOpenAccess W2892267922 @default.
- W2892267922 hasPrimaryLocation W28922679221 @default.
- W2892267922 hasRelatedWork W2126887587 @default.
- W2892267922 hasRelatedWork W2731899572 @default.
- W2892267922 hasRelatedWork W2939353110 @default.
- W2892267922 hasRelatedWork W2948658236 @default.
- W2892267922 hasRelatedWork W3009238340 @default.
- W2892267922 hasRelatedWork W3215138031 @default.
- W2892267922 hasRelatedWork W4312962853 @default.
- W2892267922 hasRelatedWork W4321369474 @default.
- W2892267922 hasRelatedWork W4327774331 @default.
- W2892267922 hasRelatedWork W4360585206 @default.
- W2892267922 hasVolume "321" @default.
- W2892267922 isParatext "false" @default.
- W2892267922 isRetracted "false" @default.
- W2892267922 magId "2892267922" @default.
- W2892267922 workType "article" @default.