Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285290717> ?p ?o ?g. }
- W4285290717 endingPage "2752" @default.
- W4285290717 startingPage "2740" @default.
- W4285290717 abstract "RRAM-based in-memory computing (IMC) effectively accelerates deep neural networks (DNNs). Furthermore, model compression techniques, such as quantization and pruning, are necessary to improve algorithm mapping and hardware performance. However, in the presence of RRAM device variations, low-precision and sparse DNNs suffer from severe post-mapping accuracy loss. To address this, in this work, we investigate a new metric, <italic xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>model stability</i> , from the loss landscape to help shed light on accuracy loss under variations and model compression, which guides an algorithmic solution to maximize model stability and mitigate accuracy loss. Based on statistical data from a CMOS/RRAM 1T1R test chip at 65nm, we characterize wafer-level RRAM variations and develop a cross-layer benchmark tool that incorporates quantization, pruning, device variations, model stability, and IMC architecture parameters to assess post-mapping accuracy and hardware performance. Leveraging this tool, we show that a loss-landscape-based DNN model selection for stability effectively tolerates device variations and achieves a post-mapping accuracy higher than that with 50% lower RRAM variations. Moreover, we quantitatively interpret why model pruning increases the sensitivity to variations, while a lower-precision model has better tolerance to variations. Finally, we propose a novel variation-aware training method to improve model stability, in which there exists the most stable model for the best post-mapping accuracy of compressed DNNs. Experimental evaluation of the method shows up to 19%, 21%, and 11% post-mapping accuracy improvement for our 65nm RRAM device, across various precision and sparsity, on CIFAR-10, CIFAR-100, and SVHN datasets, respectively." @default.
- W4285290717 created "2022-07-14" @default.
- W4285290717 creator A5002498234 @default.
- W4285290717 creator A5004249628 @default.
- W4285290717 creator A5004534807 @default.
- W4285290717 creator A5046312510 @default.
- W4285290717 creator A5047916979 @default.
- W4285290717 creator A5058197956 @default.
- W4285290717 creator A5058375642 @default.
- W4285290717 creator A5059526101 @default.
- W4285290717 creator A5060335470 @default.
- W4285290717 creator A5063655609 @default.
- W4285290717 creator A5067550998 @default.
- W4285290717 creator A5091909540 @default.
- W4285290717 date "2022-11-01" @default.
- W4285290717 modified "2023-10-05" @default.
- W4285290717 title "Exploring Model Stability of Deep Neural Networks for Reliable RRAM-Based In-Memory Acceleration" @default.
- W4285290717 cites W1971319818 @default.
- W4285290717 cites W1999085092 @default.
- W4285290717 cites W2194775991 @default.
- W4285290717 cites W2518281301 @default.
- W4285290717 cites W2558662207 @default.
- W4285290717 cites W2612375349 @default.
- W4285290717 cites W2613989746 @default.
- W4285290717 cites W2883149906 @default.
- W4285290717 cites W2912811302 @default.
- W4285290717 cites W2925935411 @default.
- W4285290717 cites W2944898315 @default.
- W4285290717 cites W2945146780 @default.
- W4285290717 cites W2946047477 @default.
- W4285290717 cites W2946522000 @default.
- W4285290717 cites W2949989598 @default.
- W4285290717 cites W2962851801 @default.
- W4285290717 cites W2963735024 @default.
- W4285290717 cites W2998470761 @default.
- W4285290717 cites W3005619596 @default.
- W4285290717 cites W3035560939 @default.
- W4285290717 cites W3036092616 @default.
- W4285290717 cites W3046772465 @default.
- W4285290717 cites W3048606948 @default.
- W4285290717 cites W3083443371 @default.
- W4285290717 cites W3091835145 @default.
- W4285290717 cites W3091885635 @default.
- W4285290717 cites W3116379531 @default.
- W4285290717 cites W3158997543 @default.
- W4285290717 cites W3201613041 @default.
- W4285290717 cites W4240805545 @default.
- W4285290717 cites W4245731639 @default.
- W4285290717 doi "https://doi.org/10.1109/tc.2022.3174585" @default.
- W4285290717 hasPublicationYear "2022" @default.
- W4285290717 type Work @default.
- W4285290717 citedByCount "3" @default.
- W4285290717 countsByYear W42852907172022 @default.
- W4285290717 countsByYear W42852907172023 @default.
- W4285290717 crossrefType "journal-article" @default.
- W4285290717 hasAuthorship W4285290717A5002498234 @default.
- W4285290717 hasAuthorship W4285290717A5004249628 @default.
- W4285290717 hasAuthorship W4285290717A5004534807 @default.
- W4285290717 hasAuthorship W4285290717A5046312510 @default.
- W4285290717 hasAuthorship W4285290717A5047916979 @default.
- W4285290717 hasAuthorship W4285290717A5058197956 @default.
- W4285290717 hasAuthorship W4285290717A5058375642 @default.
- W4285290717 hasAuthorship W4285290717A5059526101 @default.
- W4285290717 hasAuthorship W4285290717A5060335470 @default.
- W4285290717 hasAuthorship W4285290717A5063655609 @default.
- W4285290717 hasAuthorship W4285290717A5067550998 @default.
- W4285290717 hasAuthorship W4285290717A5091909540 @default.
- W4285290717 hasBestOaLocation W42852907171 @default.
- W4285290717 hasConcept C108010975 @default.
- W4285290717 hasConcept C112972136 @default.
- W4285290717 hasConcept C113775141 @default.
- W4285290717 hasConcept C11413529 @default.
- W4285290717 hasConcept C119857082 @default.
- W4285290717 hasConcept C13280743 @default.
- W4285290717 hasConcept C147789679 @default.
- W4285290717 hasConcept C154945302 @default.
- W4285290717 hasConcept C17525397 @default.
- W4285290717 hasConcept C182019814 @default.
- W4285290717 hasConcept C185592680 @default.
- W4285290717 hasConcept C185798385 @default.
- W4285290717 hasConcept C205649164 @default.
- W4285290717 hasConcept C28855332 @default.
- W4285290717 hasConcept C41008148 @default.
- W4285290717 hasConcept C50644808 @default.
- W4285290717 hasConcept C6557445 @default.
- W4285290717 hasConcept C86803240 @default.
- W4285290717 hasConceptScore W4285290717C108010975 @default.
- W4285290717 hasConceptScore W4285290717C112972136 @default.
- W4285290717 hasConceptScore W4285290717C113775141 @default.
- W4285290717 hasConceptScore W4285290717C11413529 @default.
- W4285290717 hasConceptScore W4285290717C119857082 @default.
- W4285290717 hasConceptScore W4285290717C13280743 @default.
- W4285290717 hasConceptScore W4285290717C147789679 @default.
- W4285290717 hasConceptScore W4285290717C154945302 @default.
- W4285290717 hasConceptScore W4285290717C17525397 @default.
- W4285290717 hasConceptScore W4285290717C182019814 @default.
- W4285290717 hasConceptScore W4285290717C185592680 @default.
- W4285290717 hasConceptScore W4285290717C185798385 @default.