Matches in SemOpenAlex for { <https://semopenalex.org/work/W3217214923> ?p ?o ?g. }
Showing items 1 to 88 of
88
with 100 items per page.
- W3217214923 endingPage "43" @default.
- W3217214923 startingPage "29" @default.
- W3217214923 abstract "Many recent research efforts have exploited data sparsity for the acceleration of convolutional neural network (CNN) inferences. However, the effects of data transfer between main memory and the CNN accelerator have been largely overlooked. In this work, the authors propose a CNN acceleration technique that leverages hardware/software co-design and exploits the sparsity in input feature maps (IFMs). On the software side, the authors' technique employs a novel lossless compression scheme for IFMs, which are sent to the hardware accelerator via direct memory access. On the hardware side, the authors' technique uses a CNN inference accelerator that performs convolutional layer operations with their compressed data format. With several design optimization techniques, the authors have implemented their technique in a field-programmable gate array (FPGA) system-on-chip platform and evaluated their technique for six different convolutional layers in SqueezeNet. Results reveal that the authors' technique improves the performance by 1.1×–22.6× while reducing energy consumption by 47.7%–97.4% as compared to the CPU-based execution. Furthermore, results indicate that the IFM size and transfer latency are reduced by 34.0%–85.2% and 4.4%–75.7%, respectively, compared to the case without data compression. In addition, the authors' hardware accelerator shows better performance per hardware resource with less than or comparable power consumption to the state-of-the-art FPGA-based designs." @default.
- W3217214923 created "2021-12-06" @default.
- W3217214923 creator A5011131038 @default.
- W3217214923 creator A5023061567 @default.
- W3217214923 creator A5024203425 @default.
- W3217214923 date "2021-11-29" @default.
- W3217214923 modified "2023-10-17" @default.
- W3217214923 title "Sparse convolutional neural network acceleration with lossless input feature map compression for resource‐constrained systems" @default.
- W3217214923 cites W2094756095 @default.
- W3217214923 cites W2289252105 @default.
- W3217214923 cites W2516141709 @default.
- W3217214923 cites W2603836393 @default.
- W3217214923 cites W2604319603 @default.
- W3217214923 cites W2613119772 @default.
- W3217214923 cites W2623629680 @default.
- W3217214923 cites W2625457103 @default.
- W3217214923 cites W2795444169 @default.
- W3217214923 cites W2798317693 @default.
- W3217214923 cites W2931743911 @default.
- W3217214923 cites W2948229371 @default.
- W3217214923 cites W2962821792 @default.
- W3217214923 cites W2963140246 @default.
- W3217214923 cites W2973459602 @default.
- W3217214923 cites W2977634443 @default.
- W3217214923 cites W2979439447 @default.
- W3217214923 cites W2987852271 @default.
- W3217214923 cites W3093799822 @default.
- W3217214923 cites W3107327473 @default.
- W3217214923 cites W4240168186 @default.
- W3217214923 cites W639708223 @default.
- W3217214923 doi "https://doi.org/10.1049/cdt2.12038" @default.
- W3217214923 hasPublicationYear "2021" @default.
- W3217214923 type Work @default.
- W3217214923 sameAs 3217214923 @default.
- W3217214923 citedByCount "6" @default.
- W3217214923 countsByYear W32172149232022 @default.
- W3217214923 countsByYear W32172149232023 @default.
- W3217214923 crossrefType "journal-article" @default.
- W3217214923 hasAuthorship W3217214923A5011131038 @default.
- W3217214923 hasAuthorship W3217214923A5023061567 @default.
- W3217214923 hasAuthorship W3217214923A5024203425 @default.
- W3217214923 hasConcept C113775141 @default.
- W3217214923 hasConcept C13164978 @default.
- W3217214923 hasConcept C149635348 @default.
- W3217214923 hasConcept C154945302 @default.
- W3217214923 hasConcept C199360897 @default.
- W3217214923 hasConcept C2777904410 @default.
- W3217214923 hasConcept C41008148 @default.
- W3217214923 hasConcept C42935608 @default.
- W3217214923 hasConcept C78548338 @default.
- W3217214923 hasConcept C81081738 @default.
- W3217214923 hasConcept C81363708 @default.
- W3217214923 hasConcept C9390403 @default.
- W3217214923 hasConceptScore W3217214923C113775141 @default.
- W3217214923 hasConceptScore W3217214923C13164978 @default.
- W3217214923 hasConceptScore W3217214923C149635348 @default.
- W3217214923 hasConceptScore W3217214923C154945302 @default.
- W3217214923 hasConceptScore W3217214923C199360897 @default.
- W3217214923 hasConceptScore W3217214923C2777904410 @default.
- W3217214923 hasConceptScore W3217214923C41008148 @default.
- W3217214923 hasConceptScore W3217214923C42935608 @default.
- W3217214923 hasConceptScore W3217214923C78548338 @default.
- W3217214923 hasConceptScore W3217214923C81081738 @default.
- W3217214923 hasConceptScore W3217214923C81363708 @default.
- W3217214923 hasConceptScore W3217214923C9390403 @default.
- W3217214923 hasFunder F4320322120 @default.
- W3217214923 hasIssue "1" @default.
- W3217214923 hasLocation W32172149231 @default.
- W3217214923 hasLocation W32172149232 @default.
- W3217214923 hasOpenAccess W3217214923 @default.
- W3217214923 hasPrimaryLocation W32172149231 @default.
- W3217214923 hasRelatedWork W2032414556 @default.
- W3217214923 hasRelatedWork W2169853506 @default.
- W3217214923 hasRelatedWork W2169871401 @default.
- W3217214923 hasRelatedWork W2350586049 @default.
- W3217214923 hasRelatedWork W2385628723 @default.
- W3217214923 hasRelatedWork W2394342941 @default.
- W3217214923 hasRelatedWork W2461250372 @default.
- W3217214923 hasRelatedWork W2547124190 @default.
- W3217214923 hasRelatedWork W2948148442 @default.
- W3217214923 hasRelatedWork W3008492011 @default.
- W3217214923 hasVolume "16" @default.
- W3217214923 isParatext "false" @default.
- W3217214923 isRetracted "false" @default.
- W3217214923 magId "3217214923" @default.
- W3217214923 workType "article" @default.