Matches in SemOpenAlex for { <https://semopenalex.org/work/W2908966236> ?p ?o ?g. }
Showing items 1 to 53 of
53
with 100 items per page.
- W2908966236 abstract "Training Convolutional Neural Networks(CNNs) is both memory-and computation-intensive. The resistive random access memory (ReRAM) has shown its advantage to accelerate such tasks with high energy-efficiency. However, the ReRAM-based pipeline architecture suffers from the low utilization of computing resource, caused by the imbalanced data throughput in different pipeline stages because of the inherent down-sampling effect in CNNs and the inflexible usage of ReRAM cells. In this paper, we propose a novel ReRAM-based bidirectional pipeline architecture, named HUBPA, to accelerate the training with higher utilization of the computing resource. Two stages of the CNN training, forward and backward propagations, are scheduled in HUBPA dynamically to share the computing resource. We design an accessory control scheme for the context switch of these two tasks. We also propose an efficient algorithm to allocate computing resource for each neural network layer. Our experiment results show that, compared with state-of-the-art ReRAM pipeline architecture, HUBPA improves the performance by 1.7X and reduces the energy consumption by 1.5X, based on the current benchmarks." @default.
- W2908966236 created "2019-01-25" @default.
- W2908966236 creator A5011544360 @default.
- W2908966236 creator A5023632642 @default.
- W2908966236 creator A5045693138 @default.
- W2908966236 creator A5053801300 @default.
- W2908966236 creator A5056632010 @default.
- W2908966236 creator A5059787309 @default.
- W2908966236 date "2019-01-21" @default.
- W2908966236 modified "2023-10-18" @default.
- W2908966236 title "HUBPA" @default.
- W2908966236 cites W1677182931 @default.
- W2908966236 cites W2013028205 @default.
- W2908966236 cites W2117539524 @default.
- W2908966236 cites W2162639668 @default.
- W2908966236 cites W2613989746 @default.
- W2908966236 cites W2906043559 @default.
- W2908966236 cites W4240163901 @default.
- W2908966236 cites W4243519499 @default.
- W2908966236 cites W4245199738 @default.
- W2908966236 doi "https://doi.org/10.1145/3287624.3287674" @default.
- W2908966236 hasPublicationYear "2019" @default.
- W2908966236 type Work @default.
- W2908966236 sameAs 2908966236 @default.
- W2908966236 citedByCount "7" @default.
- W2908966236 countsByYear W29089662362019 @default.
- W2908966236 countsByYear W29089662362020 @default.
- W2908966236 crossrefType "proceedings-article" @default.
- W2908966236 hasAuthorship W2908966236A5011544360 @default.
- W2908966236 hasAuthorship W2908966236A5023632642 @default.
- W2908966236 hasAuthorship W2908966236A5045693138 @default.
- W2908966236 hasAuthorship W2908966236A5053801300 @default.
- W2908966236 hasAuthorship W2908966236A5056632010 @default.
- W2908966236 hasAuthorship W2908966236A5059787309 @default.
- W2908966236 hasConcept C41008148 @default.
- W2908966236 hasConceptScore W2908966236C41008148 @default.
- W2908966236 hasLocation W29089662361 @default.
- W2908966236 hasOpenAccess W2908966236 @default.
- W2908966236 hasPrimaryLocation W29089662361 @default.
- W2908966236 hasRelatedWork W2093578348 @default.
- W2908966236 hasRelatedWork W2130043461 @default.
- W2908966236 hasRelatedWork W2350741829 @default.
- W2908966236 hasRelatedWork W2358668433 @default.
- W2908966236 hasRelatedWork W2376932109 @default.
- W2908966236 hasRelatedWork W2382290278 @default.
- W2908966236 hasRelatedWork W2390279801 @default.
- W2908966236 hasRelatedWork W2748952813 @default.
- W2908966236 hasRelatedWork W2899084033 @default.
- W2908966236 hasRelatedWork W3004735627 @default.
- W2908966236 isParatext "false" @default.
- W2908966236 isRetracted "false" @default.
- W2908966236 magId "2908966236" @default.
- W2908966236 workType "article" @default.