Matches in SemOpenAlex for { <https://semopenalex.org/work/W4297201308> ?p ?o ?g. }
- W4297201308 abstract "Artificial intelligence applications implemented with neural networks require extensive arithmetic capabilities through multiply-accumulate (MAC) units. Traditional designs based on voltage-mode circuits feature complex logic chains for such purposes as carry processing. Additionally, as a separate memory block is used (e.g., in a von Neumann architecture), data movements incur on-chip communication bottlenecks. Furthermore, conventional multipliers have both operands encoded in the same physical quantity, which is either low cost to update or low cost to hold, but not both. This may be significant for low-energy edge operations. In this paper, we propose and present a mixed-signal multiply-accumulate unit design with in-memory computing to improve both latency and energy. This design is based on a single-bit multiplication cell consisting of a number of memristors and a single transistor switch (1TxM), arranged in a crossbar structure implementing the long-multiplication algorithm. The key innovation is that one of the operands is encoded in easy to update voltage and the other is encoded in non-volatile memristor conductance. This targets operations such as machine learning which feature asymmetric requirements for operand updates. Ohm’s Law and KCL take care of the multiplication in analog. When implemented as part of a NN, the MAC unit incorporates a current to digital stage to produce multi-bit voltage-mode output, in the same format as the input. The computation latency consists of memory writing and result encoding operations, with the Ohm’s Law and KCL operations contributing negligible delay. When compared with other memristor-based multipliers, the proposed work shows an order of magnitude of latency improvement in 4-bit implementations partly because of the Ohm’s Law and KCL time savings and partly because of the short writing operations for the frequently updated operand represented by voltages. In addition, the energy consumption per multiplication cycle of the proposed work is shown to improve by 74%–99% in corner cases. To investigate the usefulness of this MAC design in machine learning applications, its input/output relationships is characterized using multi-layer perceptrons to classify the well-known hand-writing digit dataset MNIST. This case study implements a quantization-aware training and includes the non-ideal effect of our MAC unit to allow the NN to learn and preserve its high accuracy. The simulation results show the NN using the proposed MAC unit yields an accuracy of 93%, which is only 1% lower than its baseline." @default.
- W4297201308 created "2022-09-27" @default.
- W4297201308 creator A5029446985 @default.
- W4297201308 creator A5046259966 @default.
- W4297201308 creator A5054894631 @default.
- W4297201308 creator A5058743869 @default.
- W4297201308 creator A5074769370 @default.
- W4297201308 creator A5077777787 @default.
- W4297201308 creator A5083933836 @default.
- W4297201308 date "2022-09-26" @default.
- W4297201308 modified "2023-10-18" @default.
- W4297201308 title "Energy-efficient neural network design using memristive MAC unit" @default.
- W4297201308 cites W115883071 @default.
- W4297201308 cites W1578783943 @default.
- W4297201308 cites W1988726414 @default.
- W4297201308 cites W2008901850 @default.
- W4297201308 cites W2025674646 @default.
- W4297201308 cites W2071892512 @default.
- W4297201308 cites W2077296610 @default.
- W4297201308 cites W2081729575 @default.
- W4297201308 cites W2112181056 @default.
- W4297201308 cites W2162651880 @default.
- W4297201308 cites W2244859704 @default.
- W4297201308 cites W2512274313 @default.
- W4297201308 cites W2569842290 @default.
- W4297201308 cites W2775771159 @default.
- W4297201308 cites W2795849550 @default.
- W4297201308 cites W2808697572 @default.
- W4297201308 cites W2946189696 @default.
- W4297201308 cites W2970608956 @default.
- W4297201308 cites W2980561559 @default.
- W4297201308 cites W2990049270 @default.
- W4297201308 cites W3010665448 @default.
- W4297201308 cites W3014166398 @default.
- W4297201308 cites W3040827038 @default.
- W4297201308 cites W3081302630 @default.
- W4297201308 cites W3206725821 @default.
- W4297201308 cites W3208067593 @default.
- W4297201308 cites W4251722996 @default.
- W4297201308 cites W4285548014 @default.
- W4297201308 doi "https://doi.org/10.3389/felec.2022.877629" @default.
- W4297201308 hasPublicationYear "2022" @default.
- W4297201308 type Work @default.
- W4297201308 citedByCount "0" @default.
- W4297201308 crossrefType "journal-article" @default.
- W4297201308 hasAuthorship W4297201308A5029446985 @default.
- W4297201308 hasAuthorship W4297201308A5046259966 @default.
- W4297201308 hasAuthorship W4297201308A5054894631 @default.
- W4297201308 hasAuthorship W4297201308A5058743869 @default.
- W4297201308 hasAuthorship W4297201308A5074769370 @default.
- W4297201308 hasAuthorship W4297201308A5077777787 @default.
- W4297201308 hasAuthorship W4297201308A5083933836 @default.
- W4297201308 hasBestOaLocation W42972013081 @default.
- W4297201308 hasConcept C119599485 @default.
- W4297201308 hasConcept C121332964 @default.
- W4297201308 hasConcept C127413603 @default.
- W4297201308 hasConcept C150072547 @default.
- W4297201308 hasConcept C154945302 @default.
- W4297201308 hasConcept C173608175 @default.
- W4297201308 hasConcept C24326235 @default.
- W4297201308 hasConcept C24890656 @default.
- W4297201308 hasConcept C2742236 @default.
- W4297201308 hasConcept C2780595030 @default.
- W4297201308 hasConcept C29984679 @default.
- W4297201308 hasConcept C41008148 @default.
- W4297201308 hasConcept C50644808 @default.
- W4297201308 hasConcept C55526617 @default.
- W4297201308 hasConcept C76155785 @default.
- W4297201308 hasConcept C9390403 @default.
- W4297201308 hasConceptScore W4297201308C119599485 @default.
- W4297201308 hasConceptScore W4297201308C121332964 @default.
- W4297201308 hasConceptScore W4297201308C127413603 @default.
- W4297201308 hasConceptScore W4297201308C150072547 @default.
- W4297201308 hasConceptScore W4297201308C154945302 @default.
- W4297201308 hasConceptScore W4297201308C173608175 @default.
- W4297201308 hasConceptScore W4297201308C24326235 @default.
- W4297201308 hasConceptScore W4297201308C24890656 @default.
- W4297201308 hasConceptScore W4297201308C2742236 @default.
- W4297201308 hasConceptScore W4297201308C2780595030 @default.
- W4297201308 hasConceptScore W4297201308C29984679 @default.
- W4297201308 hasConceptScore W4297201308C41008148 @default.
- W4297201308 hasConceptScore W4297201308C50644808 @default.
- W4297201308 hasConceptScore W4297201308C55526617 @default.
- W4297201308 hasConceptScore W4297201308C76155785 @default.
- W4297201308 hasConceptScore W4297201308C9390403 @default.
- W4297201308 hasFunder F4320334627 @default.
- W4297201308 hasLocation W42972013081 @default.
- W4297201308 hasLocation W42972013082 @default.
- W4297201308 hasOpenAccess W4297201308 @default.
- W4297201308 hasPrimaryLocation W42972013081 @default.
- W4297201308 hasRelatedWork W2515608498 @default.
- W4297201308 hasRelatedWork W2554427681 @default.
- W4297201308 hasRelatedWork W2605257365 @default.
- W4297201308 hasRelatedWork W2795874008 @default.
- W4297201308 hasRelatedWork W2946726301 @default.
- W4297201308 hasRelatedWork W2951280857 @default.
- W4297201308 hasRelatedWork W3134783211 @default.
- W4297201308 hasRelatedWork W3175953918 @default.
- W4297201308 hasRelatedWork W4285207471 @default.
- W4297201308 hasRelatedWork W4297201308 @default.