Matches in SemOpenAlex for { <https://semopenalex.org/work/W4362714362> ?p ?o ?g. }
Showing items 1 to 87 of
87
with 100 items per page.
- W4362714362 endingPage "2510" @default.
- W4362714362 startingPage "2497" @default.
- W4362714362 abstract "Recurrent neural networks (RNNs) are used extensively in time series data applications. Modern RNNs consist of three layer types: recurrent, Fully-Connected (FC), and attention. This paper introduces the design, acceleration, implementation, and verification of a complete reconfigurable RNN using a system-on-chip approach on an FPGA. This design is suitable for small-scale projects and Internet of Things (IoT) end devices as the design utilizes a small number of hardware resources compared to previous configurable architectures. The proposed reconfigurable architecture consists of three layers. The first layer is a Python software layer that contains a function serving as the architecture’s user interface. The output of the python function is three binary files containing the RNN architecture description and trained parameters. The embedded software layer implemented on an on-chip ARM microcontroller is the second layer of that architecture. This layer reads the first layer output files and configures the hardware layer with the required configuration and parameters to execute each layer in the RNN. The hardware layer consists of two Intellectual Properties (IPs) with different configurations. The Recurrent Layer Hardware IP implements the recurrent layer using either Long Short Term Memory (LSTM) or Gated Recurrent Unit (GRU) as basic building blocks, while the ATTENTION/FC IP implements the attention layer and the FC layer. The proposed design allows the implementation of a recurrent layer on an FPGA with variable input and a hidden vector length of up to 100 elements for each vector. It also supports implementing an attention layer with up to 64 input vectors and a maximum vector length of 100 items. The FC layers can be configured to support a maximum value of 256 for the input vector length and the number of neurons in each layer. The hardware design of the recurrent layer achieves a maximum performance of 1.958 and 2.479 GOPS for the GRU and LSTM models, respectively. The maximum performance of the attention and FC layers is 2.641 GOPS and 634.3 MOPS, respectively. The hardware design works at a maximum frequency of 100 MHz." @default.
- W4362714362 created "2023-04-09" @default.
- W4362714362 creator A5012409956 @default.
- W4362714362 creator A5037468500 @default.
- W4362714362 date "2023-06-01" @default.
- W4362714362 modified "2023-09-23" @default.
- W4362714362 title "SoC Reconfigurable Architecture for Implementing Software Trained Recurrent Neural Networks on FPGA" @default.
- W4362714362 cites W1995341919 @default.
- W4362714362 cites W2064675550 @default.
- W4362714362 cites W2107878631 @default.
- W4362714362 cites W2154289978 @default.
- W4362714362 cites W2157331557 @default.
- W4362714362 cites W2588448445 @default.
- W4362714362 cites W2589086007 @default.
- W4362714362 cites W2593809812 @default.
- W4362714362 cites W2757698722 @default.
- W4362714362 cites W2788838111 @default.
- W4362714362 cites W2891504940 @default.
- W4362714362 cites W2915106038 @default.
- W4362714362 cites W2956083712 @default.
- W4362714362 cites W2962820060 @default.
- W4362714362 cites W3003697522 @default.
- W4362714362 cites W3031696893 @default.
- W4362714362 cites W3034342478 @default.
- W4362714362 cites W3041591210 @default.
- W4362714362 cites W3108690123 @default.
- W4362714362 cites W3157414788 @default.
- W4362714362 cites W3157959896 @default.
- W4362714362 cites W3201247842 @default.
- W4362714362 cites W3215120999 @default.
- W4362714362 doi "https://doi.org/10.1109/tcsi.2023.3262479" @default.
- W4362714362 hasPublicationYear "2023" @default.
- W4362714362 type Work @default.
- W4362714362 citedByCount "0" @default.
- W4362714362 crossrefType "journal-article" @default.
- W4362714362 hasAuthorship W4362714362A5012409956 @default.
- W4362714362 hasAuthorship W4362714362A5037468500 @default.
- W4362714362 hasConcept C111919701 @default.
- W4362714362 hasConcept C118524514 @default.
- W4362714362 hasConcept C142962650 @default.
- W4362714362 hasConcept C147168706 @default.
- W4362714362 hasConcept C149635348 @default.
- W4362714362 hasConcept C154945302 @default.
- W4362714362 hasConcept C178790620 @default.
- W4362714362 hasConcept C185592680 @default.
- W4362714362 hasConcept C2777674469 @default.
- W4362714362 hasConcept C2779227376 @default.
- W4362714362 hasConcept C41008148 @default.
- W4362714362 hasConcept C42935608 @default.
- W4362714362 hasConcept C50644808 @default.
- W4362714362 hasConcept C519991488 @default.
- W4362714362 hasConcept C9390403 @default.
- W4362714362 hasConceptScore W4362714362C111919701 @default.
- W4362714362 hasConceptScore W4362714362C118524514 @default.
- W4362714362 hasConceptScore W4362714362C142962650 @default.
- W4362714362 hasConceptScore W4362714362C147168706 @default.
- W4362714362 hasConceptScore W4362714362C149635348 @default.
- W4362714362 hasConceptScore W4362714362C154945302 @default.
- W4362714362 hasConceptScore W4362714362C178790620 @default.
- W4362714362 hasConceptScore W4362714362C185592680 @default.
- W4362714362 hasConceptScore W4362714362C2777674469 @default.
- W4362714362 hasConceptScore W4362714362C2779227376 @default.
- W4362714362 hasConceptScore W4362714362C41008148 @default.
- W4362714362 hasConceptScore W4362714362C42935608 @default.
- W4362714362 hasConceptScore W4362714362C50644808 @default.
- W4362714362 hasConceptScore W4362714362C519991488 @default.
- W4362714362 hasConceptScore W4362714362C9390403 @default.
- W4362714362 hasIssue "6" @default.
- W4362714362 hasLocation W43627143621 @default.
- W4362714362 hasOpenAccess W4362714362 @default.
- W4362714362 hasPrimaryLocation W43627143621 @default.
- W4362714362 hasRelatedWork W2011469574 @default.
- W4362714362 hasRelatedWork W2066159914 @default.
- W4362714362 hasRelatedWork W2093864534 @default.
- W4362714362 hasRelatedWork W2152926077 @default.
- W4362714362 hasRelatedWork W2361654132 @default.
- W4362714362 hasRelatedWork W2380396636 @default.
- W4362714362 hasRelatedWork W2572399654 @default.
- W4362714362 hasRelatedWork W2995926156 @default.
- W4362714362 hasRelatedWork W3124648670 @default.
- W4362714362 hasRelatedWork W3151546682 @default.
- W4362714362 hasVolume "70" @default.
- W4362714362 isParatext "false" @default.
- W4362714362 isRetracted "false" @default.
- W4362714362 workType "article" @default.