Matches in SemOpenAlex for { <https://semopenalex.org/work/W4308479898> ?p ?o ?g. }
Showing items 1 to 84 of
84
with 100 items per page.
- W4308479898 endingPage "4099" @default.
- W4308479898 startingPage "4088" @default.
- W4308479898 abstract "Since Google proposed Transformer in 2017, it has made significant natural language processing (NLP) development. However, the increasing cost is a large amount of calculation and parameters. Previous researchers designed and proposed some accelerator structures for transformer models in field-programmable gate array (FPGA) to deal with NLP tasks efficiently. Now, the development of Transformer has also affected computer vision (CV) and has rapidly surpassed convolution neural networks (CNNs) in various image tasks. And there are apparent differences between the image data used in CV and the sequence data in NLP. The details in the models contained with transformer units in these two fields are also different. The difference in terms of data brings about the problem of the locality. The difference in the model structure brings about the problem of path dependence, which is not noticed in the existing related accelerator design. Therefore, in this work, we propose the ViA, a novel vision transformer (ViT) accelerator architecture based on FPGA, to execute the transformer application efficiently and avoid the cost of these challenges. By analyzing the data structure in the ViT, we design an appropriate partition strategy to reduce the impact of data locality in the image and improve the efficiency of computation and memory access. Meanwhile, by observing the computing flow of the ViT, we use the half-layer mapping and throughput analysis to reduce the impact of path dependence caused by the shortcut mechanism and fully utilize hardware resources to execute the Transformer efficiently. Based on optimization strategies, we design two reuse processing engines with the internal stream, different from the previous overlap or stream design patterns. In the stage of the experiment, we implement the ViA architecture in Xilinx Alveo U50 FPGA and finally achieved ~5.2 times improvement of energy efficiency compared with NVIDIA Tesla V100, and 4–10 times improvement of performance compared with related accelerators based on FPGA, that obtained nearly 309.6 GOP/s computing performance in the peek." @default.
- W4308479898 created "2022-11-12" @default.
- W4308479898 creator A5040104068 @default.
- W4308479898 creator A5051427882 @default.
- W4308479898 creator A5056314969 @default.
- W4308479898 creator A5061329069 @default.
- W4308479898 creator A5072915437 @default.
- W4308479898 creator A5077322091 @default.
- W4308479898 creator A5089833377 @default.
- W4308479898 date "2022-11-01" @default.
- W4308479898 modified "2023-10-18" @default.
- W4308479898 title "ViA: A Novel Vision-Transformer Accelerator Based on FPGA" @default.
- W4308479898 doi "https://doi.org/10.1109/tcad.2022.3197489" @default.
- W4308479898 hasPublicationYear "2022" @default.
- W4308479898 type Work @default.
- W4308479898 citedByCount "4" @default.
- W4308479898 countsByYear W43084798982022 @default.
- W4308479898 countsByYear W43084798982023 @default.
- W4308479898 crossrefType "journal-article" @default.
- W4308479898 hasAuthorship W4308479898A5040104068 @default.
- W4308479898 hasAuthorship W4308479898A5051427882 @default.
- W4308479898 hasAuthorship W4308479898A5056314969 @default.
- W4308479898 hasAuthorship W4308479898A5061329069 @default.
- W4308479898 hasAuthorship W4308479898A5072915437 @default.
- W4308479898 hasAuthorship W4308479898A5077322091 @default.
- W4308479898 hasAuthorship W4308479898A5089833377 @default.
- W4308479898 hasConcept C113775141 @default.
- W4308479898 hasConcept C11413529 @default.
- W4308479898 hasConcept C118524514 @default.
- W4308479898 hasConcept C119599485 @default.
- W4308479898 hasConcept C127413603 @default.
- W4308479898 hasConcept C13164978 @default.
- W4308479898 hasConcept C138885662 @default.
- W4308479898 hasConcept C149635348 @default.
- W4308479898 hasConcept C154945302 @default.
- W4308479898 hasConcept C165801399 @default.
- W4308479898 hasConcept C2779808786 @default.
- W4308479898 hasConcept C41008148 @default.
- W4308479898 hasConcept C41895202 @default.
- W4308479898 hasConcept C42935608 @default.
- W4308479898 hasConcept C45374587 @default.
- W4308479898 hasConcept C66322947 @default.
- W4308479898 hasConcept C9390403 @default.
- W4308479898 hasConceptScore W4308479898C113775141 @default.
- W4308479898 hasConceptScore W4308479898C11413529 @default.
- W4308479898 hasConceptScore W4308479898C118524514 @default.
- W4308479898 hasConceptScore W4308479898C119599485 @default.
- W4308479898 hasConceptScore W4308479898C127413603 @default.
- W4308479898 hasConceptScore W4308479898C13164978 @default.
- W4308479898 hasConceptScore W4308479898C138885662 @default.
- W4308479898 hasConceptScore W4308479898C149635348 @default.
- W4308479898 hasConceptScore W4308479898C154945302 @default.
- W4308479898 hasConceptScore W4308479898C165801399 @default.
- W4308479898 hasConceptScore W4308479898C2779808786 @default.
- W4308479898 hasConceptScore W4308479898C41008148 @default.
- W4308479898 hasConceptScore W4308479898C41895202 @default.
- W4308479898 hasConceptScore W4308479898C42935608 @default.
- W4308479898 hasConceptScore W4308479898C45374587 @default.
- W4308479898 hasConceptScore W4308479898C66322947 @default.
- W4308479898 hasConceptScore W4308479898C9390403 @default.
- W4308479898 hasFunder F4320321001 @default.
- W4308479898 hasFunder F4320321133 @default.
- W4308479898 hasFunder F4320322769 @default.
- W4308479898 hasFunder F4320335777 @default.
- W4308479898 hasIssue "11" @default.
- W4308479898 hasLocation W43084798981 @default.
- W4308479898 hasOpenAccess W4308479898 @default.
- W4308479898 hasPrimaryLocation W43084798981 @default.
- W4308479898 hasRelatedWork W1732210391 @default.
- W4308479898 hasRelatedWork W2295680811 @default.
- W4308479898 hasRelatedWork W2369375926 @default.
- W4308479898 hasRelatedWork W2398947563 @default.
- W4308479898 hasRelatedWork W2921258041 @default.
- W4308479898 hasRelatedWork W3088312824 @default.
- W4308479898 hasRelatedWork W3124648670 @default.
- W4308479898 hasRelatedWork W3147787617 @default.
- W4308479898 hasRelatedWork W4205290991 @default.
- W4308479898 hasRelatedWork W4295855328 @default.
- W4308479898 hasVolume "41" @default.
- W4308479898 isParatext "false" @default.
- W4308479898 isRetracted "false" @default.
- W4308479898 workType "article" @default.