Matches in SemOpenAlex for { <https://semopenalex.org/work/W4313250830> ?p ?o ?g. }
- W4313250830 endingPage "103" @default.
- W4313250830 startingPage "90" @default.
- W4313250830 abstract "Classic deep neural network (DNN) pruning mostly leverages software-based methodologies to tackle the accuracy/speed tradeoff, which involves complicated procedures such as critical parameter searching, fine-tuning, and sparse training to find the best plan. In this article, we explore the opportunities of hardware runtime pruning and propose a regularity-aware hardware runtime pruning methodology, termed “BitXpro” to empower versatile DNN inference. The method targets the bit-level sparsity and the sparsity irregularity in the parameters and pinpoints and prunes the useless bits on-the-fly in the proposed BitXpro accelerator. The versatility of BitXpro lies in: 1) software effortless; 2) orthogonal to the software-based pruning; and 3) multiprecision support (including both floating point and fixed point). Empirical studies on various domain-specific artificial intelligence (AI) tasks highlight the following results: 1) up to <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$8.27times $ </tex-math></inline-formula> speedup over the original nonpruned DNN and <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$10.81times $ </tex-math></inline-formula> speedup collaborated with the software-pruned DNN; 2) up to 0.3% and 0.04% higher accuracy for the floating- and fixed-point DNNs, respectively; and 3) <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$6.01times $ </tex-math></inline-formula> and <inline-formula xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink> <tex-math notation=LaTeX>$8.20times $ </tex-math></inline-formula> performance improvement over the state-of-the-art accelerators, with 0.068 mm2 and 74.82 mW (floating point 32) and 40.44 mW (16-bit fixed point) power consumption under the TSMC 28-nm technology library." @default.
- W4313250830 created "2023-01-06" @default.
- W4313250830 creator A5000220618 @default.
- W4313250830 creator A5014244388 @default.
- W4313250830 creator A5023380073 @default.
- W4313250830 creator A5070288466 @default.
- W4313250830 creator A5086445887 @default.
- W4313250830 date "2023-01-01" @default.
- W4313250830 modified "2023-09-30" @default.
- W4313250830 title "BitXpro: Regularity-Aware Hardware Runtime Pruning for Deep Neural Networks" @default.
- W4313250830 cites W1191365092 @default.
- W4313250830 cites W1522734439 @default.
- W4313250830 cites W1861492603 @default.
- W4313250830 cites W2003690406 @default.
- W4313250830 cites W2108598243 @default.
- W4313250830 cites W2194775991 @default.
- W4313250830 cites W2276892413 @default.
- W4313250830 cites W2516141709 @default.
- W4313250830 cites W2541839172 @default.
- W4313250830 cites W2585720638 @default.
- W4313250830 cites W2607041014 @default.
- W4313250830 cites W2625457103 @default.
- W4313250830 cites W2794141774 @default.
- W4313250830 cites W2798729263 @default.
- W4313250830 cites W2819476901 @default.
- W4313250830 cites W2904902077 @default.
- W4313250830 cites W2931118404 @default.
- W4313250830 cites W2962851801 @default.
- W4313250830 cites W2962874694 @default.
- W4313250830 cites W2963091558 @default.
- W4313250830 cites W2963446712 @default.
- W4313250830 cites W2964137095 @default.
- W4313250830 cites W2964233199 @default.
- W4313250830 cites W2980285956 @default.
- W4313250830 cites W2982770724 @default.
- W4313250830 cites W3014447010 @default.
- W4313250830 cites W3203082935 @default.
- W4313250830 cites W4247198796 @default.
- W4313250830 doi "https://doi.org/10.1109/tvlsi.2022.3221732" @default.
- W4313250830 hasPublicationYear "2023" @default.
- W4313250830 type Work @default.
- W4313250830 citedByCount "0" @default.
- W4313250830 crossrefType "journal-article" @default.
- W4313250830 hasAuthorship W4313250830A5000220618 @default.
- W4313250830 hasAuthorship W4313250830A5014244388 @default.
- W4313250830 hasAuthorship W4313250830A5023380073 @default.
- W4313250830 hasAuthorship W4313250830A5070288466 @default.
- W4313250830 hasAuthorship W4313250830A5086445887 @default.
- W4313250830 hasConcept C108010975 @default.
- W4313250830 hasConcept C11413529 @default.
- W4313250830 hasConcept C119857082 @default.
- W4313250830 hasConcept C154945302 @default.
- W4313250830 hasConcept C173608175 @default.
- W4313250830 hasConcept C199360897 @default.
- W4313250830 hasConcept C2524010 @default.
- W4313250830 hasConcept C2776214188 @default.
- W4313250830 hasConcept C2777904410 @default.
- W4313250830 hasConcept C28719098 @default.
- W4313250830 hasConcept C33923547 @default.
- W4313250830 hasConcept C41008148 @default.
- W4313250830 hasConcept C45357846 @default.
- W4313250830 hasConcept C50644808 @default.
- W4313250830 hasConcept C6557445 @default.
- W4313250830 hasConcept C68339613 @default.
- W4313250830 hasConcept C80444323 @default.
- W4313250830 hasConcept C84211073 @default.
- W4313250830 hasConcept C86803240 @default.
- W4313250830 hasConcept C94375191 @default.
- W4313250830 hasConceptScore W4313250830C108010975 @default.
- W4313250830 hasConceptScore W4313250830C11413529 @default.
- W4313250830 hasConceptScore W4313250830C119857082 @default.
- W4313250830 hasConceptScore W4313250830C154945302 @default.
- W4313250830 hasConceptScore W4313250830C173608175 @default.
- W4313250830 hasConceptScore W4313250830C199360897 @default.
- W4313250830 hasConceptScore W4313250830C2524010 @default.
- W4313250830 hasConceptScore W4313250830C2776214188 @default.
- W4313250830 hasConceptScore W4313250830C2777904410 @default.
- W4313250830 hasConceptScore W4313250830C28719098 @default.
- W4313250830 hasConceptScore W4313250830C33923547 @default.
- W4313250830 hasConceptScore W4313250830C41008148 @default.
- W4313250830 hasConceptScore W4313250830C45357846 @default.
- W4313250830 hasConceptScore W4313250830C50644808 @default.
- W4313250830 hasConceptScore W4313250830C6557445 @default.
- W4313250830 hasConceptScore W4313250830C68339613 @default.
- W4313250830 hasConceptScore W4313250830C80444323 @default.
- W4313250830 hasConceptScore W4313250830C84211073 @default.
- W4313250830 hasConceptScore W4313250830C86803240 @default.
- W4313250830 hasConceptScore W4313250830C94375191 @default.
- W4313250830 hasFunder F4320321001 @default.
- W4313250830 hasFunder F4320321133 @default.
- W4313250830 hasIssue "1" @default.
- W4313250830 hasLocation W43132508301 @default.
- W4313250830 hasOpenAccess W4313250830 @default.
- W4313250830 hasPrimaryLocation W43132508301 @default.
- W4313250830 hasRelatedWork W1509211761 @default.
- W4313250830 hasRelatedWork W1531488649 @default.
- W4313250830 hasRelatedWork W1585350690 @default.
- W4313250830 hasRelatedWork W2133693067 @default.